Click to Skip Ad
Closing in...

Meta’s update to its AI-powered Ray-Bans make the case for Apple to make smart glasses

Published Apr 23rd, 2024 6:08PM EDT
Smart(er) Glasses: Introducing New Ray-Ban | Meta Styles + Expanding Access to Meta AI with Vision
Image: Meta

If you buy through a BGR link, we may earn an affiliate commission, helping support our expert product labs.

Meta’s Ray-Ban glasses aren’t as popular as our phones yet, and they probably never will be. Still, as the company continues to invest in them and introduce new features, the use cases for such a product become more and more evident — and also point out that other companies like Apple, Google, and Samsung should dive into the category.

In a blog post, Meta announced a range of new features coming to its Ray-Ban smart glasses. In addition to launching new AI vision features, the glasses can now stream what you are seeing on a video call as well as allow you to control music from streaming services like Apple Music.

Sharing what you see on a video call just got better

This feature is probably the best one. There are so many times that I’ve found myself on FaceTime with someone and want to show them what I am seeing. Usually, it requires me to flip the camera being used so that while they can now see what I see, they can’t see me anymore.

Meta is upgrading this experience by allowing Ray-Ban owners to share what they see on a video call. The new feature is launching first on WhatsApp and Messenger.

Whether you’re watching a legendary set from your favorite musician or mountain biking down an epic trail, now you can share it with friends and family completely hands-free. They’ll see what you see in all its uncut glory.

It’s also good for the everyday activities you can’t live without—like grocery shopping. Not sure which brand of kombucha to buy? Can’t tell if that pineapple is ripe? Now you can hop on a video call with your mom and get her advice based on what you see.

Meta’s Ray-Bans will now let you share what you see while on a video call.

Meta wants to one-up the AirPods Apple Music experience

The next feature that has popped up is the ability for Meta Ray-Ban owners to access and control their music on Apple Music. As reported by The Verge, while the feature wasn’t included in any of the announcements made today by the company, the feature has popped up in the Meta View app.

Per the report, the integration seems quite deep, as you can request specific songs, playlists, artists, or even Apple Music’s radio stations. It sounds like Apple and Meta have worked together to offer almost the same experience you would get using AirPods and Siri to request music from the service.

Another feature not mentioned in the blog is Apple Music compatibility. However, the feature appeared this morning in the Meta View app, which pairs with the glasses. The instructions note that you’ll be able to control Apple Music hands-free to play any song, playlist, station, or artist. You can also request recommendations based on your listening history.

You can now use Meta’s Ray-Ban glasses to listen to music from Apple Music.

Meta AI with Vision is rolling out in a beta starting today

While not the most flashy feature announced today, Meta’s update to Meta AI on the Ray-Ban smart glasses may be the most impactful. The company says that it “started testing a multimodal AI update in December, which helps you understand the world around you by interpreting what you’re seeing as well as what you’re asking.”

That modal is now ready for the big show. Meta AI with Vision is now rolling out to all Ray-Ban Meta smart glasses in the US and Canada. Meta says that it is making this available as a beta, so while it’s cool that it’s rolling out, expect to run into the usual quirks when trying it out.

Say you’re traveling and trying to read a menu in French. Your smart glasses can use their built-in camera and Meta AI to translate the text for you, giving you the info you need without having to pull out your phone or stare at a screen. Whether you’re coming up with an Instagram caption, looking for information while sight-seeing, or simply meal prepping for the week, Meta AI with Vision can help you do it hands-free.

Of course, these things take time. While it’s an exciting update, the tech is still early, so there are going to be times when Meta AI doesn’t get things quite right. As we get more feedback from people using Meta AI on their glasses and continue innovating new model architectures, we expect to see improved performance over time. In other words, it only gets better from here.

Meta AI with Vision lets you translate a menu you are looking at.

Where are everyone else’s smart glasses?

Seeing what Meta is doing here, I can’t help but wonder where Apple, Google, and Samsung are in the smart glasses space. I know everyone is focused on virtual and mixed reality right now, but it feels like the real long-term product that will dominate the “I wear this on my face” category is something people are already used to wearing: glasses.

With all of the features that Meta is adding here, it’s obvious that this could be an incredibly useful product for a lot of people. As an iPhone user, I would love a similar product from Apple that lets me access Siri and request to listen to music or share what I see on a FaceTime call more easily. Imagine Apple Maps laid over the real world when you’re walking through a city. That’s much more impactful than watching Ted Lasso in a virtual home theater.

I have to say — Meta has a winning product here. The only reason I don’t own smart glasses yet is because I don’t want to buy one from Meta. But I can’t deny this is likely a major part of the future.

Joe Wituschek Tech News Contributor

Joe Wituschek is a Tech News Contributor for BGR.

With expertise in tech that spans over 10 years, Joe covers the technology industry's breaking news, opinion pieces and reviews.