Meta introduces AI and Shazam integration in its smart glasses

0
181
Meta introduces AI and Shazam integration in its smart glasses

Ray-Ban’s Meta smart glasses already worked well as a camera on your head and a pair of open earbuds, but now Meta updates the glasses with access to live artificial intelligence without having to type a word, real-time translation between multiple languages, and access to Shazam for music identification.

Meta first demonstrated most of these features at Meta Connect 2024 in September. The Live AI feature allows you to start a “live session” with Meta AI, which gives the assistant access to everything you see and allows you to ask questions without having to say “Hello, Meta.” If you need your hands free to cook or fix something, the Live AI feature will help you keep your glasses on even if you need to concentrate on what you’re doing.

The Live Translation feature allows the glasses to translate from English to French, Italian, or Spanish. When live translation is enabled and someone speaks to you in one of the selected languages, you will hear everything they say in English through the glasses’ speakers or as a printed transcription in the Meta View app. To translate from one language to another, you’ll have to download individual models of glasses, and the live translation feature needs to be enabled before they start acting as a translator, but it feels more natural than pulling out your phone to translate something.

Thanks to the integration with Shazam, your Meta smart glasses will also be able to identify any song playing around you. A simple query “Meta, what song is this?” will make the glasses’ microphones understand what you’re listening to, just like when you use Shazam on your smartphone.

All three updates are gradually bringing the glasses closer to Meta’s ultimate goal of creating a true pair of augmented reality glasses that can replace your smartphone, and the experimental Orion hardware is a preview of this idea in real life. The combination of artificial intelligence with virtual and augmented reality also seems to be an idea that many tech giants are circling around. Google’s newest XR platform, Android XR, is built on the idea that generative AI like Gemini could be the glue that makes VR or AR appealing. It’s still many years away before any company wants to change the field of view with holographic images, but for now, smart glasses seem like a moderately useful intermediate.

All Ray-Ban Meta Smart Glasses owners will be able to enjoy Shazam integration as part of the Meta v11 update. To use live translation and artificial intelligence, you need to become a member of the Meta early access program, which you can join right now on the company’s website.

LEAVE A REPLY

Please enter your comment!
Please enter your name here