Site icon aivancity blog

Ray-Ban Meta: see, speak, interact… Artificial Intelligence is coming to your nose

Long considered futuristic gadgets or commercial flops (think Google Glass or Snap Spectacles), smart glasses are making a comeback thanks to dramatic advances in artificial intelligence. Worn as part of everyday life, they no longer just record or stream: they understand, respond, translate, and interpret—all in real time.

It is against this backdrop that Meta, in partnership with Ray-Ban, has launched the second generation of its smart connected glasses. More discreet, more powerful, and above all equipped with a built-in AI assistant, they represent a new milestone in the convergence of everyday objects and automated cognitive capabilities. Far from being mere cameras mounted on temples, the Ray-Ban Meta 2025 is a true portable AI terminal, capable of interacting with the environment, processing voice commands, and providing context-aware responses.

Behind the familiar look of classic sunglasses lies cutting-edge technology. The second generation of Ray-Ban Meta features:

But most importantly, they are the first to integrate Meta’s AI voice assistant—based on Llama 3—directly into the glasses. This assistant can answer questions, analyze a visual scene, summarize a conversation, translate in real time, and even comment on what the user is seeing. The promise? A generative, contextual, and multimodal AI, always within sight (and earshot).

Wearable connected devices, which include smart glasses, represent a rapidly growing market segment. According to IDC, more than 120 million units were sold worldwide in 2024 (including smartwatches, earbuds, and smart glasses). Meta aims to rapidly expand its Ray-Ban Meta lineup, with 1 million units sold for the second generation, set to launch in the spring of 20251.

The launch price in France is €369 for the base model, with options available featuring prescription lenses or photochromic lenses. The primary use remains photo and video capture (80% of users), but Meta notes a 45% increase in voice interactions between 2024 and 20252, a sign that the “conversational AI” aspect is gaining traction.

These figures reflect a fundamental trend: AI is becoming increasingly integrated into the human body, in forms that are ever more discreet, mobile, and personal.

The experience offered by Ray-Ban Meta is based on the interaction between voice, vision, and intent. Users can activate the voice assistant simply by saying “Hey Meta,” then make a natural request such as:

The glasses then use a combination of visual data captured by the camera, contextual information (geolocation, time, history), and voice input to generate a relevant response.

Meta is also experimenting with more advanced features, such as object recognition, pedestrian navigation assistance (“follow that person in the crowd”), and interaction history that allows users to revisit past conversations.

While the innovation is undeniable, it raises numerous ethical, social, and legal questions. Wearing an AI device on one’s nose in a public place raises sensitive issues:

In France, the CNIL is closely monitoring developments in this area. In May 2025, it reiterated that any recording in public spaces must comply with the GDPR, even if carried out by a private individual using a smart device3.

The Ray-Ban Meta is part of a broader trend of wearable smart interfaces, alongside the Rabbit R1, the Humane AI Pin, and the Apple Vision Pro. Unlike smartphones or smartwatches, these devices combine AI with eye tracking and voice control, creating a new form of “augmented” interaction.

For now, its use remains limited to early adopters and recreational, professional, or tourism-related contexts. But as the performance of voice and visual AI improves and interfaces become more user-friendly, the widespread adoption of facial AI is becoming a plausible prospect.

The Ray-Ban Meta glasses thus raise a fundamental question: do we want our most everyday tools to become intelligent agents, capable of perceiving, interpreting, and even anticipating our actions and intentions?

In a previous post on this blog, we examined Meta’s technical ambitions through the lens of VivaTech 2025: Mistral AI unveils a sovereign high-performance computing infrastructure in partnership with Nvidia, as well as the rise of embedded AI in “Magistral: Mistral’s artificial intelligence that brings new meaning to automated reasoning.”

1. IDC. (2024). Worldwide Quarterly Wearable Device Tracker.
https://www.idc.com/

2. Meta. (2025). Ray-Ban Meta Usage Data – April 2025 Report.
https://about.meta.com/

3. CNIL. (2025). Wearable Devices and Privacy: Updated Recommendations.
https://www.cnil.fr/

Quitter la version mobile