Meta expands Ray‑Ban glasses with neural handwriting, live captions and navigation
At a glance:
- Neural Handwriting is now available to all Meta Ray‑Ban Display glasses users
- Live captions are added to WhatsApp, Messenger and Instagram voice DMs
- Walking directions launch across the US and in European hubs like London, Paris and Rome, plus developer preview access
What’s new for Ray‑Ban Meta glasses
Meta announced a suite of updates for its Ray‑Ban Meta smart glasses that were rolled out simultaneously. The headline feature is “Neural Handwriting,” a virtual‑ink system that lets wearers compose messages by making subtle finger motions in the air. The feature, which was previously limited to a small early‑access group, now works across Instagram, WhatsApp, Messenger and the native messaging apps on both Android and iPhone. Meta says the technology pairs with its Neural Band sensor to translate the micro‑movements into typed text in real time.
In addition to handwriting, the glasses can now capture a full‑screen recording that bundles the in‑lens display, the camera view and ambient audio into a single video file. This makes it easier for users to share what they see without juggling separate clips. The recording capability is positioned as a social‑sharing tool, complementing the existing photo capture function.
How neural handwriting works
The Neural Handwriting system relies on Meta’s proprietary Neural Band, a thin strap that houses motion sensors and a low‑power processor. When a user moves a finger, the band detects the trajectory and feeds the data into an on‑device AI model that predicts the intended character. The predicted text appears instantly in the selected app, eliminating the need to type on a physical keyboard or use voice input.
Meta emphasizes that the feature works offline, preserving privacy by keeping the handwriting inference on the device. Early testers reported latency under 200 ms, which the company claims is comparable to native typing speeds on a smartphone. The rollout to all users suggests Meta believes the sensor data and AI model are robust enough for mass deployment.
Expanded live captions and navigation
Live captions, Meta’s on‑device speech‑to‑text engine, is being extended to more communication channels. The feature will now transcribe incoming messages in WhatsApp, Messenger and voice DMs on Instagram, displaying the text directly in the glasses’ overlay. This aims to make conversations more accessible in noisy environments or for users who prefer reading over listening.
Navigation also gets a boost. Walking directions are now available throughout the United States and in several major European cities, including London, Paris and Rome. Users can request turn‑by‑turn guidance, which appears as a subtle arrow and distance readout in the lens. The expansion follows a phased rollout that began with select U.S. metro areas earlier this year.
Developer preview and future roadmap
Meta opened a developer preview for the Ray‑Ban glasses, allowing creators to build web‑based applications that run natively on the device. The preview also lets developers extend existing mobile apps to the glasses, potentially creating new mixed‑reality experiences. Among the announced projects is Muse Spark, an AI‑powered creative assistant slated to arrive on the glasses this summer.
The developer program includes access to the glasses’ sensor APIs, the Neural Handwriting SDK, and the live‑captioning pipeline. Meta says the goal is to foster a third‑party ecosystem that can enrich the glasses beyond the core messaging and navigation use cases. By lowering the barrier to entry, the company hopes to accelerate adoption and generate a broader range of AR content for everyday use.
FAQ
What apps support the new Neural Handwriting feature on Ray‑Ban glasses?
Which cities now have walking directions available on the glasses?
How can developers start building for the Ray‑Ban Meta glasses?
More in the feed
Prepared by the editorial stack from public data and external sources.
Original article