Live Translation
Real-time multilingual conversations on smart glasses
Defined and drove the hands-free interaction design for Live Translation on Ray-Ban Meta smart glasses. Built the case for the feature, owned the end to end UX and partnered with a visual designer to bring real-time multilingual conversations to life across contexts.
Enable real-time multilingual conversations through smart glasses, allowing users to communicate naturally across language barriers without reaching for their phone.
Designing a translation experience for glasses meant working within strict latency, audio and visual constraints while making the interaction feel seamless enough that users stay focused on the conversation, not the technology.
Voice-first translation
Designed the end to end voice interaction flow for live translation, defining how users initiate, manage and end translation sessions entirely hands-free. Partnered with a visual designer to complement voice with on-screen elements on display-enabled devices.
Translation interaction model
Mapped the end to end translation flow, from language detection to real-time audio output, defining how the experience adapts across voice-only and display devices.
From concept to shipped experience
Explored multiple directions through rapid prototyping and voice script testing. Each iteration was shaped by user feedback and technical constraints.
Shipped Live Translation on Ray-Ban Meta, enabling real-time multilingual conversations hands-free. The feature was demoed live by Mark Zuckerberg at Meta Connect 2024 and featured during the Super Bowl 2025.