Skip to content
Stephanie De Luna
AI Product Designer · Meta Wearables
Back to work
Case Study

Live Translation

Real-time multilingual conversations on smart glasses

Overview

Defined and drove the hands-free interaction design for Live Translation on Ray-Ban Meta smart glasses. Built the case for the feature, owned the end to end UX and partnered with a visual designer to bring real-time multilingual conversations to life across contexts.

Role
AI Product Designer
Team
Meta Wearables
Scope
Live Translation
Impact
2
Devices shipped
Ray-Ban Meta & Ray-Ban Meta Display
Users reached
Key metric
Goal

Enable real-time multilingual conversations through smart glasses, allowing users to communicate naturally across language barriers without reaching for their phone.

Challenge

Designing a translation experience for glasses meant working within strict latency, audio and visual constraints while making the interaction feel seamless enough that users stay focused on the conversation, not the technology.

Design Approach

Voice-first translation

Designed the end to end voice interaction flow for live translation, defining how users initiate, manage and end translation sessions entirely hands-free. Partnered with a visual designer to complement voice with on-screen elements on display-enabled devices.

User Flows

Translation interaction model

Mapped the end to end translation flow, from language detection to real-time audio output, defining how the experience adapts across voice-only and display devices.

Voice-first translation flow
Display translation flow
Prototypes & Iterations

From concept to shipped experience

Explored multiple directions through rapid prototyping and voice script testing. Each iteration was shaped by user feedback and technical constraints.

Early concept
Iteration
Final design
Outcome

Shipped Live Translation on Ray-Ban Meta, enabling real-time multilingual conversations hands-free. The feature was demoed live by Mark Zuckerberg at Meta Connect 2024 and featured during the Super Bowl 2025.