
Enable
AR Glasses for Google Assistant
* This project has no afiliation with the brand.

Brief
- Designing for Google Assistant
This conceptual design explores AR eyewear powered by Google Assistant, designed to generate real-time subtitles during face-to-face conversations. My goal was to improve daily communication for individuals with hearing impairments.
The project identifies key limitations in existing assistive technologies and proposes a more seamless, intuitive solution through user-centered design.
Pain Points
- AR Glasses

01
Wearability
Existing AR glasses are often bulky and heavy, making them uncomfortable to wear.
Compatibility
Integration with other devices and software is often limited, restricting their practicality for various applications.
Balance
The weight distribution of current AR glasses is challenging to balance due to the heaviness of their components.
02


03
Ideation Process
- Anti-spill, ergonomic, stackable, unified
Concept
- AR Glasses
This AR glasses concept helps people engage more fully in everyday conversations by displaying real-time subtitles. It was developed with the needs of those with hearing impairments in mind, but also responds to a growing awareness around attention-related challenges—like those experienced by people with ADHD—who may find it easier to stay focused with visual support. By making spoken words visible, the glasses aim to create more accessible, inclusive communication in daily life.

Enable - AR Glasses
AR glasses that displays automated subtitle for people with hearing impairments.





Fashion Meets Function
Seamless fit. Limitless clarity.

Stay in every moment
Real-time captions that blend into your day - no labels, just lifestyle.