About the Project
ClearSpeak is a conceptual accessibility product designed as part of a university UX Design course. The project explores how digital tools can better support individuals with hearing difficulties by improving safety, communication, and independence in everyday environments.
The scope of the work includes user research, competitive analysis, problem definition, ideation, and interface design. The goal was not to produce a market-ready product, but to demonstrate a human-centered design process grounded in real user needs and constraints.
This case study documents the full process, from identifying the problem space to proposing a multi-platform solution informed by research insights and iterative prototyping.
The Problem
Individuals with hearing difficulties face daily barriers that affect safety, social connection, and independence.
Critical alerts such as smoke detectors can be missed, creating real safety risks.
Conversations and group interactions are often difficult to follow, leading to social disconnection.
Existing assistive apps and devices frequently fall short in reliability, clarity, or ease of use.
Individuals with hearing difficulties face daily barriers that affect safety, social connection, and independence.
Critical alerts such as smoke detectors can be missed, creating real safety risks.
Conversations and group interactions are often difficult to follow, leading to social disconnection.
Existing assistive apps and devices frequently fall short in reliability, clarity, or ease of use.
Competitors
Apple, Google, Visualfy, Signly
Interviews
College Student
Highlighting challenges in crossing streets, missing alarms, and difficulties in group projects due to auditory limitations.
Elderly Person
Discussed fears related to safety, instances of missing alarms, and struggles with unclear audio in community events.
Online Personas
Deal-breakers, such as the need for simplicity, and preferences, like desiring a widget, were extracted from online discussions.
Research Takeaways
Users prioritize safety, requiring ClearSpeak to deliver clear and accurate information in real-world situations where mistakes can have serious consequences.
At the same time, social inclusion, independence, and ease of use are essential, making real-time, reliable transcription, including offline support, critical for trust and everyday adoption.
"How might we enhance social interactions for individuals with hearing loss?"
"How might we ensure simplicity and ease of use for our app?"
"How might we promote independence through visual assistance?"
Feedback From Prototype
Users want greater customization and control, including adjustable preferences, clearer visual hierarchy through color, real-time word highlighting, and the ability to share transcripts or sessions.
At the same time, the AR interface can feel visually overwhelming, indicating a need for simplification and more focused presentation to improve clarity and ease of use.
My Final Solution: ClearSpeak
ClearSpeak was designed in direct response to the challenges uncovered through research, prioritizing safety, clarity, and independence.
Through real-time alerts and live AR transcription, the system keeps users aware of critical sounds while enabling more natural, confident communication.
By unifying these capabilities into a single, intuitive multi-platform experience, ClearSpeak moves beyond fragmented solutions to deliver security, connection, and autonomy for individuals with hearing difficulties.
Thank You!