XR Design · Accessibility · Speculative Design

Facilitating the seamless and inclusive communication between deaf and hearing people


PROJECT OVERVIEW

In today's interconnected world, sign language users face significant cross-cultural communication barriers. Existing AR assistive technologies neglect these differences, overlooking the diverse landscape of sign languages. To address this gap, we created a cross-cultural sign language interpretation system using AR glasses and a mobile app.

This system goes beyond daily communication interpretation. It features a customisation mode for adding jargon and foreign signs to personal dictionaries, and a learning mode to expand vocabulary. This speculative project aims to create a seamless global communication experience, embracing inclusivity and disability awareness.


👩‍🏫 Role

👥 Team

AR Design Lead

Jennie Lin
Nehal Sharm
Cici Lin

⏱️ Timeline

🧰 Tools

Nov - Dec 2023
Updates ongoing

Lens Studio
Blender

🎨 Deliverables

Product demo video
AR prototype videos

Design Process

📝 Define

From empathy to pain point: frame deaf users’ problem

🎨 Design

Dive deep and inspire solutions for deaf users

Walk in deaf people' shoes to unlock insights

💭 Empathize

Bring ideas to life, testing and iterating the product

📦 Implement

Empathise: People with disabilities (deaf signers) are facing some unseen communication barriers.

Learning from the disability society: A large amount of problems but few solutions for deaf signers

About 440M people in the world are DEAF

There are 300+ SIGN LANGUAGES in the world spoken by more than 72m deaf or hard to hear people.

There are only 10,000 certified American Sign Language interpreters in the US.

Interview with people with hearing disabilities: “We are not uncommunicable. We are just a linguistic minority.”

We accessed some talks from deaf people and spoke to some deaf friends, including accessibility advocates and normal deaf people. Here is what they say.

“Sign languages are not universal. They are just as diverse as spoken languages.”

“We are a linguistic minority with a unique language and culture.”

“The biggest problem for me to communicate in sign language is communication with jargon in workplace.”

Research on the assistive tech market: There is no universal and accessible solution across the globe.

To learn more about how current assistive technologies assist this linguist minority and further inspire design opportunities, we conducted market research.

Strengths

  • Multiple interpreter support

  • Integrates seamlessly with Zoom in real-time communication

Weaknesses

  • Limited interpreter availability

  • Technology constraint

  • Limited cross-cultural functionality

Strengths

  • Mobile-first platform

  • Real-time interpretations

  • Various sign language support

  • Multiple conversation participants support

Weaknesses

  • Only speech to text and unable to directly translate sign language to text

  • Unable to translate complicated words

  • Limited cross-cultural functionality

Strengths

  • Interactive sessions with gamified features

  • Variety of learning resources

  • Personalized learning adapting to individual pace and progress

Weaknesses

  • Limited sign language support - only offers courses in ASL

  • Limited grammar and syntax instruction - focuses merely on vocabulary and basic phrases

Based on the previous research, we identified the following key problem for current disabled people using sign languages for their communication:

“As a deaf signer, I want communication assistance so that I can communicate in the cross-cultural environment without barriers.”

Underseen linguistic minority of sign language

Varied levels of sign language support resources

Lack of qualified human sign language interpreters

Ideation: Key decision on AR glasses

User Flow: everyday accessible cross-cultural interpretation

To synthesise all the pain points from the previous research, we created the following persona. Three pain points keywords are highlighted: cross-cultural communication, jargon and continuous learning.

Device: AR glasses - visible equipment but invisible technology

After defining the problems, we officially started ideation. One of the key analyses in the ideation phase is the choice of device to accommodate the product/service. AR glasses were found to have the most potential for hearing-impaired signers to communicate without too many layers of physical barriers.

Features: everyday accessible cross-cultural interpretation

Based on AR glasses, we sketched three features, as well as use scenarios, tailoring to address each pain points, where I mainly designed the first and second modes:

1️⃣ Communication mode

2️⃣ Customisation mode

3️⃣ Learning mode


Final User Flow

Design

Prototyping the system: balancing accessibility and innovation

The whole system is consisted of two parts: mobile app for general application management and AR glasses as the primary interaction touchpoint. We prototyped the software in both devices. At this point, I was responsible for building the AR software in Lens studio and creating interfaces in AR glasses.

Product 1: AR glasses based sign language communication system

AR Concept image

Product 2: SignSync mobile app - Data Management Terminal

Mob App UI

Implement

Delivering without AR glasses: idealize all the AR elements and develop in alternative ways

According to what we have previously designed as a foundation, we built these AR lenses and product demo prototypes as our deliverables for now. In all the demos below, I created all AR effects using lens studio.

1️⃣ Product demo

2️⃣ AR prototype 1:
Communication mode

3️⃣ AR prototype 2:
Customisation mode

4️⃣ AR prototype 3-1:
Learning mode (learning levels)

5️⃣ AR prototype 3-2:
Learning mode
(conversation practice)

Reflection

Limitations: Lack of technology and standards for AR glasses design

Even though we completed these prototypes, we always realized that some technical limits were hard to overcome and move our project forward. However, we did come up with some solutions to cope with them.

Lack of AR glasses equipment

Solution: We used a mobile phone + 360-degree camera (150-degree mode) to simulate the AR glasses view.

Tight timeframe and no professional developer

Solution: I built the prototypes on the quick hands-on software Lens Studio for concept delivery.

Next step communication for deaf people: more flexibility

We believe that we could still progress more and eventually reach our ultimate project goal to assist people with disabilities in the near future from the following perspectives.

Auto-detected language interpretations

AR glasses can detect the conversation languages and output the interpretation based on the language the recipients use for them.

Categorisation on customisation mode

Different environments have particular jargon that can only be used in that scenario (e.g., some companies have their internal jargon. Signer users can customize which sign language library to be on during communication in certain scenarios.

Flexibility of sign language learning

Offer diverse sign language learning resources for users to choose by themselves and learn accordingly.

Planning to move forward, we created the revised user flow (below) to guide our next step.