Designing for Deaf users
A conversation between a Deaf developer and a UX Designer
After meeting Emmanuelle Aboaf at Paris Web, I wanted to explore how deafness can be addressed in UX design. Emmanuelle, who was born deaf, describes herself as “bionic” due to her two cochlear implants. A Fullstack Angular .NET developer and tech enthusiast, she is also active in feminist communities. She regularly speaks at conferences to promote accessibility, both in real life and online.
We discussed her experience as a deaf user online, and as a developer, touching on the various challenges she faces. She shared her insights on how UX designers can better address the needs of Deaf users. Our conversation also covered the progress of assistive technologies, perceptions of disability, and the ongoing need to advocate for accessibility.
This interview was originally conducted in French. It reflects Emmanuelle's personal experience and the specific context of living as a Deaf person in France, which may differ from other people's experiences, whether in France or abroad.
Deafness, everything I didn't know
I noticed that you always say "deaf or hard-of-hearing". Isn’t someone deaf regardless of the degree of hearing loss?
Yes, that’s true. Being deaf or hard-of-hearing is essentially the same thing. You can have mild, moderate, severe, or profound hearing loss like I do. But some people prefer to describe themselves as hard-of-hearing because it distances them from the stigma of disability. It can also depend on how you became deaf. If you lose your hearing suddenly, I can understand why you might prefer to say, "I’m hard-of-hearing." Accepting the loss of a sense is a long and difficult process. Some people feel ashamed, so they’ll say they’re just losing a bit of hearing. This is especially true for young people—it happens to a lot of them after going to concerts. We really underestimate our auditory health. It’s really not easy, and I can understand why they might not want to say they’re deaf. As for me, I was born deaf, so I don’t know what it’s like to hear. Also, if I were to say “I’m just hard-of-hearing”, it would downplay my disability and its impact on my life. But I say "deaf or hard-of-hearing person" out of respect for how people choose to define themselves.
Since I’m short-sighted, I wouldn’t be able to see you without my contact lenses. Yet no one says I’m visually impaired. What’s the difference?
Poor eyesight is more common than being deaf, so there’s more stigma around deafness—it seems insurmountable to people. We’re immediately pitied as “the poor deaf-mutes.” But the term "deaf-mute" is completely outdated. It’s obsolete and almost medieval. Some deaf people choose not to speak, either because they haven’t learned to, don’t like their voices, or express themselves through sign language instead. But their vocal cords work perfectly fine. Being deaf and having damaged vocal cords—and therefore being mute—is extremely rare.
For those born deaf, learning to speak requires speech therapy. I spent 20 years in speech therapy to learn how to control my voice, and yet I still have a distinct voice, an accent. A deaf person’s voice depends on many factors: the environment, education, hearing aids, and speech therapy.
If learning to speak is already difficult for a deaf person, how did you learn English?
Through phonetics. But it’s hard for me to understand spoken English. I can’t lip-read someone in another language because I haven’t trained myself to lip-read in English. Lip-reading in English is completely different from lip-reading in French. Even with hearing aids or implants, we have to train our ears and brains to identify sounds. It’s like machine learning: "this is a barking dog" and "this is a car horn." That’s how I learned to hear and recognise everyday sounds. At school, I already had a lot of extra work to do, with plenty of support and speech therapy sessions. That’s why I was exempt from a second language in high school, otherwise it would have been too much. On the other hand, I’m fluent in sign language, and it’s definitely a language in its own right. With French, English, and French Sign Language (LSF), I’m trilingual after all!
I imagine it’s even more complicated for people who are both deaf and blind?
Yes, it’s a minority within our community. We refer to these individuals as deafblind. We’re trying to include them more and more, particularly thanks to the work of Thomas Soret at the Unanimes association. Thomas has Usher syndrome, a progressive illness that gradually causes blindness. For people who are deafblind, the needs are different. For example, they can’t see subtitles, which is why we also need to provide a transcript that can be read with a Braille display.
Transcripts must be provided, whether it’s for a video or a podcast. In fact, this is part of the accessibility regulations, which state that all multimedia content must have a text-based transcript as an alternative. When transcripts are not provided, it prevents us from learning what we could otherwise acquire.
When transcripts are available, I’m so happy and I don’t hesitate to share them with my community. But sometimes, I’ve come across situations where transcripts are only accessible to subscribers. Why do we have to give personal information to access these transcripts when others can just listen to the podcasts without signing up? It’s not very fair.
Can the way you lost a sense impact how you use assistive technology?
Yes, of course. People who have been deaf for a long time or since birth have developed the other senses. For example, we've learned to read subtitles. We read very quickly because we don’t access information through hearing. However, if you lose your hearing as you get older and the subtitles move too fast, you will struggle. You haven’t trained yourself to do that. But that’s why there’s been a charter for subtitle quality in France since 2011. There are best practices for colours, character count, number of lines, etc. Sometimes it’s frustrating because it forces the subtitlers to summarise a sentence. And when you can hear a little with implants, you can notice the difference, which is unsettling.
But to go back to your question about people with multiple disabilities, there’s currently a reflection underway to adapt subtitle colours for deaf and hard-of-hearing people. The colour code has existed since the 1980s. For example, when there’s a noise indication, we use red. Magenta is for music. Green is for foreign language transcriptions. So, for people who are colourblind, this can be a problem.
We also develop other senses like touch. And recently, Apple added a new feature: music haptics. It’s so cool to be able to feel the music in a different way through vibrations.
Being deaf at work: a developer's perspective
I guess you sometimes work remotely. How do video meetings work for you?
Generally, I rely on automatic captions and pray that it works. For example, we’re talking on Microsoft Teams right now, and I’m using the automatically generated captions to understand you. I can’t read your lips because the resolution of your webcam isn’t good enough. I can hear you with my implants, but there’s a difference between hearing and understanding. Luckily, the captions can help, even though it's not perfect. There can be a lot of errors depending on context, the environment and the person speaking. The more technical the subject, the less it works. When I speak, there are lots of mistakes because AI struggles with accents.
When I have meetings with several people talking at the same time, automatically generated captions often have errors, especially when the subject is technical. AI can’t process that properly. So I plan ahead with Tadeo, a French online transcription service. The person from Tadeo will join the meeting and write captions of much better quality for me. Of course, these services are only available during office hours and subject to availability. So you need to be well-organised.
Do these transcription services completely solve the accessibility issue in these situations?
Yes, even though sometimes the people who write captions aren’t always familiar with the technical terms of my job. So I have to send them a glossary in advance. It’s normal, they can't know everything about every industry. But to answer your question, these services help me a lot. It’s not just Tadeo, there’s also Elioz and Le Messageur, which work very well.
How does the transcription service work? Does the person join your meeting?
The person from the transcription service can join the meeting via audio or access it through a router. And I have three screens. One for the meeting, one for the code we’re inspecting, and one for the transcript, which only I can see.
Do these services only provide transcription?
No, you can also have a French Sign Language (LSF) or Cued Speech (CS) interpreter. CS is a system that makes spoken language more accessible to deaf and hard-of-hearing people. You make hand signs near the face to distinguish sounds that look similar when reading lips. And it doesn’t just work for work meetings. When I need to make a commercial call, like calling a bank or an insurance company, some websites offer you the option to go through relay centres available in LSF, CS, or transcription.
A Deaf person's user journey: overcoming digital barriers
Let's talk about user journeys on digital platforms. As a deaf person, can you use a CAPTCHA?
I've had problems with CAPTCHAs before, of course. In addition to my cochlear implants, I also wear glasses. Sometimes I struggle to distinguish the letters, so I have to regenerate the code every time. But I never use the audio option because I know it won’t help me. Despite my implants, the sound of different letters can be hard for me to distinguish. For example, between P and B, or F and S, I can’t hear the difference. I rely solely on context. That’s why it’s very complicated for me when someone spells out words or numbers. But you know what, let’s do the test together.
What alternative to CAPTCHA can we offer to secure a login?
The checkbox "I'm not a robot" works fine. Or, you can have maths problems to solve. If it’s a simple calculation, it shouldn't be an issue for people with intellectual disabilities or cognitive impairments. I’m terrible at maths, but 1+1=2, I can manage. There are plenty of valid alternatives to secure a login.
As a UX designer, if I could only do one thing for you, what would it be?
Stop making phone numbers mandatory. If you ask for it, I want to know why. Is it for a call? Because I won’t understand, I’ll have to ask people to repeat multiple times, especially if they're not making any effort to be understood. It’s exhausting. Most of the time, I just enter a fake number anyway. Unless it’s for an official administrative form, of course. But psychologically, I feel frustrated when I’m forced to provide a phone number.
That’s also why I need user journeys to be clear. If I can’t fill in a form on my own, calling customer service isn’t a solution for me. Right now, I’m looking for information on entrepreneurship. The forms are incomprehensible. I have to ask my friends for help. This lack of clarity makes me completely dependent on others.
When I need to contact customer service, the first thing I do is look for the contact form via email. If I don’t get a response to my first email, I try again. And if I still get nothing, I check the directory of the French Federation of Accessibility to see if they have a transcription, LSF, or CS service. But it slows down everything I do…
When booking an appointment online, there’s sometimes a field to share information with customer service. Do you use it to specify that they shouldn’t call you?
Of course, but it doesn’t make any difference! I may write “I’m deaf, please send me a text message or an email,” but people still call me every time. It’s unbearable! We need to keep raising awareness on those issues…
Accessibility and inclusion in today's world
If I understand correctly, accessibility should be part of the company’s strategy and involve every role in the business, depending on their responsibilities.
Absolutely. The worst is when I get a voicemail. I can’t listen to it! That’s when it becomes essential to have an iPhone rather than an Android. The accessibility features aren’t the same. The voicemail transcript is available on iPhone, although it doesn’t always work well. It also depends on the phone you have... There are also features that only work in English at the moment.
I imagine there’s a financial aspect to consider. Social security covers your implants, but not your phone, right?
No, of course not. And actually, my implant brand, Cochlear, has made a deal with Apple to develop features compatible with implants on iPhone. It works on Android too, but with fewer features. So, I have little choice but to buy an iPhone. And not everyone can afford it.
Shouldn't we consider smartphones as assistive devices?
That's a very good question. It's true that Apple has done a lot for accessibility. Android too, but it's different. I can't do without my phone. So, when people talk to me about sustainability, about not buying a new phone if the old one still works... I have other concerns being deaf... We could consider offering a discount for Disabled people who buy these phones out of necessity.
It feels like our conversation is taking a political turn. Given that Trump has just been re-elected, how do you see the future of accessibility?
Now more than ever, it's essential to keep fighting. Earlier, I was giving a webinar on AI and accessibility. And at the end, I told the people attending that, regardless of who you are – a man, a woman, a disabled person, a queer person – you absolutely belong! But it’s true that I said that on the day Trump had been re-elected, and it felt strange. Like a whole different dimension entirely. But I still believe in it. We need to make more efforts. I think about the associations, our work in accessibility – it’s even more necessary now. We need to stay united, resist together, and keep pushing things forward. Sure, one small step at a time, but we must keep moving forward.