Tag Archives: audiology

Hearing and understanding are not the same

– Sarah Sparks

Hearing and understanding are not the same. Hearing sound is not just about the ear. AN image of a white woman with black pulled back hair holds a hand behind her ear.

Deaf and hard of hearing professionals who use hearing technologies sometimes find ourselves in challenging situations. Whether at conferences, in daily work environments, or in other professional settings, we encounter frequent misunderstandings about our hearing abilities. At times, hearing colleagues mistakenly believe that we understand everything that is said so long as we have heard it. We might need to advocate extra strongly for ourselves when a conference organizer declines to provide American Sign Language interpreters, CART services, FM systems or other accommodations because we have access to sound. Colleagues who see us every day might assume that we have access to information when we do not: that we understood an oral exchange simply because we heard it.

The following is repost from Sarah’s blog and you can see Sarah sign her blog in ASL at this link.


Family members of deaf and hard of hearing (DHH) people sometimes ask me questions like this one: “My deaf child seemed not to hear when I asked if she finished her math homework, but she understood when I said, ‘Dinner is ready!’ Why did she hear one but not the other?”

Like many DHH people, I have experienced hearing without understanding. As an audiologist and cochlear implant user, I know to expect this from myself. But awkward situations do occur sometimes. I might find out that a neighbor spoke to me in the hallway and now believes that I’m rude because I didn’t answer. Or a friend thinks that I don’t care about their problem because they mentioned it while I wasn’t looking and I didn’t catch all of what was said. Or a family member is confused because I understood, “What movie do you want to watch?” but missed, “Could you do the laundry tonight?”

Explaining, “I heard it, but I didn’t understand it,” can be a challenge. Most people, including our hearing friends and families, have never had to think about the differences between hearing and understanding. What are those differences, anyway? To answer that question, we need to consider the following:

Hearing devices do not provide “normal” hearing. Hearing aids, cochlear implants, and other devices are great technologies. Many DHH people use them for access to sound, but they do not provide “normal” hearing. DHH people have more access to sound with hearing devices than we have without them, but these technologies do not work like eyeglasses that correct to 20/20 vision.

Hearing is not just about the ear. We hear with our brains, not just our ears. Hearing aids and cochlear implants do not repair damage to the tiny nerve cells in the inner ear, the bones in the middle ear, and other parts of the ear’s anatomy that may be affected. Because of this, DHH people who listen through hearing devices do not necessarily receive the same sound input to the brain as hearing people.

Hearing that sound is happening is not the same as processing sound. Determining where a sound is, how far away it is, what kind of sound it is, and whether it is different from other sounds: all of these are possible because of auditory processing in the brain. When a hearing person is listening, they have access to subtle auditory cues. These are variations in sounds that we need for differentiating one sound from another. They play an important role in auditory processing. Even with hearing devices, most DHH people will miss some of these cues. In some situations, these cues are missing for hearing people too. Have you ever struggled to understand someone speaking through a megaphone, intercom system, or out-of-tune radio? Hearing a spoken message does not necessarily mean that all of its information was accessible.

What might happen if a DHH person heard the message, but some of the information in it was not accessible? A few examples:

  • Misunderstanding words and sentences: the DHH person heard, “The samurai” instead of “The sand is dry.”
  • Misunderstanding the tone of the message: the speaker was excited, but the DHH person heard their tone of voice as angry.
  • Difficulty hearing in background noise: the speaker’s voice seemed distorted by the noises in a restaurant or at a party, and the DHH person did not hear the words clearly.
  • Perceiving a sound as far away when it is nearby: the speaker was near the DHH person in the hallway, but the sound of their voice seemed farther away. The DHH person did not know that the speaker wanted their attention.
  • Perceiving two similar but different sounds as the same: the DHH person consistently hears /m/ and /n/ as the same, so words like “moo” and “new” also sound the same.

Listening for understanding requires cognitive effort. Auditory processing isn’t the only thing that the brain does with sound. Language processing is a whole other topic for another day (and spoken language is not the only kind of language!). For now: making sense of sounds and understanding their meanings within a spoken language requires effort and energy from our brains. That effort is greater for DHH people who use hearing devices because the auditory input that we receive is not the same as hearing people receive. Noise in the background means that even more cognitive effort is required for listening. When a person has to use more cognitive resources to listen, their ability to comprehend and remember auditory information decreases.

Think of it like the gas tank in a car: when the road is clear and you’re driving at a steady speed with no delays, you will use less gasoline than you would when driving the same distance in a rush hour traffic jam. For most hearing people, daily listening involves clear roads and steady speeds with a few pockets of occasional traffic. The day ends, and a new day begins with a full tank of gas. But for DHH people, there are fewer clear roads. The day is full of traffic jams and roadblocks like background noise, lack of access to visual cues, and complex listening situations where auditory information is missed. The day ends, but our gas tanks never get refilled completely. Our hearing coworkers and classmates might be ready for a nap by the end of a long day while we were exhausted and in need of a listening break (and maybe a nap too!) by noon.

Why did the DHH person in your life hear what you said just now but didn’t seem to hear you five minutes ago? Maybe they heard you talking five minutes ago but didn’t know that you were talking to them. Maybe because they thought you were talking to someone else, they opted to save some of their listening energy for later. Maybe they heard what you said five minutes ago but another sound was happening at the same time and their brain prioritized that sound instead. Maybe the speech sounds of what you said five minutes ago were more challenging to understand than the speech sounds of what you said just now. Maybe the DHH person is exhausted from a long day of nonstop auditory input, and what you said five minutes ago required more listening effort than what you said just now. Or maybe you spoke more softly five minutes ago and they didn’t hear you at all. There are many possible reasons that a DHH person might not have understood a spoken message.


A smiling, young white woman with glasses wears her dark hair back to show her cochlear implant.

Dr. Sparks holds a clinical Doctorate in Audiology (Au.D.) from Gallaudet University. She is the founder of Audiology Outside the Box, an audiologic counseling and aural (re)habilitation-focused telepractice. She also works part time at another clinic, providing cochlear implant, hearing aid, and diagnostic testing services. Currently, she is studying at Gallaudet for a Ph.D. in Hearing, Speech, and Language Sciences. Her clinical and research interests include pediatrics, vestibular assessment and rehabilitation, cochlear implants, the audiologist’s role in counseling and self-advocacy skill development, and audiology services provided in American Sign Language. Her Ph.D. dissertation research will focus on vestibular dysfunction and its impact on deaf/HoH children.