Family Ties

By Ava Finnerty

Ava’s mother, Sonia, had a hearing loss but never disclosed it to anyone.

Ava’s mother, Sonia, had a hearing loss but never disclosed it to anyone.

My mother Sonia, born and raised in Wales, was the first person I knew with a hearing loss. She concealed it for many years. In my adolescence and young adulthood, I came to learn of her hearing loss, my grandmother’s, and, eventually, my own.

During World War II, my mother served in the Women’s Air Corps in Britain, where it was her duty to (wo)man the barrage balloons on the White Cliffs of Dover. It was there she met and married my father, John Jessen, a U.S. Army Sergeant preparing for D-Day. During the war she gave birth to my oldest brother, and then they both emigrated to the U.S. in 1946 to reunite with my father.

My parents moved to a veterans housing project in Bayonne, New Jersey, to raise our family. My mother was a very private person who largely refrained from sharing her medical issues with my two brothers, my sister, and me. I have a vague memory of her having some kind of ear surgery in the early 1950s, when I was 5 or 6 years old, but I did not receive an explanation. 

Every time we went swimming, my mother plugged her left ear with a large wad of cotton and covered her head with a bathing cap. She told us she had a “hole” in her ear that needed to be protected from water. Incidentally, my mother helped tend bar at my father’s parents’ bar, The Viking, before becoming pregnant with me, but I later learned for certain her hearing loss was not caused by noise.

A Family Inheritance

A strict parent, my mother believed “children should be seen and not heard,” so I thought she often remained silent in response to my questions on purpose, and not because she literally could not hear me. It was only when I was a teenager that my mother told me the truth about her hearing. She had a severe hearing loss, but she did not treat it. Her small group of friends likely provided some support for her untreated condition.

My mother inherited her hearing loss from her mother, Bessie, who was profoundly deaf. Grannie still lived in the small Welsh village of Pontypool, where I visited her occasionally, first when I was 20, before my own hearing loss had been identified.

Shown with extended family, Ava (second from right) traces her roots to Wales through her mother Sonia.

Shown with extended family, Ava (second from right) traces her roots to Wales through her mother Sonia.

Grannie was a voracious writer—I suppose by necessity, because she did not wear hearing aids. She was keenly in touch with her surroundings, able to sense vibrations and read lips adeptly. Relying heavily on her vision, she was more cognizant of others’ facial expressions and body language than most with typical hearing. 

At my wedding Grannie impressed me with her grace as a dancer, using the feelings of the bass and drums to move rhythmically. She was a strong and confident woman who’d grown resilient living as a mother and grandmother with a hearing loss during World War II. 

A Gradual Process

My own difficulty hearing came on so gradually it was hard to notice. But I do remember vividly the day I realized the difference between my left and right ears. I was then a parent of three young children, living in Bayonne in a two-family house with my mother. I was cooking while cradling the phone between my right shoulder and right ear. 

At one point in the conversation I switched the phone to my left ear and realized I could not hear what was being said. Despite this realization, I compensated for some time, relying on my “good” ear for conversation. It is truly amazing what a person can get accustomed to not having!

Around this time I could tell that my hearing loss was affecting my work. I was well into my career as a high school English teacher. At first, I attributed my inability to understand my students to their mumbling or mouth-covering. But, as the problem worsened, I knew it was me, not them. Only later did I learn my colleagues thought I was aloof because I would not acknowledge their greetings!

I developed a meaningful relationship with my mother, incidentally, during the onset of my own hearing loss. She and I cared for my father, helping him with home kidney dialysis every other day, and formed a very close bond. After his death, we spent many hours talking together, and I told her about the difficulty I had hearing my students.

Even though I knew of my mother and my grandmother’s hearing loss, I had concluded I had a buildup of earwax in my left ear. My husband Joseph, who was the chief echocardiography technologist for New York Hospital, was able to refer me to an audiologist at New York Weill Cornell Hospital. 

There I learned I had almost no earwax buildup—but I did have a significant hearing loss. I was diagnosed with a 78 percent loss in my left ear and a loss of close to 30 percent in my right. 

Successful Surgery

My left ear’s hearing loss was due to otosclerosis, an abnormal growth of bone in the middle ear. Otosclerosis is commonly thought to be inherited but its causes remain unclear. Scientists cite measles infections, stress fractures to tissue surrounding the inner ear, and immune disorders as possible causes. My doctor noted my otosclerosis was accelerated by my pregnancies, and research has since suggested this is possible.

I had a successful stapedectomy on my left ear, a surgical procedure that replaces the stapes bone with a prosthetic device so the bones in the middle ear can again vibrate in response to
sound and hearing is restored. The procedure was minimally uncomfortable but did cause severe vertigo, which I was able to control with medication. 

In the late 1980s, my mother finally chose to pursue hearing aids but wore them rarely because they emitted a very high-pitched sound. Later in her life, she stopped wearing them completely. Since we shared the two-family home, my family and I always knew what Grandma was playing on her television or radio upstairs at maximum volume. And we lost count of the number of times she shouted “whadjasay?!” to my father.

Mom became increasingly withdrawn. She never wanted to go out on dinner dates or socialize with friends. Only in recent years, after her passing, have I come to understand this preference for isolation.

Over the decades that followed, the hearing in my right ear slowly diminished and I found it increasingly difficult to manage at social events. I wanted to undergo a second stapedectomy, but the audiologist told me this wasn’t recommended. 

I was fitted for hearing aids instead. The audiogram showed a moderate hearing loss in my left ear and a severe loss in my right with difficulty hearing low frequencies in both. No wonder I could not hear the deep-voiced young men speaking in class!

The audiologist asked if I wanted access to sounds at 180 or 360 degrees. I said 360 because I wanted to hear what my students were saying behind my back. I always told my students that although I wore hearing aids, they needed to speak clearly and be aware that I sometimes surprised myself by what I was able to hear. I specifically told my students to never say “never mind” if I asked them to repeat themselves or speak up, but to repeat and rephrase what they said.

Vigilant About Hearing Well

Ava (middle) and her two daughters.

Ava (middle) and her two daughters.

This was in 2011, when I was 62 years old, and I’ve vigilantly worn my hearing aids since. The devices have, for certain, added to my quality of life. They are not perfect, but I consider them an absolute necessity if I want to hear my grandchildren and other family members. I am a music lover, play-goer, and movie fan. And had I not begun wearing them, I surely would have retired from my teaching career earlier than I wanted to. 

I supplement my hearing aids with simple requests and tools. I have no problem telling someone, “I don’t hear as well as I would like to. Could you say that again?” I retired in 2014, after 42 years of teaching high school English, and then was elected to be a Bayonne Board of Education trustee in 2015. During our meetings I prefer to sit at or near the head of the table to read the lips of the person speaking. 

I use closed captions at home watching television. When I babysit, I often go to my grandchildren’s bedroom doors to check on them because I am not sure if they are crying. I love baby monitors that not only light up but also have video for me to easily check. 

Both my daughter and daughter-in-law are aware of the genetic predisposition for otosclerosis. In fact, my daughter thinks that her 16-year-old daughter may have some hearing loss. My advice to her was to pay attention—but also that there is a great distinction between “hearing” and “listening,” especially when it comes to adolescents! 

Ava Finnerty lives with her husband Joseph in New Jersey. A retired English teacher, she serves on the Bayonne Board of Education as a trustee. Their adult children are Kristen, also an English teacher; Jill,
a music teacher; and Sean, a U.S. Navy veteran who served in Iraq. This article originally appeared in the Summer 2019 issue of Hearing Health magazine. For references, see

Print Friendly and PDF

Restoring Teachable Moments

By Neyeah Watson

Effective listening is fundamental to being a teacher. Terry Harris, who lives with a severe-profound sensorineural hearing loss, teaches special education in Glenview, IL. His life and profession changed dramatically when he experienced three months of total deafness — prompting him at age 40 to undergo cochlear implant (CI) surgery to restore his access to sound.  

Harris was diagnosed at age 4 with a profound hearing loss in his left ear and a severe-profound loss in his right. He suffered chronic ear infections and was believed to have contracted the mumps, and, at the time, his doctors believed this caused his hearing loss. Harris’s current ENT suspects the cause is genetic, as his great aunt was deaf and his son recently developed a mild hearing loss.

Terry Harris leads a presentation. Credit: Brian O'Mahoney, Pioneer Press

Terry Harris leads a presentation. Credit: Brian O'Mahoney, Pioneer Press

Despite his bilateral loss, Harrris was fitted with a hearing aid in his right ear only after his diagnosis. He attended an oral program for deaf and hard-of-hearing students until third grade before transitioning to mainstream education. In both schools, he used speech-language pathology and lipreading to supplement the amplification he received from his hearing aid. His individualized education plan (IEP) primarily focused on vocabulary development, speech-language development, developing compensatory skills, and utilizing accommodations. 

Though his IEP continued through his high school graduation, Harris struggled to follow noisy discussions in the classroom. Academics were challenging, but he received average marks or better thanks to his phenomenal teachers and hearing intererent (aide). Meanwhile, Harris developed a love for sports, which became more of a focus and priority for him in high school. 

Harris brought his passion for football and baseball with him to Elmhurst College in Elmhurst, IL, where he studied Special Education. Although Harris opted not to receive a 504 plan — an agenda to ensure a student with a disability has access to accommodations that will secure their success — his academic experience at Elmhurst was positive and accessible. He appreciated, for example, that he was able to take American Sign Language courses to fulfill his foreign language requirement. 

In 1999, Harris became a special education teacher, fulfilling a dream he’d had since eighth grade. Harris’s love for teaching derives from the support he received from his own educators. “I teach because of the teachers and coaches who influenced my childhood,” he explains. “I attribute my success to them. They never let me use my hearing loss as an excuse for failure or an excuse not to try something.”

Terry Harris writes on the whiteboard in social studies class. Credit: Brian O'Mahoney, Pioneer Press

Terry Harris writes on the whiteboard in social studies class. Credit: Brian O'Mahoney, Pioneer Press

Harris did not consider CIs until 2014, when he experienced sudden loss of the remaining hearing in his right ear. He lost his hearing completely. “I struggled knowing if I had missed any teachable moments as a result of not hearing everything,” Harris recalls. 

CI surgery had not been considered for Harris during his childhood, when the procedure was still viewed skeptically. But when he experienced total deafness, he viewed CIs as his only option. While aware of the intense aural rehabilitation that would follow, Harris was fully committed to the process of getting the hearing he needed and deserved.

Before the procedure, Harris taught for an entire month while completely deaf. He relied solely on lipreading and the assistance of a few teacher’s assistants. These three months served as a time for Harris to understand just how much CIs could better his quality of life. During this time he remained excited about restored access to sound. 

Harris took a four-month medical leave of absence for rehabilitation after the surgery. Although he did not want to be away from his students long, he was aware that the time was necessary in order to invest in developing to be the teacher he believed his students deserved. Now for the first time in his life, Harris is able to hear in the normal range, as well as localize sound.

Not getting the surgery sooner was Harris’s only regret. Now for the first time in his life, Harris is able to hear in the normal range, as well as localize sound.“I am much more confident in the classroom and other areas of the school building. The cafeteria, the auditorium, or even the gymnasium are no longer ‘problem’" areas for me.”  

Harris makes it a priority to incorporate his hearing loss story into his lessons, and begins each school year with a presentation about how CIs work. Given that Harris teaches children with special needs—including three students with hearing loss to date—he believes these lessons inspire self-determination, compensatory skills, and self-advocacy. He is proud to share his own experiences to let his students know they can achieve fulfillment living with a hearing loss or other perceived limitations.

Print Friendly and PDF

I Hear a Symphony: For U-M Violin Student, Hearing Loss is Not a Disability

Violin teacher Danielle Belen uses a lot of gestures and hand signals but not many words. Her student, Abigel Szilagyi, relies on vibrations, muscle memory and instincts.

Learning to play an instrument can be difficult for anyone, but Szilagyi must work through her own challenges: She was born with just 50% of her hearing.

When Belen, an associate professor of violin at the University of Michigan School of Music, Theatre & Dance, and Szilagyi, a violin soloist and chamber musician, started working together in California seven years ago, it took nearly two months before Belen learned her then-14-year-old student had only half of her hearing. And she didn’t notice herself—the musician’s mom broke the news about her daughter’s unique circumstance.

Credit: Szilagyi personal archive

Credit: Szilagyi personal archive

“Her mother asked me if I had noticed anything different about Abi,” Belen said. “I quickly answered, yes, she is very talented and I am totally drawn by her passion.”

The answer was not wrong, but incomplete.

“Her mom looked directly into my eyes and said, ‘Abi is partially deaf!'” Belen said.

Belen had never worked with a student who was hard-of-hearing, so it was difficult to imagine how this would work.

“But something immediately in her personality showed me that it would be possible,” she said. “I was quite impressed with her ability for her age.”

With passion and extra discipline, Szilagyi has never stopped playing the violin. In 2016, she was accepted to the U-M School of Music, Theatre & Dance and moved to Ann Arbor, where Belen has taught since 2014.

Szilagyi, who will be a junior at U-M this fall, said she had many instructors before Belen, but there was no connection.

Credit: Szilagyi personal archive

Credit: Szilagyi personal archive

“I wanted someone who believed in me and who saw my hearing problem not as an inability, but an ability,” she said. “I wanted someone who understood my hearing disability was as much a part of me as being a musician because I always wanted to connect these two parts of me.”

When Szilagyi was 4, doctors discovered she was born with a 50% sensorineural hearing loss in both ears. A couple of months later, she got her first set of hearing aids, and on the way home heard a “lovely and vibrant singing sound.”

“I then heard the birds chirping for the first time in my life,” Szilagyi said. “I was so drawn to how they could make such beautiful music. This sparked my desire to become a musician.

“It is amazing how your body can adapt. I am very observant. I learned lip reading, body movements, facial expressions and other types of communication that really help me to play well and find the correct tune.”

Belen explained that when Szilagyi’s basic level of hearing is diminished, her other senses pop up, especially her sense of touch.

“She can imitate sounds and mannerisms and has a remarkable skill of imitation,” Belen said. “Her visual cues are very sophisticated. She is like clay—moldable, flexible—yet she has her own identity as well. She is truly the ideal young artist.”

Hearing aids, ear plugs and no ear plugs

During her freshman year at U-M, Szilagyi suffered some serious ear infections that led to further damage to her fragile hearing. During the entire fall semester, she could not wear hearing aids because of sharp pain.

Again, no quitting, just a break and new adaptations. Now, the only way Szilagyi can play and tolerate the sounds is to wear ear plugs.

“It is a hard thing to explain,” she said. “While I struggle to hear ordinary sounds and conversations at a normal volume, my ears are extremely sensitive to loud sounds and pressure and it causes sharp pain in my ears.”

Her professor works closely to adapt to whatever Szilagyi needs to do.

“Her lessons are a bit crazy to watch,” Belen said “She has her hearing aids near by and they go in and out. When I need to talk to her, it goes in, and when she is playing, it goes out. It is challenging, but somehow, we are managing it.

“Instead of being frustrated, she laughs. As serious as the situation is, she is able to look at the big picture and realize that this is all joy. She has an amazing attitude. I know there are tears, sacrifices, pain and frustration, but there is also gratitude. She always rises from the challenges, and I am sure she will have a unique and important career as a violinist.”

This article was repurposed with permission from Michigan News, University of Michigan.

Print Friendly and PDF

A Newly Identified Neuron in a Brain Region Tied to Hearing

By Michael T. Roberts, Ph.D.

Most of our auditory experience requires extensive and precise computations in the brain. While the neural circuitry underlying these computations has become increasingly clear over the past several decades, there has remained a big gap in our understanding of the neural circuitry in an important brain region called the inferior colliculus (IC).

Located in the midbrain, the IC is the hub of the brain’s auditory pathway. Like an airport hub that processes travelers moving among farflung airports, the IC receives and processes most of the output of lower auditory centers and provides the major source of auditory input to higher brain centers.

Although the IC plays important roles in most auditory functions, including speech processing and sound localization, it has proven difficult to identify the types of neurons (nerve cells) that make up the IC. This has hampered progress because the ability to identify neuron types is a prerequisite for determining how specific neurons interconnect and function within the broader auditory circuitry.

Recently, my lab at the University of Michigan tackled this long-standing problem and successfully identified a novel neuron type called VIP neurons. VIP neurons make a small protein called vasoactive intestinal peptide. Despite its name, previous studies have shown that VIP is made by specific types of neurons in several other brain regions.

Sections of the inferior colliculus, the hub of the brain’s auditory pathway. A newly identified neuron type called VIP neurons, which make a small protein called vasoactive intestinal peptide, have been dyed magenta.

Sections of the inferior colliculus, the hub of the brain’s auditory pathway. A newly identified neuron type called VIP neurons, which make a small protein called vasoactive intestinal peptide, have been dyed magenta.

Our team, led by postdoctoral fellow David Goyer, Ph.D., hypothesized that VIP is a marker for a class of neurons in the IC. To test this hypothesis, we used a genetically engineered mouse to label VIP neurons with a red fluorescent protein. This made it possible to use fluorescence microscopy to target experiments to VIP neurons in the IC.

These experiments revealed that VIP neurons in the IC have internally consistent anatomical and physiological features, supporting the conclusion that IC VIP neurons constitute a distinct neuron type. Examination of the neuronal processes of VIP neurons further revealed that individual VIP neurons likely receive input from a range of sound frequencies. Work by collaborators in the Schofield Lab at Northeast Ohio Medical University showed that VIP neurons also send output to several brain regions, including to higher and lower auditory centers and to a brain region involved in visual processing.

In another set of experiments, we combined electrical recordings from VIP neurons with a technique called optogenetics, which allows scientists to stimulate specific populations of neurons using brief flashes of light. These experiments revealed that VIP neurons receive input from the dorsal cochlear nucleus, one of the first brain regions in the auditory pathway. The path from the cochlea to VIP neurons is therefore quite short, passing through only three synapses.

Michael Roberts ERG.jpg

This study, which combined both sets of experiments and was published in eLife on April 18, 2019, showed that VIP neurons are a distinct and readily identifiable class of IC neurons. Based on their features, we hypothesize that VIP neurons play a broadly influential role in sound processing. We and the Schofield lab are currently testing this hypothesis, with a particular emphasis on determining how VIP neurons contribute to speech processing in the IC. 

A 2017 Emerging Research Grants scientist, Michael T. Roberts, Ph.D., heads the Roberts Laboratory and is an assistant professor at the Kresge Hearing Research Institute, University of Michigan.

Print Friendly and PDF

Which Restaurants Are Way Too Loud (or Not)? Get Real Data and Share It!

By Kathi Mestayer


Recently, I found myself in a restaurant that was so noisy, the waitress leaned over and told us, “I can’t hear in here, either!” So, it’s not just me. In fact, a 2015 survey by Zagat that found that noise in restaurants was listed as the top complaint by diners.

One of the more satisfying things I do in that situation is to get out the decibel app on my smartphone and take a measurement. Is it really that loud? The answer is usually yes! I’ve gotten decibel readings as high as 95 dBA (“dBA” refers to decibels adjusted for human hearing). So, I gripe politely to the wait staff or manager, and consider adding it to my “never again” restaurant list. Or I visit during off hours, at 3 p.m.

Then I discovered that there are decibel apps that allow you to share your data on how loud (or quiet!) the restaurant is! Here’s SoundPrint, which I have been using for a couple of years with great success (and whose founder wrote in the Spring 2019 issue of Hearing Health about the genesis for the app).

Here is how SoundPrint works:

1. Download the app from the site above.

2. When you want to take a decibel reading, take out your iPhone, open the app, and touch the “Start” button. Record the dBA level for at least 15 seconds.


3. Then, hit “Stop.” 

4. To share the sound level at the restaurant/bar/coffeeshop, hit the “Submit” button. 


5. That will take you to the “Your Location” screen, which will give you its best guess as to where you are. You can also enter the name of the venue into the field near the top. (It will be easier to find the venue if you have the “Locations” setting activated on your iPhone. You can turn it off again immediately, if you’re as paranoid as I am.)


6. Select the venue and hit “Submit.” Your data will be on the SoundPrint site, without your name or any identification, for the rest of us to see. I’ve submitted data on places that are way too loud or nice and quiet. 

I just took a look, using the Search icon at the lower left of the iPhone screen, at Richmond, Virginia, where I live, and got a few hits! The red ones are way too noisy, orange is pretty noisy, yellow is a little noisy, and green is… quiet! The brown ones are venues that don’t have any data yet.


Clicking “View details” got me to the address and phone number, and gives you the option of leaving a comment. Now, that said, if you go there and it’s loud, you can take another measurement and submit it, too. And you can add a comment for others to see.

Kathi Mestayer.JPG

If SoundPrint users continue to add to the database, for places all around the country, and especially when places are quiet(ish), it results in such a wonderful shared resource! My favorite memory is of the time I was taking a decibel reading and the waitress was looking over my shoulder, very curious about what I was up to. I showed it to her, and hope she shared it with the manager.

Staff writer Kathi Mestayer serves on advisory boards for the Virginia Department for the Deaf and Hard of Hearing and the Greater Richmond, Virginia, chapter of the Hearing Loss Association of America.

Print Friendly and PDF

Amplifying the Home: A Technology Guide

By Neyeah Watson


Living independently may seem challenging, or even daunting, to someone who has recently been diagnosed with a hearing loss. Fortunately, innovations in technology can vastly improve life and safety in the home. Functions like answering visitors at the door, waking up with an alarm clock, and responding to an emergency can be simplified with various tools. 

Below we review devices and applications that can help you or your loved one with hearing loss perform everyday tasks and live safely.

Waking Up
A specialized alarm clock with a round, vibrotactile device attached can be placed under one’s mattress or sheets. Instead of making sounds like beeps or music, the vibrotactile device wakes the sleeper through movement. A vibrating watch worn to sleep can be used instead of, or in addition to, an alarm clock with a shaker device. Like an alarm clock, these watches use vibrations and visual representations to wake sleepers.


Responding to Danger
A multi-part device that includes a bed-shaker can be connected to a smoke detector to notify the resident of danger. One part is a flat, round, vibrotactile device placed under the mattress or other furniture that responds with movement when the smoke detector identifies a fire. The other part of the device mimics the design of an alarm clock. When activated by the smoke detector, strobe lights and/or the word “FIRE” display on the screen. Carbon monoxide devices for residents with hearing loss are designed similarly. If carbon monoxide is detected, the strobe lights and vibrating device are triggered.

Landline Phone Conversations
Captioned telephones help those who struggle to hear on a landline phone. These phones translate spoken conversation into visual text. The telephones look like standard phones with large screens attached. Most of these landlines transcribe what the other person on the other end saying, not the entire conversation. Captioned phones are available for free to individuals with hearing loss with documentation from a professional such as an audiologist or medical doctor. 

Smartphone Use
For those who use their smartphone as a means of communication at home, smartphone applications can make conversations easier by captioning the call in real time. Speech-to-text apps, the majority of which are free, use a computer voice recognition system to provide captions. Other apps transcribe in-person conversations picked up by smartphones’ microphones. 

Greeting Visitors
Door signalers notify residents of the arrival of visitors and can take different forms. Devices with screens—to show who is at the door or to indicate that someone is present—can be placed around the home. Other versions are connected to the doorbell; when the bell is detected, signal lights in front of the door will flash.

Acknowledging Natural Disasters
Weather alert machines come in the form of receivers that are connected to weather stations. When an emergency occurs, the receiver will turn on and issue a response, usually in the form of vibrations or an extremely loud alarm. Then the warning light will appear with a short message such as “TORNADO” on the display. External devices, such as strobe lights, sirens, and vibrating devices, will also be activated.

Print Friendly and PDF

CT Imaging as a Diagnostic Tool for Ménière’s Disease

By Ngoc-Nhi Luu, M.D., Dr. med.

Ménière’s disease is an inner ear condition with symptoms including vertigo, hearing loss, and tinnitus, and may be associated with an accumulation of fluid in the inner ear, termed endolymphatic hydrops. Diagnosis of Ménière’s is entirely based on clinical characteristics, and to date, no classification has been established that can predict the onset or course of the disease. Patients with Ménière’s can have varying degrees of symptoms, so defining subtypes within the Ménière’s population may help establish a classification to improve diagnoses and treatments.

Our previous analysis of cadaveric ears of patients with Ménière’s revealed striking differences within the endolymphatic sac in the inner ear, which regulates endolymph fluid. We had found two different aberrations of the endolymphatic sac—its underdevelopment or its degeneration—among Ménière’s patients, suggesting that the loss of endolymphatic sac cell function and the possible impairment of endolymphatic fluid regulation may lead to Ménière’s. In addition, these two pathologies may be associated with differing clinical traits of the disease.

In our prior work, we examined sections of human cadaveric inner ears with Ménière’s and found differences in the angular trajectory of the vestibular aqueduct (ATVA), the bony canal in which the endolymphatic sac is located. These differences resembled either the trajectory of typical adults, or the trajectory of early developmental, fetal vestibular aqueducts. ATVA similar to other adults without Ménière’s were associated with late onset of the condition, whereas Ménière’s patients with “fetal” ATVA experienced early onset.

A 3D reconstruction of the endolymphatic space of a typical human adult inner ear. In Ménière’s disease patients, the anatomy of the endolymphatic sac differs, suggesting that the impairment of the sac’s function to regulate fluid may lead to Ménière’s. (LSC, lateral semicircular canal; PSC, posterior semicircular canal; SCC, superior semicircular canal.)

A 3D reconstruction of the endolymphatic space of a typical human adult inner ear. In Ménière’s disease patients, the anatomy of the endolymphatic sac differs, suggesting that the impairment of the sac’s function to regulate fluid may lead to Ménière’s. (LSC, lateral semicircular canal; PSC, posterior semicircular canal; SCC, superior semicircular canal.)

For our paper published in the journal Otology & Neurotology in April 2019, we hypothesized that this difference can be detected with a CT scan (computerized tomography) in patients with early or late onset Ménière’s.

We used a custom-made, open-source web application for angle measurements and applied this technique on high resolution CT imaging of patients with Ménière’s. Comparing the angle measurements of the ATVA, we confirmed the results of the cadaveric study. There was a strong correlation between late onset Ménière’s with a typical “adult” course of the vestibular aqueduct, while early onset Ménière’s was associated with a more straight, “fetal” course of the vestibular aqueduct.

As such our study aims to develop a radiographic screening tool, such as a CT scan of the inner ear, to classify different Ménière’s subtypes. It appears that early onset Ménière’s patients have a different anatomy of the vestibular aqueduct compared with late onset Ménière’s patients.

We want to better understand if these findings also correlate with additional clinical factors, such as specific symptoms or a positive family history for Ménière’s. Ultimately, this may help to further characterize different Ménière’s subtypes in order to better diagnose, predict the course of, and treat the condition.


Ngoc-Nhi Luu, M.D., Dr. med., is a postdoctoral fellow at Eaton-Peabody Laboratories, Massachusetts Eye and Ear, Harvard Medical School, and an ENT resident at University Hospital Zurich. Luu’s 2017 ERG grant was generously funded by The Estate of Howard F. Schum. Coauthors on the paper include Judith Kempfle, M.D. (a 2010 ERG scientist), Steven Rauch, M.D. (1990 ERG), and Joseph Nadol, M.D. (1976–77 ERG).

Print Friendly and PDF

Close-Minded Captioning

By Amber Gordon

Sound can provide remarkable connections to the world around us. As a Longwood University communication sciences and disorders student, I’ve come to better understand how people with hearing loss experience sound, and that improvements to accessibility are urgently needed.

I have typical hearing, but know from Longwood professor Mani Aguilar, Au.D., that insufficient access to auditory information can have negative emotional and social consequences in many areas of life, including entertainment. Watching a TV show with a friend with typical hearing and not understanding why they are laughing is bound to make one feel left out.

While hearing aids and cochlear implants are extraordinarily beneficial to communication, many people with hearing loss rely on captioning to fully access audiovisual media. Because of its necessity, the Americans with Disabilities Act (ADA) requires closed captioning for video transcripts by state and local government entities and “places of public accommodation” (including universities, libraries, and hotels). Sections 504 and 508 of the Rehabilitation Act require the electronic communications of U.S. federal offices and federally-funded organizations to be accessible and captioned. 


For TV programs, the Federal Communications Commission (FCC) requires TV captions to be “accurate, synchronous, complete, and properly placed.” The 21st Century Communications and Video Accessibility Act calls for “video programming that is closed captioned on TV to be closed captioned when distributed on the Internet.”

But there are no existing laws to address captioning in the majority of online video. This was brought to light when the National Association of the Deaf sued Netflix for the lack of closed captioning on videos on their site. The district judge ruled in favor of closed captioning on streaming services; however, because this was not a Supreme Court ruling, the case did not establish a national model for ADA’s standards for online services and businesses. 

Many streaming services do include closed captions within their video services with no stipulations for quality. As noted in HuffPost, the Netflix series Queer Eye had inaccurate captions that censored profanity and changed words being used in multiple instances. A Reddit user states that shows on Netflix and Amazon Prime, in general, do not signify who is talking when they are off-screen, creating confusion as to which character is saying what.  

Meanwhile, platforms like YouTube and Facebook remain unregulated. Enabling auto-captioning on videos is merely an option for video creators, and, in many cases, this auto-generated captioning is not accurate. For precise captions, video creators must make manual edits, which can be time-consuming or expensive. 


Consider also that tone and verbal inflection can change the entire meaning of a sentence. Spoken words are just part of the piece the puzzle for those who rely on captions. According to The Atlantic, machine translation “can’t register sarcasm, context, or word emphasis. It can’t capture the cacophonous sounds of multiple voices speaking at once, essential for understanding the voice of an angry crowd of protestors or a cheering crowd. It just types what it registers.” 

We already have requirements for government programming and news alert systems. We have accessibility laws for television and even for some online content. But as entertainment becomes increasingly digital, these regulations must be transferable.

Otherwise, information remains lost in translation because captioning laws are only applicable to some circumstances. Isn’t access for everyone, regardless of hearing ability, enough reason to advocate for expanded captioning? Why must those with hearing loss be kept back by where we’ve drawn the line on accessibility?

If you are a hearing individual, I encourage you to place yourself in the shoes of someone with hearing loss. Mute your TV for a day. Mute the sound on your device playing YouTube or Facebook and enable closed captioning. How long does it take until you get annoyed? Frustrated? I’m willing to bet not very long. 

It is undeniable that closed captions have contributed greatly to the advancement of accessibility for people with hearing loss, but much work remains. We have to recognize the urgency of reliable captioning in online media.

What can we do? If you’re in a restaurant and notice that there are TVs playing without captions, politely request them. If you run a business where there are waiting rooms and lounges with televisions, please turn on captions. If you watch YouTube and notice that one of your favorite creators does not caption their videos, leave comments or write emails to encourage them. Hold streaming services like Netflix and Amazon Prime accountable by letting them know when captions are inaccurate or poorly transcribed. Lastly, if you’re watching television or your favorite show and you notice poor closed captioning, file a complaint to the Federal Communications Commission under the “Access for People with Disabilities” section of their Consumer Complaint Center. 


Slowly but surely, if we continue to think of others who are unlike ourselves, strive for empathy and advocate for equal accessibility for all, a change can and will be made. 

Amber Gordon is an aspiring speech-language pathologist who lives in Virginia.

Print Friendly and PDF

Lasting Effects From Head and Brain Injury

By Elliott Kozin, M.D.

Traumatic brain injury (TBI) is a major public health issue and contributes to injury-related morbidity and mortality worldwide. The estimated economic cost of TBI is estimated to be in excess of $76 billion per year in the United States. Unfortunately, the health effects of TBI are profound. TBI can lead to chronic and debilitating physical and psychosocial symptoms, such as loss of cognitive, sensory, and psychological function. Auditory and vestibular dysfunction has long been recognized as a consequence of head injury, including TBI. 

In our research “Patient‐Reported Auditory Handicap Measures Following Mild Traumatic Brain Injury,” published in The Laryngoscope, we examined auditory complaints following traumatic brain injury, as well as changes that occur to the peripheral vestibular system in the postmortem setting. In patients with mild traumatic brain injury (mTBI), we used patient-reported outcome measures to assess auditory complaints. The team found that auditory symptoms and associated handicap were common in patients with non-blast mTBI. 

erg Elliot Kozin.JPG

For another paper in The Laryngoscope, “Peripheral Vestibular Organ Degeneration After Temporal Bone Fracture: A Human Otopathology Study,” we evaluated postmortem specimens from the National Temporal Bone Pathology Registry with head injury. In a cohort of patients with temporal bone fractures, there were distinct peripheral vestibular changes. Collectively, these findings have implications for the pathophysiology and management of symptoms in this patient population.

rara logo.gif

Elliott Kozin, M.D., is a neurotology fellow at Eaton Peabody Laboratories, Massachusetts Eye and Ear/Harvard Medical School, and a 2018 Emerging Research Grants recipient generously funded by the General Grand Chapter Royal Arch Masons International.

Print Friendly and PDF

Everything Sounds

By Caryl Wiebe

Sometime in grade school, my parents noticed I favored my right ear because I turned it toward people during conversations. Concerned about my hearing, they took me to an ear, nose, and throat doctor who put drops in my ears for my eustachian tubes, the passageways that connect the throat to the middle ear. This provided very little improvement, but I didn’t worry. I felt could hear the important things in my world and maintain my ability to sing a cappella with my sisters in grade school and then in choirs in high school and college.

At 18, I got married and had three children in the eight years that followed. Over time I noticed my hearing was considerably declining in my left ear, even though we were able to tour as a singing family for eight years to churches in Oklahoma and California, and even sang on the radio. I was always able to hear my family, but my husband and I noticed that it was hard for me to keep up when we were in church or in a group. 

With his support, I decided to see a well-respected ear surgeon, Gunner Proud, M.D., at the University of Kansas Medical Center. Dr. Proud determined that my stapes had a calcium overgrowth that prevented its movement (otosclerosis). He had a strong reputation as a surgeon, so I was comfortable undergoing a stapedectomy, a middle ear procedure to restore hearing with the insertion of a prosthetic device.

I was dizzy after the surgery, but within three or four days it was deemed a success and I was pleased by what I was able to hear again. “I can hear the tires,” I announced to my husband. He was amused—he didn’t know what it was like to live without life’s most ordinary sounds.

I was thrilled until my hearing began to deteriorate in my left ear again. Disappointed, I returned to the medical center. Dr. Proud explained that calcium had started to grow around the plastic prosthetic "hammer" that he had inserted into my left ear. Concerned another surgery would eventually lead to the same result, he suggested a hearing aid for my remaining good ear, my right ear. I was hesitant, but I was now 30 and eagerly wanted to hear. I purchased my first of many hearing aids.

Caryl Wiebe.jpg

I'll never forget the first time I had my hearing aid on while giving my children a bath in our cramped little bathroom. I thought the loud noise from their splashing and kicking and laughing would drive me crazy with my aid in my ear. But I decided that if I removed it, I’d fall into the habit of removing my hearing aid in every noisy situation.

That bath was over 52 years ago, and to this day, I maintain the importance of keeping it on, especially when giving advice to older folks. Many complain that “everything sounds different with a hearing aid,” which is true—but at least you can hear! 

So this is my story, no cochlear implant or anything else. I get along very well with my hearing aid and at the age of 82 I don't want to try anything different.

Caryl Wiebe lives in Kansas.

Print Friendly and PDF