balance

Stability in an Unstable World

By Timothy S. Balmer, Ph.D., and Laurence O Trussell, Ph.D.

Balmer & Trussell traced the direct and indirect pathways that carry vestibular information to the cerebellum for controlling balance and posture. Shown here is a primary afferent axon (green) expressing the light-gated ion channel, Channelrhodopsin. Postsynaptic cells, in this case a unipolar brush cell (magenta), were recorded from during stimulation of the input axons by light flashes. This technique was used to discover how direct and indirect vestibular pathways are processed in the cerebellum.

Balmer & Trussell traced the direct and indirect pathways that carry vestibular information to the cerebellum for controlling balance and posture. Shown here is a primary afferent axon (green) expressing the light-gated ion channel, Channelrhodopsin. Postsynaptic cells, in this case a unipolar brush cell (magenta), were recorded from during stimulation of the input axons by light flashes. This technique was used to discover how direct and indirect vestibular pathways are processed in the cerebellum.

Mice are helping scientists to understand how the world around us remains looking stable even as we move.

While out jogging, you have no trouble keeping your eyes fixed on objects in the distance even though your head and eyes are moving with every step. Humans owe this stability of the visual world partly to a region of the brain called the vestibular cerebellum. From its position underneath the rest of the brain, the vestibular cerebellum detects head motion and then triggers compensatory movements to stabilize the head, body and eyes.

The vestibular cerebellum receives sensory input from the body via direct and indirect routes. The direct input comes from five structures within the inner ear, each of which detects movement of the head in one particular direction. The indirect input travels to the cerebellum via the brainstem, which connects the brain with the spinal cord. The indirect input contains information on head movements in multiple directions combined with input from other senses such as vision.

Balmer & Trussell traced the direct and indirect pathways that carry vestibular information to the cerebellum for controlling balance and posture. Direct projections from the vestibular inner ear (green) and indirect projections from the brainstem (magenta) were shown to target different populations of neurons in the cerebellum.

Balmer MTR.png
Larry-Trussell_250px_1.jpg

By studying the mouse brain, Balmer and Trussell have now mapped the direct and indirect circuits that carry sensory information to the vestibular cerebellum. Both types of input activate cells within the vestibular cerebellum called unipolar brush cells (UBCs). There are two types of UBCs: ON and OFF. Direct sensory input from the inner ear activates only ON UBCs. These cells respond to the arrival of sensory input by increasing their activity. Indirect input from the brainstem activates both ON UBCs and OFF UBCs. The latter respond to the input by decreasing their activity.

les paul foundation NEW 0219.png

The vestibular cerebellum thus processes direct and indirect inputs via segregated pathways containing different types of UBCs. The next step in understanding how the cerebellum maintains a stable visual world is to identify the circuitry beyond the UBCs. Understanding these circuits will ultimately provide insights into balance disorders, such as vertigo.

A 2017 Emerging Research Grants (ERG) scientist who received the Les Paul Foundation Award for Tinnitus Research, Timothy Balmer, Ph.D., is a postdoctoral fellow at the Oregon Hearing Research Center at Oregon Health & Science University (OHSU). Laurence Trussell, Ph.D., a 1991 ERG recipient, is a professor of otolaryngology–head and neck surgery at OHSU.

This research summary was repurposed with permission from eLife with permission.

Print Friendly and PDF

Flying My Way

By Ryan Vlazny

Ryan Pitts 001.png

Airplanes and learning about their mechanisms have always made me feel alive. My longtime fascination with all things aerospace inspired my desire to work with computers for a living. But, at times, my hearing and vision loss caused some turbulence.

I was born profoundly deaf and later diagnosed with Usher syndrome―which combines deafness, retinitis pigmentosa (progressive vision loss), and problems with balance―at 8 years old.

Lucky for me, Usher lets me enjoy roller coaster rides with a perspective different than people with typical hearing and vision. I can more acutely feel the car’s ascent up the hill, the hang time at the top, the speed on the drops, the toggling back and forth on the track, and all the loops and twists in between. These sensations are most fun when I ride an inverted coaster―like my first “serious” ride in Oslo, Norway―with the track above me and my feet hanging in the air. I feel like I am flying.

My parents, heavily involved in the Deaf community, decided I’d learn Signing Exact English (SEE)―a manual communication system that, unlike ASL, matches English language and vocabulary―in place of spoken language. By the time I was in the eighth grade, I was fully emerged in mainstream classes, thanks to my parents’ commitment to my language development, and had undergone cochlear implantation. While I cannot understand spoken language with my cochlear implants (CIs), they allow me to hear laughter, birds, music, and the roar of a rollercoaster.

A few years after my CI surgery, airplanes replaced my passion for roller coasters. For my 17th birthday, I had the thrill of riding in an Pitts aerobatic airplane at the airport in Pompano Beach. The 20-minute charter ride felt like being on a roller coaster ride with 4,000 foot drops above the Everglades. The pilot, Jim, did a tricks that felt similar vertical loops on a roller coaster.

My mom and I took an (ordinary) airplane ride to Tallahassee when it was time for me to take the Florida Comprehensive Assessment Test (FCAT), a requirement to graduate high school in the state. There we spoke with government officials about making the test optional for students with hearing loss, and we were successful. Still, after three tries, I passed the FCAT even though the requirement had been eliminated.

EAA 8-1-13.JPG

For the remainder of high school I continued on track, taking advantage of computer-related courses like web design and engineering. I was accepted to the Pre-Baccalaureate Engineering Program at the National Technical Institute for the Deaf at Rochester Institute of Technology (RIT), where I enrolled with a major in mechanical engineering concentrating in aerospace. Some math classes, especially differential equations, were too difficult, and with the support of my advisor, I changed my major to information technology (IT). Unlike with engineering, I felt I was able to fully understand and apply the concepts of IT.

As an IT student, I created a greeting card in Adobe Flash, a multimedia software program, about greeting a new student on my make-believe RIT World Airlines. The greeting card was even commended by the university president, Dr. William Destler in a one-on-one meeting.

Few college experiences compare with my opportunity to build my own airplane game in an application development class, though. The game simulated landing a plane, which other students found fun to play. Even though I wasn’t an aerospace student, I still got to enjoy some exciting plane rides at RIT.

Today I work as a Java developer for a financial technology firm, where I couldn’t be happier. I’m proud to be the pilot of my own career.

BIO: Ryan W. Vlazny lives in Pennsylvania.

Print Friendly and PDF

Improving Diagnostic Test for Ménière’s Disease

By Wafaa Kaf, Ph.D., and Carol Stoll

Electrocochleography (ECochG) is a commonly used assessment of the auditory system, specifically the inner ear and the hearing nerve. ECochG is most often elicited by a brief acoustic stimulus, known as a “click,” at a relatively low repetition rate. It measures two key responses: summating potential (SP) and action potential (AP), which assist in the diagnosis of Ménière’s disease, an inner ear and balance disorder. Previous research has established that individuals with Ménière’s disease are likely to have abnormally large SPs and a large SP/AP ratio. Though click ECochG has great potential to detect Ménière’s disease, it lacks sensitivity, or the ability to correctly identify those with the disease. Only 69% of those with Ménière’s disease are correctly diagnosed, while 31% of those with the disease have normal ECochG results. This lack of accuracy prevents its use as a definitive diagnostic tool. Hearing Health Foundation 2015 Emerging Research Grants recipient, Wafaa Kaf, Ph.D., is researching the use of a novel analysis technique called Continuous Loop Averaging Deconvolution (CLAD) to best improve the sensitivity of ECochG to high click rate for diagnosing Ménière’s disease. Findings were recently published in Ear and Hearing 2017.

kaf-erg.jpg.png

In a recently published paper in Frontiers in Neuroscience, Kaf’s research team shares its findings on the effects of altering the parameters of the acoustic stimulus on ECochG responses to quantify the effect of stimulus rate and duration of the elicited stimuli. Kaf and her research team obtained SP measurements to 500Hz and 2000Hz tone bursts that varied in duration and repetition rate from 20 adult females with normal hearing. CCLAD was used to interpret the tracings elicited by the differing stimuli of tone bursts.

They found that SP amplitude was significantly larger when using the highest stimulus repetition rate. It is believed that the high stimulus repetition rates minimize the neural contributions and mostly reflect hair cell responses, the target of ECochG. In addition, longer duration stimuli is believed to better reflect hair cell involvement while shorter stimuli may be useful in eliciting responses reflective of neural contributions. Lastly, 2000Hz tone bursts produced larger SP amplitude as compared to 500Hz tone bursts. Therefore, 2000Hz tone bursts with a high repetition rate and long duration can be used as parameters to minimize neural contributions to SP measures whereas short duration stimuli can be used if one wishes to asses neural activity.  

The data that Kaf’s team published is a critical initial advancement towards ultimately understanding the SP measurement in diseased ears. Their findings not only provide normative data for tone burst ECochG across stimulus frequencies, stimulus rates, and stimulus durations, but also help others better understand how to improve sensitivity of ECochG for early diagnosis of Ménière’s disease.  

Wafaa Kaf, Ph.D., is a 2015 Emerging Research Grants recipient. Her grant was generously funded by The Estate of Howard F. Schum.

WE NEED YOUR HELP IN FUNDING THE EXCITING WORK OF HEARING AND BALANCE SCIENTISTS. DONATE TODAY TO HEARING HEALTH FOUNDATION AND SUPPORT GROUNDBREAKING RESEARCH: HHF.ORG/DONATE.

Print Friendly and PDF