The exact purpose of these ear squeaks is unclear, but Groh’s initial hunch is that it might help sharpen people’s perception.
“We think this is part of a system for allowing the brain to match up where sights and sounds are located, even though our eyes can move when our head and ears do not,” Groh said.
Understanding the relationship between subtle ear sounds and vision might lead to the development of new clinical tests for hearing.
“If each part of the ear contributes individual rules for the eardrum signal, then they could be used as a type of clinical tool to assess which part of the anatomy in the ear is malfunctioning,” said Stephanie Lovich, one of the lead authors of the paper and a graduate student in psychology & neuroscience at Duke.
Just as the eye’s pupils constrict or dilate like a camera’s aperture to adjust how much light gets in, the ears too have their own way to regulate hearing. Scientists long thought that these sound-regulating mechanisms only helped to amplify soft sounds or dampen loud ones. But in 2018, Groh and her team discovered that these same sound-regulating mechanisms were also activated by eye movements, suggesting that the brain informs the ears about the eye’s movements.
In their latest study, the research team followed up on their initial discovery and investigated whether the faint auditory signals contained detailed information about the eye movements.
To decode people’s ear sounds, Groh’s team at Duke and Professor Christopher Shera, Ph.D. from the University of Southern California, recruited 16 adults with unimpaired vision and hearing to Groh’s lab in Durham to take a fairly simple eye test.
Participants looked at a static green dot on a computer screen, then, without moving their heads, tracked the dot with their eyes as it disappeared and then reappeared either up, down, left, right, or diagonal from the starting point. This gave Groh’s team a wide-range of auditory signals generated as the eyes moved horizontally, vertically, or diagonally.
An eye tracker recorded where participant’s pupils were darting to compare against the ear sounds, which were captured using a microphone-embedded pair of earbuds.
The research team analyzed the ear sounds and found unique signatures for different directions of movement. This enabled them to crack the ear sound’s code and calculate where people were looking just by scrutinizing a soundwave.
“Since a diagonal eye movement is just a horizontal component and vertical component, my labmate and co-author David Murphy realized you can take those two components and guess what they would be if you put them together,” Lovich said. “Then you can go in the opposite direction and look at an oscillation to predict that someone was looking 30 degrees to the left.”
Groh is now starting to examine whether these ear sounds play a role in perception.
One set of projects is focused on how eye-movement ear sounds may be different in people with hearing or vision loss.
Groh is also testing whether people who don’t have hearing or vision loss will generate ear signals that can predict how well they do on a sound localization task, like spotting where an ambulance is while driving, which relies on mapping auditory information onto a visual scene.
“Some folks have a really reproducible signal day-to-day, and you can measure it quickly,” Groh said. “You might expect those folks to be really good at a visual-auditory task compared to other folks, where it’s more variable.”
Groh’s research was supported by a grant from the National Institutes of Health (NIDCD DC017532).
CITATION: “Parametric Information About Eye Movements is Sent to the Ears,” Stephanie N. Lovich, Cynthia D. King, David L.K. Murphy, Rachel Landrum, Christopher A. Shera, Jennifer M. Groh. Proceedings of the National Academy of Sciences, Nov. 21, 2023. DOI: 10.1073/pnas.2303562120
Source: Duke University