News from
PRINCETON UNIVERSITY
Office of Communications
Stanhope Hall
Princeton, New Jersey 08544-5264
Telephone 609-258-3601; Fax 609-258-1301

Contact: Steven Schultz 609/258-5729
Date: February 4, 1999
 

Study Shows How the Brain Senses the Location of Nearby Sounds

PRINCETON, N.J. -- Princeton University scientists have identified a portion of the brain that controls our finely tuned ability to judge the distance of sounds that are very close to our heads. The research offers an insight into the complex processes that connect seeing, hearing and touching with doing; for example, how a person hears a sound and then ducks his head to avoid a nearby obstacle.

The work could eventually have applications in designing better prosthetic devices, helping to diagnose stroke victims, or even in designing the computer programs that control robots.

The researchers focused on an area of the brain called the premotor cortex, which plays a role in perception as well as in preparing the commands that result in physical movement. That is, it is a region that links input with output. They discovered that some cells, or neurons, in this area are involved with three functions at once: seeing, touching and hearing. And some of these "trimodal" neurons have the uncanny ability to respond according to how far the ear is from the source of the sound, regardless of how loud or soft the sound is.

Until recently, scientists could observe only the sensory cues that go into the brain and the behavior that comes out; the process in between had been a mystery, a "black box," said Michael Graziano, lead author of the work published in the February 4 issue of the journal Nature. In the last decade, however, new brain imaging techniques have led to precise mapping of the neurons that process sensory inputs. Most of that work has been in studying vision. Some research has been done on hearing, but most of that has focused on how we perceive the direction of sounds, a relatively simple process having to do with sound reaching one ear sooner than the other. How we perceive the distance of sounds has baffled scientists.

Graziano, a research staff member in the laboratory of psychology professor Charles Gross, tackled the problem by studying neurons in the premotor cortex of macaque monkeys. The scientists found that certain neurons respond only to sounds within 30 cm (about a foot) of the head. One interesting part of the discovery was that each of these distance-sensing neurons was responsible for perceiving sounds that come from one very specific direction relative to the animal’s head. One cell may respond to sounds directly in front of the nose, while another may be triggered by a sound a few inches off to one side.

The study did not reveal what cues tip these cells off to the distance of a sound. Graziano suspects that it may have something to do with subtle variations in the timbre of a sound; a distant sound has a complex quality that comes from tiny echoes and distortions, while a nearby noise is more clean and crisp.

The Princeton study adds to a view that the premotor cortex has the critical function of guiding a person’s movements toward and around objects close to the body. The region’s highly tuned vision, touch and sound perceptions work together to give us an exquisite level of control in reaching out to touch objects or moving to avoid them.

Understanding this ability could lead to applications for Graziano’s work in diverse areas. Stroke victims, for example, often lose their ability to judge the proximity of objects. These patients may see, hear or touch an object, but if it is in the "bad" or "neglected" part of space near the body, the patient ignores or fails to react to it. Knowing how neurons in the premotor cortex evaluate nearby objects could lead to better ways to diagnose and treat stroke.

The question is also important in making artificial limbs that can closely mimic the function of natural ones, or in making better hearing or visual aids that create more life-like perceptions.

Lastly, Graziano said that NASA has supported some of the work in his lab because scientists who design robots want to know how a computer like the brain translates inputs to outputs.

Graziano's research is currently funded by the National Institutes of Health and the McDonnell-Pew Program in Cognitive Neuroscience.