Friday, May 27, 2011

Human echolocation activates visual parts of the brain

From Neurophilosophy:

We all know that bats and dolphins use echolocation to navigate, by producing high frequency bursts of clicks and interpreting the sound waves that bounce off objects in their surroundings. Less well known is that humans can also learn to echolocate. With enough training, people can use this ability to do extraordinary things. Teenager Ben Underwood, who died of cancer in 2009, was one of a small number of blind people to master it. As the clip below shows, he could use echolocation not only to navigate and avoid obstacles, but also to identify objects, rollerskate and even play video games.

Very little research has been done on human echolocation, and nothing is known about the underlying brain mechanisms. In the first study of its kind, Canadian researchers used functional magnetic resonance imaging (fMRI) to monitor the brain activity of two blind echolocation experts. Their findings, published today in the open access journal PLoS ONE, show that echolocation engages regions of the brain that normally process vision.



Psychologist Lore Thaler of the University of Western Ontario and her colleagues recruited two expert echolocators for the new study. One, a 43-year-old man referred to as EB, was born with retinoblastoma - a form of cancer that affects cells in the retina - and had both eyes removed at 13 months of age. The other, a 27-year-old man known as LB, lost his vision at the age of 14, following degeneration of the optic nerve, which carries visual information from the eye to the brain. Both have trained themselves to be expert echolocators. Both of them use click-based echolocation on a daily basis, to navigate their home cities and explore unfamiliar ones, go hiking or play basketball.

The researchers seated their participants in a sealed room, placed various objects in front of them, and asked them to produce echolocation clicks. As they did so, the sounds they produced - and the faint echoes - were recorded with high quality stereo equipment. They also asked the participants to do the same thing in an outdoor courtyard surrounded by buildings, and made more recordings. Some of these contained echoes produced by a tree, car or lamp-post, while others did not.

EB and LB could accurately determine the size, shape, position and movements of objects in both situations. Crucially, they could do the same from the sound recordings when they were played back later. EB, for example, could distinguish a 3° difference in the position of a pole in the sealed room, as well as from the pre-recorded sounds. LB, was slightly less accurate, distinguishing 9° differences in position of the pole while in the room, and 22° differences from the recordings.


Thaler and her colleagues then scanned the blind participants' brains, and those of two sighted controls of the same age and sex, while they listened to the pre-recorded sounds through earphones. They found that the recordings activated the auditory cortex, which process sounds, in all four participants. The sounds also activated parts of the visual cortex in the blind participants, but this activity was completely absent in the sighted controls. EB exhibited greater visual cortical activation than LB. This may reflect the fact that he is more experienced at using echolocation.

The researchers observed another difference when they compared the brain activity evoked by outdoor recordings with and without echoes. The recordings without echoes produced the same pattern of activity as those used in the first experiment. Remarkably, though, the recordings containing echoes activated the visual cortex in the blind participants, but not the auditory cortex.

Although it is somewhat limited by the small number of participants, this study suggest that EB and LB both use echolocation in a way that is very similar to vision. The exact role of the visual cortex in human echolocation is unclear, but Thaler and her colleagues suggest that it might be processing spatial information contained in the echolocation clicks.

The researchers are cautious in their interpretation of the findings. They note numerous studies which show that blindness can lead to extensive brain re-organization. Such changes can produce cross-modal activation, whereby sensations activate brain regions that would not normally process them. But the observation that the echoes in the outdoor recordings activated visual but not auditory cortices in the blind participants supports the researchers' conclusion.

The use of pre-recorded sounds overcomes a number of difficulties in scanning the brains of echolocating people, and could stimulate other neuroimaging experiments of the phenomenon. Future studies of blind echolocators may confirm these new findings, and comparisons with sighted people who have been trained to echolocate and blind non-echolocators with an increased sensitivity to echoes could provide further insights into the underlying neural mechanisms.