Illustration of acoustic pattern of mouth clicks for human echolocation.
Credit: Thaler et al.; CC-BY

Like some bats and marine mammals, people can develop expert echolocation skills, in which they produce a clicking sound with their mouths and listen to the reflected sound waves to “see” their surroundings. A new study published in PLOS Computational Biology provides the first in-depth analysis of the mouth clicks used in human echolocation.

The research, performed by Lore Thaler of Durham University, U.K., Galen Reich and Michael Antoniou of Birmingham University, U.K., and colleagues, focuses on three blind adults who have been expertly trained in echolocation. Since the age of 15 or younger, all three have used echolocation in their daily lives. They use the technique for such activities as hiking, visiting unfamiliar cities, and riding bicycles.

While the existence of human echolocation is well documented, the details of the underlying acoustic mechanisms have been unclear. In the new study, the researchers set out to provide physical descriptions of the mouth clicks used by each of the three participants during echolocation. They recorded and analyzed the acoustic properties of several thousand clicks, including the spatial path the sound waves took in an acoustically controlled room.

Analysis of the recordings revealed that the clicks made by the participants had a distinct acoustic pattern that was more focused in its direction than that of human speech. The clicks were brief — around three milliseconds long — and their strongest frequencies were between two to four kilohertz, with some additional strength around 10 kilohertz.

The researchers also used the recordings to propose a mathematical model that could be used to synthesize mouth clicks made during human echolocation. They plan to use synthetic human clicks to investigate how these sounds can reveal the physical features of objects; the number of measurements required for such studies would be impractical to ask from human volunteers.

Find your dream job in the space industry. Check our Space Job Board »

“The results allow us to create virtual human echolocators,” Thaler says. “This allows us to embark on an exciting new journey in human echolocation research.”


Story Source: Materials provided by PLOS Original written by Whitney Clavin.Note: Content may be edited for style and length.
Journal Reference:
Lore Thaler, Galen M. Reich, Xinyu Zhang, Dinghe Wang, Graeme E. Smith, Zeng Tao, Raja Syamsul Azmir Bin. Raja Abdullah, Mikhail Cherniakov, Christopher J. Baker, Daniel Kish, Michail Antoniou. Mouth-clicks used by blind expert human echolocators – signal description and model based signal synthesis. PLOS Computational Biology, 2017; 13 (8): e1005670 DOI: 10.1371/journal.pcbi.1005670