New Research In
Physical Sciences
Social Sciences
Featured Portals
Articles by Topic
Biological Sciences
Featured Portals
Articles by Topic
- Agricultural Sciences
- Anthropology
- Applied Biological Sciences
- Biochemistry
- Biophysics and Computational Biology
- Cell Biology
- Developmental Biology
- Ecology
- Environmental Sciences
- Evolution
- Genetics
- Immunology and Inflammation
- Medical Sciences
- Microbiology
- Neuroscience
- Pharmacology
- Physiology
- Plant Biology
- Population Biology
- Psychological and Cognitive Sciences
- Sustainability Science
- Systems Biology
Echolocating bats accumulate information from acoustic snapshots to predict auditory object motion
Edited by Ranulfo Romo, National Autonomous University of Mexico, Mexico City, D.F., Mexico, and approved October 1, 2020 (received for review June 8, 2020)

Significance
Research on visual tracking of moving stimuli has contributed to our understanding of sensory-guided behaviors; however, the processes that support auditory object tracking in natural three-dimensional environments remain largely unknown. This is important, not only to diverse groups of animals, but also to humans that rely on hearing to track objects in their environment. For visually impaired individuals, hearing is paramount for auditory object tracking and navigation, and in recent years, mobility training programs for the blind include instruction on echolocation using tongue clicks. In this work, we provide conclusive demonstration that echolocating bats use predictive strategies to track moving auditory objects, which can inform future comparative work on auditory motion processing.
Abstract
Unlike other predators that use vision as their primary sensory system, bats compute the three-dimensional (3D) position of flying insects from discrete echo snapshots, which raises questions about the strategies they employ to track and intercept erratically moving prey from interrupted sensory information. Here, we devised an ethologically inspired behavioral paradigm to directly test the hypothesis that echolocating bats build internal prediction models from dynamic acoustic stimuli to anticipate the future location of moving auditory targets. We quantified the direction of the bat’s head/sonar beam aim and echolocation call rate as it tracked a target that moved across its sonar field and applied mathematical models to differentiate between nonpredictive and predictive tracking behaviors. We discovered that big brown bats accumulate information across echo sequences to anticipate an auditory target’s future position. Further, when a moving target is hidden from view by an occluder during a portion of its trajectory, the bat continues to track its position using an internal model of the target’s motion path. Our findings also reveal that the bat increases sonar call rate when its prediction of target trajectory is violated by a sudden change in target velocity. This shows that the bat rapidly adapts its sonar behavior to update internal models of auditory target trajectories, which would enable tracking of evasive prey. Collectively, these results demonstrate that the echolocating big brown bat integrates acoustic snapshots over time to build prediction models of a moving auditory target’s trajectory and enable prey capture under conditions of uncertainty.
Footnotes
- ↵1To whom correspondence may be addressed. Email: angiesalles{at}gmail.com.
↵2A.S. and C.A.D. contributed equally to this work.
Author contributions: A.S. and C.F.M. designed research; A.S. and C.A.D. performed research; A.S., C.A.D., and C.F.M. contributed new reagents/analytic tools; A.S. and C.A.D. analyzed data; and A.S., C.A.D., and C.F.M. wrote the paper.
The authors declare no competing interest.
This article is a PNAS Direct Submission.
This article contains supporting information online at https://www.pnas.org/lookup/suppl/doi:10.1073/pnas.2011719117/-/DCSupplemental.
Data Availability.
MATLAB files have been deposited in GitHub at https://github.com/angiesalles/batTargetPrediction.
Published under the PNAS license.
References
- ↵
- Visioneers
- ↵
- L. Thaler,
- X. Zhang,
- M. Antoniou,
- D. C. Kish,
- D. Cowie
- ↵
- D. R. Griffin
- ↵
- J. A. Simmons,
- M. B. Fenton,
- M. J. O’Farrell
- ↵
- ↵
- K. Ghose,
- C. F. Moss
- ↵
- ↵
- ↵
- ↵
- ↵
- P. E. Nachtigall,
- P.W.B. Moore
- K. A. Campbell,
- R. A. Suthers
- ↵
- H. R. Erwin
- ↵
- E. Fujioka,
- I. Aihara,
- M. Sumiya,
- K. Aihara,
- S. Hiryu
- ↵
- F. A. Webster,
- O. G. Brazier
- ↵
- W. M. Masters,
- A. J. M. Moffat,
- J. A. Simmons
- ↵
- H. M. Ter Hofstede,
- J. M. Ratcliffe
- ↵
- ↵
- ↵
- B. S. Lanchester,
- R. F. Mark
- ↵
- D. M. Shaffer,
- S. M. Krauchunas,
- M. Eddy,
- M. K. McBeath
- ↵
- M. K. McBeath,
- D. M. Shaffer,
- M. K. Kaiser
- ↵
- ↵
- D. B. Webster,
- A. N. Popper,
- R. R. Fay
- R. R. Hoy
- ↵
- ↵
- ↵
- T. G. Forrest,
- H. E. Farris,
- R. R. Hoy
- ↵
- ↵
- ↵
- C. Chiu,
- P. V. Reddy,
- W. Xian,
- P. S. Krishnaprasad,
- C. F. Moss
- ↵
- B. Falk,
- L. Jakobsen,
- A. Surlykke,
- C. F. Moss
- ↵
- M. J. Wohlgemuth,
- C. F. Moss
- ↵
- ↵
Log in using your username and password
Log in through your institution
Purchase access
Subscribers, for more details, please visit our Subscriptions FAQ.
Please click here to log into the PNAS submission website.
Citation Manager Formats
Sign up for Article Alerts
Article Classifications
- Biological Sciences
- Psychological and Cognitive Sciences