Multimodal perception and navigation in bats

Anand Krishnan

My studies aim to investigate the relative roles of multisensory input in flight control and navigation, using the echolocating bat as a system to address these questions. While the use of acoustic information by bats for spatial perception and navigation has received much study, relatively little information is available on the use of visual information by bats. My study aims both to address how visual cues guide flight and navigation, as well as how bats respond to sensory conflict, that is, a dissonance of information across cues. In an ongoing set of experiments, we are equipping Egyptian fruit bats with displacing prisms over the eyes to create conflicting visual and acoustic cues. We hypothesize that if bats are using visual information, we should see a corresponding displacement of their flight trajectories, whereas if they can compensate for the visual displacement using echolocation, we should see changes over time both in their flight behavior and in their use of biosonar.

Echolocation and vision: Sensory congruence & conflict

Copyright@2017 Batlab, Johns Hopkins University
Questions and comments to wxian1@jhu.eduDomain Name

Template by OS Templates