Audio cues can not only help us to recognize objects more quickly but can even alter our visual perception. That is, pair birdsong with a bird and we see a bird — but replace that birdsong with a squirrel’s chatter, and we’re not quite so sure what we’re looking at.
“Your brain spends a significant amount of energy to process the sensory information in the world and to give you that feeling of a full and seamless perception,” said lead author Jamal R. Williams (University of California, San Diego) in an interview. “One way that it does this is by making inferences about what sorts of information should be expected.”
Although these “informed guesses” can help us to process information more quickly, they can also lead us astray when what we’re hearing doesn’t match up with what we expect to see.”
“Even when people are confident in their perception, sounds reliably altered them away from the true visual features that were shown.”
“When sounds are related to pertinent visual features, those visual features are prioritized and processed more quickly compared to when sounds are unrelated to the visual features. So, if you heard the sound of a birdsong, anything bird-like is given prioritized access to visual perception,” Williams explained. “We found that this prioritization is not purely facilitatory and that your perception of the visual object is actually more bird-like than if you had heard the sound of an airplane flying overhead.”
Taken together, these findings suggest that sounds alter visual perception only when audio and visual input occur at the same time, the researchers concluded.
“This process of recognizing objects in the world feels effortless and fast, but in reality it’s a very computationally intensive process,” Williams said. “To alleviate some of this burden, your brain is going to evaluate information from other senses.” Williams and colleagues would like to build on these findings by exploring how sounds may influence our ability to locate objects, how visual input may influence our perception of sounds, and whether audiovisual integration is an innate or learned process.
Complete article at Science Daily.
Researchers acknowledge, “This process of recognizing objects in the world feels effortless and fast, but in reality it’s a very computationally intensive process.” So many seemingly simple aspects of our existence are found to exhibit layered depth of complexity. What is evolution’s explanation for the these marvels of functional complexity? Can undirected processes bring about the successful, “computationally intensive process” of object recognition?