Using advances in neuroimaging, scientists have for the first time shown how the brain uses objects to recognize a person’s surroundings.
For the study, Boston College neuroscientist Sean MacEvoy and colleague Russell Epstein of the University of Pennsylvania used functional magnetic resonance image (fMRI) to help them identify how the brain figures out where it is in the world (scene recognition).
Study participants had their brains scanned while they looked at photos of four types of scenes: kitchens, bathrooms, intersections and playgrounds. Separately, the researchers took brain scans while the subjects looked at photos of individual objects particular to those scenes (e.g., refrigerators, bathtubs, cars, and slides).
Scene patterns evoked by actual scenes in one half of scans were compared to predictor patterns derived from object-evoked patterns from the opposite half.
MacEvoy and Epstein found that they could use the brain patterns produced by objects as keys to decipher the brain patterns produced by scenes, and could “read out” what type of scene a participant was seeing at a given point in time.
“While previous research on scene recognition has emphasized the role of the three-dimensional layout of scenes in this process, our results suggest a separate system that utilizes information about the objects in scenes to piece together where we are,” said MacEvoy.
“While that”s a strategy that many of us think we might use, here we have evidence of a brain area that could be responsible for it,” he added.
The study has been published in the latest issue of Nature Neuroscience.