The human brain can process the images of animals and other things we see as well as our own faces, a new study suggests.
But when we see a picture of ourselves in front of a cell phone camera, the human brain does not have the same access to images as it does to our own face, the scientists report in the Journal of Neuroscience.
The findings may be important for understanding how our brains process images and the neural pathways that allow us to remember images, say the researchers, who were not involved in the study.
They found that when a picture is of a person, the brain starts processing it as a picture.
When the picture is not of a human, the images become part of the environment, and the brain tries to make sense of them.
But the brain can only do that when there is a clear contrast between the image and the surroundings.
This “clear contrast” means that when we look at a person in a different context, we don’t see what is in front or behind them, the researchers say.
“We think this is because we are not used to seeing a human face in a photo, but instead in a new environment,” said lead author Rolf Nieder, a neuroscientist at the Max Planck Institute for Evolutionary Anthropology.
“If we see something that is really different, we are more likely to think of that as something different,” he said.
The researchers found that people who were exposed to a computer screen showed differences in the amount of the neural network that processes images.
They also found that some of the brain areas involved in learning to recognize faces also responded differently to images of human faces.
“When we look for faces in our environment, the cortex does not work as well,” Nieders said.
“This may be because we have a different perspective on the world and we don`t get used to the difference between a human and a dog.”
To find out, the team created a computerized model of the human eye, a part of our brains that processes vision and also helps us tell people apart.
They tested it against a computer model of a face.
It worked very well.
But, when the model was tested in a more realistic environment, it was not as accurate as the human model.
The results were similar in a brain imaging study.
But the human models were more accurate than the computer models.
“The brain in the human visual cortex works differently than it does in the computer model,” Nieers said, adding that the researchers plan to study whether other parts of the visual cortex are different.
What’s the connection?
The researchers said that their work raises some interesting questions about the role of the vision system in human cognition.
“It suggests that the brain may be able to detect faces and understand them in different ways than it would for a computer,” Niesers said.
“And that may have implications for understanding our own brains and our own ability to recognize and remember objects.”