New Study Finds Eerie Similarity Between Brain, Computer Vision

1 month ago 15

Artificial neural networks are designed to mimic human brains, but they might just be too similar.

October 26, 2020

New Study Finds Eerie Similarity Between Brain, Computer Vision

A recent study from the researchers at Johns Hopkins University suggests artificial neural networks can "perceive" 3D objects in the same "first-glance" way our brains do — according to a recent study published in the journal Current Biology on Thursday.

Humans use organic brains to model artificial neural networks — so we shouldn't be surprised when they mimic organic brains.

Until, however, they become frighteningly similar.

SEE ALSO: NEUROMORPHIC COMPUTING: HOW THE BRAIN-INSPIRED TECHNOLOGY POWERS THE NEXT-GENERATION OF ARTIFICIAL INTELLIGENCE

Human brain versus artificial neural network

The human brain processes visual information through several sections of the brain, with each combining different aspects of a perceived object to form a full image.

From a scientific standpoint, this is how we come to perceive the external world.

However, the recent study shows how neurons in the V4 area — the first section in the brain's object vision — represent 3D shape fragments, not only the 2D shapes — which runs contrary to scientific consensus held generally throughout the last 40 years, reports TechXplore.

However, scientists were appalled to notice the same patterns the brain uses to see in an artificial neural network — after studying the AlexNet algorithm, which is an advanced computer vision network, Futurism reports.

Johns Hopkins University neuroscientist and study author Ed Connor and his team applied the same tests of image responses to natural and artificial neurons and discovered spookily similar response patterns in the brain's V4 section and AlexNet's layer 3.

“I was surprised to see strong, clear signals for 3D shape as early as V4,”  said Conner in a press release about the surprising findings. “I never would have guessed in a million years that you would see the same thing happening in AlexNet, which is only trained to translate 2D photographs into object labels.”

Learning more than just vision from computers 

“Artificial networks are the most promising current models for understanding the brain," added Conner. "Conversely, the brain is the best source of strategies for bringing artificial intelligence closer to natural intelligence.”

This latest study could signify the coming of a new norm for artificial neural networks — where instead of using what we know about the brain to build computers, we look to computers to learn how our brains work. It's a paradoxical time to be alive, whether we're looking at the world with a human brain or a computer's neural network.

New Study Finds 21 Drugs That Could Treat COVID-19New Light-Sensing Protein Restores Vision in Mice via Gene Therapy, Study SaysNew Superman-Like Vision Allows Us to See Through Clouds, FogEating Ultra-Processed Foods Makes You Age Faster, Study Finds
Read Entire Article