AI falls behind human brain in key area

AI falls behind human brain in key area


Ever looked at a photo of a river and instinctively known you would be able to swim in it? Or seen a trail on your drive to work and thought, “I could walk there”? It’s a neat trick our brains do without us consciously asking them to — and one that AI struggles to emulate.

A new study from researchers in Amsterdam shows that our brains automatically recognize “affordances,” a term used to describe opportunities for action, when we look at scenes in the world around us. Common ways this manifests might include the knowledge of climbing stairs or cycling down a path, or even being aware when you should not move forward.

Using MRI scans, researchers tracked brain activity while participants viewed photos of various settings, like mountain trails and city streets. Then, they pressed buttons to indicate how they thought they could move through each scene. Even without explicit instruction to think about moving, participants’ brains still lit up in ways that suggested they were evaluating different possibilities of action.

Specifically, parts of the visual cortex activated based not only on what the participant saw, but what they could envision doing. So, while you’re looking at that forest trail, your brain is already considering your next step — literally.

The kicker? Researchers also tested how well AI models performed at the same test. The result? Not great. While AI could label objects, it struggled to “see” what humans see: action, movement, potential.

Why does this matter? Well, as we build AI for real-world tasks, like disaster-response robots or self-driving cars, understanding how we move through space has the potential to make systems more efficient … and more like how a human would behave.

Related Episodes