Have you ever watched a baseball player make a seemingly impossible catch? Snatched a tumbling toddler from mid-air before they hit the floor? Intercepted a wiley dog attempting to squirt through an open gate to freedom?
If you have, you likely have a sense of what scientists know: That there are highly precise interactions going on between our visual and motor systems.
Neuroscientists at the University of Rochester say there’s something else at work during such feats. And that’s the ability to visually predict movement.
The researchers set up an experiment using marmosets, which are small, long-tailed South American monkeys with cute tufts on the sides of their head. They used high-speed cameras and video data from an artificial intelligence tool called DeepLabCut that measures and tracks arm and hand movements.
They measured an 80-millisecond delay in the primate’s visuomotor behavior — that is, the gap between the exact moment when vision and movement begin working in tandem to make the monkey’s hand move toward the object, which in this experiment, was a cricket. Marmosets find crickets to be a yummy snack.
Despite the delay, the marmosets still were able to grab the crickets. That means the monkeys had to be able to predict the crickets’ movement.
Perhaps surprisingly, the scientists’ goal in building a detailed model of vision-guided reaching behavior has nothing to do with improving humans’ reflexes or athletic ability.
Instead, they said they hope that better understanding will give scientists better insight into what’s going wrong when neurological disorders interrupt humans’ ability to perform such tasks.