The location of the ‘physics engine’ in the brain is highlighted in color in this illustration.Credit: Jason Fischer/JHU

Whether or not they aced the subject in high school, human beings are physics masters when it comes to understanding and predicting how objects in the world will behave. A Johns Hopkins University cognitive scientist has found the source of that intuition, the brain’s “physics engine.”

This engine, which comes alive when people watch physical events unfold, is not in the brain’s vision center, but in a set of regions devoted to planning actions, suggesting the brain performs constant, real-time physics calculations so people are ready to catch, dodge, hoist or take any necessary action, on the fly. The findings, which could help design more nimble robots, are set to be published in the journal Proceedings of the National Academy of Sciences.

“We run physics simulations all the time to prepare us for when we need to act in the world,” said lead author Jason Fischer, an assistant professor of psychological and brain sciences in the university’s Krieger School of Arts and Sciences. “It is among the most important aspects of cognition for survival. But there has been almost no work done to identify and study the brain regions involved in this capability.”

Fischer, along with researchers at Massachusetts Institute of Technology, conducted a series of experiments to find the parts of the brain involved in physical inference. First they had 12 subjects look at videos of Jenga-style block towers. While monitoring their brain activity, the team asked the subjects either to predict where the blocks would land should the tower topple, or guess if the tower had more blue or yellow blocks. Predicting the direction of falling blocks involved physics intuition, while the color question was merely visual.

Next, the team had other subjects watch a video of two dots bouncing around a screen. They asked subjects to predict the next direction the dots would head, based either on physics or social reasoning.

Find your dream job in the space industry. Check our Space Job Board »

With both the blocks and dots, the team found, when subjects attempted to predict physical outcomes, the most responsive brain regions included the premotor cortex and the supplementary motor area — the brain’s action planning areas.

“Our findings suggest that physical intuition and action planning are intimately linked in the brain,” Fischer said. “We believe this might be because infants learn physics models of the world as they hone their motor skills, handling objects to learn how they behave. Also, to reach out and grab something in the right place with the right amount of force, we need real-time physical understanding.”

In the last part of the experiment, the team asked subjects to look at short movie clips — just to look; they received no other instructions — while having their brain activity monitored. Some of the clips had a lot of physics content, others very little. The team found that the more physical content in a clip, the more the key brain regions activated.

“The brain activity reflected the amount of physical content in a movie, even if people weren’t consciously paying attention to it,” Fischer said. “This suggests that we are making physical inferences all the time, even when we’re not even thinking about it.”

The findings could offer insight into movement disorders such as apraxia, as it’s very possible that people with damage to the motor areas of the brain also have what Fischer calls “a hidden impairment” — trouble making physical judgments.

A better understanding of how the brain runs physics calculations might also enrich robot design. A robot built with a physics model, constantly running in its programming almost like a video game, could navigate the world more fluidly.

 


Source: Johns Hopkins University

Research Reference:

Jason Fischer, John G. Mikhael, Joshua B. Tenenbaum, Nancy Kanwisher. Functional neuroanatomy of intuitive physical inference.Proceedings of the National Academy of Sciences, 2016; 201610344 DOI: 10.1073/pnas.1610344113