The fNIR sensor, which is placed around the forehead like a headband, uses LEDs that shine near-infrared light to sample from 16 brain areas. Researchers use the technology to monitor oxygenation in the brain while the subject performs cognitive tasks.Credit: Drexel University

“Smart” eyewear — that can integrate augmented reality with your own, feed you live information about your surroundings and even be used in the operating room — is no longer the stuff of science fiction.

Wearable displays also have the potential to enhance cognitive ergonomics, or more simply, make it less mentally taxing to complete certain tasks. But before technologies like Google Glass become a part of daily life, engineers need a way to monitor exactly how they affect the brain in everyday situations.

At Drexel University, researchers have developed a portable system that can do just that. The system uses functional near-infrared spectroscopy, or fNIRS, to measure a person’s brain activity.

The applications for fNIRS are seemingly endless — from training air traffic controllers and drone operators, to studying how students with disabilities learn best, or why different people are more receptive to certain Super Bowl commercials.

“This is a new trend called neuroergonomics. It’s the study of the brain at work — cognitive neuroscience plus human factors,” said Hasan Ayaz, PhD, associate research professor in the School of Biomedical Engineering, Science and Health Systems and a member of Drexel’s CONQUER Collaborative. The phrase was coined by the late Raja Parasuraman, a former professor at George Mason University and study co-author.

Find your dream job in the space industry. Check our Space Job Board »

Until now, most studies involving fNIRS took place indoors. Though participants wearing the system could move around freely while being monitored, they were still observed within laboratory confines.

A group of Drexel biomedical engineers, in collaboration with researchers at George Mason University, have now brought their portable fNIRS system “into the wild.” In their study, published this summer in Frontiers in Human Neuroscience, the researchers successfully measured the brain activity of participants navigating a college campus outdoors.

The researchers wanted to compare one group of participants navigating campus with Google Glass to another group using Google Maps on an iPhone. Their goal was to measure mental workload (how hard the brain is working) and situation awareness (the perception of environmental elements), in order to see which device was less mentally taxing.

They found that overall, users using Google Glass had a higher situation awareness and lower mental workload than their peers navigating with an iPhone.

However, the researchers also found that users wearing Google Glass fell victim to “cognitive tunneling,” meaning they focused so much more of their attention to the display itself, that they easily ignored other aspects of their surroundings.

“What we were able to see were the strengths and weaknesses of both. Now that we know we are able to capture that, we can now improve their design,” said Ayaz, the study’s principal investigator. “This opens up all new areas of applications. We will be able to analyze how the brain is functioning during all of these natural activities that you cannot replicate in artificial lab settings.”

fNIRS is a way to measure oxygenation levels in the prefrontal cortex — the part of the brain responsible for complex behaviors like decision making, cognitive expression and personality development. Greater activity in this area of the brain signals that a person is a novice, and therefore must work harder, at an activity. When someone masters a skill, the processing of information moves toward the back regions of the brain.

In the past, researchers had to use secondary tasks to measure the “user-friendliness” of an augmented reality product, like Google Glass. For instance, while a person was navigating with a maps application, they would be asked to recall a series of sounds played to them through headphones. If their responses were inaccurate, this implied that their brain had to work harder to pay attention to the primary task at hand.

For comparison, the Drexel researchers also used secondary tasks to measure mental workload and situation awareness. However, they found that these tasks were intrusive and ultimately unnecessary. The fNIRS system was able to accurately assess brain activity during the task and examine differences between a hand-held display and wearable display.

“We observed greater mental capacity reserves for head-mounted display users during ambulatory navigation based on behavioral and neuro-metabolic evidence. However, we also observed evidence that some of the advantages of head-mounted displays are overshadowed by their suboptimal display symbology, which can be overly attention grabbing,” said Ryan McKendrick, PhD, the study’s lead author and now a cognitive scientist at Northrop Grumman Corporation.

Since the research team found that Google Glass users experienced some cognitive tunneling while navigating, they suggest that future studies identify other brain biomarkers induced by this “blindness” to the outside world. By identifying cognitive tunneling biomarkers, engineers could “greatly advance display design for navigation, training and other tasks” that wearable displays are expected to enhance.


Source: Drexel University

Research Reference:

  1. Ryan McKendrick, Raja Parasuraman, Rabia Murtza, Alice Formwalt, Wendy Baccus, Martin Paczynski, Hasan Ayaz. Into the Wild: Neuroergonomic Differentiation of Hand-Held and Augmented Reality Wearable Displays during Outdoor Navigation with Functional Near Infrared Spectroscopy. Frontiers in Human Neuroscience, 2016; 10 DOI: 10.3389/fnhum.2016.00216