fNIRS with and without VR headset
In the first study to consider brain activity during visuospatial problem-solving across immersive virtual reality (VR), 2-D computer screens and physical environments, researchers from Drexel’s School of Biomedical Engineering uncovered a surprising revelation – VR-based learning exhibited optimal neural efficiency, a measure that gauges the brain activity required to complete a unit task. This finding, published in the journal Sensors, reveals using virtual reality may foster more efficient learning than real-world environments.
During the study, 30 young adults engaged in approximately 60-minute visuospatial (looking at participants’ visual perception of the spatial relationships between the objects in front of them) tasks, tackling 3D geometric puzzles as their prefrontal cortex activity was monitored with a wearable neuroimaging sensor called functional near-infrared spectroscopy (fNIRS). This optical brain imaging tool tracks cortical oxygenation changes in specific brain regions corresponding to neuronal activation alterations. Researchers continuously monitored participants’ brain activity via fNIRS sensor throughout the session through a carefully designed protocol of tasks across three presentation mediums.
“The combined analysis of task-related brain activity and behavioral performance provide a more nuanced assessment, and here, results suggest that for this cognitive task, VR reduced the mental load needed to complete tasks,” said senior author Hasan Ayaz, PhD, an associate professor in the School of Biomedical Engineering, Science and Health Systems. “This implies that VR furnishes more intelligible 3D visual cues, facilitating better problem inspection and solution evaluation.”
Participants solved the puzzles faster and were more accurate in VR versus real-world or computer screen environments with comparable mental effort. Authors suggest this benefit might come from the augmented feedback in the form of audio or visual cues. Participants made more errors and spent more time rotating the puzzle while trying to solve problems in the real-world environment, which authors suggest leads to lower neural efficiency.
“Although interaction is key to learning, taking more time than needed trying to solve a challenge can be exhausting and may encourage someone to give up before it’s completed,” said Ayaz. “In some cases, VR’s capacity to create immersive spaces – complete with better mental imagery and visual cues for learning – may in some cases be a better option than a traditional real-world or computer screen-based environments.”
The authors argue that such neuroergonomic evaluation is a valuable approach to studying complex human-machine systems, using brain activity in real-time from the prefrontal cortex during cognitive tasks and these findings may help experts develop VR-based STEM learning and other training materials. Neural efficiency may also be a way to evaluate instructional materials and teaching approaches, as well as potentially personalizing the information delivery for each student’s success, according to the study authors. Previous studies show that developing spatial skills – which can be supported using VR – can improve performance in science, technology, engineering, and math.
The researchers say that real-world and computer screen applications can still be useful for spatial learning, notably for learners who do not need visual aids.
These findings contribute to a burgeoning body of research on neuroergonomic professional training, including applications in mission-critical domains such as surgical procedures conducted in VR environments, as well as aviation training for pilots and air traffic controllers. Ayaz’s research lab specializes in neuroergonomics, researching brain health and performance optimization. A prior study by Ayaz and colleagues, focusing on flight training simulations, demonstrated that tailoring training based on individual performance and fNIRS-based cognitive load measures yielded superior outcomes compared to traditional training methods.
In addition to Ayaz, graduate students Raimundo da Silva Soares Jr. of Drexel and Universidade Federal do ABC, Kevin L. Ramirez-Chavez and Altona Tufanoglu, and post-doctoral fellow Candida Barreto in the School of Biomedical Engineering, Science and Health Systems; and João Ricardo Sato from Universidade Federal do ABC contributed to this work.
Editor’s note: The authors note that fNIR Devices, LLC manufactures the optical brain imaging instrument and licensed IP and know-how from Drexel University. Ayaz helped develop the technology and holds a minor share in the firm. The research was supported by Fulbright United States-Brazil and Coordenação de Aperfeiçoamento de Pessoal de Nível Superior and the São Paulo Research Foundation.