Brain-Computer Interface Enables Johns Hopkins Study Participant to Touch and Feel Holographic Objects

As part of a larger study exploring neural multiplexing and new modes of perception enabled by brain-computer interface (BCI), Johns Hopkins researchers have demonstrated the ability to “feel” virtual objects by integrating neural stimulation in a mixed-reality environment.

The participant in the study, Robert “Buz” Chmielewski — who previously demonstrated simultaneous control of two of the world’s most advanced prosthetic limbs through a brain-machine interface, and used brain signals to feed himself with two prosthetic limbs — has now demonstrated virtual tactile perception.

“All organisms rely exclusively on their sensory organs to perceive information about the world around them,” explained Mike Wolmetz, who manages the Human and Machine Intelligence program at the Johns Hopkins Applied Physics Laboratory (APL) in Laurel, Maryland. “BCI creates a new pathway to perceive information directly, in ways that are not constrained by, filtered through or aligned with our specific sensory organs.

“This demonstration gives us a very early indication of how neural interfaces may fundamentally change the way we interact with technology and perceive our natural and digital environments in the not-too-distant future.”

The research is part of the Neurally Enhanced Operations (NEO) project, funded by the Defense Advanced Research Projects Agency to investigate neural multiplexing: to what extent the brain can accomplish typical perception and control through the senses and muscles at the same time as perception and control through a BCI. Can an operator use their own hands and senses to interact with their computer while simultaneously using their brain to perceive and control other channels or dimensions?

“Right now, we’re focusing on the extreme version of neural multiplexing, to see if it is possible to have different signals either going to or coming from the same regions of the brain and still be able to interpret and act on all that information in a useful way,” said APL’s Luke Osborn, a neuroengineering researcher on the project. “Here we’re looking at how sensorimotor regions of the brain can successfully work in mixed reality when neural stimulation is part of that mix.”

Chmielewski, the study participant, is particularly well suited to help. He suffered a spinal cord injury at the age of 17 that resulted in a diagnosis of incomplete quadriplegia, retaining some motor function and sensation in his arms and hands. In 2019, he underwent a 12-hour brain surgery at the Johns Hopkins Hospital to become the first research participant with chronic microelectrode arrays implanted in both hemispheres of the brain.

“Buz has the very unique combination of state-of-the-art neural implants and substantial spared sensation and control in his arms and hands,” said Pablo Celnik, director of the Physical Medicine and Rehabilitation Department at Johns Hopkins Hospital, and NEO co-investigator.

The team has assembled what it likes to call “Buz’s playground” — complete with HoloLens and tablets — where he comes up with new tasks and concepts for things to try. “We’ve been able to quickly implement and test these ideas with him,” said Francesco Tenore, APL’s NEO principal investigator, “and as a result, the team is making incidental discoveries on a regular basis, uncovering the first hints of what new modes of perception may be possible through neural stimulation.”

The first of those new modes is what the NEO team calls “virtual perception.” “Here, as one part in a series of multiplexing and virtual perception studies, Buz is actually able touch and feel different holographic objects — virtual objects seen and manipulated through a HoloLens — in task-relevant ways.” said Matthew Fifer, a neuroprosthetics researcher and APL’s NEO technical lead.

The team has been focusing on augmenting tactile perception in its work with Buz, but Wolmetz noted the possibilities likely go well beyond touch. “Just in terms of the current paradigms, augmented vision and hearing are somewhat accessible — we can all see and hear in mixed reality. Touch has some very imperfect approaches [such as haptic gloves] that we’re trying to fundamentally improve on. We currently have no access to smell, taste or proprioception, for example. Direct neural input could change all that.

“The point,” he continued, “is having direct access to the mind and brain could change everything, and this is just the tip of the iceberg.”

Chmielewski will be discussing his adventures in BCI on Friday, June 25, at the joint North American Neuromodulation Society and Neural Interfaces Conference 2021. The team plans to publish on its discoveries — expected, incidental and serendipitous — in the near future.

In addition to NEO, APL works across the principal components of neural interface research to find new ways to restore lost function, augment natural abilities, integrate biological and artificial intelligence, and make neurotechnologies increasingly accessible noninvasively.

 

The Applied Physics Laboratory, a not-for-profit division of The Johns Hopkins University, meets critical national challenges through the innovative application of science and technology. For more information, visit www.jhuapl.edu.

withyou android app