A team of scientists and engineers at the University of Minnesota is giving new meaning to the old adage: “Mind over matter.” Led by Bin He, Ph.D., director of the Biomedical Functional Imaging and Neuroengineering Laboratory, the team has created a non-invasive brain-computer interface (BCI) that could one day restore mobility and independence for individuals with amputated limbs, paralysis and other impairments that prevent or limit normal movement. With the help of this interface, volunteers have been able to precisely control the flight of simulated and small model helicopters using only their minds.
While mind-reading sounds more like science fiction than science fact, researchers have been pursuing this type of technology for the past several decades. Recent advances have allowed quadriplegic patients to control a wheelchair, eat chocolate and drink coffee, all without lifting a finger. The most successful BCIs developed so far are those that rely on electrodes surgically implanted in the brain. This is because the electrical activity generated by a single thought is extremely weak: the further you get from the signal, the more likely that it will be drowned out by the steady hum of activity in the brain.
The drawback is that this approach is incredibly invasive. The medical risks associated with brain surgery and chronic brain implants are not insignificant, particularly for individuals whose health is already compromised by an injury or paralysis. Now imagine if a non-invasive approach could yield the same crisp signal, making it possible for individuals to navigate and interact with the environment without undergoing brain surgery. For He and his colleagues, that is the ultimate goal.
In 2011, He’s team showed that it was possible for volunteers outfitted with a specially designed cap containing electroencephalography sensors to fly a virtual helicopter in real-time using only their minds.
For its most recent experiment, He’s team has upped the ante, replacing the computer-simulated helicopters with small, remote-controlled ARDrone quadcopters. In this experiment, volunteers were required to fly quickly and continuously through two suspended foam rings as many times as possible within four minutes. The volunteers guided the quadcopter based on video feedback from a forward-facing camera mounted on the hull. “We wanted to show that it was possible to control an actual device, moving in real time and space,” says He. Edited from Mind-controlled devices reveal future possibilities.