Mind control. Responses to it run the gamut, from visions of dystopian Matrix-like mental straightjackets to the superhuman psychic abilities of Marvel Girl, Jean Grey. Using the power of the mind to manipulate reality has a definite edge to it. Researchers have now pushed farther into the field, much farther than ever thought possible. The power to use thought to alter reality is no longer a dream of the future, as the medical knowledge and technology have been merged to develop toys, create objects with a 3D printer, and move a wheelchair. There’s even an open-source interface called OpenBCI that allows you to control…well…whatever you want (one team of tinkerers used it to control someone else’s arms). While the technology may still be in its early stages, it is well beyond the dream and hypothesis steps. Mind control is an incredible power that we will need to understand—and harness.
Tapping into the Power of the Mind
While man has long sought to use his mind to control things in the real world, the ability to manipulate the world started off quite small, with the very simplest of discoveries. In 1924, Hans Berger recorded human brain activity using the EEG (electroencephalogram), and this was the first of the brain-computer interfaces or BCIs (not to be confused with neuroprosthetics, which connect the nervous system to a device—think cochlear implants for the hearing impaired and artificial limbs).
The EEG captures brain activity (also called brain waves or neural oscillations) by monitoring the voltage fluctuations in the neurons, a process that has typically involved placing sensors all over an individual’s scalp (some researchers have developed special headsets). It can’t identify only one or two of these electrical signals, but when many are all activated at the same time, the computer can pick them up. This type of brain-computer interface is considered non-invasive (if not particularly flattering to the wearer of the sensors), since there is no need for surgery or an implant. However, BCIs come in other flavors, like partially invasive (like electrocochleography, a test that measures a patient’s response to sound where a device is placed inside the ear) or invasive (such as when surgery is performed on the brain to insert an implant).
Invasive BCI has produced better results than its more palatable counterparts, but it has still failed to deliver on its promises of swift, accurate mental control. Those who suffer from advanced ALS have been able to take advantage of implanted sensors, although the technology is still in its infancy. One BCI product allows ALS sufferers to select one letter or word at a time by focusing on it. This is, however, an extremely tedious operation that requires insane amounts of patience—and is extremely limiting. For example, when Intel’s team tried to help Stephen Hawking speed up his declining ability to communicate, a BCI became one of the early trials that ended up in the landfill. While the link worked fine for the researchers, it could not establish a strong enough signal from Hawking to make it practical (incidentally, Hawking uses an infrared switch that reads his cheek movements to determine the text on his computer).
However, the quest to connect mind and matter has taken a different route, one that began back in the 1800s. It was then that Emil du Bois-Reymond (the father of electrophysiology, most widely remembered as the man with the twitching frog experiment) realized that the skin was electrically active. A number of researchers, including the founder of analytical psychiatry Carl Jung, took advantage of this fact to further their research. The umbrella term electrodermal activity, or EDA, examines the active and passive electrical properties of the skin and has been associated with measuring the skin’s response and amount of sweat. Researchers have found that the skin provides a window into an individual’s psychological or physiological state. The EDA is a common component of polygraph tests, which, incidentally, are still being used by some organizations.
This brings us to research area of electrodermal potential (EDP), which is a part of the EDA. Researchers have recently shown that “EDP can be used to monitor human brain activities”; in this study, the researchers examined the accuracy of scalp-acquired EEG (like the earlier BCIs) results and EDP results from the body in identifying the states of relaxation and attention. They compared the results of the EDP measurements to those of the EEG, and found that EDP had an 84% classification accuracy. This result may be surprising, especially when compared to the scalp-acquired EEG, which had an accuracy range of 84% – 89%. This means that the measurement of cognitive activity does not require a powerful headset, tons of sensors, or expensive clinical tools to acquire information about brain processing—it can be measured with a simple device that rests against the skin.
This is what Freer Logic has exactly done with its BodyWave brain wave monitoring through the body technology. Its dry sensor device placed on the body captures brain wave patterns through the skin and then transmits the data for signal processing. Their current algorithms can monitor for drowsiness, attention, stress, anxiety, and peak performance.
So where can this new technology take us? New forms of human-computer interface in voice and brain-controlled devices will become dominant as the sheer number of smart objects and apps grow into the billions. This is where the Internet of Things (IoT) become more than just an aggregate of connected sensors and devices. You will be able to adjust your IoT thermostat, check to make sure that you locked your car, and even upload an order for more milk while resting comfortably in your favorite chair. Does that still seem like too much work for one person? The advent of AI assistants like Siri, Viv and Google Assistant has shown that people can successfully interact with computerized intelligence and gain significant benefits.
Eventually, we will have our own ambient AI assistants, much like Jarvis for Iron Man. It will interface with devices, apps, and other AI agents. We will be able to communicate with our AI assistants through speech, if we choose, and brain computer interface as it advances, which will likely be much faster. The future of humans is that of transhumanism—we will augment ourselves to obtain superhuman abilities. Our intellect, our senses, and physical capabilities can all become vastly improved. Technologies that link our thoughts directly to the outside world will allow us to control the environment around us, including autonomous cars, robots, lights, and even heavy machinery.
The Mind, the Final Frontier
Enhancing the human experience is the ultimate goal of BCI, but it is not without its pitfalls such as social, ethical and privacy considerations. After all, scientists agree that the brain is tremendously complex. It is, also, really the last aspect of ourselves that is truly private. As we go about our days, we are endlessly watched by cameras, our social lives are examined by media companies, and our purchases are sifted through for insights. Computerized brain interfaces can not only identify what we are focusing on, but they can also be used to read our responses to situations. If these personal responses—our thoughts, if you will—can be read by computers, then it naturally raises questions of privacy, personhood, and free will. There’s the potential for abuse, not only by individuals and businesses, but also by governments, especially those who already keep a tight rein on their citizens. Just like all technologies, a BCI that taps into our cognitive power is a tool. It can be used for benevolent purposes or it can also be used for nefarious ones. How will you wield it?