Science fiction has provided early warning for many impending technologies – Star Trek communicators resemble today’s PDAs, while talking computers are becoming commonplace. Now, one of science fiction’s more outlandish predictions – that we will communicate with computers using direct brain control – looks as if it soon may become reality, as well.
The electromagnetic field generated by the brain has been studied for generations, and Functional Magnetic Resonance Imaging (FMRI) has allowed researchers to understand exactly how brain activity correlates to motor control, emotions and even thoughts, to some extent. However, while massive – and massively expensive – FMRI rigs can collect individual brain activity, a modern consumer device would need to be much less obtrusive. Such a device would need to be able to work from a relatively small number of sensors, and not require conductive jell or a head-shave!
Brain-Computer Interfaces Show Promise
A couple of pioneers have been working on BCI devices that show real promise. Tan Le, founder of Emotiv, worked on creating sophisticated algorithms to find the signal in the chaotic patterns that can be received from a relatively small and light 6-electrode headset. A user can train the helmet to recognize distinct electrical signals accompanying thoughts that might correspond to moving a cursor or some similar command.
However, the need for training definitely creates a barrier to adoption for devices such as the Emotiv. Although the training is relatively simple for primitive operations, the headset is unlikely to work reliably for non-trivial use without significant effort.
Such significant efforts are worthwhile in some circumstances – to serve the severely disabled, for instance. Last year, researchers at the University of Pittsburgh demonstrated “Hector,” a robotic arm under the control of a quadriplegic woman who was able to eat and drink using the arm. Most impressively, the woman reported that it was no longer necessary for her to think specific “commands” to move the arm – the arm moved in response to the same sort of natural impulses that we use to move our organic limbs.
This sort of fine motor control requires some invasiveness: in the Hector apparatus, a set of 96 pin electrodes were inserted into the motor cortex of the woman’s brain. Despite the willingness of today’s consumers to undergo body modifications, it’s unlikely that this will become a mainstream consumer option anytime soon. For the disabled, however, this technology will be life-transforming.
Emotional Interfaces May Be Closer Than BCI Devices
Consumer BCI devices that provide sophisticated robotic control may be a ways off, but there are some indications that a more emotional interface may be closer. Although there’s little commonality in the brain waves generated for motor control between individuals, the gross emotional and high level states of consciousness (sleeping, waking, dreaming, etc.) generate patterns that are not unique. Therefore, a device that aspires only to read our emotional state and alertness can operate without training.
Such a device – the “Muze” – is due to be made widely available in early 2014. While Muze can be used to perform device control, the primary ambition is to monitor high level brain state, such as the level of focus or emotion, and react to that state. This has some pretty intriguing applications – for example, it could be used as a biofeedback device to achieve greater levels of calm or concentration. Your music player could detect your mental state and generate a playlist to help you cheer up (or, perhaps, wallow in your misery).
These devices promise to open up ways for us to improve our mental functioning and perhaps to further revolutionize social networking and big data. A world in which Facebook “likes” are generated automatically might not be far off, and mining the big data generated from our own brains has some amazing – though sometimes creepy – implications.