HomeNeural Interfaces to the Internet of Things

Neural Interfaces to the Internet of Things

My previous posting brought us to the frontiers of medical technology, which is fast blurring the borders between digital technology and the human body. As that technological boundary blurs, so also does the boundary between the medical uses of digital technology and the uses of those applications for the able-bodied. In this post we look at developments in assistive technology, prosthetics and the neural interfaces to prosthetics and how they may lead back to consumer applications.

One of my favorite experiences at CableLabs was getting to know Tom Wlodkowski, vice president of accessibility and multicultural technology and product at Comcast. Tom spearheads Comcast’s efforts to comply with ADA (Americans with Disabilities Act) requirements, such as video descriptive services, but has gone beyond that into a voice-enabled navigation environment. Tom nailed it on the head when he said, “The talking guide is as much about usability as it is about accessibility.” As I like to say, navigation is a design problem.

Vision Assistance

Similar options for sighted but paralyzed users have emerged in the use of eye-tracking technology. Industry leader Tobii has products for the disabled community, but also has diversified into products for hands-free professionals and for gaming applications.

The Smith-Kettlewell Eye Research Institute operates a bewildering array of research projects ranging from video description of media to acoustic cues for sight-impaired wayfinding. Even the title of one project hints at an order of magnitude set of capabilities. Known as COVA (Choreographed and Orchestrated Video Annotation), it involves “different aspects of a coordinated media presentation coming from different networked devices, and presenting adaptive strategies depending on which abilities a user can leverage to overcome a particular disability.”

And independent streams of research are driving down costs for solutions. There’s one from a solo researcher for an eye gaze controlled wheelchair, and a Kickstarter funded product called Taptool, which inexpensively but cleverly provides a fingertip cap that improves accessibility via better performance accuracy on touch screens.

Elsewhere in the field of eyesight, some artificial or bionic eyes are coming into clinical trials. From Monash University, “Patients who have lost their sight will have tiny ‘ceramic tiles’ implanted into their brain’s visual cortex. The device bypasses the normal visual pathway, unlike the other bionic eyes in development, which rely on an implant in the retina.”

The Johns Hopkins University Applied Physics Laboratory (APL) is developing a next-generation retinal prosthesis system. It includes glasses with embedded vision and eye tracking sensors, enabling navigation around obstacles or finding objects. “This information will be distilled into a format that can be projected into the retinal prosthesis, bypassing the damaged rods and cones in the retina,” APL says.

The EagleEyes project at Boston College is tackling the tough category of individuals who have limited or no intentional muscular control (due to stroke, spinal muscular atrophy and severe cerebral palsy, for example) but do have control of their eye movement. With a set of five electrodes placed around the eyes, the project literature says, “In many cases using EagleEyes is the first time these individuals have been able to establish cause and effect, communicate and act independently of anyone assisting them.”

Expanding BCI Technology

More generally, the field of artificial limbs and organs seems to have accelerated in the early 21st century. Some of the increased research was spurred by a spike in injuries to soldiers and civilians due to improvised explosive devices in Iraq and elsewhere.

Researchers at Caltech and the Keck institute have implanted neuro prosthetics in a part of the brain that controls the intent to move, with the goal of producing more natural and fluid motions. In research, a quadriplegic patient with the implant was able to perform a handshake gesture and play “rock, paper, scissors” using a wirelessly connected robotic arm.

Prosthetic device company Ossur has developed a sensor implant allowing amputees to move prosthetic limbs via thoughts. Sensors are surgically placed in residual muscle tissue, and prosthetic movement is triggered via a receiver. A key digital technology feature of the limbs is sufficient processing power to enable real-time learning and adjustments to the user’s speed depending on terrain. The users are said to require less conscious attention to each movement over time.

And while past military actions may have spurred the need for better prosthetics, military research in the same sensor and brainwave territory is pushing the technology boundaries in new directions for the able-bodied war fighter.

The Cognitive Technology Threat Warning System (CT2WS) project of the Defense Advanced Research Projects Agency is intended to improve the US Army’s threat detection capabilities. The DARPA system combines EEG brainwave scanners, 120-megapixel cameras and multiple computers running cognitive visual processing algorithms. The camera scans the battlefield, and the images are run through visual processing algorithms to detect possible threats. Potential threats are then presented to the soldier for assessment or action; but the long-range plan seems to be to shortcut the decision-making and automate responses.

Such automation would leverage a brainwave characteristic called the P300, activity that signals recognition of visual objects some 300 milliseconds after stimulation, faster than the several seconds required for a conscious response. DARPA apparently has a BCI (brain computer interface) project to leverage the P300 to speed the ability of intelligence analysts to sort through satellite imagery.

The brain’s interface to these systems leverages the rapidly advancing technology of electro encephalogram (EEG) helmets. Guger Technologies (or call them by the cooler g.tec) offers consumer grade products with some interesting features. The company’s intendiX system comes with a set of “user-ready, BCI applications” such as the SPELLER application, which enables a paralyzed user to concentrate on characters from an on-screen keyboard in order to spell out words. The company claims users learn within 10 minutes how to spell 5 or 10 characters per minute, and then continue to improve.

The intendiX Painting application enables a user to draw onscreen; a feature intended to draw in younger users. And another Guger application, recoveriX, uses the same technology to have users imagine certain kinds of movements, which activates measureable brain regions. It turns out the imagination triggers greater plasticity of the brain and leads to faster recovery from, for example, stroke-caused impairments.

Greater granularity in measurement leads to greater granularity in tracking. The Enobio EEG platform comes in models with 8, 20 or 32 channels, each of which may be useful for different problems. The low end has limited functionality but is useful for uploading data for analysis with patients more comfortable outside a lab setting, while the high end gets into measuring “research-class brain signals and triaxial accelerometry.”

If you prefer your brand names warmer and fuzzier than g.tec, consider the company’s mindBEAGLE system, comprising an EEG cap with active electrodes, in-ear phones for auditory stimulation and vibro-tactile stimulators to be attached to the patient’s body.

The mindBEAGLE uses P300 brainwaves as well, and one usage is for conducting consciousness assessment and communication for patients with DOC (disorders of consciousness.) The company claims that evidence indicates that “40% of patients diagnosed as vegetative are reclassified as minimally conscious when assessed by an expert team.” The mindBEAGLE’s auditory and skin sensors enable “patients who have enough cognitive functions to understand spoken messages, to use certain different mental strategies…to provide simple yes/no answers.” This is not just technology evolution; this is deeply existential, life and death stuff.

And so it is that you can also look into not just mind-to-machine interfaces but mind-to-mind. Two years ago, researchers claimed to have shown one researcher controlling the hand movements of another researcher to shoot guns in a video game. The sender’s message created transcranial magnetic stimulation (TMS) to cause neurons in a specific area of the brain to fire – that area was found in turn through trial and error training. So there you go: the Internet of Brains?

Moving Beyond Medical Applications

The applications for this technology have moved rapidly outside the assistive technology world. Dell Computer was reported to be working on “mood-reading software,” based on technologies originally designed for users in wheelchairs. And, indeed, the migration to consumer price points opens up the world of neural technologies to alter the experience of consumer software, media navigation and consumer control of the Internet of Things.

There are already DIY hardware kits for health monitor and EEG apps, which illustrates how many of the systems described here could migrate to consumer electronics price points in not too long a time.

Interaxon was probably the first company to show up at the Consumer Electronics Show with a brain-sensing headband. Rarely did a CES tour group go by without a picture with one of their attendees wearing it, accompanied by requisite jokes about hypnosis and mind control. But meditation is their killer app.

NeuroSky provides “world class biosensors for both the mind and body.” They offer a cross-platform EEG headset for as low as $79.99. They offer consumer packages for meditation or brain training and also component products and a developers’ SDK. BrainLink is another headset product, and the company offers applications ported to iOS, Android, Windows and Mac OS. MeloMind is another device aimed at relaxation.

Earlier this year, British company Flexctrl launched a BCI crowd-funding campaign for a very visually appealing headset. Some extra features include a built-in gyro sensor to detect head movement, Bluetooth and Micro USB external data transmission connections and even QI wireless charging technology. Alas, the IndieGogo campaign maxed out at a couple thousand dollars.

Perhaps a better leading indicator of trends in the BCI area is the type of company applying for patents in brain technology. A market research firm recently looked at patents for neuro technology, which are being filed four times the rate they were in the decade of the 2000s – 1,600 last year alone. One expects to find companies like Medtronic PLC for ways to use EEG to measure the severity of a brain lesion or St. Jude Medical for methods of using brain activity to improve vision. But the explosion is driven by non-medical uses, including controlling video games with brain waves, using brainwaves to decide what music selections to play, or, as evidenced in a Microsoft patent, to determine whether a user is in the mood to see an advertisement.

This last reminded me that our friends at Nielsen have a whole research program called consumer neuroscience. Imagine them measuring your viewing reactions directly – that should shake up the CPM pricing structure of today’s video advertising world.

It’s easy within the media industry to make jokes about it, but there is no question the collision of medical and consumer technology will raise important issues in both business and regulatory models. And it’s been my contention that some of these models will favor established service providers better than the classic start-up.

Medical technology demands not only safety research, but also an infrastructure that complies with HIPAA privacy requirement and presents a clear case for insurance coverage of those medical uses. Therapists who use assistive technology are acutely aware of the gap between their often lower-income patients and the high costs of technology.

One association tracking consideration of Medicare reimbursement of assistive technology commented recently, “Although Medicare did not address ‘capped rental’ or ‘eye tracking accessories’ in the draft revised NCD (national coverage determination), it is essential that both topics be resolved as soon as possible…We must tell Medicare that we want the draft to be made a final guideline and we want it done as soon as possible. The changes that were made to Medicare SGD (speech-generating device) coverage in 2014 are causing harm and will continue to do so until a new NCD is effective.” These comments reflect the confluence of government, insurance and health providers in providing a complex framework for a far-reaching set of technologies.

It may be that entertainment or productivity applications will not rise to the level of these medical frameworks, but there’s still a set of business constraints, whether privacy or liability or others, that will have to be addressed. Just as touchscreens have changed the way consumers navigate and play back video services, these new brain interfaces may very well launch another wave of innovation in the media and entertainment business.

Anish Koirala
Anish Koirala
Anish is a gaming writer and tech expert, specializing in the intersection of gaming culture and cutting-edge technology. With a degree in Information Management, Anish offer insightful analysis and reviews on gaming hardware, software, and industry trends.
RELATED ARTICLES

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Most Popular