Of many The frustration of having a serious motor loss, the difficulty of communication must surely be the worst. The tech world has not offered much success for people affected by things like locked-in syndrome, ALS and severe stroke, but startup cognition is aimed at a new form of brain monitoring with a modern interface. Can meet, speak and interact. Very simple and fast.
The company’s One Headset closely tracks brain activity in such a way that the wearer can direct a cursor – resembling a visor – in multiple directions or select from a variety of menus and options. No physical movement is required, and with the help of modern voice interfaces such as Alexa, the user can not only communicate efficiently, but can freely access all types of information and content that most people provide. .
But this is not a miracle machine, and it is not a silver bullet. Here’s how it started.
Decades old brain technology
Everyone has different needs and capacities with motor impairments, and a variety of assistive technologies, which meet these needs. But many of these techs and interfaces are years or decades old – medical devices that have not been updated for the era of smartphones and high-speed mobile connections.
Some of the most dated interfaces, unfortunately, are used by those with the most severe limitations: whose movements are limited to their head, face, eyes – or even a single eyelid, such as Jean-Dominic Bobby, the famous author of “Diving Bell and Butterfly.”
A tool in the toolbox is an electroencephalogram, or EEG, which involves detecting activity in the brain that records electrical signals through patches on the skull. But while they are useful in medicine and research in many ways, EEGs are noisy and impenetrable – to find which areas of the brain are active, say, sensory cortex or the like sub-region. And of course you have to wear a shower cap wired with electrodes (often greasy with conductive gel) – it’s not the kind of thing that anyone wants to do for more than an hour, let alone every day.
Yet among those with the most physical disabilities, cognition is often unaffected – as indeed EEG studies have helped demonstrate. This made Conrix’s co-founder and CEO Andreas Forsland curious about further prospects for the venerable technology: “Can a brain-computer interface using EEG be a viable communication system?”
He first used EEG for ancillary purposes in a research study a few years ago. They were looking for alternative ways to control a person’s on-screen cursor, one of them being an accelerometer to detect head movements, and tried to integrate EEG readings as another signal. But it was far from a success.
He ran into difficulties: “With a read-only system, the EEG used today is not good; Other headsets have slower rate rates and are not accurate enough for a real-time interface. The best BCIs are in a lab, which is connected to wet electrodes – it’s messed up, it’s actually a non-starter. So how do we repeat that with a dry, inert electrode? We are trying to solve some very difficult engineering problems here. “
Borders, Forsland and his colleagues found, was not so much with the EEG itself as it was carried out. This type of brain monitoring is for diagnosis and study, not for real-time response. It would be like taking a tractor to a drag race. Not only do EEGs often operate with slow, thorough investigations of many areas of the brain that can last for several seconds, but the signals it produces are analyzed by dated statistical methods. So the realization began by questioning both practices.
Improving scan speed is more complicated than overclocking a sensor or something. Activity in the brain must be estimated by collecting a certain amount of data. But that data is collected passively, so Forsland tried to bring an active element to it: a rhythmic electrical stimulation that is reflected by the brain region, but slightly changes depending on its state – almost acoustic. Kind.
They detect these signals with a custom set of six EEG channels in the visual cortex region (behind and around their head), and use machine learning models to interpret incoming data. Running a strongly neural network locally on an iPhone – something that was not really possible a few years ago – the system can not only manipulate a signal in short order, but make accurate predictions, making it faster and faster. Smooth conversations can occur.
The result is a sub-second latency in a wireless headset powered by a mobile phone with 95–100 percent accuracy. “Speed, accuracy and reliability are getting on a commercial scale – we can match the best in the class of EEG’s current paradigm,” Forsland said.
Dr. William Goldie, a clinical neurologist who has used and studied EEG and other brain monitoring techniques for decades (and who is helping to develop cognition and test the headset voluntarily), positively evaluated the technology Offered.
He said, “There is absolutely evidence that brainwave activity reacts to thinking patterns in predictable ways. This type of stimulation and response was studied years ago.” It was fascinating, but then it was the mystery of magic. Was in the world. Now it is reconnecting with these special technologies and these days computerization. For me this is an area that is opening up in a way that I think can be clinically effective. “
Meet BCI, UI
First, Forceland said to me “We are a UI company.” And in fact even such a step forward is in the neural interface as he later explained that it means little if it cannot be applied to the problem at hand: serious enough to express oneself quickly and easily. Helping people with motor impairment.
Sadly, it’s not hard to imagine improvements in “competition”, things like puff-and-blow tubes and switches that let users laboriously move a cursor correctly, slightly higher, up, a little more up , Then click: a letter! Gaze detection is definitely a major improvement over this, but it is not always an option (the eyes do not always work as well as there will be none) and the best eye-tracking solution (eg Toby Dynox Tablet ) Are not portable.
Why shouldn’t these interfaces be as modern and fluid? The team set out to build a UI with it and their next generation EEG capabilities in mind.
Their solution takes bits from the old paradigm and combines them with modern virtual assistants and a radial design that prioritizes quick responses and common needs. It all runs in an app on an iPhone, whose display appears as a HUD and a visor, acting as an outward-facing display.
In the easy reach of, not to say a single thought, but at least a moment of concentration or a tilt of the head, there are everyday questions and responses – yes, no, thank you, etc. Then in the prepared speech – there are slots for inserting names, menu commands, and so on. And then there’s a keyboard with word- and sentence-level predicates that allow common words to pop without spelling them out.
“We have tested the system with people who rely on the switch, which can take 30 minutes to make 2 selections. We put a headset on a person with cerebral palsy, and he typed his name and played the game in 2 minutes, ”Forsland said. “It was ridiculous, everyone was crying.”
Goldie said that there is some learning curve. “When I put it on, I found that it recognizes patterns and goes through them, but it’s also a pattern taught to me. You’re training the system, and it’s training you – it’s a feedback loop. “
“I can be the biggest person in the room”
One person who has found it extremely useful is Chris Benedict, a DJ, public speaker, and disability advocate himself with dyskinetic cerebral palsy. This limits their movements and speaking ability, but does not prevent them from spinning (digital) records in various engagements, however, or from explaining their experience with Cognition’s One Headset over email. (And you can see him personally appearing in the video above.)
“Even though it’s not a tool that I need all the time, it definitely helps support my communication,” he told me. “Especially when I need to respond quickly or there is some noise somewhere, which often happens when you are a DJ. If I wear it with a Bluetooth speaker I can be the loudest person in the room. “(He always has a speaker in his hand, because” you never know when you might need some music. “)
The benefits offered by the headset give some idea of what is lacking from existing assistive technology (and what many people take).
“I can use it to communicate, but at the same time I can make eye contact with the person I’m talking to, because of the visor. I have to stare at a screen between me and someone else No. It really helps me connect with people, “Benedict explained.
“Because this is a headset that I don’t have to worry about going in and out, there is no extra bulk on my chair that I have to worry about being damaged in a doorway. The headset is also balanced, so it does not tilt my head back or forward or lower my neck, ”he continued. “When I first set it to use, it calibrated for me, and it measured my personal range of motion, so that the keyboard and options fit the screen specifically for me. It can be reorganized at any time, which is important because my speed is not the same every day. “
Alexa, which has been extremely helpful to many types of people with disabilities due to its low cost and wide range of compatible devices, is also part of the Cognition interface, something Benedict appreciates, itself as a smart home and system for other purposes. Adopted. “With other systems it’s not something you can do, or if it’s an option, it’s really complicated,” he said.
As Benedict demonstrates, there are people for whom a device like Cognition matters a lot, and the hope is that it will be embraced as part of the essential diversified ecosystem of assistive technology.
Forsland said the company is working closely with the community, from users to Goldie and clinical experts like Speech Therapist to make the One Headset as good as it can be. But the hurdle, with a lot of tools in this class, is actually how to put it on people’s heads – financially and logically speaking.
The FDA is applying for clearance to get the cost of the cognition headset – which, being operated by a phone, will not be high with an integrated screen and processor – covered by insurance. But in the meantime the company is working with clinical and corporate laboratories that are conducting neurological and psychological research. Places where you can get a simple, cumbersome EEG setup in other words.
The company has raised funds and is looking for more (hardware development and medical work is not cheap), and has also collected several grants.
The One headset may still be a few years away from widespread use (the FDA is in no hurry), but it may give the company time to refine the device and incorporate new advances. Unlike many other assistive devices, for example a switch or joystick, this is largely software-limited, meaning better algorithms and UI work will be greatly improved. While many have waited for companies such as Neuralink to create a brain-computer interface for the modern era, Cognition has already done so for a group of people who have a lot of benefits from it.
You can learn more about the Cognition One headset and sign up to get the latest on its site here.