Hacker News new | past | comments | ask | show | jobs | submit login
Cortical control of a tablet computer by people with paralysis (plos.org)
54 points by mdturnerphys on Nov 26, 2018 | hide | past | favorite | 9 comments



This is great work and the authors should be really proud.

The hard thing about BCI research is that it has to compete with operating a joystick with your lips. Quadraplegics get incredibly good at this, to the point that they can play RPGs:

https://www.youtube.com/watch?v=qj6tFWs1bhg

Perhaps BCIs will eventually get so good that they will be superior to these joysticks. Then quads will be able to play first-person-shooters :)


The problem with BCI, eyegaze or speech control systems is how slow they are. There can no delay in a fast paced game: https://www.youtube.com/watch?v=qTfa0gh2WVg


Recent New Yorker article about controlling actual robotic arms via implanted “Utah” arrays:

https://www.newyorker.com/magazine/2018/11/26/how-to-control...


I thought that implanted devices worked great at first, and then slowly stopped responding. Is this one different?


The signals do degrade with time. They appear to be using Utah arrays which have been around for a while, so the old problems likely still exist. A second problem is that the signals themselves, assuming they are good, change as well. They're doing ~15m calibrations followed by ~40m activity sessions.

I think ultimately for these things to really work, there has to be a shift in the HCI paradigm. Specifically, the interaction context needs to be structured such that the behavior of the subject is highly predictable, with that being the case it then becomes easier to continuously retrain online in the background without the subject being aware of it.

That said, they got it to work! Always nice to see hard projects come to fruition!


This is fantastic! The surgery to implant the type of intracortical device used in this study would probably be too dangerous for my friend with advanced muscular dystrophy to undergo - has anyone read about similar results with external EEG? She's a nontechnical PC/tablet user, but I could take a week or two to work with getting a device to work for her.


There are some devices like a P300 speller: http://openvibe.inria.fr/p300-speller-xdawn/

But the big trouble is that external EEG signals must diffuse through the skull and it adds a non-trivial amount of noise. So these style of fine-tuned readings in the original link can be difficult to achieve with high accuracy.


Can she move her eyes? Hybrid BCI (EEG + eye/EMG etc) rather than pure EEG based BCI can be a faster and less taxing solution for her


Yes, she can, as well as make a “normal” range of facial gestures. She can still speak, but her voice keeps getting weaker, and mass-market solutions like Alexa don’t work very well for her.

Thanks for the idea!




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: