I am fascinated by attempts to link the brain, computers and mechanical limbs — and in general the science of brain-computer interfaces, or BCI. The Wired article picked up on work published by researchers at the Long Beach Veterans Affairs Medical Center and the University of California, Irvine, in the open-access research archive arXiv.
The California team’s device, and the field it represents, have a long way to go before future devices help paralyzed people to walk again, but the progress is heartening, says Cali Fidopiastis, Ph.D., director of UAB’s Interactive Simulation (iSim) Lab, which is part of the Department of Physical Therapy in the UAB School of Health Professions.
Fidopiastis has some neat work under way that involves navigating computer programs (e.g. opening folders, starting apps) with thoughts or eye blinks, and developing devices that would allow soldiers to alert their teams to danger — or to pilot drone aircraft — simply by thinking about it.
While progress is being made toward applying thought-pattern control to prosthetics, Fidopiastis says, research needs to proceed carefully because the faulty translation of such patterns into movements could be very dangerous for a disabled person walking through his or her neighborhood with the help of such a device.
The robotic braces in the above video are controlled by electrical impulses fired along nerve pathways in the brain and captured by electroencephalogram, or EEG, electrodes placed on the scalp. Programs called “classifiers” identify the user’s intention to move from the EEG signal, but they are not yet nearly sensitive enough, says Fidopiastis.
Classifiers are learning programs that recognize patterns and make decisions. If they recognize the wrong thing, however, they can generate unintended movement in an attached prosthetic. Furthermore, a classifier must be trained by the person using it. This is done by repeating over and over the movement that the prosthetic should make until the program can accurately spot the accompanying EEG signature.
Unfortunately, today's classifiers cannot reliably transfer that training from day to day or task to task. How useful is a prosthetic if it takes hours of training before it can turn a corner, walk up a ramp and climb some stairs on command, and if it must be retrained every time you use it?
The real leap in the new study, says Fidopiastis, may be in taking brain-computer interfaces that were developed to improve communication and adapting them to enable movement. Traditionally, brain-computer interfaces have been used to enable patients with motor nerve conditions like Lou Gehrig’s disease to type a message on a keyboard by thinking about each key. The new work applies those methods, tools and techniques to mobility.
Motion-enabling prosthetics may also be useful in the emerging field of “virtual rehabilitation.” Past studies have argued that when a therapist moves a patient’s paralyzed limb, the movement may help to re-wire the brain for the possibility of self-directed movement. Edward Taub, Ph.D., in the UAB Department of Psychology, did some of the early work along these lines, showing that movement therapy applies to patients with multiple sclerosis as well as to stroke survivors. Perhaps it applies to any patient whose disease or injury has compromised their ability to move? Future studies will tell.
Fidopiastis recommends that those with an interest in this field take a look at The Wadsworth Brain-Computer Interface System or the mixed-reality rehabilitation projects under way at the E2i studio and the University of California at San Francisco.
About the blogger
Greg Williams @gregscience @themixuab is research editor in Media Relations at the University of Alabama at Birmingham.
The video and post is nice..
ReplyDeleteThanks Utah Rehab. I can't wait until we have some our own research results to talk about in this area. Coming soon.
ReplyDelete