Researchers at the University of California, San Francisco, have enabled a man paralyzed to regularly control a robotic wing using signals from his brain, transmitted through a computer.
He was able to understand, move and release objects simply imagining himself by performing actions. The device, known as a brain computer interface (BCI), operated successfully for a record seven months without requiring any adjustment.
So far, such devices had only worked for one or two days.
This BCI relies on a pattern of artificial intelligence (the one) that adapts to minor changes in brain activity as a person constantly imagines a movement, gradually improving his accuracy.
“This mix of learning among people and it is the next phase for these brain-computers interfaces,” said Professor Karunesh Ganguly, a neurologist at the UCSF Weill Institute for Neurosciences. “What is what we need to achieve a sophisticated, lively function.”
The study, funded by the US National Institute of Health, was published on March 6 in Cell magazine.
One of the study participants, who lost the ability to move and speak after a stroke years ago, can now control the robotic arm imagining specific movements.
The main progress involved understanding how brain activity is shifted day by day when the participant repeatedly imagines making these movements.
After the system he was trained to account for these changes, he held performance for months.
Professor Ganguly previously studied brain activity models in animals and observed that these models evolved as animals learned new movements.
He suspected that the same process was happening in people, which explained why early BCI quickly lost their ability to interpret brain signals.
Ganguly and Dr. Nikhilesh Natraj, a neurology researcher, worked with a participant who was paralyzed by a stroke and could not even move.
The participant had small sensors implanted on the surface of his brain to detect nerve activity when he imagined moving.
To investigate whether these brain models changed over time, participants were asked to imagine moving different parts of the body, such as his hands, feet and head.
While he could not move physically, his brain continued to generate signals that correspond to these imagined movements.
BCI recorded these signals and found that while the general models remained the same, their precise places in the brain were moved slightly daily.
The researchers then asked participants to imagine simple movements of the finger, hand and finger for two weeks as the system he learned to interpret his brain activity. Initially, robotic wing movements were incorrect.
To improve accuracy, the participant practiced using a virtual robotic wing that provided reactions on how closely his imagined movements matched the target actions.
Eventually, he was able to take the virtual arm to perform the desired tasks. After the participant began to practice with the true robotic arm, only a few practical sessions were to transfer his skills to the real world. He was able to use the robotic arm to get blocks, return them and transfer them to new places.
He was even able to open a cabinet, take a cup and keep it under a water distributor. Months later, he could still control the robotic arm after a short 15-minute “tuning” to fix changes in his brain activity over time.
Ganguly and his team are now working to refine the model to make the robotic arm move faster and better. They also plan to test the system in a home environment. For people with paralysis, the ability to perform simple tasks such as their eating or getting a water drink can be changing life.
“I am very sure we have learned how to build the system now, and that we can do this work,” Ganguly said.
#Scientists #create #robotic #wing #moved #imagination #details #revealed
Image Source : nypost.com