AI-powered Brain Device Allows Paralysed Man To Control Robotic Arm

AI-powered Brain Device Allows Paralysed Man To Control Robotic Arm

Summary: A new brain-computer interface (BCI) has enabled a paralyzed man to control a robotic arm simply by watching its movements. Unlike previous BCIs, which lasted only a few days, this AI-powered device worked reliably for seven months. The AI ​​model adapts to natural changes in brain activity, maintaining its accuracy over time.

After training with the virtual arm, the participant was able to successfully grasp, move, and manipulate real objects. This technology represents a significant step in restoring mobility for paralyzed individuals. Researchers are optimizing the system for smooth operation and testing its use at home.

Important facts:

  • Extended performance: The AI‑powered brain‑computer interface (BCI) remained reliable for seven consecutive months, far surpassing earlier models.
  • Smart adaptability: It continuously adjusted to daily shifts in brain activity while preserving precise control.
  • Real‑world function: Users operated a robotic arm to grasp objects and dispense water, demonstrating practical everyday use.

Source: UCSF

At the University of California, San Francisco, scientists have developed a device that translates brain signals into computer commands, enabling a paralyzed man to control a robotic arm.

He can grab, move, and release objects by imagining himself performing these actions. 

The device, called a brain-computer interface (BCI), worked for seven full months without any adjustments. Previously, such devices only worked for a day or two. 

BCI is based on an AI model that can adapt to small changes in the brain when someone repeats a movement (or in this case, an imagined movement) and learn to perform it better. 

“This combination of human-to-human learning and artificial intelligence (AI) is the next step in these brain-computer interfaces,” said neurologist Dr. Karunash Ganguly, professor of neurology and member of the Weill Institute for Neurosciences at UCSF. “This is what we need to achieve advanced, life-like functions.”

The research, funded by the National Institutes of Health, appears in Cell on March 6. 

The breakthrough came from tracking how the participant’s brain activity shifted each day as they repeatedly imagined specific movements. By training the AI to adapt to these daily variations, the system maintained stable performance for months.

Location, location, location

Ganguly and neurologist Nikhilesh Nataraj, PhD, collaborated with a participant who had been paralyzed for years following a concussion and was unable to move. Tiny electrode-like sensors, placed on the surface of the brain, recorded neural activity as the participant imagined performing movements, allowing those signals to be translated into control commands for a robotic arm.

When the mind is engaged over time, the participant can imagine moving different body parts such as hands, a cake, or the head.

When there is no more movement, the participant can perform an imaginary movement. Ganglion cells are based on the form of representation in the brain, which is permanent, but also occurs daily. 

From virtual to reality

Sometimes, if the participant imagines moving their semaphore by making simple movements with handles, hands, or thumbs, the sensors can register their brain activity to initiate IA.

Finally, Ganguly introduced the practical participant to a virtual robot that demonstrated retrofeedback with the accuracy of their perceptions. Ultimately, the virtual robot worked like a charm. 

One of the participants came to practice with a real robot arm, but first he had to transfer the practical session to the real world. 

Brazo robot blocks, giraffes, and could be a new user interface in movies. It includes an armchair, cup, and water dispenser. 

"An AI-powered brain device lets a paralyzed man control a robotic arm—marking a breakthrough in neurotechnology and mobility restoration. Credit: StackZone Neuro
“An AI-powered brain device lets a paralyzed man control a robotic arm—marking a breakthrough in neurotechnology and mobility restoration. Credit: StackZone Neuro

After a few minutes, the participant can control the robot arm in a 15-minute “puesta a punto” to adapt the device to the device that displays the device’s movement. 

Ganguli is now perfect for IA models for the robotic arm, which is very fast and easy to use, and the aircraft can be tested with BCI in a home environment. Para las personas con paralisis, la capacity de la food or beber agua cambiaría su vida. Ganguly believes it’s next time. 

“It’s really important to make sure you can build the system and it will work,” he said.

Authors: In addition to the lead researchers, the study team included Sarah Seko and Adeline Tu‑Chan from UCSF, and Reza Aberi from the University of Rhode Island.

Funding: The project was funded by the National Institutes of Health (grant 1 DP2 HD087955) and the UCSF Weill Institute for Neurosciences.

Abstract

Long‑term study of how the brain’s representations adapt during simple imagined movements shows that this “representational plasticity” can support sustained neuroprosthetic control.

 The nervous system must balance the stability of these neuronal patterns with their ability to change. Such stable yet adaptable representations provide an effective learning framework  particularly in humans making them well‑suited for applying learned skills to new contexts with clarity and precision.

Using an electrocorticography‑based brain–computer interface (BCI), we observed that the low‑dimensional manifold structure and representation distances associated with a set of simple imagined movements remained remarkably stable over time. This stability suggests that the brain maintains consistent neural patterns for these motor imagery tasks, providing a reliable foundation for long‑term neuroprosthetic control.

The absolute position of the male indicates a limited number of increases. Additional information can be found on neuronal statistics, especially those that vary, are flexible and regular, and that the response of BCI controllers involves somatotopic changes.

This separation was due to effects on BCI specificity and war, which was a context-sensitive demonstration.

A final study of representational plasticity and drift over several days yielded a metapresentative structure with a repertoire of generalizable decision boundaries; these are very light and durable neuroprosthetic supports that are robot arms and robot hands for griffins and griffins.

Comments

No comments yet. Why don’t you start the discussion?

Leave a Reply

Your email address will not be published. Required fields are marked *