AI-Powered Brain Computer Interface Co-pilot Offers New Autonomy for People with Paralysis
Description

Scientists at the University of California – Los Angeles (UCLA) have developed an AI-powered “co-pilot” to dramatically improve assistive devices for people with paralysis. The research, conducted in the Neural Engineering and Computation Lab led by Professor Jonathan Kao with student developer Sangjoon Lee, tackles a major issue with non-invasive, wearable brain-computer interfaces (BCIs): “noisy” signals. This means the specific brain command (the “signal”) is very faint and gets drowned out by all the other electrical brain activity (the “noise”), much like trying to hear a whisper in a loud, crowded room. This low signal-to-noise ratio has made it difficult for users to control devices with precision.

The team’s breakthrough solution is a concept called shared autonomy. Instead of only trying to decipher the user’s “noisy” brain signals, the AI co-pilot also acts as an intelligent partner by analyzing the environment, using data like a video feed of the robotic arm. By combining the user’s likely intent with this real-world context, the system can make a highly accurate prediction of the desired action. This allows the AI to help complete the movement, effectively filtering through the background noise that limited older systems.

A side-by-side diagram contrasting two approaches to brain-computer interface (BCI) control. On the left, titled "Prior studies," a person in a chair with electrodes on their head sends neural signals to a "BMI decoder," which then directly controls a robotic arm. The person receives "visual feedback" from a monitor displaying the arm's movement. On the right, titled "This study, with an AI copilot + BMI (AI-BMI)," the setup is more complex. Neural signals still go to a "BMI decoder," providing "BMI control." However, this signal now feeds into an "AI-BMI control" pathway, which also receives input from an "AI Agent." The AI Agent is shown as an "AI policy" mechanism that takes input from "Computer Vision" (represented by a camera pointing at the robotic arm and task) as well as "Task priors and information" and "Historical movements." The combined AI-BMI control then directs the robotic arm, and the person again receives "visual feedback" from the monitor.

The results of this new approach are remarkable. In lab tests, participants using the AI co-pilot to control a computer cursor and a robotic arm saw their performance improve by nearly fourfold. This significant leap forward has the potential to restore a new level of independence for individuals with paralysis. By making wearable BCI technology far more reliable and intuitive, it could empower users to perform complex daily tasks on their own, reducing their reliance on caregivers.

Source: University of Illinois Urbana-Champaign

The post AI-Powered Brain Computer Interface Co-pilot Offers New Autonomy for People with Paralysis appeared first on Assistive Technology Blog.

Comments
Order by: 
Per page:
 
  • There are no comments yet
Related Feed Entries
Kenyan entrepreneur Elly Savatia has won the prestigious Africa Prize for Engineering Innovation and £50,000 for creating Terp 360, a revolutionary web-based application that translates speech and text into sign language using AI-powered 3D avatars. Described as “Google Translate for sign lang…
13 days ago · From Assistive Technology Blog
Summary:  Learn about the long-standing partnership between Xbox and Special Olympics and their shared mission to make gaming accessible for everyone. Get an inside look at the recent Xbox Game Camp, a unique event that gave athletes an immersive experience in the gaming industry. Find out…
09.10.2025 · From Assistive Technology Blog
The fitness app MadMuscles has launched a new, completely free adaptive training program designed for wounded veterans, individuals with amputations, and people with limited mobility. Developed in partnership with the Invictus Games community in Ukraine, this initiative aims to help users who have c…
24.09.2025 · From Assistive Technology Blog
For individuals with limb loss, the connection between their body and their prosthetic, known as the socket, is crucial for comfort and mobility. Traditionally, creating this socket is a hands-on, time-consuming process involving plaster casts and multiple appointments to get the fit just right. How…
17.09.2025 · From Assistive Technology Blog
Meta’s Ray-Ban smart glasses, while not explicitly designed as assistive technology, are gaining recognition for their potential to significantly improve accessibility and independence for many people. By integrating voice control, open-ear audio, and a camera into a stylish and commonly worn …
27.08.2025 · From Assistive Technology Blog
Rate
0 votes
Info
13.10.2025 (13.10.2025)
65 Views
0 Subscribers
Recommend
Tags