Connect with us

Published

on

A paralysed individual, aged 69, has successfully piloted a virtual drone using a brain-computer interface (BCI) that interprets neural signals. This innovative achievement has enabled the participant to navigate a video-game obstacle course by imagining specific finger movements. The breakthrough device, which bridges brain activity and real-time control, demonstrates potential applications for assisting those with mobility challenges to engage in intricate tasks. These developments mark significant progress in the application of BCIs for enhancing motor functions.

Breakthrough Detailed in Nature Medicine

According to a study published in Nature Medicine, the man, who had been paralysed in all four limbs following a spinal cord injury, controlled the virtual drone using neural signals linked to imagined movements of specific finger groups. The research relied on electrodes implanted in the participant’s left motor cortex, which had been placed during a prior operation in 2016. Algorithms were trained to decode the brain’s signals when he visualised moving his right thumb, different finger pairs, or combinations of them.

The researchers reported that the participant initially practised synchronising imagined movements with a virtual hand displayed on a screen, achieving a high degree of accuracy by hitting up to 76 targets per minute. Subsequently, the signals were connected to the drone’s navigation system, allowing him to steer it through a virtual basketball court, manoeuvring rings with precision.

Expert Insights on Potential Applications

Matthew Willsey, a neurosurgeon at the University of Michigan and a co-author of the study, told Nature Medicine that the participant likened the experience to playing a musical instrument, requiring delicate adjustments to maintain control. Willsey noted that the research seeks to enable control of multiple movements simultaneously, potentially assisting activities such as typing or playing musical instruments.

John Downey, a BCI researcher from the University of Chicago, described the work as an important initial step in understanding hand control mechanisms. He highlighted the potential of this technology as a versatile tool for individuals with limited mobility. Researchers aim to enhance the system to decode signals for all ten fingers.

Continue Reading

Science

AI Model Learns to Predict Human Gait for Smarter, Pre-Trained Exoskeleton Control

Published

on

By

Scientists at Georgia Tech have created an AI technique that pre-trains exoskeleton controllers using existing human motion datasets, removing the need for lengthy lab-based retraining. The system predicts joint behavior and assistance needs, enabling controllers that work as well as hand-tuned versions. This advance accelerates prototype development and could improve…

Continue Reading

Science

Scientists Build One of the Most Detailed Digital Simulations of the Mouse Cortex Using Japan’s Fugaku Supercomputer

Published

on

By

Researchers from the Allen Institute and Japan’s University of Electro-Communications have built one of the most detailed mouse cortex simulations ever created. Using Japan’s Fugaku supercomputer, the team modeled around 10 million neurons and 26 billion synapses, recreating realistic structure and activity. The virtual cortex offers a new platform for studying br…

Continue Reading

Science

UC San Diego Engineers Create Wearable Patch That Controls Robots Even in Chaotic Motion

Published

on

By

UC San Diego engineers have developed a soft, AI-enabled wearable patch that can interpret gestures with high accuracy even during vigorous or chaotic movement. The armband uses stretchable sensors, a custom deep-learning model, and on-chip processing to clean motion signals in real time. This breakthrough could enable intuitive robot control for rehabilitation, indus…

Continue Reading

Trending