A team of researchers at the University of Chicago has received a $3.4 million grant from the National Institutes of Health. The funding will help the team develop robotic arms patients can control with their minds that receive sensory feedback from attached prosthetic hands.
The new grant is part of a combined $7 million awarded to UChicago, Pitt and the University of Pittsburgh Medical Center (UPMC) to continue their collaboration developing prosthetics with a brain computer interface (BCI) for paralyzed patients. In 2016, the team demonstrated how a clinical trial participant was able to control a robotic arm with his mind and regain the sense of touch through its hand.
The new grant will expand the clinical trial to UChicago, where the project will be led by Sliman Bensmaia, PhD, who studies the sense of touch, and Nicho Hatsopoulos, PhD, who researches how the brain directs movement in the limbs. John Downey, PhD, a staff scientist in Bensmaia's lab who formerly worked with the Pitt team, will coordinate research activities. Neurosurgeon Peter Warnke, MD, will perform surgical procedures to implant the devices, and Raymond Lee, MD, a physical rehabilitation specialist from Schwab Rehabilitation Hospital in Chicago, will recruit subjects and provide guidance on the patient population involved.
"Our goal is to create a prosthesis that has the same dexterity and functionality as the natural human hand," Bensmaia said. "UChicago has the benefit of years of experience with both motor neuroscience and somatosensory research, and we look forward to continuing that work with our partners at Pitt and UPMC."
The research team at Pitt and UPMC is led by Michael Boninger, MD, and includes Jennifer Collinger, PhD, Robert Gaunt, PhD, and Elizabeth Tyler-Kabara, MD, PhD. That team has worked with two Pittsburgh area clinical trial participants since 2012, both of whom had paralysis of their arms and hands. The new project will recruit two more such patients at each site.
The robotic neuroprosthetic system works by implanting arrays of electrodes in areas of the brain that control movement and process the sense of touch from a natural limb. The electrodes pick up activity in neurons as the patient thinks about moving their own arm to direct the robotic arm to move accordingly. The prosthetic hand is fitted with sensors to detect sensations of touch, such as pressing on individual fingertips, which in turn generates electrical signals that stimulate the appropriate areas of the brain.
The prosthetics will incorporate years of research by Bensmaia and Hatsopoulos on how the nervous system interprets sensory feedback, directs limbs to move and perceives them in space. Bensmaia's lab has developed software algorithms to recreate the sense of touch with the BCI using a "biomimetic" approach that mimics the way someone's natural nervous system would communicate signals from the hand to the brain. Hatsopoulos studies motor control and how brain cells work together to coordinate and learn complex movements of the arm and hand.
"BCI control in the past has focused on moving the limb in free space. Our project will attempt to solve the difficult challenge of controlling the hand when it comes in contact with and manipulates objects," Hatsopoulos said.
The research team at UChicago will continue to refine the previous work with their partners in Pittsburgh to incorporate greater dexterity and more precise movements into the prosthetics.
"Having a human patient lets us do all kinds of things we couldn't do before," Bensmaia said. "You can probe the quality of sensations being invoked by asking them what they feel. You can sculpt movements to make more natural and precise. This opens a new world for us here at the University of Chicago."