91Ƭ

Robotic Glove Lends a ‘Hand’ to Relearn Playing Piano After a Stroke

Robotic Glove, Piano, Artificial Intelligence, Exoskeleton, Technology

The soft robotic glove integrates five actuators into a single wearable device that conforms to the user's hand. (Photo by Alex Dolce)


By gisele galoustian | 6/29/2023

For people who have suffered neurotrauma such as a stroke, everyday tasks can be extremely challenging because of decreased coordination and strength in one or both upper limbs. These problems have spurred the development of robotic devices to help enhance their abilities. However, the rigid nature of these assistive devices can be problematic, especially for more complex tasks like playing a musical instrument.

A first-of-its-kind robotic glove is lending a “hand” and providing hope to piano players who have suffered a disabling stroke. Developed by researchers from 91Ƭ’s College of Engineering and Computer Science, the soft robotic hand exoskeleton uses artificial intelligence to improve hand dexterity.

Combining flexible tactile sensors, soft actuators and AI, this robotic glove is the first to “feel” the difference between correct and incorrect versions of the same song and to combine these features into a single hand exoskeleton.

“Playing the piano requires complex and highly skilled movements, and relearning tasks involves the restoration and retraining of specific movements or skills,” said Erik Engeberg, Ph.D., senior author, a professor in 91Ƭ’s Department of Ocean and Mechanical Engineering within the College of Engineering and Computer Science, and a member of the 91Ƭ and the 91Ƭ Stiles-Nicholson Brain Institute. “Our robotic glove is composed of soft, flexible materials and sensors that provide gentle support and assistance to individuals to relearn and regain their motor abilities.” 

Researchers integrated special sensor arrays into each fingertip of the robotic glove. Unlike prior exoskeletons, this new technology provides precise force and guidance in recovering the fine finger movements required for piano playing. By monitoring and responding to users’ movements, the robotic glove offers real-time feedback and adjustments, making it easier for them to grasp the correct movement techniques.

To demonstrate the robotic glove’s capabilities, researchers programmed it to feel the difference between correct and incorrect versions of the well-known tune, "Mary Had a Little Lamb," played on the piano. To introduce variations in the performance, they created a pool of 12 different types of errors that could occur at the beginning or end of a note, or due to timing errors that were either premature or delayed, and that persisted for 0.1, 0.2 or 0.3 seconds. Ten different song variations consisted of three groups of three variations each, plus the correct song played with no errors.

To classify the song variations, Random Forest (RF), K-Nearest Neighbor (KNN) and Artificial Neural Network (ANN) algorithms were trained with data from the tactile sensors in the fingertips. Feeling the differences between correct and incorrect versions of the song was done with the robotic glove independently and while worn by a person. The accuracy of these algorithms was compared to classify the correct and incorrect song variations with and without the human subject.

Results of the study, published in the journal , demonstrated that the ANN algorithm had the highest classification accuracy of 97.13 percent with the human subject and 94.60 percent without the human subject. The algorithm successfully determined the percentage error of a certain song as well as identified key presses that were out of time. These findings highlight the potential of the smart robotic glove to aid individuals who are disabled to relearn dexterous tasks like playing musical instruments.

Researchers designed the robotic glove using 3D printed polyvinyl acid stents and hydrogel casting to integrate five actuators into a single wearable device that conforms to the user's hand. The fabrication process is new, and the form factor could be customized to the unique anatomy of individual patients with the use of 3D scanning technology or CT scans.

“Our design is significantly simpler than most designs as all the actuators and sensors are combined into a single molding process,” said Engeberg. “Importantly, although this study’s application was for playing a song, the approach could be applied to myriad tasks of daily life and the device could facilitate intricate rehabilitation programs customized for each patient.” 

Clinicians could use the data to develop personalized action plans to pinpoint patient weaknesses, which may present themselves as sections of the song that are consistently played erroneously and can be used to determine which motor functions require improvement. As patients progress, more challenging songs could be prescribed by the rehabilitation team in a game-like progression to provide a customizable path to improvement.

“The technology developed by professor Engeberg and the research team is truly a gamechanger for individuals with neuromuscular disorders and reduced limb functionality,” said Stella Batalama, Ph.D., dean of the 91Ƭ College of Engineering and Computer Science. “Although other soft robotic actuators have been used to play the piano; our robotic glove is the only one that has demonstrated the capability to ‘feel’ the difference between correct and incorrect versions of the same song.”

Study co-authors are Maohua Lin, Ph.D., first author and a post-doctoral scholar; Rudy Paul, a graduate student; and Moaed Abd, Ph.D., a recent graduate; all from the 91Ƭ College of Engineering and Computer Science; James Jones, Boise State University; Darryl Dieujuste, a graduate research assistant, 91Ƭ College of Engineering and Computer Science; and Harvey Chim, M.D., a professor in the Division of Plastic and Reconstructive Surgery at the University of Florida.    

This research was supported by the National Institute of Biomedical Imaging and Bioengineering of the National Institutes of Health (NIH), the National Institute of Aging of the NIH and the National Science Foundation. This research was supported in part by a seed grant from the 91Ƭ College of Engineering and Computer Science and the 91Ƭ Institute for Sensing and Embedded Network Systems Engineering (I-SENSE).

The glove was programmed to feel the difference between correct and incorrect versions of the well-known tune, "Mary Had a Little Lamb," played on the piano.

-91Ƭ-