Skip To Main Content
exoskeleton

Rana Soltani and Amin Zeiaee, both second-year Ph.D. students in the Department of Mechanical Engineering at Texas A&M University, are developing an intelligent rehabilitation device that will provide automated therapy to stroke patients to expedite their recovery. Sungtae Shin, a fifth-year Ph.D. candidate in the same research group, is working on using biosignals to recognize human hand gestures. This technology has various applications in entertainment systems, robotics and rehabilitation devices.

Soltani and Zeiaee recognize that treatment for stroke victims or others with limited mobility is very expensive and inadequate due to limited number of therapists and short therapy sessions. Also, treatment is only available outside of the home at a medical or rehabilitation facility.

“Therapy must be very repetitive and intensive to be effective and we want to develop a robot that can improve the quality of training for patients and assist the therapists,” said Zeiaee.

The device would train the arm for activities of daily living, such as brushing one’s teeth, eating, brushing hair, cleaning a surface and more. 

“As motor abilities of a patient improve, the robot adaptively adjusts the provided support,” said Soltani. “It will be a robot collaborating with a human.” 

The two also intend to involve virtual reality and gaming into a patient’s utilization of the device, making therapy a more engaging and pleasant activity.

A year-out, early project goals include fabrication of the device, obtaining patents and testing the device on stroke patients in Qatar clinics.

Device ergonomics and intelligence are expected to rival any competition. With the ultimate goal of making this an in-home system that is affordable to anyone, the team is searching ways to make the device more compact and lightweight. 

A future goal of the team is the incorporation of intention detection methods into the device. Shin’s current research on processing biosignals is a possibility for this goal. To demonstrate the viability of using biosignals for the control of robots, Shin has developed a system.

“Collected biosignals and orientation information of the forearm — from the MyoTM armband — are used to manipulate the robot arm. Moreover, hand gestures such as finger snapping can be recognized by interpreting the biosignals and can then be used to control a robot arm,” said Shin, while performing a demonstration with the apparatus he has developed.

Dr. Reza Langari, professor of mechanical engineering and head of the Department of Engineering Technology & Industrial Distribution, leads the research group.

Soltani and Zeaiee’s work is sponsored by Qatar National Research Fund and includes collaborators from Texas A&M University-Qatar (Dr. Reza Tafreshi), the Department of Health and Kinesiology at Texas A&M (Dr. John Buchanan), California State University at Fullerton/UC Irvine (Dr. Nina Robson) and Hamad Medical Center in Doha, Qatar (Drs. Al-Yazidi and Loganathan).