03/25/2025
By Danielle Fretwell
The Francis College of Engineering, Department of Electrical and Computer Engineering, invites you to attend a Master's Thesis defense by Gayathri Boopathy on "Hand Tracking and Gesture Classification Using Augmented Reality Technology and Machine Learning Algorithms."
Date: Friday, March 28, 2025
Time: 2 - 3 p.m.
Location: Falmouth 203 - CACT Lab
Committee:
Advisor: Dr. Kavitha Chandra, Eng.D, Associate Dean for Undergraduate Affairs and Professor of Electrical and Computer Engineering, UMass Lowell
Committee Members*
Charles Thompson, Ph.D, Professor of Electrical and Computer Engineering, UMass Lowell
Erika Lewis, Ph.D, Associate Professor of Physical Therapy and Kinesiology, UMass Lowell
Orlando Arias, Ph.D, Assistant Professor of Electrical and Computer Engineering, UMass Lowell
Abstract:
This thesis investigates the application of augmented reality technology as a digital health solution for physical therapy and rehabilitation of hand mobility. Physical therapy relies on tracking joint flexibility, range of motion and neuromuscular coordination. Quantifying the degree to which bone joints of fingers can extend is an important consideration in the progression of the therapy. The Magic Leap 2 AR device and its hand-tracking software was utilized to capture the three-dimensional positions of the bone joints of each of the fingers as the user executes different types of gestures. The time-series of the joint positions was processed to estimate the dynamics of the angles that the bone joints traverse during the gesture. AR-based tracking can enhance static measurements made with instruments such as goniometers by providing a dynamical measure of the joint angles.
The key contribution of this work is the application of machine learning algorithms to classify hand movements using time-series data captured from the AR device. The feature analysis incorporates the 3D position of bone joints, inter-joint distances, and joint angles for movement classification. The gesture was classified into a sequence of states that captured the hand movement during open, extension, flexion and close actions. The Random Forest algorithm demonstrated the highest accuracy in classifying states. The time-series of angle dynamics was further applied to distinguish between various levels of flexion and extension such as hyper-mobility, hyperextension, extension, mild flexion, moderate flexion, deep flexion, full flexion and max flexion.
This interdisciplinary study, combining AR, ML, and biomechanics of the hand, has the potential to advance point-of-care digital health solutions for the millions of people recovering from strokes and injuries. Future work will focus on enabling real-time visual feedback and artificial intelligence based systems to motivate the AR user towards daily therapeutical practice. The integration of AI to automate gesture recognition and provide real-time guidance to the user will also be explored.