Hand Gesture Classification using Sensors from a Commercial Smartwatch
Hand Gesture Classification using Sensors from a Commercial Smartwatch
This motivation for this idea stemmed from prior works, the Gait project, and the sEMG Gesture Classification project. I was keen to explore further health applications with wearable technologies. Utilizing wearables lowers the barrier to adoption and would enable more effective out-of-clinic patient monitoring. With that idea in mind, I opted to use a commercial smartwatch for this project.
A wrist-worn wearable can be used for wrist, hand, and finger gesture recognition. Literature has shown that motion sensors (accelerometers and gyroscopes, in this case), can be used to measure changes in orientation during gesture performance. I proposed a multi-modal signal combination of acceleration, angular velocity, and the novel addition of blood volume measurements from the photoplethysmograph sensor, to develop a CNN-based classification model.
This pairing of a generalizable gesture recognition application with a commercial, off-the-shelf smartwatch would hopefully provide an easily accessible, comfortable, and practical method for monitoring mobility-impaired patients during their recovery at home. Furthermore, this initiative would provide physicians with valuable, previously inaccessible data to facilitate more patient-targeted rehabilitation programs.