Motion Trajectory Estimation for Hand Grasping States Using a Deep Learning Approach

Authors

  • Erdem Erdemir College of Engineering, Tennessee State University, USA
  • Erkan Kaplanoglu Engineering Management & Technology, Mechatronics, College of Engineering and Computer Science (CECS), University of Tennessee at Chattanooga, USA
  • Cihan Uyanik Department of Health Technology, Technical University of Denmark, Denmark
  • Gazi Akgun Engineering Management & Technology, Mechatronics, College of Engineering and Computer Science (CECS), University of Tennessee at Chattanooga, USA

DOI:

https://doi.org/10.47852/bonviewSWT52025659

Keywords:

trajectory estimation, deep learning, prosthesis

Abstract

Predicting the final grasp tendency at the start of movement in prosthetic hands is crucial for improved control. Biological data, such as 3D movement and muscle activity, have been using by researchers to predict the final grasp. Early prediction of the intended grasp allows the prosthetic device to initiate control actions before the motion is complete, resulting in faster and more intuitive responses. Most machine learning algorithms are trained to predict the gesture of the final grasp. The aim of this study is to accurately estimate the final grasp state using inertial measurement unit (IMU) data. This estimation, based on movement trajectories, will allow prosthetic devices to respond more quickly to user actions. Deep Learning model was trained using movement data collected from a prosthetic hand controlled certain gesture trajectories without any human involvement. Data such as acceleration, angular velocity, and orientation were gathered through IMU sensors to create 3D orientation matrices representing the movement process. A deep convolutional neural network was used for training, with data labeled by the final grasp states. The deep learning algorithm successfully predicted the final hand motion with 93% accuracy. This trained model enables the generation of smooth supervisory trajectories, facilitating faster and more accurate control of the prosthesis. The proposed model demonstrates significant potential in improving prosthetic hand control by predicting the final hand movement at an early stage of motion, contributing to more responsive and effective prosthetic devices.

 

Received: 12 March 2025 | Revised: 24 April 2025 | Accepted: 12 May 2025

 

Conflicts of Interest

The authors declare that they have no conflicts of interest to this work.

 

Data Availability Statement

The data that support the findings of this study are openly available in GitHub respiratory at https://github.com/BioAstLab/MotionTrajectoryEstimation.

 

Author Contribution Statement

Erdem Erdemir: Conceptualization, Methodology, Software, Investigation, Resources, Writing – review & editing, Visualization, Supervision, Project administration. Erkan Kaplanoglu: Conceptualization, Methodology, Validation, Formal analysis, Investigation, Data curation, Writing – original draft, Visualization, Project administration. Cihan Uyanik: Software, Validation, Formal analysis, Resources, Data curation, Writing – review & editing. Gazi Akgun: Validation, Data curation, Writing – original draft.

Downloads

Published

2025-05-27

Issue

Section

Research Article

How to Cite

Erdemir, E. ., Kaplanoglu, E. ., Uyanik, C. ., & Akgun, G. . (2025). Motion Trajectory Estimation for Hand Grasping States Using a Deep Learning Approach. Smart Wearable Technology. https://doi.org/10.47852/bonviewSWT52025659