Receive a weekly summary and discussion of the top papers of the week by leading researchers in the field.

In Journal of biomechanical engineering ; h5-index 32.0

Exoskeletons have decreased physical effort and increased comfort in activities of daily living (ADL) such as walking, squatting and running. However this assistance is often activity specific and does not accommodate a wide variety of different activities. To overcome this limitation and increase the scope of exoskeleton application, an automatic Human Activity Recognition (HAR) system is necessary. We developed two deep-learning models for HAR using 1D-Convolutional Neural Network (CNN) and a hybrid model using CNNs and Long-Short Term Memory (LSTM). We trained both models using the data collected from a single 3-axis accelerometer placed on the chest of ten subjects. We were able to classify five different activities, standing, walking on level ground, walking on an incline, running, and squatting with an accuracy of 98.1% and 97.8%, respectively. A two subject real-time validation trial was also conducted to validate the real-time applicability of the system The real-time accuracy was measured at 96.6% and 97.2% for the CNN and the hybrid model, respectively. The high classification accuracy in the test and real-time evaluation suggest that a single sensor could be used to distinguish human activities using machine-learning-based models.

Vakacherla Sai Siddarth, Kantharaju Prakyath, Mevada Meet, Kim Myunghee

2023-Jan-01