In Journal of biomechanics
The difficulty of estimating joint kinematics remains a critical barrier toward widespread use of inertial measurement units in biomechanics. Traditional sensor-fusion filters are largely reliant on magnetometer readings, which may be disturbed in uncontrolled environments. Careful sensor-to-segment alignment and calibration strategies are also necessary, which may burden users and lead to further error in uncontrolled settings. We introduce a new framework that combines deep learning and top-down optimization to accurately predict lower extremity joint angles directly from inertial data, without relying on magnetometer readings. We trained deep neural networks on a large set of synthetic inertial data derived from a clinical marker-based motion-tracking database of hundreds of subjects. We used data augmentation techniques and an automated calibration approach to reduce error due to variability in sensor placement and limb alignment. On left-out subjects, lower extremity kinematics could be predicted with a mean (±STD) root mean squared error of less than 1.27° (±0.38°) in flexion/extension, less than 2.52° (±0.98°) in ad/abduction, and less than 3.34° (±1.02°) internal/external rotation, across walking and running trials. Errors decreased exponentially with the amount of training data, confirming the need for large datasets when training deep neural networks. While this framework remains to be validated with true inertial measurement unit data, the results presented here are a promising advance toward convenient estimation of gait kinematics in natural environments. Progress in this direction could enable large-scale studies and offer new perspective into disease progression, patient recovery, and sports biomechanics.
Rapp Eric, Shin Soyong, Thomsen Wolf, Ferber Reed, Halilaj Eni
Gait, Inertial measurement units, Kinematics, Neural networks, Wearable sensors