Document Type

Journal Article

Publication Title

Sensors

Publisher

MDPI

School

Western Australian Academy of Performing Arts

RAS ID

30575

Comments

Hendry, D., Leadbetter, R., McKee, K., Hopper, L., Wild, C., O'Sullivan, P., ... & Campbell, A. (2020). An exploration of machine-learning estimation of ground reaction force from wearable sensor data. Sensors, 20(3), Article 740. https://doi.org/10.3390/s20030740

Abstract

This study aimed to develop a wearable sensor system, using machine‐learning models, capable of accurately estimating peak ground reaction force (GRF) during ballet jumps in the field. Female dancers (n = 30) performed a series of bilateral and unilateral ballet jumps. Dancers wore six ActiGraph Link wearable sensors (100 Hz). Data were collected simultaneously from two AMTI force platforms and synchronised with the ActiGraph data. Due to sensor hardware malfunctions and synchronisation issues, a multistage approach to model development, using a reduced data set, was taken. Using data from the 14 dancers with complete multi‐sensor synchronised data, the best single sensor was determined. Subsequently, the best single sensor model was refined and validated using all available data for that sensor (23 dancers). Root mean square error (RMSE) in body weight (BW) and correlation coefficients (r) were used to assess the GRF profile, and Bland–Altman plots were used to assess model peak GRF accuracy. The model based on sacrum data was the most accurate single sensor model (unilateral landings: RMSE = 0.24 BW, r = 0.95; bilateral landings: RMSE = 0.21 BW, r = 0.98) with the refined model still showing good accuracy (unilateral: RMSE = 0.42 BW, r = 0.80; bilateral: RMSE = 0.39 BW, r = 0.92). Machine‐learning models applied to wearable sensor data can provide a field‐based system for GRF estimation during ballet jumps.

DOI

10.3390/s20030740

Creative Commons License

Creative Commons Attribution 4.0 License
This work is licensed under a Creative Commons Attribution 4.0 License.

Share

 
COinS