Review of Optical and Inertial Technologies for Lower Body Motion Capture
Abstract
Motion capture systems play a vital role in diverse fields of knowledge, impacting multimedia content creation, sports, and assistive technologies in health sciences. This article offers an overview of human motion capture technologies with a focus on lower body analysis. Recent advancements are presented by compiling papers selected through the PRISMA method. The data were searched and extracted from various databases such as MDPI, IEEE Xplore, and Scopus, narrowing the observation window from 2018 to 2022, where 24 articles form the core of the review. The results indicate that 41% of the papers deal with technologies based on optical sensors, 24% with technologies based on MEMS sensors, 14% with technologies based on radar sensors, and 21% report the use of MEMS and optical sensor technologies. The technologies used were analyzed for their characteristics, motion information processing techniques, the type of results derived from each system, and their final applications. Based on this review, it is evident that there is a need to develop robust wearable systems that allow the fusion of different types of sensors to compensate for the weaknesses of each motion capture technology. In conclusion, the current state of motion capture technology emphasizes the development of health and rehabilitation applications.
Keywords: artificial vision, Doppler, gait analysis, inertial systems, microelectromechanical sensors, motion capture systems, optical systems.
Full Text:
PDFReferences
CALLEJAS-CUERVO M, VÉLEZ-GUERRERO M A, and ALARCÓN-ALDANA A C. Proposal for Gait Analysis Using Fusion of Inertial-Magnetic and Optical Sensors. Revista EIA, 2020, 17(34): 1-11, https://doi.org/10.24050/reia.v17i34.1472.
AZHAND A, RABE S, MÜLLER S, SATTLER I, and HEIMANN-STEINERT A. Algorithm based on one monocular video delivers highly valid and reliable gait parameters. Scientific Reports, 2021, 11(1): 1-10, https://doi.org/10.1038/s41598-021-93530-z.
FLERON M K, UBBESEN N C H, BATTISTELLA F, et al. Accuracy between optical and inertial motion capture systems for assessing trunk speed during preferred gait and transition periods. Sports Biomechanics, 2019, 18(4): 366-377, https://doi.org/10.1080/14763141.2017.1409259.
GUTTA V, BADDOUR N, FALLAVOLLITA P, and LEMAIRE E. Multiple depth sensor setup and synchronization for marker-less 3D human foot tracking in a hallway. Proceedings of the 2019 IEEE/ACM 1st International Workshop on Software Engineering for Healthcare, SEH 2019). 2019, 77-80, https://doi.org/10.1109/SEH.2019.00021.
ESTEVA A, et al. Deep learning-enabled medical computer vision. NPJ Digital Medicine, 2021, 4(1): 1-9, https://doi.org/ 10.1038/s41746-020-00376-2.
Physiopedia. The emerging role of Microsoft Kinect in physiotherapy rehabilitation for stroke patients. https://www.physio-pedia.com/The_emerging_role_of_Microsoft_Kinect_in_physiotherapy_rehabilitation_for_stroke_patients (accessed Jun. 15, 2022).
MIRON A, SADAWI N, ISMAIL W, et al. Intellirehabds (Irds)—a dataset of physical rehabilitation movements. Data (Basel), 2021, 6(5): 1–13, https://doi.org/10.3390/DATA6050046.
CHIANG A T, CHEN Q, WANG Y, and FU M R. Kinect-Based In-Home Exercise System for Lymphatic Health and Lymphedema Intervention. IEEE Journal of Translational Engineering in Health and Medicine, 2018, 6(October): 1-13, https://doi.org/10.1109/JTEHM.2018.2859992.
ARMITANO-LAGO C, WILLOUGHBY D, and KIEFER A W. A SWOT Analysis of Portable and Low-Cost Markerless Motion Capture Systems to Assess Lower-Limb Musculoskeletal Kinematics in Sport. Frontiers in Sports and Active Living, 2022, 3(January): 1-14, https://doi.org/10.3389/fspor.2021.809898.
YADAV S K, SINGH A, GUPTA A, and RAHEJA J L. Real-time Yoga recognition using deep learning. Neural Computing and Applications, 2019, 31(12): 9349-9361, https://doi.org/10.1007/s00521-019-04232-7.
BLANCHARD N, SKINNER K, KEMP A, et al. Keep me in, Coach!’: A computer vision perspective on assessing ACL injury risk in female athletes. Proceedings of the 2019 IEEE Winter Conference on Applications of Computer Vision, WACV 2019): 1366-1374, https://doi.org/10.1109/WACV.2019.00150.
VALENCIA-MARIN C K, PULGARIN-GIRALDO J D, VELASQUEZ-MARTINEZ L F, et al. An enhanced joint Hilbert embedding-based metric to support mocap data classification with preserved interpretability. Sensors, 2021, 21(13): 1-17, https://doi.org/10.3390/s21134443.
CLARK R A, MENTIPLAY B F, HOUGH E, and PUA Y H. Three-dimensional cameras and skeleton pose tracking for physical function assessment: A review of uses, validity, current developments and Kinect alternatives. Gait Posture, 2019, 68: 193-200, https://doi.org/10.1016/j.gaitpost.2018.11.029.
JI X, ZHAO Q, CHENG J, and MA C. Exploiting spatio-temporal representation for 3D human action recognition from depth map sequences. Knowledge-Based Systems, 2021, 227: 107040, https://doi.org/10.1016/j.knosys.2021.107040.
ZHANG L, XIA H, and QIAO Y. Texture Synthesis Repair of RealSense D435i Depth Images with Object-Oriented RGB Image Segmentation. Sensors, 2020, 20(23): 6725, https://doi.org/10.3390/s20236725.
URRUTIA G, and BONFILL X. PRISMA declaration: A proposal to improve the publication of systematic reviews and meta-analyses. Medicina Clínica, 2010, 135(11): 507-511. https://doi.org/10.1016/j.medcli.2010.01.015
KIM D, KIM D H, and KWAK K C. Classification of K-pop dance movements based on skeleton information obtained by a Kinect sensor. Sensors (Switzerland), 2017, 17(6): 1261, https://doi.org/10.3390/s17061261.
CHATZITOFIS A, ZARPALAS D, KOLLIAS S, and DARAS P. Deepmocap: Deep optical motion capture using multiple depth sensors and retro-reflectors. Sensors (Switzerland), 2019, 19(2): 1-26, https://doi.org/10.3390/s19020282.
LEE J N, BYEON Y H, and Kwak K C. Design of ensemble stacked auto-encoder for classification of horse gaits with MEMS inertial sensor technology. Micromachines (Basel), 2018, 9(8): 411, https://doi.org/10.3390/mi9080411.
KICO I, GRAMMALIDIS N, CHRISTIDIS Y, and LIAROKAPIS F. Digitization and visualization of folk dances in cultural heritage: A review. Inventions, 2018, 3(4): 72, https://doi.org/10.3390/inventions3040072.
FAISAL A I, MAJUMDER S, MONDAL T, COWAN D, and NASEH S. Monitoring Methods of Human Body Joints. Sensors (Switzerland), 2019, 19(11): 2629; https://doi.org/10.3390/s19112629.
YANG C, et al. Physical extraction and feature fusion for multi-mode signals in a measurement system for patients in rehabilitation exoskeleton. Sensors (Switzerland), 2018, 18(8): 2588, https://doi.org/10.3390/s18082588.
KAICHI T, MARUYAMA T, TADA M, and SAITO H. Resolving position ambiguity of imu-based human pose with a single RGB camera. Sensors (Switzerland), 2020, 20(19): 5453, https://doi.org/10.3390/s20195453.
CIMOLIN V, et al. Computation of Gait Parameters in Post Stroke and Parkinson’s Disease: A Comparative Study Using RGB‐D Sensors and Optoelectronic Systems. Sensors, 2022, 22(3): 824, https://doi.org/10.3390/s22030824.
PHAM T T, and SUH Y S. Spline function simulation data generation for walking motion using foot-mounted inertial sensors. Electronics (Switzerland), 2019, 8(1): 18, https://doi.org/10.3390/electronics8010018.
ANCANS A, GREITANS M, CACURS R, et al. Wearable sensor clothing for body movement measurement during physical activities in healthcare. Sensors, 2021, 21(6): 2068, https://doi.org/10.3390/s21062068.
SOLTANINEJAD S, CHENG I, and BASU A. Kin-FOG: Automatic simulated freezing of gait (FOG) assessment system for Parkinson’s disease. Sensors (Switzerland), 2019, 19(10): 2416, https://doi.org/10.3390/s19102416.
EHATISHAM-UL-HAQ M, et al. Robust Human Activity Recognition Using Multimodal Feature-Level Fusion. IEEE Access, 2019, 7: 60736-60751, https://doi.org/10.1109/ACCESS.2019.2913393.
PONCE H, MARTÍNEZ-VILLASEÑOR L, and NUÑEZ-MARTÍNEZ J. Sensor location analysis and minimal deployment for fall detection system. IEEE Access, 2020, 8: 166678-166691, https://doi.org/10.1109/ACCESS.2020.3022971.
QIU S, WANG Z, ZHAO H, LIU L, and JIANG Y. Using Body-Worn Sensors for Preliminary Rehabilitation Assessment in Stroke Victims with Gait Impairment. IEEE Access, 2018, 6: 31249-31258, https://doi.org/10.1109/ACCESS.2018.2816816.
LI W, TAN B, and PIECHOCKI R. Passive Radar for Opportunistic Monitoring in E-Health Applications. IEEE Journal of Translational Engineering in Health and Medicine, 2018, 6: 1-10, https://doi.org/10.1109/JTEHM.2018.2791609.
LI H, SHRESTHA A, HEIDARI H, et al. Magnetic and Radar Sensing for Multimodal Remote Health Monitoring. IEEE Sensors Journal, 2019, 19(20): 8979-8989, https://doi.org/10.1109/JSEN.2018.2872894.
CLIMENT-PÉREZ P, SPINSANTE S, MIHAILIDIS A, and FLOREZ-REVUELTA F. A review on video-based active and assisted living technologies for automated lifelogging. Expert Systems with Applications, 2020, 139: 112847, https://doi.org/10.1016/j.eswa.2019.112847.
DEBNATH B, O’BRIEN M, YAMAGUCHI M, and BEHERA A. A review of computer vision-based approaches for physical rehabilitation and assessment. Multimedia Systems, 2022, 28: 209-239, https://doi.org/10.1007/s00530-021-00815-4.
COLYER S L, EVANS M, COSKER D P, and SALO A I T. A Review of the Evolution of Vision-Based Motion Analysis and the Integration of Advanced Computer Vision Methods towards Developing a Markerless System. Sports Medicine – Open, 2018, 4: 24, https://doi.org/10.1186/s40798-018-0139-y
MORO M, MARCHESI G, HESSE F, et al. Markerless vs. Marker-Based Gait Analysis: A Proof of Concept Study. Sensors, 2022, 22(5): 2011, https://doi.org/10.3390/s22052011
SZCZĘSNA A, BŁASZCZYSZYN M, and PAWLYTA M. Optical motion capture dataset of selected techniques in beginner and advanced Kyokushin karate athletes. Scientific Data, 2021, 8(1): 2-8, https://doi.org/10.1038/s41597-021-00801-5.
GHADI Y, AKHTER I, ALARFAJ M, et al. Syntactic model-based human body 3D reconstruction and event classification via association based features mining and deep learning. PeerJ Computer Science, 2021, 7: 1-36, https://doi.org/10.7717/PEERJ-CS.764.
ECHEVERRIA J, and SANTOS O C. Toward modeling psychomotor performance in karate combats using computer vision pose estimation. Sensors, 2021, 21(24): 1–27, https://doi.org/10.3390/s21248378.
CAMOMILLA V, BERGAMINI E, FANTOZZI S, and VANNOZZI G. Trends supporting the in-field use of wearable inertial sensors for sport performance evaluation: A systematic review. Sensors (Switzerland), 2018, 18(3): 873 https://doi.org/10.3390/s18030873.
YUAN Q, ASADI E, LU Q, et al. Uncertainty-Based IMU Orientation Tracking Algorithm for Dynamic Motions. IEEE/ASME Transactions on Mechatronics, 2019, 24(2): 872-882, https://doi.org/10.1109/TMECH.2019.2892069.
KOK M, HOL J D, and SCHÖN T B. Using inertial sensors for position and orientation estimation. Foundations and Trends in Signal Processing, 2017, 11(1-2): 1-153, https://doi.org/10.1561/2000000094.
AHUJA K, JIANG Y, GOEL M, and HARRISON C. Vid2doppler: Synthesizing doppler radar data from videos for training privacy-preserving activity recognition. Proceedings of the Conference on Human Factors in Computing Systems - 2021, https://doi.org/10.1145/3411764.3445138.
SONG Y, JIN T, DAI Y, et al. Through-wall human pose reconstruction via UWB mimo radar and 3D CNN. Remote Sensing (Basel), 2021, 13(2): 1-22, https://doi.org/10.3390/rs13020241.
ELAOUD A, BARHOUMI W, DRIRA H, and ZAGROUBA E. Weighted linear combination of distances within two manifolds for 3D human action recognition. VISIGRAPP 2019 - Proceedings of the 14th International Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications, 2019, 5: 693-703, https://doi.org/10.5220/0007369006930703.
Refbacks
- There are currently no refbacks.