Extraction of Parameters for 90-degree Turn Prediction Using the IMU-based Motion Capture System

Extraction of Parameters for 90-degree Turn Prediction Using the IMU-based Motion Capture System

Ami Ogawa, Kanako Takeda, Akira Mita

download PDF

Abstract. Against the increasing number of single households, we have been proposing the “Biofied Building” that provides a safe, secure, and comfortable living space for a resident using a small home robot. The robot can be used for real-time sensing of the resident’s position and behavior. On the other hand, for further use of the robot, such as choosing a path that does not disturb the resident, a phase to predict the resident’s behavior is necessary. Walking, which is one of the most basic activities of daily living, is often targeted in studies of motion prediction. However, most of them deal with steady walking, even though walking in daily life includes unsteady walking such as the turning motion. Therefore, the purpose of this study was to extract the prediction parameters to construct a prediction method for the unsteady 90-degree turn. In this study, we explored the effective prediction parameters for 90-degree turns based on the measured data using the inertial measurement unit (IMU) based motion capture system aiming to introduce the prediction of unsteady walking to the “Biofied Building”.

Motion Prediction, 90-degree Turn, IMU, Motion Capture System

Published online 2/20/2021, 8 pages
Copyright © 2021 by the author(s)
Published under license by Materials Research Forum LLC., Millersville PA, USA

Citation: Ami Ogawa, Kanako Takeda, Akira Mita, Extraction of Parameters for 90-degree Turn Prediction Using the IMU-based Motion Capture System, Materials Research Proceedings, Vol. 18, pp 241-248, 2021

DOI: https://doi.org/10.21741/9781644901311-29

The article was published as article 29 of the book Structural Health Monitoring

Content from this work may be used under the terms of the Creative Commons Attribution 3.0 licence. Any further distribution of this work must maintain attribution to the author(s) and the title of the work, journal citation and DOI.

[1] Y. Kimura, A. Ogawa, A. Mita, Research on Motion Prediction in a House using a Markerless and Non-contact Sensor, AIJ Journal of Technology and Design 26(63) (2020) 793-797. https://doi.org/10.3130/aijt.26.793
[2] A. Ogawa, A. Mita, Recognition of human activities using depth images of Kinect for biofied building, Sensors and Smart Structures Technologies for Civil, Mechanical, and Aerospace Systems 2015, International Society for Optics and Photonics, 2015, p. 94351U. https://doi.org/10.1117/12.2084079
[3] Y. Du, W. Wang, L. Wang, Hierarchical recurrent neural network for skeleton based action recognition, Proceedings of the IEEE conference on computer vision and pattern recognition, 2015, pp. 1110-1118.
[4] Y. Horiuchi, Y. Makino, H. Shinoda, Computational foresight: Forecasting human body motion in real-time for reducing delays in interactive system, Proceedings of the 2017 ACM International Conference on Interactive Surfaces and Spaces, 2017, pp. 312-317. https://doi.org/10.1145/3132272.3135076
[5] J. Martinez, M.J. Black, J. Romero, On human motion prediction using recurrent neural networks, Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, 2017, pp. 2891-2900. https://doi.org/10.1109/CVPR.2017.497
[6] B.C. Glaister, G.C. Bernatz, G.K. Klute, M.S. Orendurff, Video task analysis of turning during activities of daily living, Gait & posture 25(2) (2007) 289-294. https://doi.org/10.1016/j.gaitpost.2006.04.003
[7] R. Sedgman, P. Goldie, R. Iansek, Development of a measure of turning during walking, Advancing rehabilitation: Proceedings of the inaugural conference of the faculty of health sciences. La Trobe University, 1994.
[8] Z. He, Activity recognition from accelerometer signals based on wavelet-ar model, 2010 IEEE International Conference on Progress in Informatics and Computing, IEEE, 2010, pp. 499-502.
[9] T. Iwamoto, D. Sugimoto, M. Matsumoto, A study of identification of pedestrioan by using 3-axis accelerometer, Journal of Information Processing 55 (2014) 739-749.
[10] J. Niki, Y. Asai, F. Sugiyama, H. Morimoto, E.B.L. Ⅲ, E.G. Johnson, N. Kashiwa, I. Wada, The reliability of 6-axial sensor for body segmental movement of head, trunk and pelvis during straight walking and turning (in Japanese), The journal of health sciences, Nihon Fukushi University 19 (2016) 19-24.
[11] M. Shigeta, A. Sawatome, H. Ichikawa, H. Takemura, Correlation between Autistic Traits and Gait Characteristics while Two Persons Walk Toward Each Other, Advanced Biomedical Engineering 7 (2018) 55-62. https://doi.org/10.14326/abe.7.55
[12] NoitomLtd., Perception Neuron. https://neuronmocap.com. (Accessed 8.1 2020).
[13] R. Sers, S. Forrester, E. Moss, S. Ward, J. Ma, M. Zecca, Validity of the Perception Neuron inertial motion capture system for upper body motion analysis, Measurement 149 (2020) 107024. https://doi.org/10.1016/j.measurement.2019.107024
[14] A. Kondo, Basics of three-dimensional motion extraction Step 1: Measure straight/rotational motion with an accelerometer (in Japanese), Interface 42(9) (2016) 47-57.
[15] D. Han, V. Renaudin, M. Ortiz, Smartphone based gait analysis using STFT and wavelet transform for indoor navigation, 2014 International Conference on Indoor Positioning and Indoor Navigation (IPIN), IEEE, 2014, pp. 157-166. https://doi.org/10.1109/IPIN.2014.7275480
[16] A.E. Patla, S.D. Prentice, C. Robinson, J. Neufeld, Visual control of locomotion: strategies for changing direction and for going over obstacles, Journal of Experimental Psychology: Human Perception and Performance 17(3) (1991) 603. https://doi.org/10.1037/0096-1523.17.3.603

Motion Prediction, 90-degree Turn, IMU, Motion Capture System