TY - GEN
T1 - Dynamic Motion Tracking Based on Point Cloud Matching with Personalized Body Segmentation
AU - Ono, Tomoko
AU - Eguchi, Ryo
AU - Takahashi, Masaki
N1 - Funding Information:
*This work was supported by JSPS KAKENHI Grant Number JP16H04290.
Publisher Copyright:
© 2020 IEEE.
PY - 2020/11
Y1 - 2020/11
N2 - Falling is a serious problem with the growing elderly population. In this sense, clinical institutions have implemented motor function assessment programs. In particular, the timed up and go test (TUG) is the most frequently applied clinical trial to assess the elderly walking ability in many clinical institutions and communities. In this study, we proposed a gait measurement system that can evaluate motor function in dynamic gait tests, such as the TUG test, using the point clouds of depth sensors (Kinect). The TUG test is a dynamic task that includes 3m of walking and turning motion. However, estimating joint positions using conventional methods that use Kinect skeleton function or point clouds is difficult. To solve these problems, before applying the iterative closest point algorithm, we proposed a method to move the segment model to a pre-estimated position and perform matching. In the accuracy verification experiments of several young people, the average error of each joint position was less than approximately 0.03 m, and the average error of the knee angle was approximately 4.54 to 5.13 degrees. These results indicate that the values estimated by the proposed method are useful as values for evaluating clinical tasks.
AB - Falling is a serious problem with the growing elderly population. In this sense, clinical institutions have implemented motor function assessment programs. In particular, the timed up and go test (TUG) is the most frequently applied clinical trial to assess the elderly walking ability in many clinical institutions and communities. In this study, we proposed a gait measurement system that can evaluate motor function in dynamic gait tests, such as the TUG test, using the point clouds of depth sensors (Kinect). The TUG test is a dynamic task that includes 3m of walking and turning motion. However, estimating joint positions using conventional methods that use Kinect skeleton function or point clouds is difficult. To solve these problems, before applying the iterative closest point algorithm, we proposed a method to move the segment model to a pre-estimated position and perform matching. In the accuracy verification experiments of several young people, the average error of each joint position was less than approximately 0.03 m, and the average error of the knee angle was approximately 4.54 to 5.13 degrees. These results indicate that the values estimated by the proposed method are useful as values for evaluating clinical tasks.
UR - http://www.scopus.com/inward/record.url?scp=85095602129&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85095602129&partnerID=8YFLogxK
U2 - 10.1109/BioRob49111.2020.9224438
DO - 10.1109/BioRob49111.2020.9224438
M3 - Conference contribution
AN - SCOPUS:85095602129
T3 - Proceedings of the IEEE RAS and EMBS International Conference on Biomedical Robotics and Biomechatronics
SP - 61
EP - 67
BT - 2020 8th IEEE RAS/EMBS International Conference for Biomedical Robotics and Biomechatronics, BioRob 2020
PB - IEEE Computer Society
T2 - 8th IEEE RAS/EMBS International Conference for Biomedical Robotics and Biomechatronics, BioRob 2020
Y2 - 29 November 2020 through 1 December 2020
ER -