TY - GEN
T1 - Software to Support Layout and Data Collection for Machine-Learning-Based Real-World Sensors
AU - Saito, Ayane
AU - Kawai, Wataru
AU - Sugiura, Yuta
N1 - Funding Information:
Acknowledgments. This work was supported by JST AIP-PRISM JPMJCR18Y2 and JST PRESTO JPMJPR17J4.
Publisher Copyright:
© Springer Nature Switzerland AG 2019.
PY - 2019
Y1 - 2019
N2 - There have been many studies of gesture recognition and posture estimation by combining real-world sensor and machine learning. In such situations, it is important to consider the sensor layout because the measurement result varies depending on the layout and the number of sensors as well as the motion to be measured. However, it takes time and effort to prototype devices multiple times in order to find a sensor layout that has high identification accuracy. Also, although it is necessary to acquire learning data for recognizing gestures, it takes time to get the data when the user changes the sensor layout. In this study, we developed software that can arrange real-world sensors. In this time, the software can handle distance-measuring sensors as real-world sensors. The user places these sensors freely in the software. The software measures the distance between the sensors and a mesh created from measurements of real-world deformation recorded by a Kinect. The classifier is generated using the time-series of distance data recorded by the software. In addition, we created a physical device that had the same sensor layout as the one designed with the software. We experimentally confirmed that the software could recognize the gestures on the physical device by using the generated classifier.
AB - There have been many studies of gesture recognition and posture estimation by combining real-world sensor and machine learning. In such situations, it is important to consider the sensor layout because the measurement result varies depending on the layout and the number of sensors as well as the motion to be measured. However, it takes time and effort to prototype devices multiple times in order to find a sensor layout that has high identification accuracy. Also, although it is necessary to acquire learning data for recognizing gestures, it takes time to get the data when the user changes the sensor layout. In this study, we developed software that can arrange real-world sensors. In this time, the software can handle distance-measuring sensors as real-world sensors. The user places these sensors freely in the software. The software measures the distance between the sensors and a mesh created from measurements of real-world deformation recorded by a Kinect. The classifier is generated using the time-series of distance data recorded by the software. In addition, we created a physical device that had the same sensor layout as the one designed with the software. We experimentally confirmed that the software could recognize the gestures on the physical device by using the generated classifier.
KW - Distance-measuring sensor
KW - Machine learning
KW - Sensor layout
UR - http://www.scopus.com/inward/record.url?scp=85069731452&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85069731452&partnerID=8YFLogxK
U2 - 10.1007/978-3-030-23528-4_28
DO - 10.1007/978-3-030-23528-4_28
M3 - Conference contribution
AN - SCOPUS:85069731452
SN - 9783030235277
T3 - Communications in Computer and Information Science
SP - 198
EP - 205
BT - HCI International 2019 - Posters - 21st International Conference, HCII 2019, Proceedings
A2 - Stephanidis, Constantine
PB - Springer Verlag
T2 - 21st International Conference on Human-Computer Interaction, HCI International 2019
Y2 - 26 July 2019 through 31 July 2019
ER -