TY - JOUR
T1 - Using the virtual data-driven measurement to support the prototyping of hand gesture recognition interface with distance sensor
AU - Xia, Chengshuo
AU - Saito, Ayane
AU - Sugiura, Yuta
N1 - Funding Information:
This work was supported by Japan Science and Technology Agency (JST) PRESTO Grant number JPMJPR2134 . We would like to thank the support from Kosuke Kikui on transfer learning design.
Publisher Copyright:
© 2022 The Authors
PY - 2022/5/1
Y1 - 2022/5/1
N2 - The performance of a fixed-sensor-based hand gesture recognition system is typically influenced by the position and number of sensors. The traditional development approach to hand gesture recognition systems follows a process of sensor pre-deployment, data collection, and model training, which is highly time-consuming and expensive to calculate how many sensors to place where, and the system has low secondary development flexibility. In this paper, we present a new development flow to assist in prototyping distance sensor-based gesture recognition interfaces. The designed system was able to simulate the position and number of sensors to recognize gestures. Using a reconstructed hand motion, the virtual distance sensor generated simulated signals and trained a convolutional neural network model. In a real-world setting, the sensor system only needed to be configured by transfer learning to recognize gestures at the same sensor layout. The proposed method was able to indicate sensor configuration and the trained classifier via virtual distance data, which can effectively reduce the development cost. We evaluated two prototype interfaces of the proposed method using distance sensors and demonstrated that the system effectively provided deployment recommendations, and models trained using virtual measurement data could effectively recognize real gestures.
AB - The performance of a fixed-sensor-based hand gesture recognition system is typically influenced by the position and number of sensors. The traditional development approach to hand gesture recognition systems follows a process of sensor pre-deployment, data collection, and model training, which is highly time-consuming and expensive to calculate how many sensors to place where, and the system has low secondary development flexibility. In this paper, we present a new development flow to assist in prototyping distance sensor-based gesture recognition interfaces. The designed system was able to simulate the position and number of sensors to recognize gestures. Using a reconstructed hand motion, the virtual distance sensor generated simulated signals and trained a convolutional neural network model. In a real-world setting, the sensor system only needed to be configured by transfer learning to recognize gestures at the same sensor layout. The proposed method was able to indicate sensor configuration and the trained classifier via virtual distance data, which can effectively reduce the development cost. We evaluated two prototype interfaces of the proposed method using distance sensors and demonstrated that the system effectively provided deployment recommendations, and models trained using virtual measurement data could effectively recognize real gestures.
KW - Hand gesture recognition
KW - Human-computer interface
KW - Infrared distance sensor
KW - Interface prototyping
KW - Virtual sensor
UR - http://www.scopus.com/inward/record.url?scp=85125494049&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85125494049&partnerID=8YFLogxK
U2 - 10.1016/j.sna.2022.113463
DO - 10.1016/j.sna.2022.113463
M3 - Article
AN - SCOPUS:85125494049
SN - 0924-4247
VL - 338
JO - Sensors and Actuators A: Physical
JF - Sensors and Actuators A: Physical
M1 - 113463
ER -