TY - GEN
T1 - Multi-touch steering wheel for in-car tertiary applications using infrared sensors
AU - Koyama, Shunsuke
AU - Sugiura, Yuta
AU - Ogata, Masa
AU - Withana, Anusha
AU - Uema, Yuji
AU - Honda, Makoto
AU - Yoshizu, Sayaka
AU - Sannomiya, Chihiro
AU - Nawa, Kazunari
AU - Inami, Masahiko
N1 - Copyright:
Copyright 2014 Elsevier B.V., All rights reserved.
PY - 2014
Y1 - 2014
N2 - This paper proposes a multi-touch steering wheel for in-car tertiary applications. Existing interfaces for in-car applications such as buttons and touch displays have several operating problems. For example, drivers have to consciously move their hands to the interfaces as the interfaces are fixed on specific positions. Therefore, we developed a steering wheel where touch positions can correspond to different operating positions. This system can recognize hand gestures at any position on the steering wheel by utilizing 120 infrared (IR) sensors embedded in it. The sensors are lined up in an array surrounding the whole wheel. An Support Vector Machine (SVM) algorithm is used to learn and recognize the different gestures through the data obtained from the sensors. The gestures recognized are flick, click, tap, stroke and twist. Additionally, we implemented a navigation application and an audio application that utilizes the torus shape of the steering wheel. We conducted an experiment to observe the possibility of our proposed system to recognize flick gestures at three positions. Results show that an average of 92% of flick could be recognized.
AB - This paper proposes a multi-touch steering wheel for in-car tertiary applications. Existing interfaces for in-car applications such as buttons and touch displays have several operating problems. For example, drivers have to consciously move their hands to the interfaces as the interfaces are fixed on specific positions. Therefore, we developed a steering wheel where touch positions can correspond to different operating positions. This system can recognize hand gestures at any position on the steering wheel by utilizing 120 infrared (IR) sensors embedded in it. The sensors are lined up in an array surrounding the whole wheel. An Support Vector Machine (SVM) algorithm is used to learn and recognize the different gestures through the data obtained from the sensors. The gestures recognized are flick, click, tap, stroke and twist. Additionally, we implemented a navigation application and an audio application that utilizes the torus shape of the steering wheel. We conducted an experiment to observe the possibility of our proposed system to recognize flick gestures at three positions. Results show that an average of 92% of flick could be recognized.
KW - Automobile
KW - Gesture Recognition
KW - Infrared Sensor
KW - Interaction Design
KW - Multi-touch
KW - Torus Interface
UR - http://www.scopus.com/inward/record.url?scp=84899785830&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=84899785830&partnerID=8YFLogxK
U2 - 10.1145/2582051.2582056
DO - 10.1145/2582051.2582056
M3 - Conference contribution
AN - SCOPUS:84899785830
SN - 9781450327619
T3 - ACM International Conference Proceeding Series
BT - Proceedings of the 5th Augmented Human International Conference, AH 2014
PB - Association for Computing Machinery
T2 - 5th Augmented Human International Conference, AH 2014
Y2 - 7 March 2014 through 8 March 2014
ER -