TY - GEN
T1 - Experimental verification for motion control of a powered wheelchair using a gazing feature in an environment
AU - Ishizuka, Airi
AU - Yorozu, Ayanori
AU - Takahashi, Masaki
N1 - Funding Information:
This study has been supported by "A Framework PRINTEPS to Develop Practical Artificial Intelligence" of the Core Research for Evolutional Science and Technology (CREST) of the Japan Science and Technology Agency (JST).
Publisher Copyright:
© 2016 ACM.
PY - 2016/12/7
Y1 - 2016/12/7
N2 - This paper describes the motion control system for a powered wheelchair using a gaze in an unknown environment. Recently, new Human-Computer Interfaces (HCIs) that have replaced joysticks have been developed for a person with a disability of the upper body. In this paper, movement of the eyes is used as an HCI. The wheelchair control system proposed in this study aims to achieve an operation such that a passenger gazes towards the direction he or she wants to move in the unknown environment. The gazing feature of the passenger in the 3D environment is acquired in real time and the wheelchair is subsequently controlled. The features include the entrance of the area of the passage and the gazing feature is acquired by obtaining the features and the gazing point of the passenger. The acquired information about the direction in which the passenger wants to move becomes operation input to the wheelchair. The wheelchair is controlled by obtaining this operation input and the information of the environment. The conventional motion control system can perform safe and smooth movement by avoiding obstacles. The effectiveness of the proposed system is demonstrated through experiments in a real environment with three participants.
AB - This paper describes the motion control system for a powered wheelchair using a gaze in an unknown environment. Recently, new Human-Computer Interfaces (HCIs) that have replaced joysticks have been developed for a person with a disability of the upper body. In this paper, movement of the eyes is used as an HCI. The wheelchair control system proposed in this study aims to achieve an operation such that a passenger gazes towards the direction he or she wants to move in the unknown environment. The gazing feature of the passenger in the 3D environment is acquired in real time and the wheelchair is subsequently controlled. The features include the entrance of the area of the passage and the gazing feature is acquired by obtaining the features and the gazing point of the passenger. The acquired information about the direction in which the passenger wants to move becomes operation input to the wheelchair. The wheelchair is controlled by obtaining this operation input and the information of the environment. The conventional motion control system can perform safe and smooth movement by avoiding obstacles. The effectiveness of the proposed system is demonstrated through experiments in a real environment with three participants.
KW - Control System
KW - Eye Gaze Tracking
KW - Human-Computer Interface
KW - Powered Wheelchair
UR - http://www.scopus.com/inward/record.url?scp=85016401010&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85016401010&partnerID=8YFLogxK
U2 - 10.1145/3029610.3029614
DO - 10.1145/3029610.3029614
M3 - Conference contribution
AN - SCOPUS:85016401010
T3 - ACM International Conference Proceeding Series
SP - 147
EP - 151
BT - Proceedings of the 4th International Conference on Control, Mechatronics and Automation, ICCMA 2016
PB - Association for Computing Machinery
T2 - 4th International Conference on Control, Mechatronics and Automation, ICCMA 2016
Y2 - 7 December 2016 through 11 December 2016
ER -