User-defined gestures for controlling primitive motions of an end effector

Mahisorn Wongphati, Hirotaka Osawa, Michita Imai

Research output: Contribution to journalArticlepeer-review

4 Citations (Scopus)


In designing and developing a gesture recognition system, it is crucial to know the characteristics of a gesture selected to control, for example, an end effector of a robot arm. We conducted an experiment to collect a set of user-defined gestures and investigate characteristics of the gestures for controlling primitive motions of an end effector in human-robot collaboration. We recorded 152 gestures from 19 volunteers by presenting virtual robotic arm movements to the participants, and then asked the participants to think about and perform gestures that would cause the motions. It was found that the hands were the parts of the body used most often for gesture articulation even when the participants were holding tools and objects with both hands: a number of participants used one- and two-handed gestures interchangeably, gestures were consistently performed by the participants across all pairs of reversible gestures, and the participants expected better recognition performance for gestures that were easy to think of and perform. These findings are expected to be useful as guidelines in creating a gesture set for controlling robotic arms according to natural user behaviors.

Original languageEnglish
Pages (from-to)225-238
Number of pages14
JournalAdvanced Robotics
Issue number4
Publication statusPublished - 2015 Feb 16


  • end effector
  • helping hand robot
  • human-robot interaction
  • manual control
  • user-defined gesture

ASJC Scopus subject areas

  • Software
  • Human-Computer Interaction
  • Control and Systems Engineering
  • Hardware and Architecture
  • Computer Science Applications


Dive into the research topics of 'User-defined gestures for controlling primitive motions of an end effector'. Together they form a unique fingerprint.

Cite this