Abstract
In order to realize a vision system for an autonomous mobile robot used in a human living environment, it is necessary to observe human behaviors and to react to those actions. In this paper, we propose a real-time human tracking method based on vision system for an autonomous mobile robot. First, the system detects body parts as moving areas in the scene, and a face region or region specific to human is extracted in the detected area using color. Next, the facial gesture and head gesture are recognized. We implement the vision system on a mobile robot, and experimentally show the system can detect and track a human and his face in real-time.
Original language | English |
---|---|
Pages | 442-449 |
Number of pages | 8 |
Publication status | Published - 2001 Dec 1 |
Externally published | Yes |
Event | 10th IEEE International Workshop on Robot and Human Communication - Bordeaux-Paris, France Duration: 2001 Sept 18 → 2001 Sept 21 |
Other
Other | 10th IEEE International Workshop on Robot and Human Communication |
---|---|
Country/Territory | France |
City | Bordeaux-Paris |
Period | 01/9/18 → 01/9/21 |
ASJC Scopus subject areas
- Hardware and Architecture
- Software