Real-time vision system for autonomous mobile robot

Masataka Doi, Manabu Nakakita, Yoshimitsu Aoki, Shuji Hashimoto

Research output: Chapter in Book/Report/Conference proceedingConference contribution

4 Citations (Scopus)

Abstract

In order to realize a vision system for an autonomous mobile robot used in a human living environment, it is necessary to observe human behaviors and to react to those actions. In this paper, we propose a real-time human tracking method based on vision system for an autonomous mobile robot. First, the system detects body parts as moving areas in the scene, and a face region or region specific to human is extracted in the detected area using color. Next, the facial gesture and head gesture are recognized. We implement the vision system on a mobile robot, and experimentally show the system can detect and track a human and his face in real-time.

Original languageEnglish
Title of host publicationProceedings - 10th IEEE International Workshop on Robot and Human Interactive Communication, ROMAN 2001
Pages442-449
Number of pages8
DOIs
Publication statusPublished - 2001
Externally publishedYes
Event10th IEEE International Workshop on Robot and Human Interactive Communication, ROMAN 2001 - Bordeaux and Paris, France
Duration: 2001 Sept 182001 Sept 21

Publication series

NameProceedings - IEEE International Workshop on Robot and Human Interactive Communication

Other

Other10th IEEE International Workshop on Robot and Human Interactive Communication, ROMAN 2001
Country/TerritoryFrance
CityBordeaux and Paris
Period01/9/1801/9/21

ASJC Scopus subject areas

  • Software
  • Artificial Intelligence
  • Human-Computer Interaction

Fingerprint

Dive into the research topics of 'Real-time vision system for autonomous mobile robot'. Together they form a unique fingerprint.

Cite this