Aligning the real space with the model on see-through typed HMD for mixed reality

Taiki Fuji, Yasue Mitsukura, Takanari Tanabata, Nobutaka Kimura, Toshio Moriya

Research output: Contribution to conferencePaperpeer-review

Abstract

In this paper, we propose an approach to align the real space with the three-dimensional (3D) model for outdoor wearable mixed reality (MR) system. In our approach, we use a monocular see-through typed head-mounted display (ST-HMD) and a virtual reality (VR) sensor. We can measure six degree-of-freedom (6DOF) using this sensor. In the default setting, it is difficult and burden for people to handle the 3D model using air mouse. Therefore, we reduce the burden by automating the default setting. Moreover, when the user changes the viewpoint, we need to change the computer graphics (CG) model on ST-HMD with corresponding to the real objects of the difference in vision. We obtain the translation and rotation data from VR sensor to reflect the CG model. Then, we structured the alignment system in this system. Furthermore, in order to evaluate the proposed alignment for three-dimensional (3D) model, we show some results of the wearable aligning system using the 3D model.

Original languageEnglish
Pages4233-4238
Number of pages6
DOIs
Publication statusPublished - 2009 Dec 1
Externally publishedYes
Event35th Annual Conference of the IEEE Industrial Electronics Society, IECON 2009 - Porto, Portugal
Duration: 2009 Nov 32009 Nov 5

Other

Other35th Annual Conference of the IEEE Industrial Electronics Society, IECON 2009
Country/TerritoryPortugal
CityPorto
Period09/11/309/11/5

ASJC Scopus subject areas

  • Control and Systems Engineering
  • Electrical and Electronic Engineering

Fingerprint

Dive into the research topics of 'Aligning the real space with the model on see-through typed HMD for mixed reality'. Together they form a unique fingerprint.

Cite this