GazeSphere: Navigating 360-degree-video environments in VR using head rotation and eye gaze

Yun Suen Pai, Benjamin I. Outram, Benjamin Tag, Megumi Isogai, Daisuke Ochi, Kai Kunze

研究成果: Conference contribution

14 被引用数 (Scopus)

抄録

Viewing 360-degree-images and videos through head-mounted displays (HMDs) currently lacks a compelling interface to transition between them. We propose GazeSphere; a navigation system that provides a seamless transition between 360-degree-video environment locations through the use of orbit-like motion, via head rotation and eye gaze tracking. The significance of this approach is threefold: 1) It allows navigation and transition through spatially continuous 360-video environments, 2) It leverages the human's proprioceptive sense of rotation for locomotion that is intuitive and negates motion sickness, and 3) it uses eye tracking for a completely seamless, hands-free, and unobtrusive interaction. The proposed method uses an orbital motion technique for navigation in virtual space, which we demonstrate in applications such as navigation and interaction in computer aided design (CAD), data visualization, as a game mechanic, and for virtual tours.

本文言語English
ホスト出版物のタイトルACM SIGGRAPH 2017 Posters, SIGGRAPH 2017
出版社Association for Computing Machinery, Inc
ISBN(電子版)9781450350150
DOI
出版ステータスPublished - 2017 7月 30
イベント44th International Conference on Computer Graphics and Interactive Techniques, ACM SIGGRAPH 2017 - Los Angeles, United States
継続期間: 2017 7月 302017 8月 3

出版物シリーズ

名前ACM SIGGRAPH 2017 Posters, SIGGRAPH 2017

Other

Other44th International Conference on Computer Graphics and Interactive Techniques, ACM SIGGRAPH 2017
国/地域United States
CityLos Angeles
Period17/7/3017/8/3

ASJC Scopus subject areas

  • コンピュータ ビジョンおよびパターン認識
  • ソフトウェア
  • コンピュータ グラフィックスおよびコンピュータ支援設計

フィンガープリント

「GazeSphere: Navigating 360-degree-video environments in VR using head rotation and eye gaze」の研究トピックを掘り下げます。これらがまとまってユニークなフィンガープリントを構成します。

引用スタイル