3DTV view generation using uncalibrated pure rotating and zooming cameras

Songkran Jarusirisawad, Hideo Saito

Research output: Contribution to journalArticlepeer-review

4 Citations (Scopus)

Abstract

This paper proposes a novel method for synthesizing free viewpoint video captured by uncalibrated pure rotating and zooming cameras. Neither intrinsic nor extrinsic parameters of our cameras are known. Projective grid space (PGS), which is the 3D space defined by the epipolar geometry of two basis cameras, is employed for weak camera calibration. Trifocal tensors are used to relate non-basis cameras to PGS. Given trifocal tensors in the initial frame, our method automatically computes trifocal tensors in the other frames. Scale invariant feature transform (SIFT) is used for finding corresponding points in a natural scene between the initial frame and the other frames. Finally, free viewpoint video is synthesized based on the reconstructed visual hull. In the experimental results, free viewpoint video captured by uncalibrated hand-held cameras is successfully synthesized using the proposed method.

Original languageEnglish
Pages (from-to)17-30
Number of pages14
JournalSignal Processing: Image Communication
Volume24
Issue number1-2
DOIs
Publication statusPublished - 2009 Jan

Keywords

  • Free viewpoint video
  • Projective grid space
  • Trifocal tensor
  • View interpolation

ASJC Scopus subject areas

  • Software
  • Signal Processing
  • Computer Vision and Pattern Recognition
  • Electrical and Electronic Engineering

Fingerprint

Dive into the research topics of '3DTV view generation using uncalibrated pure rotating and zooming cameras'. Together they form a unique fingerprint.

Cite this