Yasuyuki Yanagida*1, Shintaro Saito*2, Seiichiro Yano,
Taro Maeda, and Susumu Tachi
School of Engineering, The University of Tokyo
*1 currently at ATR Media Information Science Laboratories
*2 currently at IBM JapanProceedings of the ICAT 2001 (The 11th International Conference on Artificial Reality and Telexistence), Tokyo, Japan, pp. 42-47, December 2001.
Abstract
Visual display systems using fixed screens, including Immersive Projection Technology (IPT) displays, have a merit that they can provide a stable world against the user's head motion. In spite of this merit, such display systems have not been used for "telexistence in real worlds", which requires accurate stereoscopic view of live video image. We have proposed a method to realize a lived-video-based, real-time telexistence visual system with a fixed screen: to keep the orientation of the camera constant while following the user's eye position, and to control the position and size of the video image for each eye on the screen in real time. We have also designed technical elements to compose the system, i.e., a constant- orientation camera system and real-time 2D image manipulation (shifting and resizing) subsystem. In this paper, we describe the design and implementation of the entire system that realizes fixed-screen-based telexistence, which inherently has an ability of showing a stable remote world.