Real-time Virtual View Synthesis using Virtual Viewpoint Disparity Estimation and Convergence Check 


Vol. 37,  No. 1, pp. 57-63, Jan.  2012


PDF
  Abstract

In this paper, we propose a real-time view interpolation method using virtual viewpoint disparity estimation and convergence check. For the real-time process, we estimate a disparity map at the virtual viewpoint from stereo images using the belief propagation method. This method needs only one disparity map, compared to the conventional methods that need two disparity maps. In the view synthesis part, we warp pixels from the reference images to the virtual viewpoint image using the disparity map at the virtual viewpoint. For real-time acceleration, we utilize a high speed GPU parallel programming, called CUDA. As a result, we can interpolate virtual viewpoint images in real-time.

  Statistics
Cumulative Counts from November, 2022
Multiple requests among the same browser session are counted as one view. If you mouse over a chart, the values of data points will be shown.


  Cite this article

[IEEE Style]

I. Shin and Y. Ho, "Real-time Virtual View Synthesis using Virtual Viewpoint Disparity Estimation and Convergence Check," The Journal of Korean Institute of Communications and Information Sciences, vol. 37, no. 1, pp. 57-63, 2012. DOI: .

[ACM Style]

In-Yong Shin and Yo-Sung Ho. 2012. Real-time Virtual View Synthesis using Virtual Viewpoint Disparity Estimation and Convergence Check. The Journal of Korean Institute of Communications and Information Sciences, 37, 1, (2012), 57-63. DOI: .

[KICS Style]

In-Yong Shin and Yo-Sung Ho, "Real-time Virtual View Synthesis using Virtual Viewpoint Disparity Estimation and Convergence Check," The Journal of Korean Institute of Communications and Information Sciences, vol. 37, no. 1, pp. 57-63, 1. 2012.