Multi-View Video Generation from 2 Dimensional Video 


Vol. 33,  No. 1, pp. 53-61, Jan.  2008


PDF
  Abstract

In this paper, we propose an algorithm for generation of multi-view video from conventional 2 dimensional video. Color and motion information of an object are used for segmentation and from the segmented objects, multi-view video is generated. Especially, color information is used to extract the boundary of an object that is barely extracted by using motion information. To classify the homogeneous regions with color, luminance and chrominance components are used. A pixel-based motion estimation with a measurement window is also performed to obtain motion information. Then, we combine the results from motion estimation and color segmentation and consequently we obtain a depth information by assigning motion intensity value to each segmented region. Finally, we generate multi-view video by applying rotation transformation method to 2 dimensional input images and the obtained depth information in each object. The experimental results show that the proposed algorithm outperforms comparing with conventional conversion methods.

  Statistics
Cumulative Counts from November, 2022
Multiple requests among the same browser session are counted as one view. If you mouse over a chart, the values of data points will be shown.


  Cite this article

[IEEE Style]

Y. Baek, M. Choi, S. Park, J. Yoo, "Multi-View Video Generation from 2 Dimensional Video," The Journal of Korean Institute of Communications and Information Sciences, vol. 33, no. 1, pp. 53-61, 2008. DOI: .

[ACM Style]

Yun-ki Baek, Mi-nam Choi, Se-whan Park, and Ji-sang Yoo. 2008. Multi-View Video Generation from 2 Dimensional Video. The Journal of Korean Institute of Communications and Information Sciences, 33, 1, (2008), 53-61. DOI: .

[KICS Style]

Yun-ki Baek, Mi-nam Choi, Se-whan Park, Ji-sang Yoo, "Multi-View Video Generation from 2 Dimensional Video," The Journal of Korean Institute of Communications and Information Sciences, vol. 33, no. 1, pp. 53-61, 1. 2008.