Stereo Object Tracking and Multiview image Reconstruction System Using Disparity Motion Vector 


Vol. 31,  No. 2, pp. 166-174, Feb.  2006


PDF
  Abstract

In this paper, a new stereo object tracking system using the disparity motion vector is proposed. In the proposed method, the time-sequential disparity motion vector can be estimated from the disparity vectors which are extracted from the sequence of the stereo input image pair and then using these disparity motion vectors, the area where the target object is located and its location coordinate are detected from the input stereo image. Being based on this location data of the target object, the pan/tilt embedded in the stereo camera system can be controlled and as a result, stereo tracking of the target object can be possible. From some experiments with the 2 frames of the stereo image pairs having 256×256 pixels, it is shown that the proposed stereo tracking system can adaptively track the target object with a low error ratio of about 3.05 % on average between the detected and actual location coordinates of the target object.

  Statistics
Cumulative Counts from November, 2022
Multiple requests among the same browser session are counted as one view. If you mouse over a chart, the values of data points will be shown.


  Cite this article

[IEEE Style]

J. Ko and E. Kim, "Stereo Object Tracking and Multiview image Reconstruction System Using Disparity Motion Vector," The Journal of Korean Institute of Communications and Information Sciences, vol. 31, no. 2, pp. 166-174, 2006. DOI: .

[ACM Style]

Jung-Hwan Ko and Eun-Soo Kim. 2006. Stereo Object Tracking and Multiview image Reconstruction System Using Disparity Motion Vector. The Journal of Korean Institute of Communications and Information Sciences, 31, 2, (2006), 166-174. DOI: .

[KICS Style]

Jung-Hwan Ko and Eun-Soo Kim, "Stereo Object Tracking and Multiview image Reconstruction System Using Disparity Motion Vector," The Journal of Korean Institute of Communications and Information Sciences, vol. 31, no. 2, pp. 166-174, 2. 2006.