Feature Tracking and Synchronous Scene Generation with a Single Camera

Автор: Zheng Chai, Takafumi Matsumaru

Журнал: International Journal of Image, Graphics and Signal Processing(IJIGSP) @ijigsp

Статья в выпуске: 6 vol.8, 2016 года.

Бесплатный доступ

This paper shows a method of tracking feature points to update camera pose and generating a synchronous map for AR (Augmented Reality) system. Firstly we select the ORB (Oriented FAST and Rotated BRIEF) [1] detection algorithm to detect the feature points which have depth information to be markers, and we use the LK (Lucas-Kanade) optical flow [2] algorithm to track four of them. Then we compute the rotation and translation of the moving camera by relationship matrix between 2D image coordinate and 3D world coordinate, and then we update the camera pose. Last we generate the map, and we draw some AR objects on it. If the feature points are missing, we can compute the same world coordinate as the one before missing to recover tracking by using new corresponding 2D/3D feature points and camera poses at that time. There are three novelties of this study: an improved ORB detection, which can obtain depth information, a rapid update of camera pose, and tracking recovery. Referring to the PTAM (Parallel Tracking and Mapping) [3], we also divide the process into two parallel sub-processes: Detecting and Tracking (including recovery when necessary) the feature points and updating the camera pose is one thread. Generating the map and drawing some objects is another thread. This parallel method can save time for the AR system and make the process work in real-time.

Еще

Tracking, Synchronous map, Camera pose update, Parallel, Tracking recovery

Короткий адрес: https://sciup.org/15013982

IDR: 15013982

Список литературы Feature Tracking and Synchronous Scene Generation with a Single Camera

  • Rublee, Ethan, et al. "ORB: an efficient alternative to SIFT or SURF." Computer Vision (ICCV), 2011 IEEE International Conference on. IEEE, 2011.
  • Bouguet, Jean-Yves. "Pyramidal implementation of the affine lucas kanade feature tracker description of the algorithm." Intel Corporation 5 (2001): 1-10.
  • Klein, Georg, and David Murray. "Parallel tracking and mapping for small AR workspaces." Mixed and Augmented Reality, 2007. ISMAR 2007. 6th IEEE and ACM International Symposium on. IEEE, 2007.
  • Marius Muja and David G. Lowe, "Fast Approximate Nearest Neighbors with Automatic Algorithm Configuration", in International Conference on Computer Vision Theory and Applications (VISAPP'09), 2009.
  • Zhang, Zhengyou. "Flexible camera calibration by viewing a plane from unknown orientations." Computer Vision, 1999. The Proceedings of the Seventh IEEE International Conference on. Vol. 1. IEEE, 1999.
  • Zhang, Zhengyou. "A flexible new technique for camera calibration." Pattern Analysis and Machine Intelligence, IEEE Transactions on 22.11 (2000): 1330-1334.
  • Davison, Andrew J., et al. "MonoSLAM: Real-time single camera SLAM." Pattern Analysis and Machine Intelligence, IEEE Transactions on 29.6 (2007): 1052-1067.
  • Davison, Andrew J., Walterio W. Mayol, and David W. Murray. "Real-time localization and mapping with wearable active vision." Mixed and Augmented Reality, 2003. Proceedings. The Second IEEE and ACM International Symposium on. IEEE, 2003.
  • Bailey, Tim, et al. "Consistency of the EKF-SLAM algorithm." Intelligent Robots and Systems, 2006 IEEE/RSJ International Conference on. IEEE, 2006.
  • Chekhlov, Denis, et al. "Real-time and robust monocular SLAM using predictive multi-resolution descriptors." Advances in Visual Computing. Springer Berlin Heidelberg, 2006. 276-285.
  • Davison, Andrew J. "Real-time simultaneous localisation and mapping with a single camera." Computer Vision, 2003. Proceedings. Ninth IEEE International Conference on. IEEE, 2003.
  • Rao, G. Mallikarjuna, and Ch Satyanarayana. "Visual object target tracking using particle filter: a survey." International Journal of Image, Graphics and Signal Processing 5.6 (2013): 1250.
  • D.G. Lowe, "Object Recognition from Local Scale-Invariant Features" Proc. Seventh Int'l Conf. Computer Vision, pp. 1150-1157, 1999.
  • Rosten, Edward, and Tom Drummond. "Fusing points and lines for high performance tracking." Computer Vision, 2005. ICCV 2005. Tenth IEEE International Conference on. Vol. 2. IEEE, 2005.
  • Rosten, Edward, and Tom Drummond. "Machine learning for high-speed corner detection." Computer Vision–ECCV 2006. Springer Berlin Heidelberg, 2006. 430-443.
  • Castle, Robert O., Georg Klein, and David W. Murray. "Wide-area augmented reality using camera tracking and mapping in multiple regions." Computer Vision and Image Understanding 115.6 (2011): 854-867.
  • Castle, Robert O., and David W. Murray. "Keyframe-based recognition and localization during video-rate parallel tracking and mapping." Image and Vision Computing 29.8 (2011): 524-532.
  • Harris C, Stephens M. A combined corner and edge detector[C]//Alvey vision conference. 1988, 15: 50.
  • Bay H, Tuytelaars T, Van Gool L. Surf: Speeded up robust features[M]//Computer vision–ECCV 2006. Springer Berlin Heidelberg, 2006: 404-417.
  • M. Calonder, V. Lepetit, C. Strecha, and P. Fua. Brief: Binary robust independent elementary features. In In European Conference on Computer Vision, 2010.
  • P. L. Rosin. Measuring corner properties. Computer Vision and Image Understanding, 73(2):291 – 307, 1999.
  • B. D. Lucas and T. Kanade, "An iterative image registration technique with an application to stereovision," Int. Joint Conf. on Artificial Intelligentc, pp. 674-679, 1981.
  • Horn B K, Schunck B G. Determining optical flow[C]//1981 Technical symposium east. International Society for Optics and Photonics, 1981: 319-331.
  • Brandt J W. Improved accuracy in gradient-based optical flow estimation[J]. International Journal of Computer Vision, 1997, 25(1): 5-22.
  • Cramer, Gabriel (1750). "Introduction à l'Analyse des lignes Courbes algébriques" (in French). Geneva: Europeana. pp. 656–659. Retrieved 2012-05-18.
Еще
Статья научная