2.765

2022影响因子

(CJCR)

  • 中文核心
  • EI
  • 中国科技核心
  • Scopus
  • CSCD
  • 英国科学文摘

留言板

尊敬的读者、作者、审稿人, 关于本刊的投稿、审稿、编辑和出版的任何问题, 您可以本页添加留言。我们将尽快给您答复。谢谢您的支持!

姓名
邮箱
手机号码
标题
留言内容
验证码

增强现实混合跟踪技术综述

罗斌 王涌天 沈浩 吴志杰 刘越

罗斌, 王涌天, 沈浩, 吴志杰, 刘越. 增强现实混合跟踪技术综述. 自动化学报, 2013, 39(8): 1185-1201. doi: 10.3724/SP.J.1004.2013.01185
引用本文: 罗斌, 王涌天, 沈浩, 吴志杰, 刘越. 增强现实混合跟踪技术综述. 自动化学报, 2013, 39(8): 1185-1201. doi: 10.3724/SP.J.1004.2013.01185
LUO Bin, WANG Yong-Tian, SHEN Hao, WU Zhi-Jie, LIU Yue. Overview of Hybrid Tracking in Augmented Reality. ACTA AUTOMATICA SINICA, 2013, 39(8): 1185-1201. doi: 10.3724/SP.J.1004.2013.01185
Citation: LUO Bin, WANG Yong-Tian, SHEN Hao, WU Zhi-Jie, LIU Yue. Overview of Hybrid Tracking in Augmented Reality. ACTA AUTOMATICA SINICA, 2013, 39(8): 1185-1201. doi: 10.3724/SP.J.1004.2013.01185

增强现实混合跟踪技术综述

doi: 10.3724/SP.J.1004.2013.01185
基金项目: 

国家自然科学基金(61072096);国家科技重大专项基金(2012ZX030 02004);中国工程物理研究院科学技术发展基金(2010B0203023, 2012 B0403068);国家部委项目资助

详细信息
    作者简介:

    罗斌 中国工程物理研究院计算机应用研究所高级工程师, 博士. 1997 年获北京理工大学光电工程系学士学位,2000 年获中国工程物理研究院硕士学位, 2010 年获北京理工大学光电学院博士学位. 主要研究方向为虚拟现实和增强现实技术.E-mail: luobin1827@bit.edu.cn

Overview of Hybrid Tracking in Augmented Reality

Funds: 

Supported by National Natural Science Foundation of China (61072096), the Major National Science and Technology Projects (2012ZX03002004), the Science Technology Development Foun- dation of Chinese Academy of Engineering and Physics (2010B02 03023, 2012B0403068), and the Project of National Ministries and Commissions

  • 摘要: 混合跟踪是增强现实(Augmented reality, AR)领域最近二十年发展迅速的重要关键技术, 是增强现实姿态跟踪系统同时实现高精度和强鲁棒性的有效途径. 本文全面完整地论述了增强现实混合跟踪技术,详细阐述了混合跟踪器所涉及的混合跟踪、标定和时间同步等重要方法, 介绍了不同类型混合跟踪器的当前应用现状,然后探讨了混合跟踪技术的发展趋势及其存在的难题,最后展望了混合跟踪技术的应用前景.
  • [1] Caudell T P, Mizell D W. Augmented reality: an application of heads-up display technology to manual manufacturing processes. In: Proceedings of the 25th Hawaii International Conference on System Sciences. Hawaii: IEEE, 1992. 659-669
    [2] [2] Milgram P, Furnio K. A taxonomy of mixed reality visual displays. IEICE Transactions on Information and Systems, 1994, E77-D(12): 1321-1329
    [3] [3] Azuma R, Baillot Y, Behringer R, Feiner S, Julier S, MacIntyre B. Recent advances in augmented reality. IEEE Computer Graphics and Applications, 2001, 21(6): 34-47
    [4] [4] Feng Z, Duh H B L, Billinghurst M. Trends in augmented reality tracking, interaction and display: a review of ten years of ISMAR. In: Proceedings of the 7th IEEE/ACM International Symposium on Mixed and Augmented Reality. Cambridge, UK: IEEE, 2008. 193-202
    [5] [5] Carmigniani J, Furht B, Anisetti M, Ceravolo P, Damiani E, Ivkovic M. Augmented reality technologies, systems and applications. Multimedia Tools and Applications, 2011, 51(1): 341-377
    [6] [6] Rolland J P, Baillot Y, Goon A A. A Survey of Tracking Technology for Virtual Environments, Technical Report, University of Central Florida, Orlando FL, USA, 1999
    [7] Wang Yong-Tian, Liu Yue, Hu Xiao-Ming. Study on key technique and application of outdoor AR system. Journal of System Simulation, 2003, 15(3): 329-333(王涌天, 刘越, 胡晓明. 户外增强现实系统关键技术及其应用的研究. 系统仿真学报, 2003, 15(3): 329-333)
    [8] [8] Chen Jing, Wang Yong-Tian, Liu Yue, Axel P. Real-time structure and motion by fusion of inertial and vision data for mobile AR system. Journal of Beijing Institute of Technology, 2006, 15(4): 431-436
    [9] Chen Jing, Wang Yong-Tian, Liu Yue, Liu Wei, Guo Jun-Wei, Lin Jing-Dun. Hybrid tracking for outdoor augmented reality system. Journal of Computer-Aided Design and Computer Graphics, 2010, 22(2): 204-209 (陈靖, 王涌天, 刘越, 刘伟, 郭俊伟, 林精敦. 适用于户外增强现实系统的混合跟踪定位算法. 计算机辅助设计与图形学学报, 2010, 22(2): 204-209)
    [10] Su Hong, Kang Bo. Implementation of hybrid tracking technique in augmented reality system. Computer Engineering, 2009, 35(4): 268-270 (苏宏, 康波. 混合跟踪技术在增强现实系统中的应用. 计算机工程, 2009, 35(4): 268-270)
    [11] Li Xin-Yu, Chen Dong-Yi. Design and implementation of augmented reality system based on hybrid tracking. Journal of Computer Applications, 2009, 29(10): 2852-2854, 2858 (李薪宇, 陈东义. 基于混合跟踪的增强现实系统设计与实现. 计算机应用, 2009, 29(10): 2852-2854, 2858)
    [12] Luo Bin. On Multi-Sensor Fusion Based Pose Tracking System and Calibration Technology[Ph.D. dissertation], Beijing Institute of Technology, China, 2010(罗斌. 多传感器融合的姿态跟踪系统及其标定技术研究[博士学位论文], 北京理工大学, 中国, 2010)
    [13] Luo Bin, Wang Yong-Tian, Liu Yue. Multi-sensor data fusion for optical tracking of head pose. Acta Automatica Sinica, 2010, 36(9): 1239-1249(罗斌, 王涌天, 刘越. 光学头部姿态跟踪的多传感器数据融合研究. 自动化学报, 2010, 36(9): 1239-1249)
    [14] Yang Hao, Zhang Feng, Ye Jun-Tao. A camera-IMU relative pose calibration method. Robot, 2011, 33(4): 419-426(杨浩, 张峰, 叶军涛. 摄像机和惯性测量单元的相对位姿标定方法. 机器人, 2011, 33(4): 419-426)
    [15] Brooks R R, Iyengar S S. Real-time distributed sensor fusion for time-critical sensor readings. Optical Engineering, 1997, 36(3): 767-779
    [16] Kalman R E. A new approach to linear filtering and prediction problems. Transactions of the ASME-Journal of Basic Engineering, 1960, (82: Series D): 34-45
    [17] Julier S J, Uhlmann J K. Unscented filtering and nonlinear estimation. Proceedings of the IEEE, 2004, 92(3): 401-422
    [18] Bar-Shalom Y, Li X R, Kirubarajan T. Estimation with Applications to Tracking and Navigation. New York: Wiley-Interscience Publication, 2001
    [19] Gelb A, Kasper J F, Nash R A, Price C F, Sutherland A A. Applied Optimal Estimation. Massachusetts, US: MIT Press Publication, 2001
    [20] Feiner S, Macintyre B, Hollerer T, Webster A. A touring machine: prototyping 3D mobile augmented reality systems for exploring the urban environment. In: Proceedings of the 1st International Symposium on Wearable Computers. Cambridge, Massachusetts, USA: IEEE, 1997. 74-81
    [21] Behringer R. Registration for outdoor augmented reality applications using computer vision techniques and hybrid sensors. In: Proceedings of the 1999 IEEE Virtual Reality. Houston, Texas, USA: IEEE, 1999. 244-251
    [22] Piekarski W, Gunther B, Thomas B. Integrating virtual and augmented realities in an outdoor application. In: Proceedings of the 2nd IEEE and ACM International Workshop on Augmented Reality. San Francisco, CA, USA: IEEE, 1999. 45-54
    [23] Maeda M, Ogawa T, Kiyokawa K, Takemura H. Tracking of user position and orientation by stereo measurement of infrared markers and orientation sensing. In: Proceedings of the 8th International Symposium on Wearable Computers. Arlington, VA, USA: IEEE, 2004. 77-84
    [24] Schall G, Wagner D, Reitmayr G, Taichmann E, Wieser M, Schmalstieg D, Hofmann-Wellenhof B. Global pose estimation using multi-sensor fusion for outdoor augmented reality. In: Proceedings of the 8th IEEE International Symposium on Mixed and Augmented Reality. Orlando, Florida, USA: IEEE, 2009. 153-162
    [25] Maidi M, Ababsa F, Mallem M. Vision-inertial tracking system for robust fiducials registration in augmented reality. In: Proceedings of the 2009 IEEE Symposium on Computational Intelligence for Multimedia Signal and Vision Processing. Nashville, TN, USA: IEEE, 2009. 83-90
    [26] Welch G, Bishop G. SCAAT: incremental tracking with incomplete information. In: Proceedings of the 24th annual conference on SIGRAPH 1997. Los Angeles, CA, USA: ACM Press/Addison-Wesley Publication, 1997. 333-344
    [27] Jiang B. Robust Hybrid Tracking for Outdoor Augmented Reality[Ph.D. dissertation], University of Southern California, USA, 2004
    [28] Lin C, Nguyen K, Hoff B, Vincent T. An adaptive estimator for registration in augmented reality. In: Proceedings of the 2nd IEEE and ACM International Workshop on Augmented Reality. San Francisco, CA, USA: IEEE, 1999. 23-32
    [29] Ababsa F E, Mallem M. Inertial and vision head tracker sensor fusion using a particle filter for augmented reality systems. In: Proceedings of the 2004 International Symposium on Circuits and Systems. Vancouver, British Columbia, Canada: IEEE, 2004. 861-864
    [30] Klein G, Drummond T. Sensor fusion and occlusion refinement for tablet-based AR. In: Proceedings of the 3rd IEEE and ACM International Symposium on Mixed and Augmented Reality. Arlington, VA: IEEE, 2004. 38-47
    [31] Reitmayr G, Drummond T W. Going out: robust model-based tracking for outdoor augmented reality. In: Proceedings of the 2006 IEEE/ACM International Symposium on Mixed and Augmented Reality. Santa Barbara, CA, USA: IEEE, 2006. 109-118
    [32] Oskiper T, Chiu H P, Zhu Z W, Samarasekera S, Kumar R. Stable vision-aided navigation for large-area augmented reality. In: Proceedings of the 2001 IEEE Virtual Reality Conference. Singapore: IEEE, 2011. 63-70
    [33] Fuchs E M. Inertial Head-tracking[Master dissertation], MIT, Cambridge, USA, 1993
    [34] Welch G F. Hybrid Self-tracker: An Inertial/Optical Hybrid Three-dimensional Tracking System, Technical Report, University of North Carolina at Chapel Hill, 1995
    [35] Foxlin E, Harrington M, Pfeifer G. Constellation: a wide-range wireless motion-tracking system for augmented reality and virtual set applications. In: Proceedings of the 25th Annual Conference on Computer Graphics and Interactive Techniques. Orlando, FL, USA: ACM Press, 1998. 371-378
    [36] Foxlin E M, Harrington M, Altshuler Y. Miniature 6-DOF inertial system for tracking HMDs. In: Proceedings of the 1998 SPIE Helmet and Head-Mounted Displays III. Orlando, FL, USA: SPIE, 1998. 214-228
    [37] Foxlin E. Inertial head-tracker sensor fusion by a complementary separate-bias Kalman filter. In: Proceedings of the 1996 IEEE Virtual Reality Annual International Symposium. Santa Clara, California, USA: IEEE, 1996. 185-194, 267
    [38] Parnian N, Golnaraghi F. Integration of a multi-camera vision system and strapdown inertial navigation system (SDINS) with a modified Kalman filter. Sensors, 2010, 10(6): 5378-25394
    [39] Diverdi S, Hollerer T. Heads up and camera down: A vision-based tracking modality for mobile mixed reality. IEEE Transactions on Visualization and Computer Graphics, 2008, 14(3): 500-512
    [40] Schn T B, Gustafsson F. Integrated navigation of cameras for augmented reality. In: Proceedings of the 16th IFAC World Congress. Prague, Czech Republic: IEEE, 2005. 1-6
    [41] Hol J D, Schn T B, Luinge H, Slycke P J, Gustafsson F. Robust real-time tracking by fusing measurements from inertial and vision sensors. Journal of Real-Time Image Processing, 2008, 2: 149-160
    [42] Yokokohji Y, Sugawara Y, Yoshikawa T. Accurate image overlay on video see-through HMDs using vision and accelerometers. In: Proceedings of the 2000 IEEE Virtual Reality. New Brunswick, New Jersey, USA: IEEE, 2000. 247- 254
    [43] Azuma R, Bishop G. Improving static and dynamic registration in an optical see-through HMD. In: Proceedings of the 21st International Conference on Computer Graphics and Interactive Techniques. Orlando: ACM Press, 1994. 197- 204
    [44] Saha R K. Effect of common process noise on two-sensor track fusion. Journal of Guidance, Control, and Dynamics, 1996, 19(4): 829-835
    [45] Chang K C, Saha R K, Bar-Shalom Y. On optimal track-to-track fusion. IEEE Transactions on Aerospace and Electronic Systems, 1997, 33(4): 1271-1276
    [46] Luo B, Wang Y T, Liu Y. Sensor fusion based head pose tracking for lightweight flight cockpit systems. Multimedia Tools and Applications, 2011, 52(1): 235-255
    [47] Hoff W A. Fusion of data from head-mounted and fixed sensors. In: Proceedings of the 1st International Workshop on Augmented Reality. San Francisco, CA, USA: IEEE, 1998. 167-182
    [48] Newman J, Ingram D, Hopper A. Augmented reality in a wide area sentient environment. In: Proceedings of the 2001 IEEE and ACM International Symposium on Augmented Reality. New York, NY, USA: IEEE, 2001. 77-86
    [49] Kalkusch M, Lidy T, Knapp N, Reitmayr G, Kaufmann H, Schmalstieg D. Structured visual markers for indoor pathfinding. In: Proceedings of the 1st IEEE International Workshop on Augmented Reality Toolkit. Darmstadt, Germany: IEEE, 2002. 8-15
    [50] Zendjebil I M, Ababsa F, Didier J Y, Mallem M. Hybrid localization system for mobile outdoor augmented reality applications. In: Proceedings of the 1st Workshops on Image Processing Theory, Tools and Applications. Sousse, Tunisia, France: IEEE, 2008. 1-6
    [51] Kanbara M, Fujii H, Takemura H, Yokoya N. A stereo vision-based augmented reality system with an inertial sensor. In: Proceedings of the 2000 IEEE and ACM International Symposium on Augmented Reality. Munich, Germany: IEEE, 2000. 97-100
    [52] Bajura M, Neumann U. Dynamic registration correction in video-based augmented reality systems. IEEE Computer Graphics and Applications, 1995, 15(5): 52-60
    [53] Satoh K, Uchiyama S, Yamamoto H, Tamura H. Robust vision-based registration utilizing bird's-eye view with user's view. In: Proceedings of the 2nd IEEE/ACM International Symposium on Mixed and Augmented Reality. Tokyo, Japan: IEEE, 2003. 46-55
    [54] Satoh K, Uchiyama S, Yamamoto H. A head tracking method using bird's-eye view camera and gyroscope. In: Proceedings of the 3rd IEEE and ACM International Symposium on Mixed and Augmented Reality. Arlington, VA, USA: IEEE, 2004. 202-211
    [55] Grimm M, Grigat R R. Real-time hybrid pose estimation from vision and inertial data. In: Proceedings of the 1st Canadian Conference on Computer and Robot Vision. Canada: IEEE, 2004. 480-486
    [56] State A, Hirota G, Chen D T, Garrett W F, Livingston M A. Superior augmented reality registration by integrating landmark tracking and magnetic tracking. In: Proceedings of the 23rd Annual Conference on Computer Graphics and Interactive Techniques. New Orleans, LA, USA: ACM Press, 1996. 429-438
    [57] Auer T, Pinz A. Building a hybrid tracking system: integration of optical and magnetic tracking. In: Proceedings of the 2nd IEEE and ACM International Workshop on Augmented Reality. San Francisco, CA, USA: IEEE, 1999. 13-22
    [58] Fischer J, Eichler M, Bartz D, Straper W. Model-based hybrid tracking for medical augmented reality. In: Proceedings of the 12th Eurographics Conference on Virtual Environments. Lisbon, Portugal: IEEE, 2006. 71-80
    [59] Aron M, Simon G, Berger M O. Handling uncertain sensor data in vision-based camera tracking. In: Proceedings of the 3rd IEEE and ACM International Symposium on Mixed and Augmented Reality. Arlington, USA: IEEE, 2004. 58-67
    [60] Aron M, Simon G, Berger M O. Use of inertial sensors to support video tracking. Computer Animation and Virtual Worlds, 2007, 18(1): 57-68
    [61] Satoh K, Anabuki M, Yamamoto H, Tamura H. A hybrid registration method for outdoor augmented reality. In: Proceedings of the 2001 IEEE and ACM International Symposium on Augmented Reality. New York, NY, USA: IEEE, 2001. 67-76
    [62] You S, Neumann U, Azuma R. Orientation tracking for outdoor augmented reality registration. IEEE Computer Graphics and Applications, 1999, 19(6): 36-42
    [63] Klein G S W, Drummond T W. Tightly integrated sensor fusion for robust visual tracking. Image and Vision Computing, 2004, 22(10): 769-776
    [64] Klein G, Drummond T. Robust visual tracking for non-instrumental augmented reality. In: Proceedings of the 2nd IEEE/ACM International Symposium on Mixed and Augmented Reality. Tokyo, Japan: IEEE, 2003. 113-122
    [65] Zhang Z Y. A flexible new technique for camera calibration. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2000, 22(11): 1330-1334
    [66] Tsai R. A versatile camera calibration technique for high-accuracy 3D machine vision metrology using off-the-shelf TV cameras and lenses. IEEE Journal of Robotics and Automation, 1987, 3(4): 323-344
    [67] Foxlin E, Naimark L. Miniaturization, calibration accuracy evaluation of a hybrid self-tracker. In: Proceedings of the 2nd IEEE and ACM International Symposium on Mixed and Augmented Reality. Tokyo, Japan: IEEE, 2003. 151- 160
    [68] Kim A, Golnaraghi M F. Initial calibration of an inertial measurement unit using an optical position tracking system. In: Proceedings of the 2004 Position Location and Navigation Symposium. Monterey, California, USA: IEEE, 2004. 96-101
    [69] Alves J, Lobo J, Dias J. Camera-inertial sensor modeling and alignment for visual navigation. Machine Intelligence and Robotic Control, 2003, 5(3): 103-111
    [70] Bin L, Wang Y T, Liu Y. Sensor fusion for vision-based indoor head pose tracking. In: Proceedings of the 5th International Conference on Image and Graphics. Xi'an, China: IEEE, 2009. 677-682
    [71] Horn B K P. Closed-form solution of absolute orientation using unit quaternions. Journal of the Optical Society of America A, 1987, 4(4): 629-642
    [72] Park F C, Martin B J. Robot sensor calibration: solving AX = XB on the Euclidean group. IEEE Transactions on Robotics and Automation, 1994, 10(5): 717-721
    [73] Baillot Y, Julier S J, Brown D, Livingston M A. A tracker alignment framework for augmented reality. In: Proceedings of the 2nd IEEE and ACM International Symposium on Mixed and Augmented Reality. Tokyo, Japan: IEEE, 2003. 142-150
    [74] Lobo J, Dias J. Relative pose calibration between visual and inertial sensors. The International Journal of Robotics Research, 2007, 26(6): 561-575
    [75] Randeniya D I B, Gunaratne M, Sarkar S, Nazef A. Calibration of inertial and vision systems as a prelude to multi-sensor fusion. Transportation Research Part C: Emerging Technologies, 2008, 16: 255-274
    [76] Hol J D, Schon T B, Gustafsson F. Modeling and calibration of inertial and vision sensors. The International Journal of Robotics Research, 2010, 29(2-3): 231-244
    [77] Hol J D, Schon T B, Gustafsson F. A new algorithm for calibrating a combined camera and IMU sensor unit. In: Proceedings of the 10th International Conference on Control, Automation, Robotics and Vision. Hanoi, Vietnam: IEEE, 2008. 1857-1862
    [78] Mirzaei F M, Roumeliotis S I. A Kalman filter-based algorithm for IMU-camera calibration: observability analysis and performance evaluation. IEEE Transactions on Robotics, 2008, 24(5): 1143-1156
    [79] Kelly J, Sukhatme G S. Self-calibration of inertial and omnidirectional visual sensors for navigation and mapping. In: Proceedings of the 2010 IEEE International Conference on Robotics and Automation. Anshorage, Alaska, USA: IEEE, 2010. 1-6
    [80] Huber M, Schlegel M, Klinker G. Temporal calibration in multisensor tracking setups. In: Proceedings of the 8th IEEE International Symposium on Mixed and Augmented Reality. Orlando, Florida, USA: IEEE, 2009. 195-196
    [81] Liang J D, Shaw C, Green M. On temporal-spatial realism in the virtual reality environment. In: Proceedings of the 4th annual ACM symposium on User Interface Software and Technology. New York, NY, USA: ACM Press, 1991. 19-25
    [82] Kantonen T. Sensor synchronization for AR applications. In: Proceedings of the 9th IEEE International Symposium on Mixed and Augmented Reality. Seoul, Korea: IEEE, 2010. 245-246
    [83] Yamazoe H, Utsumi A, Tetsutani N, Yachida M. Vision-based human motion tracking using head-mounted cameras and fixed cameras. Electronics and Communications in Japan (Part II: Electronics), 2007, 90(2): 40-53
    [84] Young M C, Young S S, Sang K. Pose estimation from landmark-based vision and inertial sensors. In: Proceedings of the 2006 International Joint Conference on SICE-ICASE. Bexco, Busan, Korea: IEEE, 2006, 1668-1671
    [85] Foxlin E, Altshuler Y, Naimark L, Harrington M. FlightTracker: a novel optical/inertial tracker for cockpit enhanced vision. In: Proceedings of the 3rd IEEE and ACM International Symposium on Mixed and Augmented Reality. Arlington, USA: IEEE, 2004, 212-221
    [86] Foxlin E, Naimark L. VIS-Tracker: a wearable vision-inertial self-tracker. In: Proceedings of the 2003 IEEE Virtual Reality Conference. Los Angeles, CA: IEEE, 2003. 199-206
    [87] Parnian N, Golnaraghi M F. Integration of vision and inertial sensors for industrial tools tracking. Sensor Review, 2007, 27(2): 132-141
    [88] Hogue A, Jenkin M R, Allison R S. An optical-inertial tracking system for fully-enclosed VR displays. In: Proceedings of the 1st Canadian Conference on Computer and Robot Vision. Canada: IEEE, 2004. 22-29
    [89] Ribo M, Lang P, Ganster H, Brandner M, Stock C, Pinz A. Hybrid tracking for outdoor augmented reality applications. IEEE Computer Graphics and Applications, 2002, 22(6): 54 -63
    [90] Ababsa F, Mallem M. Hybrid three-dimensional camera pose estimation using particle filter sensor fusion. Advanced Robotics, 2007, 21(1-2): 165-181
    [91] Rehbinder H, Ghosh B K. Multi-rate fusion of visual and inertial data. In: Proceedings of the 2001 International Conference on Multisensor Fusion and Integration for Intelligent Systems. Kongresshaus Baden-Baden, Germany: IEEE, 2001. 97-102
    [92] Rehbinder H, Ghosh B K. Pose estimation using line-based dynamic vision and inertial sensors. IEEE Transactions on Automatic Control, 2003, 48(2): 186-199
    [93] Wu Y X, Hu D W, Wu M P, Wu X P, Wu T. Observability analysis of rotation estimation by fusing inertial and line-based visual information: a revisit. Automatica, 2006, 42(10): 1809-1812
    [94] Jiang B, Neumann U, Suya Y. A robust hybrid tracking system for outdoor augmented reality. In: Proceedings of the 2004 IEEE Virtual Reality. Chicago, IL, USA: IEEE, 2004. 3-275
    [95] Suya Y, Neumann U, Azuma R. Hybrid inertial and vision tracking for augmented reality registration. In: Proceedings of the 1999 IEEE Virtual Reality. Houston, Texas, USA: IEEE, 1999. 260-267
    [96] You S, Neumann U. Fusion of vision and gyro tracking for robust augmented reality registration. In: Proceedings of the 2001 IEEE Virtual Reality. Yokohama, Japan: IEEE, 2001. 71-78
    [97] Bleser G, Wohlleber C, Becker M, Stricker D. Fast and stable tracking for AR fusing video and inertial sensor data. In: Proceedings of the 2006 International Conference in Central Europe on Computer Graphics, Visualization and Computer Vision. Plzen, Czech Republic: IEEE, 2006. 109-115
    [98] Hol J D, Schon T B, Gustafsson F, Slycke P. Sensor fusion for augmented reality. In: Proceedings of the 9th International Conference on Information Fusion. Florence, Italy: IEEE, 2006. 1-6
    [99] Chandaria J, Thomas G A, Stricker D. The MATRIS project: real-time markerless camera tracking for augmented reality and broadcast applications. Journal of Real-Time Image Processing, 2007, 2(2): 69-79
    [100] Bleser G, Stricker D. Advanced tracking through efficient image processing and visual-inertial sensor fusion. In: Proceedings of the 2008 IEEE Virtual Reality Conference. Reno, Nevada, USA: IEEE, 2008. 137-144
    [101] Bleser G, Hendeby G. Using optical flow as lightweight SLAM alternative. In: Proceedings of the 8th IEEE International Symposium on Mixed and Augmented Reality. Orlando, Florida, USA: IEEE, 2009. 175-176
    [102] Waechter C, Huber M, Keitler P, Schlegel M, Pustka D, Klinker G. A multi-sensor platform for wide-area tracking. In: Proceedings of the 9th IEEE International Symposium on Mixed and Augmented Reality. Seoul, Korea: IEEE, 2010. 275-276
    [103] Hollerer T, Feiner S, Pavlik J. Situated documentaries: embedding multimedia presentations in the real world. In: Proceedings of the 3rd International Symposium on Wearable Computers. San Francisco, California, USA: IEEE, 1999. 79-86
    [104] Azuma R, Weon Lee J, Jiang B, Park J, You S, Neumann U. Tracking in unprepared environments for augmented reality systems. Computers and Graphics, 1999, 23(6): 787-793
    [105] Azuma R, Hoff B, Neely H III, Sarfaty R. A motion-stabilized outdoor augmented reality system. In: Proceedings of the 1999 IEEE Virtual Reality. Houston, Texas, USA: IEEE, 1999. 252-259
    [106] Lobo J, Lucas P, Dias J, de Almeida A T. Inertial navigation system for mobile land vehicles. In: Proceedings of the 1995 IEEE International Symposium on Industrial Electronics. Athens, Greece: IEEE, 1995. 843-848
    [107] Hagure T, Marchant J A, Tillett N D. Ground based sensing systems for autonomous agricultural vehicles. Computers and Electronics in Agriculture, 2000, 25(1): 11-28
    [108] Hu Z, Uchimura K. Fusion of vision, GPS and 3D gyro data in solving camera registration problem for direct visual navigation. International Journal of ITS Research, 2006, 4(1): 3-12
    [109] Karlekar J, Zhou S Z, Nakayama Y, Lu W Q, Loh Z C, Hii D. Model-based localization and drift-free user tracking for outdoor augmented reality. In: Proceedings of the 2010 IEEE International Conference on Multimedia and Expo. Singapore: IEEE, 2010. 1178-1183
    [110] Park J. 3DOF tracking accuracy improvement for outdoor augmented reality. In: Proceedings of the 9th IEEE International Symposium on Mixed and Augmented Reality. Seoul, Korea: IEEE, 2010. 263-264
    [111] Milic L, Saramaki T, Bregovic R. Multirate filters: an overview. In: Proceedings of the 2006 IEEE Asia Pacific Conference on Circuits and Systems. Singapore: IEEE, 2006. 912-9150
    [112] Armesto L, Tornero J, Vincze M. Fast ego-motion estimation with multi-rate fusion of inertial and vision. International Journal of Robotics Research, 2007, 26(6): 577-589
    [113] Newman J, Wagner M, Bauer M, MacWilliams A, Pintaric T, Beyer D, Pustka D, Strasser F, Schmalstieg D, Klinker G. Ubiquitous tracking for augmented reality. In: Proceedings of the 3rd IEEE and ACM International Symposium on Mixed and Augmented Reality. Arlington, USA: IEEE, 2004. 192-201
    [114] Piekarski W, Thomas B. ARQuake: the outdoor augmented reality gaming system. Communications of the ACM, 2002, 45(1): 36-38
    [115] Hllerer T, Feiner S, Terauchi T, Rashid G, Hallaway D. Exploring MARS: developing indoor and outdoor user interfaces to a mobile augmented reality system. Computers and Graphics, 1999, 23(6): 779-785
    [116] Julier S, Baillot Y, Lanzagorta M, Brown D, Rosenblum L. BARS: Battlefield augmented reality system. In: Proceedings of the 2000 NATO Symposium on Information Processing Techniques for Military Systems. Istanbul, Turkey: IEEE, 2000. 9-11
    [117] Ishii H, Bian Z, Fujino H, Sekiyama T, Nakai T, Okamoto A, Shimoda H, Izumi M, Kanehira Y, Morishita Y. Augmented reality applications for nuclear power plant maintenance work. In: Proceedings of the 2007 International Symposium on Symbiotic Nuclear Power Systems for 21st Century. Tsuruga, Jaoan, Japan, 2007. 262-268
    [118] Reinhart W, Fursch A. A camera-based support system with augmented reality functions for manual tasks in radioactive production environments. Production Engineering, 2008, 2(2): 139-147
    [119] Jones E S, Soatto S. Visual-inertial navigation, mapping and localization: a scalable real-time causal approach. The International Journal of Robotics Research, 2011, 30(4): 407-430
    [120] Escolano C, Antelis J, Minguez J. Human brain-teleopera-ted robot between remote places. In: Proceedings of the 2009 IEEE International Conference on Robotics and Automation. Kobe, Japan: IEEE, 2009. 4430-4437
    [121] Mistry P, Maes P. SixthSense: a wearable gestural interface. In: Proceedings of the 2009 SIGGRAPH Asia'09. Ketch, Yokohama, Japan: ACM, 2009. 85-92
  • 加载中
计量
  • 文章访问数:  1947
  • HTML全文浏览量:  83
  • PDF下载量:  3658
  • 被引次数: 0
出版历程
  • 收稿日期:  2011-11-22
  • 修回日期:  2012-10-25
  • 刊出日期:  2013-08-20

目录

    /

    返回文章
    返回