進階搜尋


下載電子全文  
系統識別號 U0026-1508201413584300
論文名稱(中文) 雙像視覺里程計輔助低成本INS/GPS 整合系統之效益分析
論文名稱(英文) The Performance Analysis of Stereo Visual Odometry Assisted Low-Cost INS/GPS Integration System
校院名稱 成功大學
系所名稱(中) 測量及空間資訊學系
系所名稱(英) Department of Geomatics
學年度 102
學期 2
出版年 103
研究生(中文) 陳麗虹
研究生(英文) Li-Hung Chen
學號 P66014133
學位類別 碩士
語文別 英文
論文頁數 90頁
口試委員 指導教授-江凱偉
口試委員-陳國華
口試委員-張智安
中文關鍵字 視覺里程計  慣性導航系統  全球定位系統  整合系統 
英文關鍵字 Visual Odometry  INS  GPS  Integration System 
學科別分類
中文摘要 近幾年來慣性導航系統 (Inertial Navigation System, INS) 已發展出許多輔助感測器,包含全球定位系統 (Global Positioning System, GPS)、輪速計、航向角感測器…等等,而在導航領域的陸載應用上最常使用的即為INS/GPS 並配載輪速計進行輔助的整合系統。在本研究中提出使用視覺里程計 (Visual Odometry, VO) 取代常見的輪速計進行低成本的INS/GPS整合系統的輔助。當GPS訊號斷訊時,INS獨立運作的定位系統會隨時間而產生飄移量並出現誤差累積的現象,此時VO可以提供載體動態的預估值包含位置及速度量以輔助導航系統降低誤差量。

本研究將戰術等級的INS/GPS導航解視為參考解,除了比較VO系統的預估軌跡與真實軌跡的相似度,亦計算其終點誤差值與行進距離的比值(Distances Traveled, DT)。除此之外,本研究亦比較單像VO及雙像VO系統的效能,以證實前人研究提出的論點: 雙像系統的表現會比單像系統的表現佳。

本研究測試系統效能的實驗包含有:不同的採樣率、實驗場景及行車動態。初步成果顯示;在GPS斷訊超過兩分鐘時,透過VO輔助的低成本INS/GPS系統可以確實提升整體定位精度達到40%以上。藉由實驗成果得知:雙像VO的表現良好且本研究所提出的VO輔助系統在GPS訊號品質不佳的區域例如高架橋下及高樓大廈旁等皆具有良好的表現。
英文摘要 In recent years, there are increasing applications of Inertial Navigation System (INS) with different aiding sensors such as Global Positioning System (GPS), odometer, heading rate sensor and so on. The INS/GPS integration system with odometer is the most commonly used in the land vehicular navigation area. In this research, a novel method is proposed for assisting low-cost INS/GPS integration system with Visual Odometry (VO) system to replace above sensors. The VO can estimate the movement of vehicle. When GPS signal vanishes, VO can be applied as the backup for GPS. The VO provides velocity information and updates the estimated position of the vehicle to reduce the INS accumulation errors then to improve the positioning accuracy.

The tactical INS/GNSS (Global Navigation Satellite System) trajectory processed with RTS (Rauch-Tung-Striebel) smoother in the tightly coupled mode is used as the reference trajectory. The distances traveled (DT) by low-cost INS/GPS and VO modes are calculated. The trajectory comparison scenario is conducted in this research as well. In addition, this study investigates the accuracy variation in both monocular and stereo-vision VO system, respectively. The purpose is to verify the argument presented in related work.

The settings of our experiments include different image sampling rates, different scenes and driving dynamics. According to the preliminary results, the position errors in three directions are reduced when GPS outage period is over two minutes and the improvement of 3D position RMSE is more than 40% by using the assisted information including VO position and velocity. Based on the preliminary results shown in the research, the stereo VO performs well and the performance of the proposed system is good in the GPS-hostile environment.
論文目次 中文摘要 I
Abstract II
誌謝 III
Contents IV
List of Tables VII
List of Figures VIII
Chapter 1 Introduction 1
1-1 Background 1
1-2 Motivation and Purpose 4
1-3 Thesis Outline 6
Chapter 2 INS/GPS Integration System 7
2-1 Coordinate Frames and Transformations 7
2-1-1 Earth-centered Inertial Frame 7
2-1-2 Earth-centered Earth-fix Frame 8
2-1-3 Local Level Navigation Frame 8
2-1-4 Body Frame 9
2-1-5 Transformation between Frames 9
2-2 Inertial Navigation System 12
2-2-1 Introduction of INS 12
2-2-2 INS Navigation Equations 15
2-3 Global Positioning System 19
2-3-1 Scheme of GPS 19
2-3-2 Principle of GPS 21
2-3-3 Error Source of GPS 24
2-4 INS/GPS Integration System 25
2-4-1 Kalman Filter 26
2-4-2 Smoothing 28
2-4-3 Integration System Scheme 29
2-4-4 Additional Aiding Sources 31
Chapter 3 Visual Odometry Assisted INS/GPS Integration System 34
3-1 Visual Odometry System 34
3-1-1 Feature Extract and Matching 35
3-1-2 Bucketing 36
3-1-3 RANSAC 37
3-1-4 Ego-motion Estimation 38
3-2 Integration System 39
3-2-1 The Processing of VO Output 39
3-2-2 The Scheme of Proposed System 44
Chapter 4 System Calibration 46
4-1 Camera calibration 46
4-1-1Camera Calibration Method 46
4-1-2 Camera Calibration Field 48
4-1-3 Image Rectification 49
4-2 IMU Calibration 51
Chapter 5 Experiments and Analysis 54
5-1 Experimental Settings 54
5-1-1 Equipment 54
5-1-2 Experiments 56
5-1-3 Software Procedure 57
5-2 Experimental Results 60
5-2-1 Visual Odometry 60
5-2-2 Proposed System 74
Chapter 6 Conclusions and Future Works 82
6-1 Conclusions 82
6-2 Future Works 85
References 86
參考文獻 Abeles P., (2014): “BoofCV (v0.17),” http://boofcv.org
Aggarwal P., Syed Z., Niu X., and El-Sheimy N., (2008): “A Standard Testing and Calibration Procedure for Low Cost MEMS Inertial Sensors and Units,” Journal of Navigation, vol.61, pp. 323-336.
Bosse M., Karl W.C., Castanon D., and Debitetto P., (1997): “A Vision Augmented Navigation System,” IEEE Intelligent Transportation Systems, pp. 1028-1033.
Badino H., (2004): “A Robust Approach for Ego-Motion Estimation Using a Mobile Stereo Platform,” IWCM, Lecture Notes in Computer Science, vol. 3417, pp. 198-208.
Bota S. and Nedevschi S., (2008): “Camera Motion Estimation Using Monocular and Stereo-Vision,” IEEE Intelligent Computer Communication and Processing, pp. 275-278.
Chiang K.W., (2004): “INS/GPS Integration Using Neural Networks for Land Vehicular Navigation Application,” Dept. of Geomatics, University of Calgary, Calgary, Canada.
Chang H.W., (2009): “The development of self-growing Neural Network Embedded POS Determination Scheme for MEMS INS/GPS Integrated Systems,” MSc Thesis, Dept. of Geomatics, National Cheng Kung University, Tainan, Taiwan.
Civera J., Grasa Ó.G., Davison A.J., and Montiel J. M. M., (2010): “1-Point RANSAC for EKF Filtering: Application to Real-Time Structure from Motion and Visual Odometry,” Journal of Field Robotics, vol. 27(5), pp. 609-631.
Dissanayake G., Sukkarieh S., Nebot E., and Durrant-Whyte H., (2001): “The aiding of a low-cost, strapdown inertial unit using modeling constraints in land vehicle applications,” IEEE Trans. on Robotics and Automation, vol. 17, pp. 731-747.
Duong T.T., (2014): “Integration Strategies and Estimation algorithms to Improve the Navigation Accuracy of Land-Based Mobile Mapping System,” PhD Thesis, Dept. of Geomatics, National Cheng Kung University, Tainan, Taiwan
El-Sheimy N., (2002): “Introduction to Inertial Navigation,” ENGO 699.71 Lecture Notes, Dept. of Geomatics, University of Calgary, Calgary, Canada.
El-Sheimy N., (2003): “Inertial Techniques and INS/DGPS Integration,” Lecture Notes ENGO 623, Dept. of Geomatics, University of Calgary, Calgary, Canada.
El-Sheimy N., Chiang K.W., and Noureldin A., (2006): “The utilization of artificial neural networks for multisensor system integration in navigation and positioning instruments,” Instrumentation and Measurement, IEEE Transactions on, vol. 55(5), pp. 1606-1615.
Fischler M.A. and Bolles R.C., (1981): “Random Sample Consensus: A Paradigm for Model Fitting with Applications to Image Analysis and Automated Cartography,” Communications of the ACM, vol. 24, pp. 381-395.
Fraser C.S., (1997): “Digital Camera Self-calibration,” ISPRS Journal of Photogrammetry & Remote Sensing, vol. 52, pp. 149-159.
Farrell J.A. and Barth M., (1998): “The Global Positioning System & Inertial Navigation”. McGraw Hill Professional.
Gelb A., (1974): “Applied Optimal Estimation,” MIT Press, Cambridge, Massachusetts, and London, England.
Grewal M.S., Weill L.R., and Andrews A.P., (2001): “Globel Positioning Systems, Inertial Navigation, and Integration,” John Weiley & Sons, Inc. New York.
Godha S., (2006): “Performance Evaluation of Low Cost MEMS-Based IMU Integrated With GPS for Land Vehicle Navigation Application,” MSc Thesis, Department of Engineering, University of Calgary, Calgary, Canada.
Geiger A., Ziegler J., and Stiller C., (2011): “StereoScan: Dense 3d Reconstruction in Real-time,” IEEE Intelligent Vehicles Symposium, vol. 4, pp. 963-968.
Harris C. and Stephens H., (1988): “A combined corner and edge detector,” in Proceedings of the Alvey Vision Conference, pp. 147-151.
Helmick D.M., Cheng Y., Clouse D.S., Matthies L. H., and Roumeliotis S. I., (2004): “Path following using visual odometry for a mars rover in high-slip environments,” IEEE Aerospace Conference, vol. 2, pp. 772-789.
Howard A., (2008): “Real-Time Stereo Visual Odometry for Autonomous Ground Vehicles,” IEEE/RSJ Intelligent Robots and Systems, pp. 3946-3952.
Kalman R.E., (1960): “A new approach to linear filtering and prediction problems,” Journal of Basic Engineering, vol. 82, pp. 35-45.
Kennedy S., Hamilton, J., and Martell, H., (2006): “Architecture and system performance of SPAN–NovAtel’s GPS/INS solution,” Proceedings of IEEE/ION PLANS 2006, pp. 23-25.
Kitt B., Geiger A., and Lategahn H., (2010): “Visual Odometry based on Stereo Image Sequences with RANSAC-based Outlier Rejection Scheme,” IEEE Intelligent Vehicles Symposium, vol. 4, pp. 486-492.
Kang H., Stoykova E., Park J., Hong S., and Kim Y., (2013): “Holographic Printing of White-Light Viewable Holograms and Stereograms,” InTech.
Lidholm J., Spampinato G., and Asplund L., (2009): “Validation of Stereo Matching for Robot Navigation,” IEEE Emerging Technologies & Factory Automation, pp. 1-8.
Li M. and Mourikis A.I., (2012): “Improving the accuracy of EKF-Based Visual-Inertial Odometry,” IEEE Robotics and Automation, pp. 828-835.
Moravec H.P., (1980): “Obstacle avoidance and navigation in the real world by a seeing robot rover,” PhD thesis, Stanford University, Stanford, California.
Matthies L. and Shaffer S.A., (1987): “Error Modeling in Stereo Navigation,” IEEE Journal of Robotics and Automation, vol. 3, pp. 239-248.
Misra P. and Enge P., (2001): “Global Positioning System: Signals, Measurements and Performance,” Ganga-Jamuna Press.
Nister D., Naroditsky O., and Bergen J., (2004): “Visual odometry,” Computer Vision and Pattern Recognition, vol. 1, pp. I-652 - I-659.
Neubeck A. and Gool L.V., (2006): “Efficient non-maximum suppression,” Pattern Recognition, ICRP, pp. 850-855.
Nassar S., Niu X., and El-sheimy N., (2007): “Land Vehicle INS/GPS Accurate Positioning during GPS signal Blockage Periods,” Journal of Surveying Engineering. vol. 133, pp. 134-143.
Netramai C., Roth H., and Sachenko A., (2011): “High Accuracy Visual Odometry Using Multi-Camera System,” IEEE Intelligent Data Acquisition and Advanced Computing Systems, vol. 1, pp. 263-268.
Parra I., Sotelo M.A., Llorca D.F., Fernandez C., Llamazares A., Hernandez N., and Garcia I., (2011): “Visual odometry and map fusion for GPS navigation assistance,” IEEE Industrial Electronics, pp. 832-837.
Peng K.Y., (2012): “The Performance Analysis of an AKF Based Tightly Coupled INS/GNSS Sensor Fusion Scheme with Non-holonomic Constraints,” MSc Thesis, Dept. of Geomatics, National Cheng Kung University, Tainan, Taiwan.
Rauch H., Tung F., and Striebel C., (1965): “Maximum likelihood estimates of linear dynamic systems,” AIAA J, vol. 3, pp. 1445-1450.
Rau J.Y. and Yeh P.C., (2012): “A Semi-Automatic Image-Based Close Range 3D Modeling Pipeline Using a Multi-Camera Configuration,” Sensors, vol. 12(8), pp. 11271-11293.
Schwarz K.-P., (1999): “Fundamentals of Geodesy: Lecture Notes ENGO 421,” Dept. of Geomatics, University of Calgary, Calgary, Canada.
Schwarz K.-P. and Wei M., (2000): “INS/GPS Integration for Geodetic Applications: Lecture Notes ENGO 623,” Dept. of Geomatics, University of Calgary, Calgary, Canada.
Seeber G., (2003): “Satellite Geodesy,” Walter de Gruyter, Berlin, New York, USA.
Titterton D.H. and Weston J.L., (1997): “Strapdown Inertial Navigation Technology,” Peter Peregrinus Ltd, United Kingdom.
TIMÁR G., AUNAP R., and MOLNÁR G., (2004): “Datum transformation parameters between the historical and modern Estonian geodetic networks,” Estonia – Geographical Studies, vol. 9, pp. 99-106.
Tardif J.-P., George M., Laverne M., Kelly A., and Stentz A., (2010): “A New Approach to Vision-Aided Inertial Navigation,” IEEE Intelligent Robots and Systems, pp. 4161-4168.
Webster A.A., Jones C.T., Pinson M.H., Voran S. D., and Wolf, S., (1993): “Objective video quality assessment system based on human perception.” In IS&T/SPIE's Symposium on Electronic Imaging: Science and Technology, pp. 15-26.
Wendel J. and Trommer G.F., (2004): “Tightly coupled GPS/INS integration for missile applications,” Aerospace Science and Technology, vol. 8, pp. 627–634.
Wolf P.R. and Dewitt B.A., (2004): “Elements of Photogrammetry with Applications in GIS 3rd edition,” McGraw-Hill, Asia.
Yu Q. F. and Shang Y., (1991): “Video Measurement Principles and Applications,” Science, China.
Zhang Z., Deriche R., Faugeras O., and Luong Q.T., (1995): “A robust technique for matching two uncalibrated images through the recovery of the unknown epipolar geometry,” Artificial intelligence, vol. 78(1), pp. 87-119.
論文全文使用權限
  • 同意授權校內瀏覽/列印電子全文服務,於2019-09-02起公開。
  • 同意授權校外瀏覽/列印電子全文服務,於2019-09-02起公開。


  • 如您有疑問,請聯絡圖書館
    聯絡電話:(06)2757575#65773
    聯絡E-mail:etds@email.ncku.edu.tw