進階搜尋


下載電子全文  
系統識別號 U0026-3108201412255300
論文名稱(中文) 視覺導航研究-結合共面與共線條件式於自我移動估計
論文名稱(英文) Study on Vision-Based Navigation-Integration of Coplanarity and Collinearity Condition for Ego-Motion Estamation
校院名稱 成功大學
系所名稱(中) 測量及空間資訊學系
系所名稱(英) Department of Geomatics
學年度 102
學期 2
出版年 103
研究生(中文) 司元榮
研究生(英文) Yuan-Rong Sih
學號 p66011062
學位類別 碩士
語文別 英文
論文頁數 69頁
口試委員 口試委員-曾義星
指導教授-朱宏杰
口試委員-韓仁毓
口試委員-趙鍵哲
中文關鍵字 共線式  共面式  相對方位  絕對方位  移動估計  攝影測量 
英文關鍵字 collinearity condition  coplanarity condition  relative orientation  absolute orientation  ego-motion estimation  photogrammetry 
學科別分類
中文摘要 在攝影測量的研究領域中,利用共線條件式來建立物像點之間的關聯性以解算相機的絕對方位參數是一般常見的方法,但由於共線條件式是非線性方程式,而在解算絕對方位參數時,須事先針對欲解算的未知參數設定初始值,利用最小二乘法進行迭代求解。在解算過程中,初始值的設定是決定解算成果的重要關鍵,因此本研究提出利用共面條件式解算出相對方位參數,進而將此組參數值提供做為利用共線條件式解算絕對方位參數之初始值,成為求解相機絕對方位參數的一種策略。
本研究主要針對單一相機的移動估計進行探討,利用偵測及匹配影像特徵點之演算法自動化獲取連續影像重疊區中共軛像點的資訊,提供做為共面條件式的觀測量並解算相對方位參數,然而上述的共軛像點及相對方位參數可在更進一步提供做為共線條件式的觀測量以及未知參數的初始值,再利用已知條件推算的地面控制點的觀測量,即可進行相機絕對方位參數的嚴密求解。解算成果除了相機外方位參數之外,包含自動匹配出的共軛像點物空間座標值,使得共軛像點成為在連續影像中的新地面控制點,最後重複此流程即可求得相機的移動軌跡。本研究分別在室內及室外不同的場景下進行實驗,實驗結果顯示本研究提出的方位解算流程是有效且可行的。
英文摘要 Photogrammetry research generally uses the collinearity condition to establish relations between object and image points and calculate absolute orientation parameters. Given that the collinearity condition is a nonlinear system, appropriate initial values of unknown absolute orientation parameters must be set up first in the iterative least-squares solution. This research proposes using the coplanarity condition to solve relative orientation parameters as well as using these parameters as initial values to solve absolute orientation parameters based on the collinearity condition. The proposed method can provide a strategy for solving the absolute orientation parameters of a camera.
This research focuses on the motion estimation of a single camera. First, the algorithm of feature point detection is matched to automatically acquire conjugate image points between sequential images. Information on conjugate image points can provide observations to solve relative orientation parameters given the coplanarity condition. Second, results from the previous step can provide initial values and observations. The absolute orientation parameters of a camera can be solved through the iterative least-squares method with the aid of ground control points. Aside from the absolute orientation parameters of a camera, the object-space coordinates of conjugate image points can also be acquired. These conjugate image points can be used as new ground control points for subsequent image pairs. Finally, camera trajectory can be obtained by repeating the procedure. This research conducted experiments in indoor and outdoor environments, and the results proved that the proposed procedure is effective and feasible.
論文目次 中文摘要 I
Abstract II
Acknowledgement IV
Table of Contents V
List of Tables VIII
List of Figures IX
CHAPTER 1 Introduction 1
1.1 Background 1
1.2 Motivation 2
1.3 Literature Review 3
1.4 Structure of Thesis 6
CHAPTER 2 Principles of Close-Range Photogrammetry 8
2.1 Background 8
2.2 Coordinate Systems 9
2.2.1 The Camera Frame 9
2.2.2 The Mapping Frame 11
2.3 Camera Calibration 12
2.4 Coordinate Transformations 15
2.5 Collinearity Condition 17
2.6 Coplanarity Condition 19
CHAPTER 3 Localization and Orientation of Sequential Images 22
3.1 Image Feature Points Detection and Matching 23
3.1.1 Speed-up Robust Feature (SURF) 24
3.1.1.1 Feature Detector 24
3.1.1.2 Feature Descriptor 27
3.1.1.3 Feature Matching 30
3.2 Relative Orientation Parameters 30
3.3 Absolute Orientation Parameters 32
3.3.1 Initialization with First Two Images 32
3.3.2 Automatic Calculation for Sequential Images 38
CHAPTER 4 Case Studies and Analysis 40
4.1 Consumer-Grade Digital Camera 40
4.2 Local Reference System 41
4.3 Experimental Results 43
4.3.1 Case I: Straight Line 44
4.3.2 Case II: Loop 49
4.3.3 Case III: Different Shooting Direction 55
4.3.4 Case IV: Rotation Examination 57
4.3.5 Case V: Simulated Moving Camera 60
CHAPTER 5 Conclusions and Future Works 64
5.1 Conclusions 64
5.2 Future Works 65
References 67
參考文獻 Bay, H., Ess, A., Tuytelaars, T., & Van Gool, L. (2008). Speeded-Up Robust Features (SURF). Computer Vision and Image Understanding, 110(3), 346-359.
Cheng, Y., Maimone, M., & Matthies, L. (2005). Visual odometry on the Mars exploration rovers. Paper presented at the Systems, Man and Cybernetics, 2005 IEEE International Conference on.
Cronk, S., & Fraser, C. S. (2008). Hybrid measurement scenarios in automated close-range photogrammetry. Paper presented at the ISPRS Congress.
Davison, A. J., Reid, I. D., Molton, N. D., & Stasse, O. (2007). MonoSLAM: Real-Time Single Camera SLAM. Pattern Analysis and Machine Intelligence, IEEE Transactions on, 29(6), 1052-1067.
DeWitt, B. A., & Wolf, P. R. (2000). Elements of Photogrammetry(with Applications in GIS): McGraw-Hill Higher Education.
Fraser, C. S. (1997). Digital camera self-calibration. Isprs Journal of Photogrammetry and Remote Sensing, 52(4), 149-159.
Geiger, A., Ziegler, J., & Stiller, C. (2011, 5-9 June 2011). StereoScan: Dense 3d reconstruction in real-time. Paper presented at the Intelligent Vehicles Symposium (IV), 2011 IEEE.
Haralick, R. M., Lee, D., Ottenburg, K., & Nolle, M. (1991, 3-6 Jun 1991). Analysis and solutions of the three point perspective pose estimation problem. Paper presented at the Computer Vision and Pattern Recognition, 1991. Proceedings CVPR '91., IEEE Computer Society Conference on.
Howard, A. (2008). Real-time stereo visual odometry for autonomous ground vehicles. Paper presented at the Intelligent Robots and Systems, 2008. IROS 2008. IEEE/RSJ International Conference on.
Juea, L. (2008). RESEARCH ON CLOSE-RANGE PHOTOGRAMMETRY WITH BIG ROTATION ANGLE. Astronomy and astrophysics (Berlin), 1, 1.
Lienhart, R., & Maydt, J. (2002). An extended set of haar-like features for rapid object detection. Paper presented at the Image Processing. 2002. Proceedings. 2002 International Conference on.
Luhmann, T. (2009). Precision potential of photogrammetric 6DOF pose estimation with a single camera. Isprs Journal of Photogrammetry and Remote Sensing, 64(3), 275-284.
Luhmann, T., Robson, S., & Kyle, S. (2006). Close Range Photogrammetry: Principles, Techniques and Applications: Wiley.
Mikhail, E. M., Bethel, J. S., & McGlone, C. J. (2001). Introduction to modern photogrammetry. New York: Wiley.
Milella, A., & Siegwart, R. (2006, 04-07 Jan. 2006). Stereo-Based Ego-Motion Estimation Using Pixel Tracking and Iterative Closest Point. Paper presented at the Computer Vision Systems, 2006 ICVS '06. IEEE International Conference on.
Mouragnon, E., Lhuillier, M., Dhome, M., Dekeyser, F., & Sayd, P. (2009). Generic and real-time structure from motion using local bundle adjustment. Image and Vision Computing, 27(8), 1178-1193.
Nistér, D., Naroditsky, O., & Bergen, J. (2004). Visual odometry. Paper presented at the Computer Vision and Pattern Recognition, 2004. CVPR 2004. Proceedings of the 2004 IEEE Computer Society Conference on.
Nister, D. (2004). An efficient solution to the five-point relative pose problem. Pattern Analysis and Machine Intelligence, IEEE Transactions on, 26(6), 756-770.
Torr, P. H. S., & Zisserman, A. (1997). Robust parameterization and computation of the trifocal tensor. Image and Vision Computing, 15(8), 591-605.
Xu, F. (2004). Mapping and localization for extraterrestrial robotic explorations. Ohio State University.
論文全文使用權限
  • 同意授權校內瀏覽/列印電子全文服務,於2014-09-09起公開。
  • 同意授權校外瀏覽/列印電子全文服務,於2019-09-09起公開。


  • 如您有疑問,請聯絡圖書館
    聯絡電話:(06)2757575#65773
    聯絡E-mail:etds@email.ncku.edu.tw