進階搜尋


下載電子全文  
系統識別號 U0026-2308201816165800
論文名稱(中文) 擴增實境於具手眼配置微組裝系統之實現
論文名稱(英文) Implementation of Augmented Reality in Eye-In-Hand Micro-Assembly Systems
校院名稱 成功大學
系所名稱(中) 機械工程學系
系所名稱(英) Department of Mechanical Engineering
學年度 106
學期 2
出版年 107
研究生(中文) 簡威任
研究生(英文) Wei-Jen Chien
學號 N16051613
學位類別 碩士
語文別 中文
論文頁數 110頁
口試委員 指導教授-張仁宗
口試委員-田思齊
口試委員-林穎裕
中文關鍵字 眼看手配置  手眼配置  攝影機校正  擴增實境  影像伺服  微組裝系統 
英文關鍵字 eye-in-hand & eye-to-hand  virtual camera calibration  augmented reality  visual servo  micro-assembly system 
學科別分類
中文摘要 本文架設攝影機方式分別使用眼看手配置(Eye-to-hand)與手眼配置(Eye-in-hand),在不同階段輔助安裝微型夾爪之目標,在擴增實境與影像伺服部分,透過三維模型之虛擬攝影機,進行校正與追蹤。首先以「眼看手配置」的攝影機,先初步尋找物件與組裝件粗略位置,並將其靠近。進行組裝時,由於以「手眼配置」攝影機的架設方式,能夠輕易觀察到組裝件孔,先以非線性的模型進行虛擬攝影機校正,使虛擬影像和真實影像達成疊合,再以影像伺服,由物件與攝影機移動速度和投影至影像座標之速度關係,讓建立之虛擬模型能追蹤真實物件在攝影機中的移動狀態,透過擴增實境的輔助,能夠準確得知當下組裝件孔的世界座標位置,提升微組裝的成功機會。
英文摘要 We operate the micro-assembly system in different state with eye-to-hand camera or eye-in-hand camera. In assembly part, we complete the task with augmented reality (AR). We make 3D virtual model overlay the object on image through virtual camera calibration and visual servo method. First, clip the pin with eye-to-hand camera and determine the distance from pin to object. If the distance is small, change eye-to-hand camera to eye-in-hand camera. Next, in eye-in-hand camera, apply virtual camera calibration by estimation of intrinsic parameters, the linear model and the nonlinear model. If the real camera parameters are equal to the virtual camera parameters, the images of the object captured by the real camera and virtual camera are same. Finally, in dynamic assembly part, we make 3D model track real object on image by visual servo method. Use the relationship between object velocity and reprojection point velocity on image plane to build servo model. Calculate the control law and adjust virtual object or camera pose to track. When we assemble the pin and object, we can observe the object hole easily with eye-in-hand camera and always get clear edge of object hole by overlapping virtual object on real image. The series of method can assist the operator and increase the success rate in assembly task.
論文目次 目錄
摘要 I
表目錄 IX
圖目錄 X
第一章 緒論 1
1-1 前言 1
1-2 研究動機 1
1-3文獻回顧 2
1-3-1 手眼配置與眼看手配置 2
1-3-2 攝影機校正 4
1-3-3 視覺伺服 7
1-4 研究目標與方法 12
1-5 本文架構 13
第二章 虛擬實境之數學基礎 14
2-1虛擬模型之介紹 14
2-1-1三維模型之建構方式 14
2-1-2表面渲染 15
2-2虛擬成像之座標轉換 16
2-2-1 模型矩陣(Model matrix) 17
2-2-2 視圖矩陣(View matrix) 19
2-2-3 投影矩陣(Projection matrix) 21
2-2-4 正規化座標(Normalized device coordinate) 24
2-2-5 視埠矩陣(Viewport matrix) 26
2-3 真實攝影機與虛擬攝影機模型之比較 27
2-3-1 真實攝影機外參數矩陣 27
2-3-2真實攝影機內參數矩陣 28
2-4 虛擬微組裝系統座標 30
2-5本章總結 32
第三章 攝影機之校正 33
3-1 特徵點選取 33
3-1-1 校正塊之設計 33
3-1-2 影像特徵辨識 35
3-1-3 虛擬三維座標特徵點選取 40
3-1-4 攝影機轉換之前置作業 40
3-2 攝影機內參數估測 44
3-2-1 量測攝影機FOV 44
3-2-2 真實攝影機校正 47
3-3 虛擬攝影機校正 50
3-3-1 線性估測 50
3-3-2 非線性估測 53
3-4 夾爪位置估測 57
3-5 校正流程 60
3-6 本章總結 61
第四章 擴增實境之視覺伺服 62
4-1 視覺伺服 62
4-1-1 影像基礎之視覺伺服(IBVS) 63
4-1-2 交互作用矩陣(Interaction matrix) 66
4-1-3 攝影機退化現象 69
4-2 IBVS應用於微組裝系統之模型 70
4-3模擬數據 75
A. 決定交互作用矩陣形式 77
B. 決定λ收斂變數大小 82
4-4本章總結 85
第五章 手眼配置微組裝系統之實現 86
5-1 系統介紹 86
5-1-1 硬體規格 86
5-1-2 軟體介面介紹 90
5-2 實驗數據 94
5-2-1 靜態校正 95
5-2-2 視覺伺服 103
5-3 實驗結果 104
5-4 本章總結 105
第六章 結論與未來展望 106
6-1 結論 106
6-2 未來展望 107
參考文獻 108

參考文獻 參考文獻
[1] D. Kragic & H. I. Christensen, “Survey on Visual Servoing for Manipulation,” 2002.
[2] G. Flandin, F. Chaumette & E. Marchand, “Eye-in-hand / Eye-to-hand Cooperation for Visual Servoing,” IEEE International Conference on Robotics & AutomationSan Francisco, CA, April 2000.
[3] R. Tsai, “A Versatile Camera Calibration Techniaue for High-Accuracy 3D Machine Vision Metrology Using Off-the-shelf TV Cameras and Lenses,” IEEE Journal of Robotics and Automation Society, vol. 3, no. 4, August 1987, pp. 324-344.
[4] J. Weng, P. Cohen, and M. Herniou, “Camera Calibration with Distortion Models and Accuracy Evaluation,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 14, no. 10, October 1992, pp. 965-980.
[5] Z. Zhang, “A Flexible New Technique for Camera Calibration,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 22, no. 11, November 2000, pp. 1330-1334.
[6] W. S. Kim, “Computer Vision Assisted Virtual Reality Calibration,” IEEE Robotic and Automation, June 1999, pp. 450-464.
[7] A. C. Sanderson and L. E. Weiss, “Image-based visual servo control using relat- ional graph error signals,” Proceedings of the IEEE International Conference on Cybernetics and Society, 1980, pp. 1074-1077.
[8] L. E. Weiss, A. C. Sanderson and C. P. Neuman, “Dynamic Sensor-Based Control of Robots with Visual Feedback,” IEEE Journal of Robotics & Automation, vol. RA-3, no. 5, October 1987, pp. 404-417.
[9] S. Hutchinson, G. D. Hager and P. I. Corke, “A Tutorial on Visual Servo Control,” IEEE Transaction on Robotics & Automation, vol. 12, no. 5, October 1996, pp. 651-670.
[10] W. J. Wilson, C. C. W. Hulls, and G. S. Bell, “Relative End-Effector Control Using Cartesian position Based Visual Servoing,” IEEE Transactions on Robotics and Automation, vol. 12, no. 5, October 1996, pp. 684-696.
[11] K. Hashimoto, T. Kimoto, T. Ebine and H. Kimura, “Manipulator Control with Image-Based Visual Servo,” Proceedings of the 1991 IEEE International Conference on Robotics and Automation Sacramento, California, April 1991.
[12] E. Malis, F. Chaumette and S. Boudet, “2½D visual servoing,” IEEE Journal of Robotics & Automation, vol. 15, no. 2, April 1999, pp. 238-250.
[13] S. Hutchinson, G. D. Hager and P. I. Corke, “A Tutorial on Visual Servo Control,” IEEE Transaction on Robotics & Automation, vol. 12, no. 5, October 1996, pp. 651-670.
[14] “OpenGL Transformation”, http://www.songho.ca/opengl/gl_transform.html, 2018/03/19.
[15] W. H. Besant, “Conic Sections Treated Geometrically,” Cambridge : Deighton, Bell; London : G. Bell and sons, 1890.
[16] J. He, R. Zhou, Z. Hong, “Modified fast climbing search autofocus algorithm with adaptive step size searching technique for digital camera,” IEEE Trans. Consumer Electron, June 2003, pp. 257-262.
[17] R. J. Chang, J. C. Jau, “Augmented Reality in Peg-in-Hole Microassembly Operations,” Int. J. of Automation Technology vol. 10 no. 3, 2016, pp. 438-446.
[18] R. J. Chang, J. C. Jau, “Error Measurement and Compensation in Developing Virtual-Reality-Assisted Microassembly System,” Int. J. Auto. Tech., vol. 9, no. 6, 2015, pp. 619-628.
[19] Z. Y. Zhang, “Flexible Camera Calibration by Viewing a Plane from Unknown Orientations,” IEEE Int. Conf. on Computer Vision, September 1999, pp. 666-673.
[20] A. Aziz, Y. I. & Karara, H. M. “Direct linear transformation into object space coordinates in close-range photogrammetry,” Photogrammetric Engineering & Remote Sensing, vol. 81, no. 2, February 2015, pp. 103-107.
[21] F. Chaumette and S. Hutchinson, “Visual servo control. I. Basic approaches,” IEEE Robotics & Automation Magazine, vol. 13, no. 4, December 2006, pp. 82-90.
[22] F. Chaumette and S. Hutchinson, “Visual servo control. II. Advanced approaches,” IEEE Robotics & Automation Magazine, vol. 14, no. 1, March 2007, pp. 109-118.
[23] B. Espiau, F. Chaumette, and P. Rives, “A new approach to visual servoing in robotics,” IEEE Trans. Robotics and Automation, vol. 8, no.3, June 1992, pp. 313-326.
[24] S. Hutchinson, G. Hager, and P. Corke, “A tutorial on visual servo control,” IEEE Trans. Robot. Automat., vol. 12, no. 5, October 1996, pp. 651-670.
[25] E. Malis, “Improving vision-based control using efficient second-order minimization techniques,” IEEE International Conference on Robotics and Automation, April 2004, pp. 1843–1848.
[26] 劉俊甫, “三維模型基礎之擴增實境與微組裝之應用,” 國立成功大學機械工程所碩士論文, 2017。
論文全文使用權限
  • 同意授權校內瀏覽/列印電子全文服務,於2018-08-30起公開。
  • 同意授權校外瀏覽/列印電子全文服務,於2018-08-30起公開。


  • 如您有疑問,請聯絡圖書館
    聯絡電話:(06)2757575#65773
    聯絡E-mail:etds@email.ncku.edu.tw