進階搜尋


 
系統識別號 U0026-0812200911441695
論文名稱(中文) 具眼在手視覺系統之輪式移動型機械手臂之導航與避障
論文名稱(英文) Navigation and Obstacle Avoidance of Wheeled Mobile Manipulators with an Eye-in-Hand Vision System
校院名稱 成功大學
系所名稱(中) 機械工程學系碩博士班
系所名稱(英) Department of Mechanical Engineering
學年度 93
學期 2
出版年 94
研究生(中文) 蕭義麟
研究生(英文) Yi-Lin Hsiao
學號 n1691464
學位類別 碩士
語文別 英文
論文頁數 91頁
口試委員 口試委員-孫永年
口試委員-林錫寬
指導教授-蔡清元
中文關鍵字 單眼視覺  導航  循跡  避障  定位  移動式機器人 
英文關鍵字 positioning  obstacle avoidance  path following  navigation  mobile robot  Monocular vision 
學科別分類
中文摘要   在自動化生產中,市場競爭激烈、人工成本上漲,以往人工操作的搬運和固定式輸送帶為主的傳統物件搬運方式,不但佔用空間也不容易更變生產線結構,加上需要人力監督操作,更增加生產成本,在現今的環境下,已逐漸難以滿足生產自動化的需求。為順應市場產品少量多樣的趨勢,面對搬運線路配置必須能夠快速改變以適應新的工廠搬運路線規劃,結合移動平台和機器手臂之自動化物件搬運系統,在物件搬運上具有相當好的機動性,滿足自動化生產的搬運作業需求,因此廣泛地使用於物料搬運作業上。

  本論文結合機械視覺於物件搬運系統的運動控制,相較於紅外線和超音波感測器的偵測方式,有較佳的解析度與雜訊干擾小之優點。物件搬運系統上的視覺子系統,將攝影機架設於移動平台上,藉由攝影機擷取移動平台周圍的環境資訊,再由電腦估測出移動平台與工作站之間距離和偵測障礙物,以做路徑規劃的動作,相較於頂掛式固定攝影機,其優點是攝影機數量大幅減少。本研究使用單眼視覺導引搬運系統,取代早先反光帶和磁式感應器的導引,以減少搬運路線變更的困難度。藉由顏色分割和幾何特徵分析方法,萃取和辨識地面標記導引搬運系統行進,依據位置為基礎之看而後動的控制架構,導引搬運系統執行路徑追蹤以及到站定位。亦由於攝影機的視野廣闊,應用於偵測搬運路徑上的障礙物,在行進過程中閃避環境的障礙物,直到完全避開再修正方向,回到原來路徑上。

  論文在最後進行實驗,利用影像處理與機械視覺技術擷取移動平台移動路線前方的環境資訊,搭配撰寫之導航控制器及避障路徑規劃,進行移動式平台的路徑追蹤、障礙物閃避和定位控制,從測試的結果驗證使用的方法確實有效可行。
英文摘要  Facing strong business competition and expensive labor costs, companies pursue flexibility and quick response to fit the strong demand associated with short life cycle, small-volume and wide-variety of the products in production lines. Regarding diverse and flexible production patterns, materials transportation routes should be adjusted rapidly for new production lines. Mobile manipulators, which comprise a mobile base and a robot manipulator equipped with a vision system, are appropriate for flexible manufacturing system (FMS) in automatic manufacturing processes. Such material handling systems transfer materials between stations efficiently and flexibly.

 This study adopts a single CCD camera for environmental sensing owing to its large detecting range and better resolution than ultrasonic or infrared sensors. The camera is mounted on the end-effecter of the manipulator and is used to capture the forward scene. The vision system can provide distance information from the mobile base to a landmark, station or obstacle. This work aims to advance the method used to position the developed vision-guided material handling system. Compared with the overhead camera configuration, in which several cameras were distributed at equal intervals in the workspace, the eye-in-hand configuration efficiently reduces the number of cameras necessary. Fast landmark recognition and obstacle detection by color segment are proposed for path following, obstacle avoidance and mobile base positioning. Using the machine vision, a vision-based vector field histogram (VFH) method is modified and applied to guide the mobile manipulator for obstacle avoidance. The mobile base is capable of trajectory planning based on the landmarks for path following, accurate positioning beside a station and determining the steering angle and forward velocity for obstacle avoidance.

 Finally, the proposed guidance algorithms are assessed on the mobile manipulator, including path following, obstacle avoidance and positioning beside a station. The experimental results indicate that the proposed approach is successfully validated while visually navigating a mobile manipulator.
論文目次 Abstract i
Table of Contents ii
List of Tables v
List of Figures vi

1 Introduction 1
1.1 Preface 1
1.2 Motivation and Objective 2
1.3 Literature Survey 3
1.4 Contribution 6
1.5 Thesis Organization 7
2 Background 8
2.1 Brief Introduction to Mobile Manipulator 8
2.2 Mobile Manipulator Architecture 9
2.2.1 Mobile Base 9
2.2.2 Robot Manipulator 10
2.2.3 Vision Subsystem 10
2.3 Mobile Manipulator Communication 11
2.4 Mobile Robot Kinematics 12
2.4.1 Locomotion 13
2.4.2 Localization 13
2.4.3 Path Tracking 14
3 Image Processing and Machine Vision 22
3.1 Image Preprocessing 22
3.1.1 Color Space Conversion 22
3.1.2 Filtering 23
3.1.3 Morphological Processing 25
3.2 Landmark Detection 26
3.2.1 Color Segmentation 26
3.2.2 Geometric Analysis 27
3.3 Obstacle Detection 29
3.3.1 Background Model 29
3.3.2 Background Segmentation 31
3.3.3 Post Processing 32
3.4 Machine Vision 32
3.4.1 Camera Projection Model 33
3.4.2 Monocular Distance Perception 34
3.4.3 Distance Estimation 34
3.4.4 Modified Distance Estimation 35
4 Mobile Base Guidance 46
4.1 Path Following 46
4.1.1 Determinate the Deviation 47
4.1.2 Steering and Velocity Control 48
4.2 Obstacle Avoidance 49
4.2.1 Wall Following Method 50
4.2.2 Potential Field Method 50
4.2.3 Vector Field Histogram Method 51
4.2.4 Vision Based Vector Field Histogram Method 52
4.2.5 Cubic Bezier path generator 54
4.3 Positioning 55
4.3.1 Accurate Alignment 56
4.4 Decision-Making Procedures 57
5 Experimentation 65
5.1 Experiential Setup 65
5.1.1 Camera Calibration 66
5.1.2 Monocular Distance Perception 66
5.2 Path following 67
5.3 Obstacle avoidance 69
5.4 Positioning 70
6 Conclusion 85
6.1 Summary 85
6.2 Future Improvements 86
Bibliography 88
參考文獻 [1] D. Austin and K. Kouzoubov, "Robust, Long Term Navigation of a Mobile Robot," Proceedings of IARP Workshop on Technical Challenges for Dependable Robots in Human Environments, pp. 1-8, Oct. 2002.
[2] P. Batavia and S. Singh, "Obstacle Detection Using Adaptive Color Segmentation and Color Stereo Homography," Proceedings of the IEEE International Conference on Robotics and Automation, pp. 705-710, May 2001.
[3] S. M. Benoit and F. P. Ferrie, "Monocular Optical Flow for Real-Time Vision Systems," In Proceedings of the 13th International Conference on Pattern Recognition, pp. 864-868, Aug. 1996.
[4] J. Borenstein and Y. Koren, "Real-time obstacle avoidance for fast mobile robots," IEEE Transactions on Systems, Man, and Cybernetics, Vol. 19, no. 5, pp.1179-1187, Sep. 1989.
[5] J. Borenstein and Y. Koren, "The Vector Field Histogram - Fast Obstacle Avoidance for Mobile Robots," IEEE Transactions on Robotics and Automation, Vol. 7, pp. 278-288, June 1991.
[6] R. Cassinis, F. Tampalini, P. Bartolini and R. Fedrigotti, "Docking and Charging System for Autonomous Mobile Robots," DEA-Unibs, 2005.
[7] G. Cheng and A. Zelinsky, "Real-Time Visual Behaviours for Navigating a Mobile Robot," Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Vol 2, pp. 973-980, Nov 1996.
[8] G. N. Desouza and A. C. Kak, "Vision for mobile robot navigation: a survey," IEEE Transactions on Pattern Analysis and Machine Intelligence, Vol. 24, pp. 237-267, Feb. 2002.
[9] R. C. Gonzalez and R. E. Woods, Digital Image Processing, Prentice Hall, 2nd edition, Jan. 2002.
[10] R. C. Coulter, "Implementation of the Pure Pursuit Path Tracking Algorithm," tech. report CMU-RI-TR-92-01, Robotics Institute, Carnegie Mellon University, Jan. 1992.
[11] Mitsubishi Corporation, MELFA Industrial Robots Instruction Manual, Oct. 2002.
[12] Sony Corporation, Color Camera Module Instruction Manual, 2000.
[13] D. A. Forsyth and J. Ponce, Computer Vision: A Modern Approach, Prentice Hall, Aug. 2002.
[14] O. Ghita and P. F. Whelan, "A bin picking system based on depth from defocus," Machine Vision And Applications, pp. 234-244, 2003.
[15] G. Gordon, T. Darrell, M. Harville and J. Woodfill, "Background Estimation and Removal Based on Range and Color," In Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, Vol. 2, pp. 464, June 1999.
[16] A. Gruen and T. S. Huang, Calibration and Orientation of Cameras in Computer Vision, Springer-Verlag Telos, July 2001.
[17] O. Khatib and J. Burdick, "Motion and force control of robot manipulators," Proc. IEEE Robotics and Automation, pp.1381–1386, Apr. 1986.
[18] K. C. Koh and H. S. Cho, "A Smooth Path Tracking Algorithm for Wheeled Mobile Robots with Dynamic Constraints," Journal of Intelligent and Robotic Systems, Vol. 24, pp. 367-385, 1999.
[19] Y. Koren and J. Borenstein, "Potential Field Methods and Their Inherent Limitations for Mobile Robot Navigation," IEEE International Conference on Robotics and Automation, pp. 1398-1404, Vol. 2, Apr. 1991.
[20] S. Kang, A. Koschan, B. Abidi and M. Abidi, "Video Surveillance of High Security Facilities," Proc. of 10th Int. Conf. on Robotics & Remote Systems for Hazardous Environments, pp. 530-536, Gainesville, FL, March 2004.
[21] O. Khatib, "Real-time obstacle avoidance for manipulators and mobile robots." In Proc. of the IEEE Intl. Conf. on Robotics and Automation, pp. 500–505, 1985.
[22] L. M. Lorigo, R.A. Brooks and W. E. L. Grimsou, "Visually- Guided Obstacle Avoidance in Unstructured Environments," IEEE/RSJ International Conference on Intelligent Robots and Systems, Vol. 1, pp. 373-379, Sep 1997.
[23] E. Morimoto, M. Suguri and M. Umeda, "Obstacle Avoidance System for Autonomous Transportation Vehicle Based on Image Processing," International Commission of Agricultural Engineering, Vol. 9, pp.1-11, Dec 2002.
[24] S. Oh, A. Zelinsky and K. Taylor, "Autonomous Battery Recharging for Indoor Mobile Robots," Proceedings of Australian Conference on Robotics and Automation (ACRA), pp. 1-5, Aug. 2000.
[25] A. Ollero and G. Heredia "Stability Analysis of Mobile Robot Path Tracking," IEEE/RSJ International, Conference on Intelligent Robots and Systems, IROS, pp. 461-466, 1995.
[26] M. Ollis, "Perception Algorithms for a Harvesting Robot," Carnegie Mellon University Doctoral Dissertation, CMU-RI-TR-97-43, Aug 1997.
[27] M. Ollis and A. Stentz, "Vision-Based Perception for an Autonomous Harvester," Proceedings of the IEEE/RSJ International Conference on Intelligent Robotic Systems, Vol. 3, pp. 1838-1844, Sep. 1997.
[28] A. Rankin, C. Crane, A. Armstrong, A. Nease and H.E. Brown, "Autonomous path planning navigation system used for site characterization", in Proc of the SPIE 10th Annual AeroSense Symposium, Orlando, Vol. 2738, pp. 176–186, April 1996.
[29] M. C. Silverman, Dan Nies, Boyoon Jung and G. S. Sukhatme, "Staying Alive: A Docking Station for Autonomous Robot Recharging." IEEE International Conference on Robotics and Automation, pp. 1050-1055, May 2002.
[30] K. T. Song and J. H. Huang, "Fast Optical Flow Estimation and Its Application to Real-time Obstacle Avoidance," in Proc. of 2001 IEEE ICRA, pp.2891-2896, 2001.
[31] T. Taylor, S. Geva and W. W. Boles, "Monocular Vision as a Range Sensor," Proceedings International Conference on Computational Intelligence for Modelling, Control & Automation, pp. 566-575, Aug. 2004.
[32] I. Ulrich and I. Nourbakhsh, "Appearance-Based Obstacle Detection with Monocular Color Vision," Proceedings of the National Conference on Artificial Intelligence, pp. 886-871, Jul. 2000.
[33] J. Wit, C. D. Crane and D. Armstrong, "Autonomous Ground Vehicle Path Tracking," Journal of Robotic Systems, Vol. 21, pp. 439-449, 2004.
[34] H. M. Yang, C. L. Liu, K. H. Liu and S. M. Huang, "Traffic Sign Detection and Recognition in Noisy Outdoor Scenes," Proceedings of the Fourteenth International Symposium on Methodologies for Intelligent Systems, pp. 252-261, Oct. 2003.
[35] K. J. Yoon and I. S. Kweon, "Landmark Design and Real-Time Landmark Tracking for Mobile Robot Localization," Proc. SPIE Vol. 4573, pp. 219-226, 2002.
[36] M. Zhao, B. Jiajun and C. Chun, "Robust background subtraction in HSV color space," in SPIE: Multimedia Systems and Applications, Vol. 4861, pp. 325–332, Jul. 2002.
[37] P. Zingaretti and L. Bossoletti, "Image segmentation for appearance-based self-localisation," The International Conference on Image Analysis and Processing, pp. 113-118, Sep 2001.
論文全文使用權限
  • 同意授權校內瀏覽/列印電子全文服務,於2008-09-06起公開。
  • 同意授權校外瀏覽/列印電子全文服務,於2008-09-06起公開。


  • 如您有疑問,請聯絡圖書館
    聯絡電話:(06)2757575#65773
    聯絡E-mail:etds@email.ncku.edu.tw