進階搜尋


 
系統識別號 U0026-0812200914372706
論文名稱(中文) 移動式機器人即時影像目標物追蹤與避障系統之設計與實現
論文名稱(英文) Real-Time Image Processing Target Tracking and Obstacle Avoidance for Mobile Robot
校院名稱 成功大學
系所名稱(中) 電機工程學系碩博士班
系所名稱(英) Department of Electrical Engineering
學年度 96
學期 2
出版年 97
研究生(中文) 薛柏彥
研究生(英文) Bo-Yan Hsueh
電子信箱 n2695124@mail.ncku.edu.tw
學號 n2695124
學位類別 碩士
語文別 英文
論文頁數 87頁
口試委員 口試委員-呂虹慶
召集委員-孔蕃鉅
口試委員-陳大道
口試委員-郭逸平
指導教授-李祖聖
中文關鍵字 目標物追蹤  立體視覺  移動式機器人  避障 
英文關鍵字 Mobile robot  stereo vision  obstacle avoidance  target tracking 
學科別分類
中文摘要 本論文提出在室內環境中移動式機器人之即時目標物追蹤與障礙物規避之立體視覺影像處理系統之設計,本論文的視覺系統,結合了數種影像處理技術,包括顏色分割、均值濾波、侵蝕及擴張、邊緣偵測等方法,找出環境中實際的目標物與障礙物,求出目標物與障礙物相對於移動式機器人的角度位置,以及利用立體視覺求出相對於移動式機器人的實際距離,藉由了解環境中目標物與障礙物相對於移動式機器人的遠近關係,選擇不同的行為模式計算出追蹤路徑或避障路徑,並驅動移動式機器人在巡邏過程中閃避環境中的障礙物避免碰撞並且成功地完成目標物追蹤。最後,我們將藉由實驗之結果,來驗證所提方法之效能及適用性。
英文摘要 This thesis proposes an image processing approach for real-time target tracking and obstacle avoidance for mobile robot navigation in an indoor environment using a stereo vision sensor. Several image processing techniques which include the averaging filter, edge-detection, erosion and dilation, and color segmentation are combined to find the target and obstacles. Then we can compute the angular position of the detected target and obstacle related to the mobile robot in the corridor. Stereo vision is utilized to calculate the relative distance of the target and obstacle from the mobile robot. According to the distance, we can determine the relationship of the target and obstacle to the mobile robot. The best target tracking path and obstacle avoidance path can be determined by different behavior modes. Therefore, the mobile robot plans a collision-free and successful track target to complete the patrol routine. Finally, the practical experiments demonstrate the feasibility and effectiveness of the proposed schemes.
論文目次 Abstract Ⅰ
Acknowledgment Ⅲ
Contents Ⅳ
List of Figures Ⅶ
List of Tables Ⅺ

Chapter 1. Introduction 1
1.1 Motivation 1
1.2 Thesis Organization 3
Chapter 2. Overview of the surveillance and security robot 4
2.1 Introduction 4
2.2 Mobile Robot system 5
2.3 Hardware Architecture of the Mobile Robot 6
2.3.1 The Vision Module 6
2.3.2 The Driver and DC Motor Module 7
2.3.3 Central Processor Units: NB 10
2.3.4 Battery Module and Voltage Regular Module 11
2.4 Hardware Configuration of the Mobile Robot 11
Chapter 3. Stereo Vision 13
3.1 Introduction 13
3.2 Perspective Geometry 14
3.3 The Depth of Stereo Image 15
3.4 Summary 18
Chapter 4. Vision System 19
4.1 Introduction 19
4.2 Obstacles detection 21
4.2.1 Image Pre-processing 21
4.2.2 Grayscale manipulation 23
4.2.3 Averaging Filter 24
4.2.4 Edge Detection Approach 26
4.2.5 Binarization 28
4.2.6 Connected Component Labeling 28
4.3 Target detection 34
4.3.1 Color Segmentation 35
4.3.2 Mathematical Morphology 36
4.3.3 Differentiate between Obstacles and Target 41
4.4 Stereo Matching 42
4.4.1 Area-based technique 42
4.5 Summary 49
Chapter 5. Control Strategy System 50
5.1 Introduction 50
5.2 The Coordinate System of the Mobile Robot 53
5.3 Fuzzy Logic Controller of the Mobile Robot for Target Tracking 56
5.3.1 Fuzzification Interface (FI) 56
5.3.2 Decision Making Logic (DML) 58
5.3.3 Knowledge Base (KB) 59
5.3.4 Defuzzification Interface (DFI) 60
5.4 The Control Strategy for Target Detection 62
5.5 The Control Strategy for Obstacle Avoidance 64
5.6 Summary 68
Chapter 6. Experimental Results 69
6.1 Introduction 69
6.2 Experimental Results of Strategy for Target Tracking 70
6.3 Experimental Results of Strategy for Obstacle Avoidance 73
6.3.1 Obstacle Avoidance Strategy with One Obstacle 73
6.3.2 Obstacle Avoidance Strategy with Multi-Obstacles 76
Chapter 7. Conclusions and Future Works 81
7.1 Conclusions 81
7.2 Future Works 82
References 83
Biography 87
參考文獻 [1] A. Elfes, “Using occupancy grid for mobile robot perception and navigation,” IEEE Computer Magazine, Vol.22, No.6, pp. 46-57, Jun. 1989.
[2] H. Li and S. X. Yang, “Ultrasonic Sensor based Fuzzy Obstacle Avoidance Behaviors,” in Proc. IEEE Int. Conf. on System, Man and Cybernetics, Vol.2, pp. 644-649, 2002.
[3] J. Hancock, M. Hebert and C. Thorpe, “Laser intensity-based obstacle detection Intelligent Robots and Systems,” in Proc. IEEE/RSJ Int. Conf. on Intelligent Robotic Systems, Vol. 3, pp. 1541-1546, 1998.
[4] E. Menegatti, A. Pretto, A. Scarpa and E. Pagello, “Omnidirectional Vision Scan Matching for Robot Localization in Dynamic Environments,” IEEE Trans. on Robotics and Automation, Vol. 22, pp. 523-535, 2006.
[5] E. Elkonyaly, F. Areed, Y. Enab and F. Zada, “Range sensory based robot navigation in unknown terrains,” in Proc. SPIE on the International Society for Optical Engineering, Vol. 2591, pp. 76-85, 1995.
[6] C. Lin and R. L. Tummala, “Adaptive Sensor Integration for Mobile Robot Navigation,” in Proc. IEEE Int. Conf. on Multisensor Fusion and Integration for Intelligent System, pp. 85-91, Oct. 1994.
[7] T. H. S. Li and S. J. Chang, “Fuzzy target tracking control of autonomous mobile robots by using infrared sensors,” IEEE Trans. on Fuzzy Syst., Vol. 12, No. 4, pp. 491-501, 2004.
[8] K. Sugihara, “Some location problems for robot navigation using a single,” Computer Vision, Graphics and Image Processing, Vol. 42, No. 1, pp. 112-129, Apr. 1988.
[9] T. Tsubouchi and S. Yuta, “Map assisted vision system of mobile robots for reckoning in a building environment,” in Proc. IEEE Int. Conf. on Robotics Automat., pp. 1978-1984, Mar./Apr. 1987.
[10] Y. Yagi, S. Kawato and S. Tsuji, “Collision avoidance using omnidirectional image sensor (COPIS),” in Proc. IEEE Int. Conf. on Robotics and Automation, Vol. 1, pp. 910-915, Apr. 1991.
[11] Y. Sun, Q. Cao and W. Chen, “An object tracking and global localization method using omnidirectional vision system,” in Proc. 5th World Congress on Intelligent Control and Automation, Vol.6, pp. 4730-4735, Jun. 2004.
[12] D. J. Kriegman, E. Triendl and T. O. Binford, “Stereo vision and navigation in buildings for mobile robots,” IEEE Trans. on Robotics Automat., Vol. 5, No. 6, pp. 792-803, Dec. 1989.
[13] C. Caraffi, S. Cattani and P. Grisleri, “Off-Road Path and Obstacle Detection Using Decision Networks and Stereo Vision,” IEEE Trans. on Intelligent Transportation Systems, Vol. 8, pp. 607-618, Dec. 2007.
[14] N. Ayache and F. Lustman, “Trinocular Stereo Vision for Robotics,” IEEE Trans. on Pattern Analysis and Machine Intelligence, Vol. 13, No. 1, pp. 73-85, 1991.
[15] S. O. Lee, Y. J. Cho, M. H. Bo, B. J. You and S. R. Oh, “A stable target tracking control for unicycle mobile robots,” in Proc. Int. Conf. on Intelligent Robots and Systems, Vol. 3, pp. 1822-1827, 2000.
[16] T. Darrell, G. Gordon, M. Harville and J. Woodfill, “Integrated person tracking using stereo, color and pattern detection,” in Proc. IEEE Computer Society Conf. on Computer Vision and Pattern Recognition, pp. 601-608, Jun. 1998.
[17] J. Orwell, P. Remagnino and G. A. Jones, “Multi-camera colour tracking,” in IEEE Workshop on Visual Surveillance, pp. 14-21, Jul. 1999.
[18] T. H. S. Li, S. J. Chang and W. Tong, “Fuzzy target tracking control of autonomous mobile robots by using infrared sensors,” IEEE Trans. on Fuzzy Syst. Vol. 12, No. 4, pp. 491-501, 2004.
[19] R. C. Luo and T. M. Chean, “Autonomous mobile target tracking system based on gray-fuzzy control algorithm,” IEEE Trans. on Ind. Electron. Vol. 47, No. 4, pp. 920–931, 2000.
[20] F. J. Montecillo-Puente, V. Ayala-Ramirez, A. Perez-Garcia and R. E. Sanchez-Yanez, “Fuzzy color tracking for robotic tasks,” in Proc. IEEE Int. Conf. on Systems, Man and Cybernetics, Vol. 3, pp. 2769–2773, 2003.
[21] M. Sonka, V. Hlavac and R. Boyle, Image Processing, Analysis, and Machine Vision, Prentice-Hall, New Jersey, 1998.
[22] R. Jain, R. Kasturi and B. G. Schunck, Machine Vision, Prentice-Hall, New Jersey, 1995.
[23] http://www.logitech.com/index.cfm/webcam_communications/webcams/devices/245&cl=tw,zh
[24] Instruction Manual, Series MCDC 2805, Motion Controller for DC-Micromotors, Faulhaber Co.
[25] Instruction Manual, RE-36 and HEDL 5540, http://www.maxonmotor.com/, Maxon Motor Ag.
[26] http://www-307.ibm.com/pc/support/site.wss/document.do?sitestyle=lenovo&lndocid=MIGR-59144
[27] J. Lu, H. Cai, J. G. Lou and Jiang Li, “An Epipolar Geometry-Based Fast Disparity Estimation Algorithm for Multiview Image and Video Coding,” IEEE Trans. on Circuits and Systems for Video Technology, Vol. 17, No. 6, pp. 737- 750, 2007.
[28] J. Woodfill and B. V. Herzen, “Real-time Stereo Vision on the PARTS Reconfigurable Computer,” in Proc. The 5th Annual IEEE Symposium on FPGAs for Custom Machines, pp. 201-210, Apr. 1997.
[29] G. Balakrishnan, G. Sainarayanan, R. Navigation and S. Yaacob, “Stereopsis method for visually impaired to identify obstacles based on distance,” in Proc. IEEE Int. Conf. on Image and Graphics, pp. 580-583, Dec. 2004.
[30] J. Huang and Y. Wang, “Compression of Color Facial Images Using Feature Correction Two-Stage Vector Quantization,” IEEE Trans. on Image Processing, Vol. 8, No. 1, pp. 102-109, Jan. 1999.
[31] ITU-R Recommendation BT.601-5: Studio encoding parameters of digital television for standard 4:3 and wide-screen 16:9 aspect ratios.
[32] Y. Yang, P. Yuhua and L. Zhaoquanq, “A Fast Algorithm for YCbCr to RGB Conversion,” IEEE Trans. on Consumer Electronics, Vol. 53, No. 4, pp. 1490-1493, Nov. 2007.
[33] S. Suthaharan, “Image and edge detail detection algorithm for object-based coding,” Pattern Recognition Letters, Vol. 21, Nos. 6-7, pp. 549-557, Jun. 2000.
[34] S. T. Acton and D. P. Mukherjee, “Area operators for edge detection,” Pattern Recognition Letters, Vol. 21, No. 9, pp. 771-777, Jul. 2000.
[35] F. Russo and A. Lazzari, “Color edge detection in presence of Gaussian noise using nonlinear prefiltering,” IEEE Trans. on Instrumentation and Measurement, Vol. 54, No. 1, pp. 352-358, Feb. 2005.
[36] B. Xianqzhi and Z. Fuqen, “Edge Detection Based on Mathematical Morphology and Iterative Thresholding,” in Proc. IEEE Int. Conf. on Computational Intelligence and Security, Vol. 2, pp. 1849-1852, Nov. 2006.
[37] M. F. Ercan and Y. F. Fung, “Connected component labeling on a one dimensional DSP array,” in Proc. IEEE Region 10 Conf., Vol. 2, pp. 1299-1302, Sep. 1999.
[38] X. D. Tian, H. Y. Li, X. F. Li and L. P. Zhang, “Research on symbol recognition for mathematical expressions,” in Proc. IEEE Int. Conf. on Innovative Computing, Information and Control, Vol. 3, pp. 357-360, Aug. 2006.
[39] A. Fusiello, V. Roberto and E. Trucco, “Efficient stereo with multiple windowing,” in Proc. IEEE Computer Society Conf. on Computer Vision and Pattern Recognition, pp.858-863, Jun. 1997.
[40] I. Ohya , A. Kosaka and A. Kak, “Vision-based navigation by a mobile robot with obstacle avoidance using single-camera vision and ultrasonic sensing,” IEEE Trans. on Robotics and Automation, Vol. 14, No. 6, pp. 969-978, Dec. 1998.
[41] P.Veelaert and W. Boqaerts, “Ultrasonic potentiall field sensor for obstacle avoidance,” IEEE Trans. on Robotics and Automation, Vol. 15, pp. 774-779, Aug. 1999.
[42] J. Borenstein and Y. Koren, “Real-time obstacle avoidance for fast mobile robots,” IEEE Trans. on Systems, Man, and Cybernetics, Vol. 19, No. 5, pp. 1179-1187, Sept./Oct. 1989.
[43] Z. Qu, J. Wang and C. E. Plaisted, “A new analytical solution to mobile robot trajectory generation in the presence of moving obstacles,” IEEE Trans. on Robotics, Vol. 20, No. 6, pp. 978-993, Dec. 2004.
[44] C. Marques and P. Lima, “A localization method for a soccer robot using a vision-based omni-directional sensor,” in Proc. of RoboCup Workshop, Melbourne, Australia, 2000.
[45] G. Adorni, L. Bolognini, S. Cagnoni and M. Mordonini, “Steteo Obstacle Detection Method for Hybrid Omni-directional/Pin-Hole Vision System,” in RoboCup-2001: Robot Soccer World Cup V, pp. 244-250.
論文全文使用權限
  • 同意授權校內瀏覽/列印電子全文服務,於2011-09-08起公開。
  • 同意授權校外瀏覽/列印電子全文服務,於2013-09-08起公開。


  • 如您有疑問,請聯絡圖書館
    聯絡電話:(06)2757575#65773
    聯絡E-mail:etds@email.ncku.edu.tw