進階搜尋


下載電子全文  
系統識別號 U0026-0908201016515000
論文名稱(中文) 人形足球機器人運球避障策略之設計與實現
論文名稱(英文) Design and Implementation of Obstacle Avoidance and Dribbling Strategy for Humanoid Soccer Robot
校院名稱 成功大學
系所名稱(中) 電機工程學系碩博士班
系所名稱(英) Department of Electrical Engineering
學年度 98
學期 2
出版年 99
研究生(中文) 孫志傑
研究生(英文) Chih-Chieh Sun
學號 n2697414
學位類別 碩士
語文別 英文
論文頁數 69頁
口試委員 指導教授-李祖聖
口試委員-孫育義
口試委員-呂虹慶
口試委員-郭逸平
口試委員-王明賢
中文關鍵字 避障策略  人形機器人 
英文關鍵字 obstacle avoidance  humanoid 
學科別分類
中文摘要 本論文係針對人形足球機器人,探討如何利用單眼視覺,透過競賽場地上物體,精確地定出本身的位置,進而辨識障礙物以達到運球避障的功能。在影像處理上,採用YUV 顏色空間,因為YUV 可以將光源變化的影響降低,讓物件辨識能更加準確。從RGB 影像轉到YUV 影像,採用整數運算法,去除浮點數運算,使得運算效率能有所提高,並利用建立YUV LUT 的方式,讓整體的演算法能更加的有效率。視覺定位步驟係先透過攝影機的內參數矩陣、外參數矩陣與座標轉換,以機器人兩腳底為中心原點,計算出物體與人形機器人的相對座標。接著透過物體與機器人的相對關係,經由改進的粒子濾波器得到機器人在世界座標上的位置。本論文詳細說明影像處理之步驟,包含物體辨識與計算相對座標之演算法,提出粒子濾波器之改善法則,並針對運球避障系統,設計一可實現的控制策略。最後透過實際實驗,比較未校正距離與經由校正後的距離之差異性,驗證定位系統的準確性與強健性,以及實現本小型人形機器人整體運球避障策略之適用性與效能。
英文摘要 This thesis mainly investigates the obstacle avoidance and dribbling strategy for humanoid soccer robot by using only monocular vision, where the local object is used for positioning and recognizing the obstacles. For image processing, the YUV color space is adopted to make object recognition more accurate, because the effect of the lighting change can be reduced in YUV space. To enhance efficiency of computing, the function of converting RGB to YUV is operating on integer not on float point. And by establishing YUV LUT, it would improve the performance of the whole algorithm. The center of soles of feet of the robot is set as the origin and the relative coordinate between the object and the robot can be calculated through the camera's intrinsic parameters matrix, extrinsic parameters matrix and coordinate transformation. Then by the relation between the object and the robot, the position of the robot at the world coordinate could be figured out with the ameliorative particle filter. It would be more robust and perform well for the obstacle avoidance and dribbling strategy to apply the above information. This thesis focuses on the vision system, including the algorithm of the object recognition and the coordinate transformation between image and world coordinates. An enhancement scheme of the particle filter is also proposed in the thesis. Furthermore, a real-time control strategy is developed for dribbling ball and avoiding obstacles. Finally, experiments are performed to compare the difference of the distances between with and without the calibration, verify the accuracy and robustness of the local positioning system, and show the feasibility and validity of the proposed obstacle avoidance and dribbling strategy for humanoid soccer robots.
論文目次 Abstract Ⅰ
Acknowledgment Ⅲ
Contents Ⅳ
List of Figures Ⅶ
List of Tables X
Chapter 1. Introduction 1
1.1 Motivation 1
1.2 Thesis Organization 2
Chapter 2. Hardware of the Humanoid Robot 4
2.1 Introduction 4
2.2 Mechanism Design 6
2.3 Central Process Unit 7
2.4 Vision system 9
2.4.1 Image Sensor 9
2.4.2 Human Machine Interface 11
2.5 Actuators 11
2.6 Power System 14
2.7 Summary 16
Chapter 3. Vision System 17
3.1 Introduction 17
3.2 Object Recognition 18
3.2.1 Ball Recognition 18
3.2.2 Recognition of Interval between Obstacles 21
3.2.3 Goal Recognition 23
3.3 Target Position Derivation 25
3.3.1 Camera Calibration 25
3.3.2 Target Position Derivation 27
3.4 Summary 32
Chapter 4. Control Strategy 34
4.1 Introduction 34
4.2 The Algorithm for Localization 35
4.2.1 Grid Map 36
4.2.2 Line Contribution 37
4.2.3 Fitness Function 38
4.2.4 Threshold for Resample 41
4.2.5 Hierarchy Particle Filter 42
4.3 The Control Strategy for Common Behaviors 43
4.4 The Control Strategy for Obstacle Avoidance and Dribbling 44
4.4.1 Stable Motions 45
4.4.2 The Direction of the Forward 45
4.4.3 Obstacle Avoidance and Dribbling 45
4.4.4 The Strategy for Obstacle Avoidance and Dribbling 46
4.5 Summary 48
Chapter 5. Experimental Results 49
5.1 Introduction 49
5.2 Experimental Results of Calculating Target Information 50
5.3 Experimental Results of Strategy for Localization 53
5.4 Experimental Results of Strategy for Obstacle Avoidance and Dribbling 57
Chapter 6. Conclusions and Future Works 62
6.1 Conclusions 62
6.2 Future Works 63
References 65
Biography 68
參考文獻 [1] C. H. Messom, G. Sen Gupta, and S. Demidenko, “Hough Transform Run Length Encoding for Real-Time Image Processing,” in Proceedings of the IEEE Instrumentation and Measurement Technology Conference. IMTC, 2005, pp. 2198-2202.
[2] A. J. Davison, I. D. Reid, N. D. Molton, and O. Stasse, “MonoSLAM: Real-Time Single Camera SLAM,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 29, pp. 1052-1067, 2007.
[3] J. Schmudderich, V. Willert, J. Eggert, S. Rebhan, C. Goerick, G. Sagerer, and E. Korner, “Estimating Object Proper Motion Using Optical Flow, Kinematics, and Depth Information,” IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics, vol. 38, pp. 1139-1151, 2008.
[4] T. Gevers and H. Stokman, “Classifying Color Edges in Video into Shadow-Geometry, Highlight, or Material Transitions,” IEEE Transactions on Multimedia, vol. 5, pp. 237-243, 2003.
[5] G. Sen Gupta and D. Bailey, “Discrete YUV Look-Up Tables for Fast Colour Segmentation for Robotic Applications,” Canadian Conference on Electrical and Computer Engineering. CCECE, 2008, pp. 963-968.
[6] C.-M. Chang, Design and Implementation of Vision and Strategy System for Humanoid Robot Soccer Competition, Master Thesis, Department of Electrical Engineering, 2009.
[7] C. Yizong, “Mean Shift, Mode Seeking, and Clustering,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 17, pp. 790-799, 1995.
[8] Jong-Soo Lee, Yu-Ho Jeong, “CCD Camera Calibrations and Projection Error Analysis,” in Proceedings of the 4th Korea-Russia International Symposium on Science and Technology. KORUS, 2000, pp. 50-55 vol. 2.
[9] S. Ernst, C. Stiller, J. Goldbeck, and C. Roessig, “Camera Calibration for Lane And Obstacle Detection,” in Proceedings of IEEE/IEEJ/JSAI International Conference on Intelligent Transportation Systems, 1999, pp. 356-361.
[10] J. Heikkila and O. Silven, “A Four-Step Camera Calibration Procedure with Implicit Image Correction,” in Proceedings of IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 1997, pp. 1106-1112.
[11] C. Yong-Sheng, S. Shen-Wen, H. Yi-Ping, and F. Chiou-Shann, “Camera Calibration with A Motorized Zoom Lens,” in Proceedings of 15th International Conference on Pattern Recognition, 2000, pp. 495-498 vol.4.
[12] Daniel, Camera calibration tools, [Online]. Available:http://ubimon.doc.ic.ac.uk/dvs/m581.html/.
[13] Z. Zhang, “A Flexible New Technique for Camera Calibration,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 22, pp. 1330-1334, 2000.
[14] K. T. Holland, R. A. Holman, T. C. Lippmann, J. Stanley, and N. Plant, “Practical Use of Video Imagery in Nearshore Oceanographic Field Studies,” IEEE Journal of Oceanic Engineering, vol. 22, pp. 81-92, 1997.
[15] Y. Abdel-Aziz and H. Karara, “Direct Linear Transformation from Comparator Coordinates into Object Space Coordinates in Close-Range Photogrammetry,” Amer. Soc. Photogrammetry, pp. 1-18, 1971.
[16] T. Rofer, T. Laue, and D. Thomas, “Particle-Filter-Based Self-Localization Using Landmarks and Directed Lines,” in Proceedings of RoboCup 2005: Robot Soccer World Cup IX, pp. 608-615, 2006.
[17] S. Lenser and M. Veloso, “Sensor Resetting Localization for Poorly Modelled Mobile Robots,” in Proceedings of ICRA '00. IEEE International Conference on Robotics and Automation, 2000, pp. 1225-1232, vol.2.
[18] I. Harmati and K. Skrzypczyk, “Robot Team Coordination for Target Tracking Using Fuzzy Logic Controller in Game Theoretic Framework,” Robotics and Autonomous Systems, vol. 57, pp. 75-86, 2009.
[19] R. Ueda, T. Fukase, Y. Kobayashi, T. Arai, H. Yuasa, and J. Ota, “Uniform Monte Carlo Localization - Fast and Robust Self-Localization Method for Mobile Robots,” in Proceedings of the ICRA '02. IEEE International Conference on Robotics and Automation, 2002, pp. 1353-1358.
[20] P. Buschka, A. Saffiotti, and Z. Wasik, “Fuzzy Landmark-Based Localization for A Legged Robot,” in Proceedings of IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2000, pp. 1205-1210 vol.2.
[21] D. Herrero-Perez and H. Martinez-Barbera, “Fast and Robust Recognition of Field Line Intersections,” in Proceedings of Robotics Symposium, 2006. LARS '06. IEEE 3rd Latin American, 2006, pp. 115-119.
[22] F. Dellaert, D. Fox, W. Burgard, and S. Thrun, “Monte Carlo Localization for Mobile Robots,” in Proceedings of IEEE International Conference on Robotics and Automation, 1999, pp. 1322-1328, vol.2.
[23] A. Doucet, N. De Freitas, and N. Gordon, Sequential Monte Carlo Methods in Practice: Springer Verlag, 2001.
[24] G. Grisetti, C. Stachniss, and W. Burgard, “Improved Techniques for Grid Mapping with Rao-Blackwellized Particle Filters,” IEEE Transactions on Robotics, vol. 23, pp. 34-46, 2007.
[25] RoboCup, http://www.robocup.org/.
[26] FIRA, http://www.fira.net/.
[27] AXIOMTEK, http://tw.axiomtek.com.tw/
[28] Logitech, http://www.logitech.com/
[29] Embarcadero, http://www.embarcadero.com/
[30] ROBOTIS, http://www.robotis.com/xe/
[31] MEAN WELL, http://www.meanwell.com/
[32] S. Yu-Te, H. Chun-Yang, and T. H. S. Li, “FPGA-Based Fuzzy PK Controller and Image Processing System for Small-Sized Humanoid Robot,” in IEEE International Conference on Systems, Man and Cybernetics. SMC, 2009, pp. 1039-1044.
論文全文使用權限
  • 同意授權校內瀏覽/列印電子全文服務,於2013-08-17起公開。
  • 同意授權校外瀏覽/列印電子全文服務,於2015-08-17起公開。


  • 如您有疑問,請聯絡圖書館
    聯絡電話:(06)2757575#65773
    聯絡E-mail:etds@email.ncku.edu.tw