Mobile Robots Perception & Navigation Mobile Robots Perception & Navigation Edited by Sascha Kolski pro literatur Verlag Published by the Advanced Robotic Systems International and pro literatur Verlag plV pro literatur Verlag Robert Mayer-Scholz Mammendorf Germany Abstracting and non-profit use of the material is permitted with credit to the source. Statements and opinions expressed in the chapters are these of the individual contributors and not necessarily those of the editors or publisher. No responsibility is accepted for the accuracy of information contained in the published articles. Publisher assumes no responsibility liability for any damage or injury to persons or property arising out of the use of any materials, instructions, methods or ideas contained inside. After this work has been published by the Advanced Robotic Systems International, authors have the right to republish it, in whole or part, in any publication of which they are an author or editor, and the make other personal use of the work. © 2007 Advanced Robotic Systems International www.ars-journal.com Additional copies can be obtained from: publication@ars-journal.com First published March 2007 Printed in Croatia A catalog record for this book is available from the German Library. Mobile Robots, Perception & Navigation, Edited by Sascha Kolski p. cm. ISBN 3-86611-283-1 1. Mobile Robotics. 2. Perception. 3. Sascha Kolski V Preface Mobile Robotics is an active research area where researchers from all over the world find new technologies to improve mobile robots intelligence and areas of application. Today ro- bots navigate autonomously in office environments as well as outdoors. They show their ability to beside mechanical and electronic barriers in building mobile platforms, perceiving the environment and deciding on how to act in a given situation are crucial problems. In this book we focused on these two areas of mobile robotics, Perception and Navigation. Perception includes all means of collecting information about the robot itself and it’s envi- ronment. To make robots move in their surrounding and interact with their environment in a reasonable way, it is crucial to understand the actual situation the robot faces. Robots use sensors to measure properties of the environment and interpret these measure- ments to gather knowledge needed for save interaction. Sensors used in the work described in the articles in this book include computer vision, range finders, sonars and tactile sensors and the way those sensors can be used to allow the robot the perception of it’s environment and enabling it to safely accomplishing it’s task. There is also a number of contributions that show how measurements from different sensors can be combined to gather more reliable and accurate information as a single sensor could provide, this is especially efficient when sensors are complementary on their strengths and weaknesses. As for many robot tasks mobility is an important issue, robots have to navigate their envi- ronments in a safe and reasonable way. Navigation describes, in the field of mobile robotics, techniques that allow a robot to use information it has gathered about the environment to reach goals that are given a priory or derived from a higher level task description in an ef- fective and efficient way. The main question of navigation is how to get from where we are to where we want to be. Researchers work on that question since the early days of mobile robotics and have devel- oped many solutions to the problem considering different robot environments. Those in- clude indoor environments, as well is in much larger scale outdoor environments and under water navigation. Beside the question of global navigation, how to get from A to B navigation in mobile robot- ics has local aspects. Depending on the architecture of a mobile robot (differential drive, car like, submarine, plain, etc.) the robot’s possible actions are constrained not only by the ro- VI bots’ environment but by its dynamics. Robot motion planning takes these dynamics into account to choose feasible actions and thus ensure a safe motion. This book gives a wide overview over different navigation techniques describing both navi- gation techniques dealing with local and control aspects of navigation as well es those han- dling global navigation aspects of a single robot and even for a group of robots. As not only this book shows, mobile robotics is a living and exciting field of research com- bining many different ideas and approaches to build mechatronical systems able to interact with their environment. Editor Sascha Kolski VII Contents Perception 1. Robot egomotion from the deformation of active contours . 01 Guillem Alenya and Carme Torras 2. Visually Guided Robotics Using Conformal Geometric Computing 19 Eduardo Bayro-Corrochano, Luis Eduardo Falcon-Morales and Julio Zamora-Esquivel 3. One approach To The Fusion Of Inertial Navigation And Dynamic Vision 45 Stevica Graovac 4. Sonar Sensor Interpretation and Infrared Image Fusion for Mobile Robotics 69 Mark Hinders, Wen Gao and William Fehlman 5. Obstacle Detection Based on Fusion Between Stereovision and 2D Laser Scanner 91 Raphaël Labayrade, Dominique Gruyer, Cyril Royere, Mathias Perrollaz and Didier Aubert 6. Optical Three-axis Tactile Sensor 111 Ohka, M. 7. Vision Based Tactile Sensor Using Transparent Elastic Fingertip for Dexterous Handling 137 Goro Obinata, Dutta Ashish, Norinao Watanabe and Nobuhiko Moriyama 8. Accurate color classification and segmentation for mobile robots 149 Raziel Alvarez, Erik Millán, Alejandro Aceves-López and Ricardo Swain-Oropeza 9. Intelligent Global Vision for Teams of Mobile Robots 165 Jacky Baltes and John Anderson 10. Contour Extraction and Compression-Selected Topics 187 Andrzej Dziech VIII Navigation 11. Comparative Analysis of Mobile Robot Localization Methods Based On Proprioceptive and Exteroceptive Sensors . 215 Gianluca Ippoliti, Leopoldo Jetto, Sauro Longhi and Andrea Monteriù 12. Composite Models for Mobile Robot Offline Path Planning . 237 Ellips Masehian and M. R. Amin-Naseri 13. Global Navigation of Assistant Robots using Partially Observable Markov Decision Processes . 263 María Elena López, Rafael Barea, Luis Miguel Bergasa, Manuel Ocaña and María Soledad Escudero 14. Robust Autonomous Navigation and World Representation in Outdoor Environments 299 Favio Masson, Juan Nieto, José Guivant and Eduardo Nebot 15. Unified Dynamics-based Motion Planning Algorithm for Autonomous Underwater Vehicle-Manipulator Systems (UVMS) 321 Tarun K. Podder and Nilanjan Sarkar 16. Optimal Velocity Planning of Wheeled Mobile Robots on Specific Paths in Static and Dynamic Environments . 357 María Prado 17. Autonomous Navigation of Indoor Mobile Robot Using Global Ultrasonic System 383 Soo-Yeong Yi and Byoung-Wook Choi 18. Distance Feedback Travel Aid Haptic Display Design 395 Hideyasu Sumiya 19. Efficient Data Association Approach to Simultaneous Localization and Map Building 413 Sen Zhang, Lihua Xie and Martin David Adams 20. A Generalized Robot Path Planning Approach Without The Cspace Calculation 433 Yongji Wang, Matthew Cartmell, QingWang and Qiuming Tao 21. A pursuit-rendezvous approach for robotic tracking . 461 Fethi Belkhouche and Boumediene Belkhouche 22. Sensor-based Global Planning for Mobile Manipulators Navigation using Voronoi Diagram and Fast Marching 479 S. Garrido, D. Blanco, M.L. Munoz, L. Moreno and M. Abderrahim IX 23. Effective method for autonomous Simultaneous Localization and Map building in unknown indoor environments 497 Y.L. Ip, A.B. Rad, and Y.K. Wong 24. Motion planning and reconfiguration for systems of multiple objects 523 Adrian Dumitrescu 25. Symbolic trajectory description in mobile robotics 543 Pradel Gilbert and Caleanu Catalin-Daniel 26. Robot Mapping and Navigation by Fusing Sensory Information . 571 Maki K. Habib 27. Intelligent Control of AC Induction Motors 595 Hosein Marzi 28. Optimal Path Planning of Multiple Mobile Robots for Sample Collection on a Planetary Surface . 605 J.C. Cardema and P.K.C. Wang 29. Multi Robotic Conflict Resolution by Cooperative Velocity and Direction Control 637 Satish Pedduri and K Madhava Krishna 30. Robot Collaboration For Simultaneous Map Building and Localization . 667 M. Oussalah and D. Wilson X [...]... the representation to a roll-pitch-yaw codification It is frequently used in the navigation field, it being also called heading-attitude-bank (Sciavicco and Siciliano, 2000) We use the form (17) where s and c denote the sinus and cosinus of , respectively The inverse solution is (18) (19) (20) 6 Mobile Robots, Perception & Navigation Fig 2 Systematic error in the Rx component Continuous line for values... green dotted one to the vision-computed camera trajectory 12 Mobile Robots, Perception & Navigation Finally, to compare graphically both methods, the obtained translations are represented in the XZ plane (Fig 8(b)) This experiment shows that motion estimation provided by the proposed algorithm has a reasonably precision, enough for robot navigation To be able to compare both estimations it has been... respectively Fig 9 Robucab mobile robot platform transporting four people Robot Egomotion from the Deformation of Active Contours 15 5.2.Experiments enhancing vision with inertial sensing For this experimentation, we use a Robu Cab Mobile Robot from Robosoft As can be seen in Fig 9, it is a relatively big mobile vehicle with capacity for up to four people It can be used in two modes: car-like navigation and bi-directional... despite its quick translation across the image Figure 10: Covariance trace resulting from tracking without using inertial information (a) and using it (b) 16 Mobile Robots, Perception & Navigation 6.Conclusions and future work A method for estimating mobile robot egomotion has been presented, which relies on tracking contours in real-time images acquired with a monocular vision system The deformation of... as some catadioptric marks, yielding a value of 7.7m The performed motion was a translation of approximately 3.5m along the heading direction of the robot perturbed by small turnings 10 Mobile Robots, Perception & Navigation Fig 5 Still EGV-10robotized forklift used in a warehouse for realexperimentation Odometry, laser positioningandmonocular vision data were recollected Fig 6 Real experiment to compute... rotation and translation are obtained from the M = [Mi,j] and t = (tx, ty) defining the affinity Representing the rotation matrix in Euler angles form, (9) equation (7) can be rewritten as 4 Mobile Robots, Perception & Navigation where R|2 denotes de 2 X 2 submatrix of R Then, (10) where This last equation shows that which we will name ( 1, 2): T can be calculated from the eigenvalues of the matrix MM ,... consider a symmetric sensor, |vxmin|=|vxmax| Sensor readings can be rescaled to provide values in the range [vxmin vxmax] A value vx provided by the inertial sensor can be rescaled using 14 Mobile Robots, Perception & Navigation (28) Following the same reasoning, shape vector parameters can be rescaled For the first component we have (29) and the expression (30) Inertial information can be added now by... of the viewed object is small compared to its distance to the camera This camera model has been widely used before (Koenderink and van Doorn, 1991; Shapiro et al., 1995; Brandt, 2005) 2 Mobile Robots, Perception & Navigation Active contours are a usual tool for image segmentation in medical image analysis The ability of fastly tracking active contours was developed by Blake (Blake and Isard, 1998)... translations are quite correctly recovered As shown in (Alberich-Carramiñana et al., 2006), the sign of the error depends on the target shape and the orientation of the axis of rotation 8 Mobile Robots, Perception & Navigation The third experiment involves a translation along the optical axis Z From the initial distance Z0 = 5000 the camera is translated to Z = 1500, that is a translation of —3500mm... error-minimizing control point insertion strategies IEEE Trans Pattern Anal Machine Intell., 21(1), 1999 J Foley, A van Dam, S Feiner, and F Hughes Computer Graphics Principles and Practice 18 Mobile Robots, Perception & Navigation Addison-Wesley Publishing Company, 1996 E Martínez and C Torras Contour-based 3dmotion recovery while zooming Robotics and Autonomous Systems, 44:219–227, 2003 L.Sciavicco and . Mobile Robots Perception & Navigation Mobile Robots Perception & Navigation Edited by Sascha Kolski pro literatur Verlag Published. German Library. Mobile Robots, Perception & Navigation, Edited by Sascha Kolski p. cm. ISBN 3-86611-283-1 1. Mobile Robotics. 2. Perception. 3. Sascha Kolski V Preface Mobile Robotics. under water navigation. Beside the question of global navigation, how to get from A to B navigation in mobile robot- ics has local aspects. Depending on the architecture of a mobile robot (differential