Appearance based mobile robot localization and map building in unstructured environments

143 190 0
Appearance based mobile robot localization and map building in unstructured environments

Đang tải... (xem toàn văn)

Tài liệu hạn chế xem trước, để xem đầy đủ mời bạn chọn Tải xuống

Thông tin tài liệu

APPEARANCE-BASED MOBILE ROBOT LOCALIZATION AND MAP-BUILDING IN UNSTRUCTURED ENVIRONMENTS MANA SAEDAN M.Eng (NUS) A THESIS SUBMITTED FOR THE DEGREE OF DOCTOR OF PHILOSOPHY DEPARTMENT OF MECHANICAL ENGINEERING NATIONAL UNIVERSITY OF SINGAPORE 2007 ACKNOWLEDGMENTS Four great years in PhD program, I have been indebted for the kindness of many people. First of all, I will not be here today without the guidance from my supervisor Dr. Marcelo H. Ang, Jr. His inspiration, encouragement and thrust made me pass through all hurdles. I would like to thank especially my wife Janjarut, without her I would not have come this far. Her support and many sleepless nights accompanying me are most invaluable. Throughout the program, I met many inspiring people. Their views and experiences definitely affect my life perspective. I would like to thank Professor Alain Berthoz for the opportunity to work in the Laboratoire de Physiologie de la Perception et de l’Action (LPPA), College de France, Paris. Many thanks to Panagiota Panagiotagi for her French to English translation and also discussion about our brains! Julien Diard for his assistance and advice so I had no difficulty while in France. My colleagues and friends are never less important. Koh Niak Wu, Tirthankar Bandyopadhyay, Lee Gim Hee, and Li Yuanping: I would like to thank them for sharing and discussing the work. Lim Chee Wang for his full support during the period I worked at Singapore Institute of Manufacturing Technology (SIMTech). I sincerely appreciate supports from technicians and laboratory officers of the control and mechatronics lab. ii Lastly, I would like to dedicate this work to my parents, Mr. Thongdee and Ms. Ampai, they are truly my angels. iii SUMMARY Machine vision has gained in popularity for use as a main sensor modality in navigation research because of its relatively low cost compared to other sensors. Furthermore, vision provides enormous information about an environment. We develop a vision-based navigation system that is inspired by the capability of humans to use eyes (vision) to navigate in daily lives. The contributions of the Ph.D. work are as follows: • Direct image processing of a (circular) image from a catadioptric (omnidirectional) camera system. • Localization using vision and robot odometry with a hybrid representation of the world using topological/metric map. • Integration of feature map building and localization to incrementally build a map and solve the “loop closing” problem, make the system applicable in unknown, dynamically changing environments with good performance in large indoor areas. • Integration of appearance-based simultaneous localization and map-building with navigation to a goal. iv TABLE OF CONTENTS Page Acknowledgments . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ii Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . iv List of Figures . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ix List of Tables . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . xvi Nomenclature . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . xvii Chapters:: 1. 2. Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1.1 State-of-the-Art Localization and SLAM algorithm . . . . . . . . . 1.2 Vision-Based Navigation . . . . . . . . . . . . . . . . . . . . . . . . 1.3 Summary of Contributions . . . . . . . . . . . . . . . . . . . . . . . Probabilistic Algorithms in Vision-based Robot Navigation . . . . . . . . 12 v 3. 2.1 Kalman Filter . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14 2.2 Mixture of Gaussian (MOG) . . . . . . . . . . . . . . . . . . . . . . 16 2.3 Particle Filter . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16 2.4 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19 Image Processing for Vision-Based Navigation . . . . . . . . . . . . . . . 20 3.1 Feature Extraction . . . . . . . . . . . . . . . . . . . . . . . . . . . 20 3.1.1 Salient Location Identification . . . . . . . . . . . . . . . . . 22 3.1.2 Neighborhood Pixel Extraction . . . . . . . . . . . . . . . . 24 3.1.3 Feature Description . . . . . . . . . . . . . . . . . . . . . . 26 3.2 Image Matching . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28 3.2.1 Global Matching . . . . . . . . . . . . . . . . . . . . . . . . 30 3.2.2 Local Matching . . . . . . . . . . . . . . . . . . . . . . . . . 35 3.3 Experimental Results 4. . . . . . . . . . . . . . . . . . . . . . . . . . 37 3.4 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 39 Appearance-based Localization in an Indoor Environment . . . . . . . . 40 4.1 System Modeling . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41 4.1.1 State Model of a Robot . . . . . . . . . . . . . . . . . . . . 41 4.1.2 Measurement (Observation) Model . . . . . . . . . . . . . . 42 4.2 Disbelief Algorithm . . . . . . . . . . . . . . . . . . . . . . . . . . . 43 4.3 Self-Localization Experiment . . . . . . . . . . . . . . . . . . . . . 45 4.4 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 49 vi 5. Appearance-based Simultaneous Localization and Map-Building (SLAM) 53 5.1 Appearance-Based SLAM . . . . . . . . . . . . . . . . . . . . . . . 54 5.1.1 Map Representation and Particle Sampling . . . . . . . . . 56 5.1.2 Particle Weight Update . . . . . . . . . . . . . . . . . . . . 59 5.1.3 Tracking Window to Estimate Robot Position . . . . . . . . 61 5.2 Appearance-Based Map-building . . . . . . . . . . . . . . . . . . . 63 5.2.1 Oracle Model . . . . . . . . . . . . . . . . . . . . . . . . . . 64 5.3 Map Management . . . . . . . . . . . . . . . . . . . . . . . . . . . 67 5.3.1 6. Update Absolute Positions of Map Nodes . . . . . . . . . . 69 5.4 Experimental Setup and Results . . . . . . . . . . . . . . . . . . . 71 5.5 Integration of SLAM and Navigation to a Goal Tasks . . . . . . . . 85 5.6 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 92 Conclusions and Future work . . . . . . . . . . . . . . . . . . . . . . . . 93 6.1 Future Work . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 93 6.1.1 Optimum Area Coverage of Map Nodes . . . . . . . . . . . 94 6.1.2 Appearance Parameters Adjusting and Learning . . . . . . 94 6.1.3 Fusion with Range Sensors . . . . . . . . . . . . . . . . . . 95 6.1.4 Combination with Metrical Map . . . . . . . . . . . . . . . 95 Appendices: A. Algorithms for Appearance-Based Robot Navigation vii . . . . . . . . . . . 96 A.1 Algorithms for Image Matching (Chapter 3) . . . . . . . . . . . . . 99 A.2 Algorithms for Localization (Chapter 4) . . . . . . . . . . . . . . . 105 A.3 Algorithms for SLAM (Chapter 5) . . . . . . . . . . . . . . . . . . 112 Bibliography . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 120 viii LIST OF FIGURES Figure Page 1.1 Main concept of appearance-based navigation . . . . . . . . . . . . . 2.1 Overview of particle filter algorithm . . . . . . . . . . . . . . . . . . . 18 3.1 The coordinate system of an image in our application. The origin of the image at the bottom-left corner. In this example, the image is 4×4 pixels. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21 3.2 An overview of feature extraction on an omnidirectional image. . . . . 22 3.3 Example of wavelet decomposition on the original image 224 × 224 pixels. Wavelet decomposition are performed four times recursively. Each matrix in the last level of decomposition has size 14 × 14 pixels ix 24 par1 ∈ gP(gT imeStep); CumSum1 = par1 .weight; u1 = Rand(0, 1); for i ← to cP pari ∈ gP(gT imeStep); CumSumi = pari .weight; ui = Rand(0, 1); end u = AscendingSort(u); i = 1; j = 1; PNew = Null; while i Td then T otalRelocateP ar = 0; foreach map node in gM P articlesAtNode = GetNumberP articles(n); if P articlesAtNode > 0.3 cP then /* Determine number of particle to relocate RelocateP articles = 0.4 P articlesAtNode; T otalRelocateP ar = T otalRelocateP ar + RelocateP articles; /* Randomly generate index of particles to be removed Index = GenerateRandomIndex(RelocateP articles); /* Remove particle from the current set of particle foreach Index Remove par at Index f rom gP(gT imeStep); end end end /* Generate new particles to each candidate node P articlesAtNode = T otalRelocateP ar/T otalCandidates; foreach Candidate node in gDc node = ExtractNode(gDc ); PNew = GenerateRandomP article(node, P articlesAtNode); Add PNew to gP(gT imeStep); end end Algorithm 13: Disbelief algorithm - Disbelief () 110 */ */ */ */ gT imeStep = 0; /* Algorithm P aticlesIntialization(); while Run Robot gT imeStep = gT imeStep + 1; gQ = F eatureExtration(gIq /* Algorithm 10 P articlesSampling(); /* Algorithm 11 P articlesW eightUpdate(); P articlesReSampling(); /* Algorithm 13 Disblief (); Determine node with maximum number of particles as localize node; end Algorithm 14: Localization 111 */ */ */ */ A.3 Algorithms for SLAM (Chapter 5) The SLAM system is described in Algorithm 15. It utilizes map building in Algorithm 16 to add new node to the map, and to determine an absolute location of the new node from previous localize node. The window tracking in Algorithm 18 tracks major group of particles. The estimated robot location is obtain from the averaging of pose of particles inside the windows. The oracle model is implemented as shown in Algorithm 21. The map management in Algorithm 22 maintains quality of the map. 112 gT imeStep = 0; cP = 1000; while Robot is running gT imeStep = gT imeStep + 1; gQ = F eatureExtraction(gIq ); if No map then BuildMap(); Get f irst node m1 ; gP(gTimeStep) = GenerateRandomP article(m1 (gT imeStep), cP ); go0 = go1 return; end G(gQ, gD); W indowP osP rediction(); P articleSampling(); ReviseP articleW eightUpdate(); P aticleReSampling(); T rackingW indow(); if gNw < 0.4M then SearchNewW indowLocation(gP(gT imeStep)); end ; d = x2W + yW if d > cRM and gSc < exp(0.015d)gTM ap then BuildNode(gQ); else n0 = ExtractNodeID(gξξ W ); n1 = ExtractNodeID(gξξ W ); if n0 = n1 then RecordLocalizeData(); end end Oracle(); MapManagement(); go0 = go1 end Algorithm 15: SLAM() 113 P revNodeID = ExtractNodeID(gξξ W ); NewNodeID = AddNode(Q); UpdateT ree(); if P revNodeID = Null then apos = CalculateAbsoluteP os(F indMap(P revNodeID), gξξ W ); SetAbsoluteP os(NewNodeID, apos); if F indMap(P revNodeID) is the last node of map f ragment then AddEdge(P revNodeID, NewNodeID); else AddF ragment(NewNodeID); end Last node of current map f ragment is the new node; else SetAbsoluteP os(F indMap(NewNodeID), 0); end SetW indowNode(gξξ W , NewNodeID); SetW indowP os(gξξ W , 0); /* Make sure no node switching occurs when adding new node gξξ W = gξξ W ; N = Number of map node; gTd = + 10exp(−0.05N); TM ap = + 10exp(−0.025N); Algorithm 16: BuildMap() */ for i ← to cP if Node is never matched then L(gQ, gDn ); end dn = x2n + yn2 ; Determine h(φn ); Obtain S(din ), σS2 (dn ), σθ2 ; /* Compute weight of a particle. The small constant is added to make the weight of a particle resisting to ‘‘bad’’ matching. */ wi = 0.25 + exp (Sn −S(dn ))2 (d ) σS n + (θ i −h(φn ))2 σθ2 ; end Normalize weights of all particles, wi = 1; Algorithm 17: ReviseP articlesW eightUpdate() 114 CurrentNodeID = ExtractNodeID(gξξ W 1); W indowSize = 0.3; while W indowSize < gNw = sx = 0; sy = 0; sθ = 0; for i ← to cP pari ∈ gP(gT imeStep) ; if pari is in window then T ransf orm particle ref erence f rame to window f rame; sx = sx + xi ; sy = sy + yi ; sθ = sθ + θi ; gNw = gNw + 1; end end if gNw < 0.4 cP then W indowSize = W indowSize + 0.1; else End loop; end end sy sx sθ T gξξ W = [ CurrentNodeID gN gNw gNw ] ; w Algorithm 18: T rackingW indow() 115 MaxP articles = 0; MaxNodeID = 0; i = 1; foreach mi ∈ gM NodeID = ExtractNodeID(mi ) NumP articles = CountP article(gP, NodeID); if NumP articles > MaxP articles then MaxNodeID = NodeID; MaxP articles = NumP articles; end i=i+1 ; end gNw = MaxP articles; sx = 0; sy = 0; sθ = 0; i = 1; foreach pari ∈ gP NodeID = ExtractNodeID(pari ); if NodeID = MaxNodeID then sx = sx + xi ; sy = sy + yi ; sθ = sθ + θi ; end end sy sx sθ T gξξ W = [ NodeW ithMaxP ar gN gNw gNw ] ; w Algorithm 19: SearchNewW indowLocation() gNumberOf Records = gNumberOf Records + 1; gRecord.P revW indow = gξξ W 0; gRecord.CurrentW indow = gξξ W ; gRecord.Motion = (Odometry − P revOdometry); gRecord.T ime = T imeStep; Algorithm 20: Record localization 116 if Build new node then for i ← to cP pari ∈ gP(gT imeStep); di = x2i + yi2 ; if di > 0.7 then RelocateP article(pari , W indowNode); end end return; end if gSc > gTd then for i ← to 0.15 gNw pari = RandomRemoveP articlesInW indow(); Remove pari f rom gP(gT imeStep); end 0.15 gNw n = N umberOf ; Candidates foreach CandidateNode ∈ gDc m = GetMapNode(CandidateNodeID); for i ← to n par = GenerateRandomP article(m, 1); Add par to gP(gT imeStep); end end end Algorithm 21: Oracle 117 record = ExtractLastRecord(gRecord); if T imeStep − record.T imeStep > 50 then DeleteAllRecords(gRecord); gNumberOf Records = 0; return; end if gNumberOf Records >= then foreach gRecord record = ExtractOneRecord(gRecord); CurrentNodeID = ExtractNodeID(record.CurrentW indow); P revNodeID = ExtractNodeID(record.P revW indow); /* Determine relative location of previous node and current node from localization record */ rp = DetermineRelativeLocation(record); switch NodesRelationship(record) case Nodes are in different map fragments UpdateMapF ragment(P revNodeID, CurrentNodeID, rp); end case Nodes are connected UpdateLocation(P revNodeID, CurrentNodeID, rp); end case Nodes are not connected but in the same map fragment if ShortestP ath(P revNodeID, CurrentNodeID) > nodes then /* The starting node is the re-observed node. */ UpdateAbsoluteLocation(P revNodeID, CurrentNodeID, rp); end end end end MergeNode(); end Algorithm 22: MapManagement() MergeMapF ragment(NodeID1, NodeID2); if NodeID1 > NodeID2 then UpdateLocation(NodeID2, NodeID1, rp); else UpdateLocation(NodeID1, NodeID2, −rp); end AddEdge(NodeID1, NodeID2); Algorithm 23: UpdateF ragment(NodeID1, NodeID2, rp) 118 /* Remove edge before updating */ RemoveEdge(BaseNodeID, UpdatedNodeID); /* Update all map nodes in the route between base node to update node */ UpdateAbsoluteLocation(BaseNodeID, UpdatedNodeID, rp); Algorithm 24: UpdateLocation(BaseNodeID, UpdatedNodeID, rp) BP ath = Null; F P ath = ShortestP ath(ReObserveNodeID, EndNodeID); N = number of node in the path; F P ath.Node1 .P ose = GetAbsoluteP ose(ReObserveNodeID); BP ath.Node1 = F P ath.Node1 ; for i ← to N BP ath.Nodei = F P ath.Nodei ; /* Obtain the relative pose from node i − to node i */ pp = GetRelativeP ose(F P ath.Nodei−1 , F P ath.Nodei ); F P ath.Nodei .P ose = F P ath.Nodei−1 .P ose + pp; end /* Set the position of the last node in the backward path. */ BP ath.NodeN .P ose = GetAbsoluteP ose(ReObserveNodeID) − rp; for i ← N − to /* Obtain the relative pose from node i to node i + */ pp = GetRelativeP ose(BP ath.Nodei+1 , BP ath.Nodei ); BP ath.Nodei .P ose = BP ath.Nodei+1 .P ose + pp; end for i = to N ap = [(N − i + 1)F P ath.Nodei .P ose + (i − 1)BP ath.Nodei .P ose]/N; NodeID = ExtractNodeID(F P ath.Nodei ); UpdateAbsoluteP ose(NodeID, ap); foreach NeighborhoodNode of F P ath.Nodei NNodeID = ExtractNodeID(NeighborhoodNode); pp = GetRelativeP ose(NodeID, NNodeID); apn = ap + pp; SetAbsoluteP ose(NNodeID, apn ); end end AddEdge(EndNodeID, ReObserveNodeID); Algorithm 25: UpdateAbsoluteLocation(EndNodeID, ReObserveNodeID, rp) 119 BIBLIOGRAPHY [1] P. Zingaretti and E. Frontoni, “Appearance-based robotics,” IEEE Robotics and Automation magazine, vol. 13, pp. 59–68, 2006. [2] A. Steinhage and G. Sch¨oner, “Self-calibration based on invariant view recognition: Dynamic approach to navigation,” Robotics and Autonomous Systems, vol. 20, pp. 133–156, 1997. [3] M. O. Franz, H. A. Mallot B. Sch¨olkpf, and H. H. B¨ ulthoff, “Where did i take that snapshot? scene-based homing by image matching,” Biological Cybernetics, vol. 79, pp. 191–202, 1998. [4] Y. Mutsumoto, K. Ikeda, M. Inaba, and H. Inoue, “Exploration and navigation in corridor environment based on omni-view sequence,” in Proceesings of the International Conference on Intelligent Robots and Systems, 2000, pp. 1505– 1510. [5] H.-M. Gross, A. Koenig, C. Schroeter, and H.-J. Boehme, “Omnivision-based probailitistic self-localization for a mobile shopping assistant continued,” in Proceedings of the Internaltional Conference on Intelligent Robots ans Systems, 2003, pp. 1505–1511. [6] E. Menegatti, M. Zoccarato, E. Pagello, and H. Ishiguro, “Image-based montecarlo localisation with omnidirectional images,” Robotics and Autonomous Systems, vol. 48, no. 1, pp. 17–30, 2004. [7] H. Andreasson, A. Treptow, and T. Duckett, “Localization for mobile robots using panoramic vision, local features and particle filter,” in Proceedings of the International Conference on Robotics and Automation, 2005. [8] M. Saedan, M. H. Ang, and C. W. Lim, “Matching panoramic image for visionbased robot localization,” in Proceesings of the Conference on Mechatronics and Machine Vision in Practice (M2VIP 2005), 2005. [9] M. Saedan, C. W. Lim, and M. H. Ang, “The omnidirectional image matching for vision-based robot localization,” in Proceedings of the International Conference on Mechatronic, 2006, pp. 17–22. 120 [10] R. Smith and P. Cheeseman, “On the representation and estimation of spatial uncertainty,” The International Journal of Robotics Research, vol. 5, no. 4, 1986. [11] G. Dissanayake, H. Durrant-Whyte, , and T. Bailey., “A computationally efficient solution to the simultaneous localisation and map building (slam) problem,” in Proceedings of the international conference on robotics and automation, 2000. [12] S. Thrun, D. Kolle, Z. Ghahramani, H. Durrant-Whyte, and A. Y. Ng, “Simultaneous mapping and localization with sparse extended information filters,” in Proceedings of the Fifth International Workshop on Algorithmic Foundations of Robotics, 2002. [13] S. Thrun, M. Montemerlo, D. Koller, B. Wegbreit, J. Nieto, and E. Nebot, “Fastslam: An efficient solution to the simultaneous localization and mapping problem with unknown data association,” Journal of Machine Learning Research, 2004, To appear. [14] Y. Mutsumoto, K. Ikeda, M. Inaba, and H. Inoue, “Exploration and navigation in corridor environment based on omni-view sequence,” in Proceedings of the international conference on intelligent robots and systems, 2000, pp. 1505–1510. [15] E. Menegatti, M. Zoccarato, E. Pagello, and H. Ishiguro, “Image-based montecarlo localisation with omnidirectional images,” Submitted to Elsevier Science, 2004. [16] N. Winters and J. Santos-Victor, “Information sampling for vision-based robot navigation,” Robotics and Autonomous Systems, vol. 41, 2002. [17] D. G. Lowe, “Distinctive image feature from scale-invariant keypoints,” International Journal of Computer Vision, vol. 60, no. 2, 2004. [18] S. Se, D. Lowe, and J. Little, “Vision-based mobile robot localization and mapping using scale-invariant features,” in Proceedings of the International Conference on Robotics and Automation, 2001. [19] S. Se, D. Lowe, and J. Little, “Vision-based mapping with backward correction,” in Proceesings of the international conference on intelligent robots and systems, 2002. [20] J. Valls Miro, W. Zhou, and G. Dissanayake, “Towards vision based navigation in large indoor environment,” in Proceesings of the International Conference on Intelligent Robots and Systems, 2006, pp. 2096–2102. [21] W. Kasprzak, W. Szynkiewicz, and M. Karolczak, “Global color image features for discrete self-localization of indoor vehicle,” in Proceedings of the International Conference on Computer Analysis of Images and Patterns, 2005, pp. 620–627. 121 [22] A. J. Davison and N. Kita, “Sequential localization and map-building for realtime computer vision and robotics,” Robotics and Autonomous Systems, vol. 36, 2001. [23] A. J. Davison, “Real-time simultaneous localisation and mapping with a single camera,” in Proceedings of the international Conference on Computer Vision, 2003. [24] W. Y. Jeong and K. M. Lee, “CV-SLAM: A new ceiling vision-based SLAM technique,” in Proceedings of international conference of intelligent robots and systems, 2005, pp. 3070–3075. [25] S. Panzieri, F. Pascucci, and G. Ulivi, “Vision based navigation using kalman approach for slam,” in Proceedings of the International Conference on Advanced Robotics, 2003, pp. 834–839. [26] J. V. Miro, G. Dissanayake, and W. Zhou, “Vision-based SLAM using natural features in indoor environment,” in Proceedings of International Conference on Intelligent Sensors, Sensor Networks and Information Processing, 2005, pp. 151– 156. [27] J. Wolf, W. Burgard, and H. Burkhardt, “Robust vision-based localization by combining an image retrieval system with monte carlo localization,” IEEE Transaction on Robotics, vol. 21, no. 2, pp. 208–216, 2003. [28] F. Dellaert, W. Burgard, D. Fox, and S. Thrun, “Using the condensation algorithm for robust, vision-based mobile robot localization,” Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 1999. [29] J. Wolf, W. Burgard, and H. Burkhardt, “Robust vision-based localization by combining image retrieval system with Monte Carlo localization,” IEEE transactions on robotics, vol. 21, no. 2, 2005. [30] J. M. Porta and B. J. A. Kr¨ose, “Apperance-based concurrent map building and localization,” Robot and Autonomous Systems, vol. 54, no. 2, 2005. [31] A. Koenig, S. Mueller, and H.-M. Gross, “Appearance-based CML apporach for a home store environment,” in Proceedings of the 2nd European conference on Mobile robots, 2005. [32] M. Saedan, C. W. Lim, and M. H. Ang, “Vision-based localization using a central catadioptric vision system,” in Proceedings of the International Symposium on Experimental Robotics, 2006. 122 [33] M. Saedan, C. W. Lim, and M. H. Ang, “Appearance-based slam with map loop closing using omnidirectional camera paper,” Proceedings of the International Conference on Advanced Intelligent Mechatronics, 2007. [34] J. Leonard and H. F. Durrant-Whyte, “Mobile robot localizaton by tracking geometric beacons,” IEEE Transaction on Robotics and Automation, vol. 7, no. 3, pp. 376–382, 1991. [35] H. Durrant-Whyte, S. Majumder, S. Thrun, M. de Battista, , and S. Scheding, “A bayesian algorithm for simultaneous localization and map building,” in Proceedings of the International Symposium of Robotics Research, 2001. [36] G. Bishop and G. Welch, “An introduction to the kalman filter,” 2001. [37] J. Uhlmann, S. Julier, and M. Csorba, “Nondivergent simultaneous map building and localization using covariance intersection,” in Proceedings of the SPIE Aerospace Conerence, 1997. [38] S. Arulampalam, S. Maskell, N. Gordon, and T. Clapp, “A tutorial on particle filters for on-line non-linear/non-gaussian bayesian tracking,” IEEE Transactions on Signal Processing, vol. 50, no. 2, pp. 174–188, Feb. 2002. [39] C. Schmid and R. Mohr, “Local grayvalue invariants for image retrieval,” IEEE Transaction on Pattern Analysis and Machine Intelligence, vol. 19, no. 5, pp. 530–535, 1997. [40] S. Bres and J.-M. Jolion, “Multiresolution contrast based detection of interest points,” Tech. Rep., Laboratoire Reconnaissance de Formes et Vision, 1998. [41] E. Loupias, N. sebe, S. Bres, and J.-M. Jolion, “Wavelet-based salient points for image retrieval,” in Proceedings of the International Conference on Image Processing, 2000. [42] E. Stollnitz, T. DeRose, and D. Salesin, “Wavelets for compter graphics: A primer, part 1,” IEEE Computer Graphics and Applications, vol. 15, no. 3, pp. 76–84, 1995. [43] J. S. Beis and D. G. Lowe, “Shape indexing using approximate nearest-neighbour search in high-dimensional spaces,” in Proceedings of the Conference on Computer Vision and Pattern Recognition, 1997, pp. 1000–1006. [44] J. L. Bentley, “Multidimensional binary search trees used for associative searching,” Communications of the ACM, vol. 18, no. 9, 1975. 123 [45] A. Doucet, N. de Freitas, and N. Gorgon, Eds., Sequential Monte Carlo Methods in Practice, Statistics for Engineering and Information Science. Springer-Verlag, 2001. [46] M. W. M. G. Dissanayake, P. M. Newman, H. F. Durrant-Whyte, S. Clark, and M. Csorba, “An experimental and theoretical investigation into simultaneous localization and map building,” in Proceedings of the International Symposium on Experimental Robotics, 1999. [47] M. Montemerlo, S. Thrun, D. Koller, and B. Wegbreit, “FastSLAM: A factored solution to the simultaneous localization and mapping problem,” in Proceedings of the AAAI National Conference on Artificial Intelligence. 2002, AAAI. [48] M. Bosse, P. M. Newman, J. Leonard J, and S. Teller, “Slam in large-scale cyclic environments using the atlas framework,” International Journal on Robotics Research, vol. 23, no. 12, pp. 1113–1139, 2004. [49] F. Mustiere, M. Bolic, and M. Bouchard, “Rao-blackwellised particle filters: examples of applications,” in Proceedings of the IEEE Canadian Conference on Electrical and Computer Engineering, 2006. [50] P. Elinas, R. Sim, and J. J. Little, “sigmaslam: Stereo vision slam using theraoblackwellised particle filter and a novel mixture proposal distribution,” in Proceedings of the international conference on robotics and automation, 2006. [51] S. Hagen and B. Kr¨ose, “Trajectory reconstruction for self-localization and map building,” in Proceeding of the international conference on robotics and automation, 2002, pp. 1796–1801. [52] E. W. Dijkstra, “A note on two problems in connexion with graphs,” Numerische Mathematik, vol. 1, pp. 269271, 1959. 124 [...]... navigation especially localization and simultaneous localization and map- building (SLAM) algorithms In mobile robot localization, the state of interest is usually the pose of a robot (position and orientation) with respect to the world The measurement may include surrounding environment data and the robot status such as range measurements, camera images, and odometry The Bayesian filtering determines the posterior... performs localization and/ or simultaneous localization and map- building in realistic indoor environment (SLAM) • To initiate a framework for integrating (SLAM) with other navigation tasks 1.1 State-of-the-Art Localization and SLAM algorithm The SLAM algorithm deals with uncertainties of a model of an environment (map) and a drobot location from the same external sensor source(s) Although initial reports... existing map ξ Oracle(ξ t ) The oracle model assisting in the relocation of particles xviii CHAPTER 1 INTRODUCTION The ability of human beings to use vision to navigate in daily lives has inspired us to investigate vision to be used as a primary external sensor for robot localization and simultaneous localization and map- building (SLAM) We can remember and relate places without explicitly knowing their... matching with a Bayesian filter, i.e., a particle filter A distinctive element of this work compared to others is that it provides the integrated system that is able to build a map, detect and close the map loop, and maintain quality of a map through a map management This is a novel approach to tackle two “hard” problems using a single estimation system, i.e building a map from scratch, and detecting... the extension of localization 10 to operate in an unknown environment This approach is generally known as concurrent map- building and localization (CML) or simultaneous localization and mapbuilding (SLAM) Furthermore, we demonstrate the integration of our appearancebased SLAM with other navigation tasks We design a navigation system that has our SLAM algorithm running in the background Finally, chapter... (circular image) 2 An appearance- based localization system using particle filter that is able to operate on a sparse map (map with less number of reference images) 3 A complete appearance- based simultaneous localization and map- building (SLAM) system that is able to detect the map- loop, update and maintain the quality of a map The SLAM system allows user to specify inter-node distance that directly... modeling it Many work in this field such as [2, 3] were inspired by biological counterparts 1 Early implementations of appearance- based localization [2] and [4] involved quite complicated image processing and matching techniques The ultimate aims of those methods are to obtain the location of a robot by finding an image stored in the map that matches best with a query image Their work were either tested in. .. work, and also proposes directions for future work in order to bring an autonomous mobile robot into our daily lives 11 CHAPTER 2 PROBABILISTIC ALGORITHMS IN VISION -BASED ROBOT NAVIGATION Three questions arise in the general problems of mobile robot navigation are “Where am I?”, “Where am I going?”, and “How should I get there?” [34] This chapter provides a review on algorithms in vision -based robot. .. and map- building in an unknown indoor environment Our approach aims to solve the problem of localization and/ or simultaneous localization and map- building (SLAM) in large indoor area Our work has been reported in [8, 9, 32, 33] The contributions of our work can be summarized as follows: 1 An algorithm to process and utilize an image directly from a catadioptric vision system (circular image) 2 An appearance- based. .. merged with the existing map fragment 83 5.19 Mapping result with the inter-node distance RM ap = 100 cm The cross-marks are the location of each map node obtaining from robot odometry 84 5.20 Mapping result with the inter-node distance RM ap = 150 cm The cross-marks are the location of each map node obtaining from robot odometry . APPEARANCE-BASED MOBILE ROBOT LOCALIZATION AND MAP-BUILDING IN UNSTRUCTURED ENVIRONMENTS MANA SAEDAN M.Eng (NUS) A THESIS SUBMITTED FOR. map. • Integration of feature map building and localization to incrementally build a map and solve the “loop closing” problem, make the system applicable in un- known, dynamically changing environments. Self-LocalizationExperiment 45 4.4 Summary 49 vi 5. Appearance-based Simultaneous Localization and Map-Building (SLAM) 53 5.1 Appearance-BasedSLAM 54 5.1.1 MapRepresentationandParticleSampling

Ngày đăng: 11/09/2015, 14:22

Từ khóa liên quan

Tài liệu cùng người dùng

Tài liệu liên quan