1. Trang chủ
  2. » Kỹ Thuật - Công Nghệ

Field and Service Robotics - Corke P. and Sukkarieh S.(Eds) Part 9 ppsx

24 287 0

Đang tải... (xem toàn văn)

Tài liệu hạn chế xem trước, để xem đầy đủ mời bạn chọn Tải xuống

THÔNG TIN TÀI LIỆU

Thông tin cơ bản

Định dạng
Số trang 24
Dung lượng 8,28 MB

Nội dung

284 S. Saripalli and G. Sukhatme Figure 3shows atypical image obtained from the downward pointing camera on the helicopter.The target is marked with an Hand movesinthe x-direction for adistance of 12m. Figure 5shows three curves. The solid line is ground truth. It is the location of the target as measured on the ground by the odometry of the target robot. The dotted line is the location as measured by the camera without filtering. This is obtained by first detecting the target in the image, finding its image coordinates and then transforming them to the helicopter’sreference plane as shown in Section 4. The dashed line is the location of the target deduced by the Kalman filter as explained in Section 5. Figure 6shows the simulated trajectory followed by 0 10 20 30 40 50 60 −2 0 2 4 6 8 10 12 Time (sec) Position (meters) Kalman Filter Performance actual pos estimated pos measured pos Fig.5. Kalman filter tracking performance the helicopter while tracking the tar get. The plot sho ws the trajectory of the tar get (solid) and the trajectory of the helicopter (dashed). As can be seen the helicopter is able to track the tar get quite well. Figure 7s ho ws the height of the helicopter 0 1 2 3 4 5 6 0 2 4 6 8 10 12 Time x 10 (sec) X(meters) Heli tracking cart−pos heli−pos Fig.6. The helicopter trajectory in the x-direction (dashed) and the target trajectory in the x-direction (solid) while tracking with respect to time. During simulation it wasfound that updating the trajectory of the helicopter every time step the Kalman filter made anew prediction wasnot necessary.Only adiscrete number of modifications were made to the trajectory and acubic spline trajectory as described in Section 6was used. It may be noted that Landing on a Moving Target Using an Autonomous Helicopter 285 during the actual flight tests we will probably use a straight line interpolation of the cubic spline which is also shown in Figure 7 0 1 2 3 4 5 6 −5 0 5 10 15 20 25 30 Time x 10 (sec) Height(meters) Heli tracking Linear Trajectory Cubic Trajectory Fig .7 . Height trajectory commanded for tracking and landing 9C onclusions This paper describes the design of an algorithm for landing on amoving target using an autonomous helicopter.Weuse aKalman filter as an estimator to predict the position of the target and plan the trajectory of the helicopter such that it can land on it. Givenconditions about the distance at which the helicopter should land on the target we perform discrete updates of the trajectory so that we can track and land on the target. We have performed experiments in manual control mode where ahuman pilot wascontrolling the helicopter and washolding it in hoverwhile we collected data of aground target moving in the field of viewofthe helicopter. The estimator and the trajectory planner were run offline to test the validity of the algorithm. Figures 5, 6, 7showthe results obtained by using the Kalman filter in conjunction with aobject recognition algorithm. 9.1 Limitations, Discussion and FutureWork In the future we plan to test the algorithm on our autonomous helicopter.Several limitations exist with the current algorithm: • We assume that the helicopter is in hove r( zero roll, pitch and yawv alues and zero movement in the northing and easting directions). This is almost impos- sible to achieve in an aerial vehicle. We plan to integrate the errors in GPS coordinates and attitude estimates into the Kalman filter so that we are able to track the target more precisely. • The current estimator is able to track the tar get only in as ingle dimension. We will extend it so that it is able to track the full pose of the target and verify the validity of the algorithm. 286 S. Saripalli and G. Sukhatme • Currently we use an algorithm based on the intensity of the image for object detection. We also assume that the object is planar.This is quite restrictive in nature. We plan to integrate our algorithm with abetter object detector and a camera calibration routine so that the coordinates of the tracked object obtained during the vision processing stage are much more accurate. • We will extend the algorithm in the future such that we are able to track mul- tiple targets and then land on anyone of them. Also we intend to investigate algorithms to be able to pursue and land on evasive targets. Acknowledgment This work is supported in part by NASA under JPL/Caltech contract 1231521, by DARPAunder grant DABT63-99-1-0015 as part of the Mobile Autonomous Robot Software (MARS) program, and by ONR grant N00014-00-1-0638 under the DURIP program. We thank Doug Wilson for support with flight trials. Refer ences 1. Office of the Under Secretary of Defense, “Unmanned aerial vehicles annual report,” Tech. Rep., Defense Airborne Reconnaissance Office, Pentagon,Washington DC, July 1998. 2. S. Saripalli, J. F. Montgomery,and G. S. Sukhatme, “Visually-guided landing of an unmanned aerial vehicle,” IEEE Transactions on Robotics and Autonomous Systems, 2003(to appear). 3. B. K. P. Horn and B. G. Schunk, “Determinig optical flow,” Artificial Intelligence,vol. 17, pp. 185–203, 1981. 4. C. Tomasi and J. Shi, “Direction of heading from image deformations,”in IEEE Con- fer ence on Computer Vi sion and Pa ttern Reco gnition,J une 1993, pp. 4326–4333. 5. J. Shi and C. Tomasi, “Good frames to track,”in IEEE Conference on Computer Vision and Pattern Recognition,June 1994, pp. 4326–4333. 6. S. Birtchfield, “A ne lliptical head track er ,” in 31 Asilomar Confer ence on Signals Systems and Computers ,November 1997, pp. 4326–4333. 7. M. Turk and A. Pentland, “Eigenfaces for recognition,”in Journal of Cognitive Neuro- Science,1995, vol. 3, pp. 4326–4333. 8. D. Schultz, W. Burgard, D. Fox, and A. B. Cremers, “Tracking multiple moving tar- gets with amobile robot using particle filters and statistical data association,”in IEEE International Conference on Robotics and Automation,2001, pp. 1165–1170. 9. Y. Bar-Shalom and W. D. Blair, Multitarget-Multisensor Tracking: Applications and Advances,v ol. 3, Artech House, 1992. 10. O. Shakernia, Y. Ma, T. J. Koo, and S. S. Sastry,“Landing an unmanned air vehicle:vision based motion estimation and non-linear control,”in Asian Journal of Control,September 1999, vo l. 1, pp. 128–145. 11. K. Ogata, Modern Control Engineering,Prentice Hall, 2nd ed., 1990. 12. M. K. Hu, “Visual pattern recognition by moment invariants,”in IRE Transactions on Information Theory,1962, vo l. IT -8, pp. 179–187. 13. J. F. Montgomery, Learning Helicopter Control through ’Teaching by Showing’,Ph.D. thesis, University of Southern california, May 1999. Scan Alignmentand 3-D SurfaceModelingwitha HelicopterPlatform  1  2     3 1    2  3    Abstract.                       1Introduction                                                                                                                                                                                                                            Fig.1.                         Fig.2.                                                                                                                                                                              simultaneouslocalizationand mapping                                                                                                                                   23-D ModelingApproach 2.1Vehicle Model  x t  pose    t                     y t     y t    x t    p ( y t | x t )  p ( y t | x t ) ∝ exp − 1 2 ( y t − x t ) T A − 1 ( y t − x t )   A         systematic error      p ( Δy t | x t ,x t − 1 )  Δy t = y t − y t − 1        p ( Δy t | x t ,x t − 1 ) ∝ exp − 1 2 ( Δy t − δ ( x t ,x t − 1 )) T D − 1 ( Δy t − δ ( x t ,x t − 1 ))     Fig.3.    δ    D        A                 y t       2.2Range Sensor Model              p ( z t | m, x t )   z t   m   x t                 m                       p ( z t | x t ,x t − 1 ,z t − 1 )      z t    z t − 1       x t  x t − 1         Fig.4.              p ( z t | x t ,x t − 1 ,z t − 1 ) ∝  i exp − 1 2 min  α, min j  ( z i t − f ( z j t − 1 ,x t − 1 ,x t )) T B − 1 ( z i t − f ( z j t − 1 ,x t − 1 ,x t ))     i   z t   f   j    i         j    z i t        z t        α           B                 B − 1        2.3Optimization      p ( y t | x t ) p ( Δy t | x t ,x t − 1 ) p ( z t | x t ,x t − 1 ,z t − 1 )         const . + 1 2  ( y t − x t ) T A − 1 ( y t − x t )     Fig.5.       +(Δy t − δ ( x t ,x t − 1 )) T D − 1 ( Δy t − δ ( x t ,x t − 1 ))  +  i min  α, min j  ( z i t − f ( z j t − 1 ,x t − 1 ,x t )) T B − 1 ( z i t − f ( z j t − 1 ,x t − 1 ,x t ))              x t       z j t − 1     z i t              x t                         t    t − 1                                                        3Results                          Fig.6.                                                                                                                                  [...]... 294 S Thrun, M Diel, and D H¨ hnel a Fig 7 Snapshots of the 3-D Map of the building shown in Figure 4 The map is represented as a VRML fi le and can be displayed using standards visualization tools Scan Alignment and 3-D Surface Modeling with a Helicopter Platform 295 Fig 8 Multi-polygonal 3-D models of a different urban site with a smaller building, and a cliff at the pacifi... are mission specific nodes and perform multi-target tracking, target registration and decentralised data fusion The air-to-air/air-to-ground communication links are established for the decentralised fusion purposes, remote control operation, differential GPS data uplink, and telemetry data The ground station consists of a DGPS base station, weather station, hand-held controller, and monitoring computer... Helicopter Platform 297 3 P Besl and N McKay A method for registration of 3d shapes Transactions on Pattern Analysis and Machine Intelligence, 14(2):2 39 256, 199 2 4 D Drinka D Coleman, D Creel Implementation of an autonomous aerial reconnaissance system GeorgaTech Entry into the 2002 International Aerial Robotics Competition, 2002 5 P.E Debevec, C.J Taylor, and J Malik Modeling and rendering architecture... Proc of the 23rd International Conference on Computer Graphics and Interactive Techniques (SIGGRAPH), 199 6 6 G Dissanayake, P Newman, S Clark, H.F Durrant-Whyte, and M Csorba A solution to the simultaneous localisation and map building (SLAM) problem IEEE Transactions of Robotics and Automation, 2001 In Press 7 J Dittrich and E.N Johnson Multi-sensor navigation system for an autonomous helicopter In Proceedings... acquiring multi-planar maps of indoor environments with mobile robots IEEE Transactions on Robotics, 2003 To Appear Real-Time Navigation, Guidance, and Control of a UAV Using Low-Cost Sensors Jong-Hyuk Kim1 , Salah Sukkarieh1 , and Stuart Wishart2 1 2 Australian Centre for Field Robotics School of Aerospace, Mechanical and Mechatronic Engineering University of Sydney, NSW 2006, Australia {jhkim, salah}@acfr.usyd.edu.au... which the designer has to face The main restrictions are the stability of the Inertial Navigation System (INS) degraded S Yuta et al (Eds.): Field and Service Robotics, STAR 24, pp 299 –3 09, 2006 © Springer-Verlag Berlin Heidelberg 2006 300 J.H Kim, S Sukkarieh, and S Wishart Waypoint IMU GPS BARO GPS/INS Navigation Loop Navigation Sloution Guidance Loop AIR DATA RPM Flight Control Loop Throttle Ground... References 1 J Bagnell and J Schneider Autonomous helicopter control using reinforcement learning policy search methods In Proceedings of the International Conference on Robotics and Automation 2001 IEEE, May 2001 2 S Becker and M Bove Semiautomatic 3-D model extraction from uncalibrated 2-D camera views In Proc of the SPIE Symposium on Electronic Imaging, San Jose, 199 5 Scan Alignment and 3-D Surface Modeling... mode It computes guidance demands to force the vehicle to follow the desired way-point The flight Real-Time Navigation, Guidance, and Control of a UAV Using Low-Cost Sensors 305 (a) IMU Angular Rates Accelerations Position, Velocity, and Attitude Strapdown INS Estimated Errors in Position, Velocity, and Attitude Predicted Position and Velocity GPS - + z1 Measured Position and Velocity Fusion Kalman Filter... Pittsburgh, PA, 2002 Technical Report CMURI-TR-0 2-0 7 13 R Miller and O Amidi 3-D site mapping with the CMU autonomous helicopter In Proceedings of the 5th International Conference on Intelligent Autonomous Systems, Sapporo, Japan, 199 8 14 D Murray and J Little Interpreting stereo vision for a mobile robot Autonomous Robots, 2001 To Appear 15 A.Y Ng, J Kim, M.I Jordan, and S Sastry Autonomous helicopter fl... roll and pitch uncertainties are maintained under 0.4◦ and the heading is under 0.8◦ (a) (b) Fig 7 (a) The evolution of the vehicle 1 uncertainty in position and (b) that of attitude 7 Conclusions This paper presents the real-time implementation and result of the navigation, guidance, and control of UAV using low-cost sensors A low tactical grade IMU is used for the inertial navigation and two low-cost . Recognition,June 199 4, pp. 4326–4333. 6. S. Birtchfield, “A ne lliptical head track er ,” in 31 Asilomar Confer ence on Signals Systems and Computers ,November 199 7, pp. 4326–4333. 7. M. Turk and A. Pentland,. track mul- tiple targets and then land on anyone of them. Also we intend to investigate algorithms to be able to pursue and land on evasive targets. Acknowledgment This work is supported in part. NASA under JPL/Caltech contract 1231521, by DARPAunder grant DABT6 3 -9 9- 1 -0 015 as part of the Mobile Autonomous Robot Software (MARS) program, and by ONR grant N0001 4-0 0-1 -0 638 under the DURIP program.

Ngày đăng: 10/08/2014, 02:20