1. Trang chủ
  2. » Ngoại Ngữ

Bayesian recursive algorithms for estimating free space and user intentions in a semi autonomous wheelchair with a stereoscopic camera system

266 266 0

Đang tải... (xem toàn văn)

Tài liệu hạn chế xem trước, để xem đầy đủ mời bạn chọn Tải xuống

THÔNG TIN TÀI LIỆU

Thông tin cơ bản

Định dạng
Số trang 266
Dung lượng 5,71 MB

Nội dung

Bayesian recursive algorithms for estimating free space and user intentions in a semiautonomous wheelchair with a stereoscopic camera system by Thanh Hai Nguyen Submitted to the Falcuty of Engineering in partial fulfilment of the requirements for the degree of Doctor of Philosophy at the University of Technology, Sydney Sydney, August 2009 Acknowledge I would like to express my sincere gratitude and appreciation to my supervisor, Professor Hung Tan Nguyen, for providing me with the valuable opportunity to finish this research, which was carried out from July 2006 to August 2009 in Faculty of Engineering, University of Technology Sydney I am not only grateful to him for his invaluable encouragement, enthusiastic support and professional guidance, I also highly appreciate his expert knowledge in the area of Mechatronics and Intelligent Systems I would also like to thank my co-supervisor, Doctor Steven Su, for his assistance, sharing research experiences and constructive comments with respect to the research I greatly thank to Gunasmin and Phyllis’s generous help during the time I have been working at the faculty Many thanks to Greg for helping me to experiments using a head-movement sensor and Michelle for helping me to edit the research I also want to thank my best friends and students in my research group for helping me experiments I wish they all the best and have the best results in their study Finally, I would like to show my deep gratitude to my family, who always support and encourage me all my life, especially the period of working at UTS I hope to work and dedicate my knowledge and research experiences to the society Abstract Assistive technologies have been proposed in recent years to install in mobile wheelchairs for providing severely disabled people with enhanced access to independent activities in their daily life In practice, ultrasound and laser sensors have been developed to navigate obstacles, but they only provide two-dimensional (2D) grids The main contributions of this thesis are in the exploitation of three-dimensional (3D) information from the stereoscopic camera in the estimation of free space and the user’s intention This is achieved through using Bayesian Recursive (BR) algorithms which are conditioned on measurements, control data and conditional probabilities, within a semi-autonomous wheelchair control system In order to provide 3D information for detecting free spaces and obstacles, a “Bumblebee” stereoscopic camera system has been mounted to a powered wheelchair The Sum of Absolute Differences (SAD) algorithm is subsequently used for constructing an optimal disparity map Especially, the color intensity functions of images has been applied to obtain this optimal disparity map, moreover, the mark size and the disparity boundaries to increase the optimality of the disparity map Given the optimal disparity map, both 3D point and 2D distance maps are produced for controlling the autonomous wheelchair In particular, the height and width of free space have been computed for passing through Experimental results have shown the effectiveness of the SAD approach using the color intensity function and benefits of computing 3D point map and 2D distance map for the wheelchair control The stereoscopic camera system may provide 3D information about free spaces in the environment However, the free space information can be uncertain This is especially likely when the free space height and/or width are close to the safe height and/or diameter of the wheelchair It is then difficult for the wheelchair to estimate the height and/or width for moving through free space To combat this, the BR algorithm is applied for estimating free space In order to apply a Bayesian decision for the wheelchair to pass autonomously through free space, the average optimal probability values are determined Experimental results for estimating various free spaces prove that the proposed BR approach is effective A semi-autonomous wheelchair control strategy is combined between user intention and autonomous mode In the autonomous mode, a dynamic free space has been estimated using the advanced BR algorithm conditioned on the “obstacle” distance This can be altered when a moving obstacle is in front of the mobile wheelchair User intentions are often uncertain due to noise from a head-movement sensor, and it is therefore difficult for the mobile wheelchair to determine user intentions Hence, the advanced BR algorithm is utilized, conditioned on dynamic free space estimation The algorithm is used to determine the user intention In conclusion, the experimental results detailed in the thesis, serve to illustrate the effectiveness of the approaches Contents Nomenclature v List of Figures viii List of Tables xxi Abbreviations xxii Abstract xxiv Chapter Introduction 1.1 Motivation 1.2 Thesis Contribution 1.3 Publication 1.4 Thesis Outline Chapter Literature Review 11 2.1 Introduction 11 2.2 Sensor-based Controls 12 2.3 Camera-based Weelchairs 18 2.4 Stereo Vision Problems 22 2.5 Control problem 37 2.6 Discussion 38 Chapter Obstacle and Freespace Detection using a Stereoscopic Camera System 41 3.1 Introduction 41 3.2 Stereoscopic Vision 44 3.2.1 Camera Model 44 3.2.2 Epipolar Geometry and the Fundamental Matrix 47 i 3.3 3.4 3.2.3 Image Pair Rectification and Edge Detection 50 3.2.4 Block Matching 55 3.2.5 Disparity Estimation 59 Obstacle and Freespace Detection 62 3.3.1 Distance Perception 63 3.3.2 Computation of 3D Point Map 63 3.3.3 Computation of 2D Distance Map 67 3.3.4 Obstacle and Freespace Detection 70 Comparison of 2D Maps using a ‘Bumblebee’ Stereoscopic Camera and ‘URG’ Laser Systems 80 3.4.1 2D Map using the ‘URG’ Laser System 80 3.4.2 2D Distance Map using the ‘Bumblebee’ Stereoscopic Camera System 81 3.5 Discussion 83 Chapter Bayesian Recursive Algorithm for Freespace Estimation 85 4.1 Introduction 85 4.2 Freespace Estimation Algorithm 86 4.3 4.2.1 Bayesian Recursive Algorithm 87 4.2.2 Bayesian Decision 91 Experiments of Freespace Estimation 93 4.3.1 Experiment 1: Estimation of the width of a freespace 96 4.3.2 Experiment 2: Estimation of the width of two freespaces 103 4.3.3 Experiment 3: Estimation of the height and width of one freespace 111 4.3.4 Experiment 4: Estimation of the height and width of two freespaces 119 4.4 Discussion 133 Chapter Advanced Bayesian Estimation in Semi-autonomous Wheelchair Control 136 5.1 Introduction 136 ii 5.2 Semi-autonomous Wheelchair Control Strategy 138 5.2.1 Representation of User Commands 138 5.2.2 Autonomous Mode 145 5.2.3 5.3 5.2.2.1 Dynamic Freespace Estimation 146 5.2.2.2 Computation of Controls 149 Semi-autonomous Wheelchair Control 154 5.2.3.1 Bayesian Estimation for Intention of the User 155 5.2.3.2 Bayesian Decision For Control 159 Experiments of Bayesian Estimation For Semi-autonomous Wheelchair Control 160 5.3.1 Experiment 1: Bayesian estimation for a freespace with a moving obstacle 161 5.3.2 Experiment 2: Autonomous mode for passing through a freespace 176 5.3.3 5.4 Experiment 3: User intention for passing through a freespace183 5.3.3.1 Intention Estimation 183 5.3.3.2 Bayesian Decision for Control 187 Discussion 189 Chapter Conclusion and Future Work 192 6.1 Conclusion 192 6.2 Future Work 196 Appendix A Wheelchair Hardware Description 197 A Overview of Power Wheelchairs 197 A Description of the “Bumblebee” Stereoscopic Camera System 199 A Description of the “URG” Laser System 201 A Description of the Head-movement Sensor 204 A Description of the National Instrument USB 6008 Multifunction Data Acquisition 209 iii Appendix B Implementations of Semi-autonomous Wheelchair Control System with a Stereoscopic Camera System 212 B Freespace and Obstacle Detection using the Stereoscopic Camera System for C++ 212 B BR algorithms for Estimation Freespace and User Intention in Semi- autonomous Wheelchair Control for LabVIEW 227 Appendix C Publications Related to the Thesis 232 Bibliography iv Nomenclature θ - Degree dependent on the head tilt of the user φ - Tilt angle η - Normalized coefficient λ∈R+ - Arbitrary positive scalar ωa - Steering velocity - Pixel coordinates (Q,Q ) - Two centres of two cameras ωu - Steering velocity (Xi,Yj,Zk) - 3D coordinate system (Xi,Zimin) - 2D distance map corresponding to Zimin (Xi,Zk) - Horizontal 2D plane (Yj,Zk) - Vertical 2D plane A - Intrinsic parameter of stereo cameras B - Baseline of the two camera centres (in metres) C - User intention state in the BR algorithm CL - Camera left centre CR - Camera right centre Ct-1 - Previous state of User intention D - Extrinsic parameter of stereo cameras SADcolor - Disparity function df - Width of the freespace dmax - Maximum disparities dmin - Minimum disparities ds - Safe diameter eL - Left epipole eR - Right epipole f - Focal length (i, j) ’ v F - Fundamental matrix FT - Fundamental matrix of two cameras (Q,Q’) in the opposite order h - Height of the wheelchair h1 - Height of the first freespace hc - Camera position on the wheelchair hl - Height of the freespace hobs - “obstacle” distance value from obstacles to the wheelchair IL - Left image plane IR - Right image plane ks - Safe distance form obstacle to the wheelchair lR - Right epipolar line M - Mask size OH - Maximum distance on the Z-axis P - Scene point P(C0) - Equal prior probabilities of user intention P(Ct=uauto) - Probability of the autonomous mode P(Ct=uuser) - Probability of the user intention P(Wd0) - Equal prior probability of dynamic freespace P(x(t-1)) - Previous probability over state (the height or width) Pav(x(t)) - Average value of the optimal probabilities pfs(Wd) - Dynamic freespace probability pfs1:t-1 - Past dynamic freespace estimation pL - Left image point Ppo(Ct) - Optimal probability of user intention Ppr(Ct) Ppr(Wdt) - Posterior probability of user intention posterior probability of dynamic freespace Ppr(xt) - Posterior probability of freespace pR - Right image point R - Rotation matrix T - Translation vector u1:t - Past controls of freespace uauto(va,ωa) - Autonomous mode ud1:t - All past controls in estimating dynamic freespaces vi de = digiclopsExtractTriclopsInput(digiclops, STEREO_IMAGE, &stereoData ); // grab the color image data de = digiclopsExtractTriclopsInput( digiclops, RIGHT_IMAGE, &colorData ); // preprocessing the images te = triclopsPreprocess( triclops, &stereoData ); // stereo processing te = triclopsStereo( triclops ); // retrieve the interpolated depth image from the context te = triclopsGetImage16( triclops, TriImg16_DISPARITY, riCam_REFERENCE, &depthImage16 ); te = triclopsRectifyColorImage(triclops, TriCam_REFERENCE, &colorData, &colorImage ); //declare linked arrays of x,y,z int const ELEMENT_COUNT = 1024; float *x_cur, *y_cur, *z_cur; x_cur = new float[ELEMENT_COUNT];; y_cur = new float[ELEMENT_COUNT];; z_cur = new float[ELEMENT_COUNT];; int array_count = 1; float * x_arrays[100]; float * y_arrays[100]; float * z_arrays[100]; x_arrays[0] = x_cur; y_arrays[0] = y_cur; z_arrays[0] = z_cur; int cur_pos = 0; // determine the number of pixels spacing per row pixelinc = depthImage16.rowinc/2; for ( i = 0, k = 0; i < depthImage16.nrows; i++ ) { row = depthImage16.data + i * pixelinc; for ( j = 0; j < depthImage16.ncols; j++, k++ ) { 215 disparity = row[j]; // filter invalid points if ( disparity < 0xFF00 ) { // convert the 16 bit disparity value to floating point x,y,z triclopsRCD16ToXYZ( triclops, i, j, disparity, &x, &y, &z ); // look at points within a range if (( z_final[i]) { min_z_idx = i; } } else { //fprintf( filePointsX, "%f\n", x_final[min_z_idx]); xoutarray[outputcounter]=x_final[min_z_idx]; //youtarray[outputcounter]=y_final[min_z_idx]; zoutarray[outputcounter]=z_final[min_z_idx]; min_z_idx = i; outputcounter++; } } //fclose(filePointsX); fclose(filePointsY); fclose(filePointsZ); delete []x_final; delete []y_final; delete []z_final; /*printf("End printing to files %d\n", GetTickCount());*/ digiclopsStop( digiclops ); digiclopsDestroyContext( digiclops ); triclopsDestroyContext( triclops ); //return 0; } *** Camera DLL 218 LIBRARY CameraDLL DESCRIPTION 'A C++ dll that can be called from VB' EXPORTS TestDLL2 @1 *** stdafx.cpp : source file that includes just the standard includes // CameraDLL.pch will be the pre-compiled header // stdafx.obj will contain the pre-compiled type information #include "stdafx.h" *** CameraDLL.h #ifndef CAMERADLL_H #define CAMERADLL_H //This function is exported from the CAMERADLL.dll void declspec(dllexport) TestDLL2(float *xoutarray, float *zoutarray); #endif #ifndef DIGICLOPS_H #define DIGICLOPS_H #ifdef WIN32 #ifdef DIGICLOPSLIB_EXPORTS #define DIGICLOPSLIB_API declspec( dllexport ) #else #define DIGICLOPSLIB_API declspec( dllimport ) #endif #else 219 #define DIGICLOPSLIB_API #endif // PGR Includes #ifdef cplusplus extern "C" { #endif // Macro Definitions; Group = Macro Definitions; The version of the library #define DIGICLOPS_VERSION 2339 #define TOP_IMAGE 0x1 // 32bpp RGBU packed image from the top camera #define LEFT_IMAGE 0x2 // 32bpp RGBU packed image from the left camera #define RIGHT_IMAGE 0x4 // 32bpp RGBU packed image from the right camera #define STEREO_IMAGE 0x8 // 24bpp unpacked image from all cameras #define ALL_IMAGES ( TOP_IMAGE | LEFT_IMAGE | RIGHT_IMAGE | STEREO_IMAGE ) #define DEFAULT_IMAGE_BUFFER NULL // Type Definitions and Enumerations; Group = Type Definitions typedef void* DigiclopsContext; // LEFT_IMAGE, RIGHT_IMAGE, and STEREO_IMAGE typedef unsigned long DigiclopsImageType; // Description: The error codes returned by the functions in this library 220 typedef enum DigiclopsError { DIGICLOPS_ok, // Function completed successfully DIGICLOPS_ALREADY_INITIALIZED, DIGICLOPS_ALREADY_STARTED, // Device already initialized // Grabbing has already been started DIGICLOPS_CALLBACK_NOT_REGISTERED, // Callback is not registered DIGICLOPS_CALLBACK_ALREADY_REGISTERED, // Callback is already registered DIGICLOPS_CAMERACONTROL_PROBLEM, / Problem controlling camera DIGICLOPS_COULD_NOT_OPEN_FILE, DIGICLOPS_FAILED, // Failed to open file // General failure DIGICLOPS_INVALID_ARGUMENT, // Invalid argument passed DIGICLOPS_INVALID_CONTEXT, // Invalid context passed DIGICLOPS_INVALID_IMAGE_TYPE, // Invalid image type passed DIGICLOPS_ISOCH_GRAB_ERROR, // Isoch grab error occured DIGICLOPS_ISOCH_GRAB_NOT_STARTED, // Isoch grab not started DIGICLOPS_ISOCH_GRAB_SHUTDOWN_ERROR, // Error shutting down isoch grab DIGICLOPS_MEMORY_ALLOC_ERROR, DIGICLOPS_NO_IMAGE, // Memory allocation error // DigiclopsGrabImage() not called DIGICLOPS_NO_TRICLOPS_CONTEXT_FOUND, // No Triclops context found on camera DIGICLOPS_NOT_IMPLEMENTED, DIGICLOPS_NOT_INITIALIZED, // Function not implemented // Device not initialized 221 DIGICLOPS_NOT_STARTED, // DigiclopsStart() not called DIGICLOPS_MAX_BANDWIDTH_EXCEEDED, // Request would exceed maximum bandwidth DIGICLOPS_NON_PGR_CAMERA,// Attempt to use driver on non-PGR camera DIGICLOPS_INVALID_MODE, // Invalid mode or framerate set or retrieved DIGICLOPS_ERROR_UNKNOWN, // Unknown Error } DigiclopsError; // DigiclopsBusNotificationCallback #define DIGICLOPS_BUS_INVALID // DigiclopsBusNotificationCallback #define DIGICLOPS_BUS_VALID // FLYCAPTURE_BUS_INVALID or FLYCAPTURE_BUS_VALID typedef void DigiclopsBusNotificationCallback( int iBusNotificationMsg ); // The type used to store the serial number uniquely identifying a Digiclops typedef unsigned long DigiclopsSerialNumber; // An enumeration used to identify the different image resolutions that are // supported by the library typedef enum DigiclopsImageResolution { DIGICLOPS_160x120, // 160 x 120 resolution DIGICLOPS_320x240, // 320 x 240 resolution DIGICLOPS_640x480, // 640 x 480 resolution 222 DIGICLOPS_1024x768, // 1024 x 768 resolution } DigiclopsImageResolution; // An enumeration used to identify the settings for the size of the output // image relative to the camera sensor resolution typedef enum DigiclopsOutputImageResolution { DIGICLOPS_FULL, // full resolution images DIGICLOPS_HALF, // half resolution images } DigiclopsOutputImageResolution; // An enumeration used to describe the different camera color configurations typedef enum DigiclopsCameraType { DIGICLOPS_BLACK_AND_WHITE, // black and white system DIGICLOPS_COLOR // color system } DigiclopsCameraType; // An enumeration used to describe the camera device currently being controlled typedef enum DigiclopsCameraDevice { // Digiclops 3-camera system DIGICLOPS_DEVICE_DIGICLOPS, // Bumblebee 2-camera system DIGICLOPS_DEVICE_BUMBLEBEE, } DigiclopsCameraDevice; 223 // An enumeration used to describe the different color processing method typedef enum DigiclopsColorMethod { DIGICLOPS_DISABLE_COLOUR_PROCESSING, // no color processing DIGICLOPS_NEAREST_NEIGHBOR, // nearest neighbor de-mosaicing DIGICLOPS_EDGE_SENSING // edge sensing de-mosaicing } DigiclopsColorMethod; // A record used in querying the Digiclops properties typedef struct DigiclopsInfo { // Camera serial number DigiclopsSerialNumber SerialNumber; // CCD resolution DigiclopsImageResolution ImageSize; // Type of CCD (color or b&w) DigiclopsCameraType CameraType; // Type of device DigiclopsCameraDevice CameraDevice; } DigiclopsInfo; // The low-level 1394 bus timestamp Note that this is not an absolute // epoch-based timestamp, it is a relative timestamp that wraps around every // 128 seconds typedef struct Digiclops1394Timestamp { // The current seconds value of the 1394 bus time (0-127) 224 unsigned long ulCycleSeconds; // The current cycle count value of the 1394 bus time (0-7999) unsigned long ulCycleCount; } Digiclops1394Timestamp; // A wrapper for a TriclopsInput that includes a Digiclops1394Timestamp typedef struct DigiclopsImage { // The low-level timestamp value for this DigiclopsImage Digiclops1394Timestamp digiclopsTimestamp; // The TriclopsInput structure contained in this DigiclopsImage TriclopsInput triclopsInput; } DigiclopsImage; // An enumeration of the different camera properties that can be set via the programming interface typedef enum DigiclopsCameraProperty TriclopsBool ppmWriteFromTriclopsInputWithComment( const char* const char* filename, comment, TriclopsInput* input ); #ifdef cplusplus } #endif #endif //#ifndef _PNMUTILS_H_ 225 B BR algorithms for Estimating Freespace and User Intention in Semi-autonomous Wheelchair Control for Labview The LabVIEW software was used to program for the wheelchair control Firstly, the LabVIEW program calls DLL files which are programmed on C++ as shown in Figure B.1 In addition, the program was computed to display freespaces and obstacles Bayesian Recursive (BR) algorithms were designed to estimate freespaces for the wheelchair decisions in various environments Figure B.1: Reading DLL files The block diagram as shown in Figure B.2 is to initialize for computing 2D array corresponding to resolution 0.01 The X-array is -1.8m to 1.7m and the Z-array is from to 5m The computation of 2D distance for ten times is shown in Figure B.3 Figure B.2: Initial of array X 226 Figure B.3: Computation of 2D distance map Figure B.4: Wheelchair control system based on freespace values and camera calibration The wheelchair control system was computed based on the Z-X coordinates of 2D distance map as shown in Figure B.4 The wheelchair speed is computed dependent on many parameters such as freespace values, the “obstacle” distance and the installed diameter value of the wheelchair The block diagram was programmed to estimate freespace and user intention as sown in Figures B.5 to B.7 The block diagram of the system and its interface are shown in Figures B.8 and B.9 227 Figure B.5: Computation of BR algorithm Figure B.6: Loops of probabilistic computation Figure B.6: probabilistic computation for tem times 228 CERTIFICATE OF AUTHORSHIP/ORIGINALITY I certify that the work in this thesis has not previously been submitted for a degree nor has it been submitted as part of the requirements for a degree, except as fully acknowledged within the text I also certify that the thesis has been written by me Any help that I have received in my research work and the preparation of the thesis itself has been acknowledged In addition, I certify that all information sources and literature used are indicated in the thesis Signature of Candidate

Ngày đăng: 20/05/2016, 15:28

TỪ KHÓA LIÊN QUAN