1. Trang chủ
  2. » Ngoại Ngữ

Omnidirectional Robot Design Report

172 1 0

Đang tải... (xem toàn văn)

Tài liệu hạn chế xem trước, để xem đầy đủ mời bạn chọn Tải xuống

THÔNG TIN TÀI LIỆU

Thông tin cơ bản

Định dạng
Số trang 172
Dung lượng 5,18 MB

Nội dung

May 08-35 Omnidirectional Robot Design Report Client & Faculty Advisor: Dr Nicola Elia Date: December 11, 2007 Members: Alex Baumgarten (CprE) Mark Rabe (CprE) Jon Dashner (CprE) Chris Tott (EE) Shane Griffith (CprE) Jon Watson (EE) Kyle Miller (EE) Joshua Watt (CprE) DISCLAIMER: This document was developed as part of the requirements of an electrical and computer engineering course at Iowa State University, Ames, Iowa The document does not constitute a professional engineering design or a professional land surveying document Although the information is intended to be accurate, the associated students, faculty, and Iowa State University make no claims, promises, or guarantees about the accuracy, completeness, quality, or adequacy of the information Document users shall ensure that any such use does not violate any laws with regard to professional licensing and certification requirements Such use includes any work resulting from this student-prepared document that is required to be under the responsible charge of a licensed engineer or surveyor This document is copyrighted by the students who produced the document and the associated faculty advisors No part may be reproduced without the written permission of the senior design course coordinator Table of Contents Table of Contents Introduction 1.1 Summary 1.2 Acknowledgements 1.3 Problem Statement 1.4 1.5 1.6 1.3.1 Statement 1.3.2 Approach Strategy .9 1.4.1 Concept Sketch 1.4.2 System Block Diagram 10 1.4.3 System Description 11 1.4.4 Operating Environment 12 1.4.5 User Interface Description 13 1.4.6 Market Survey 14 Scope 14 1.5.1 Functional Requirements 14 1.5.2 Non-Functional Requirements 15 Project Plan 15 1.6.1 Resource Requirements 15 1.6.2 Project Schedule 16 Design Approach and Product Design 23 2.1 Approach 23 2.2 Design Objectives 23 2.3 System Decomposition 23 2.3.1 Robot System 25 2.3.1.1 Functional Requirements 25 2.3.1.2 Hardware Specifications 25 2.3.1.3 Software Specifications 58 2.3.1.4 Input Output Specifications 78 2.3.1.5 Test Specifications 78 2.3.2 Localization System 86 2.3.2.1 Camera 86 2.3.2.2 Localization System Options 87 May 08-35 Project Design Approach and Product Design Page of 172 2.3.2.3 Design with Camera Implementation 88 2.3.2.4 Camera Requirements 89 2.3.2.5 Camera Technologies 97 2.3.2.6 Camera Specifications 98 2.3.2.7 Vision Builder Integration 99 2.3.2.8 Localization Requirements 104 2.3.2.9 Localization Specifications 105 2.3.2.10 Determining the Required Speeds of the Camera & NI Vision Software 107 2.3.2.11 Localization System Class Breakdown .116 2.3.2.12 Localization Testing 120 2.3.3 Logging System 121 2.3.3.1 Functional Requirements 122 2.3.3.2 Software Specifications 123 2.3.3.3 Intermodule Dependencies 129 2.3.3.4 User Interface Specifications 130 2.3.4 IO Server System 133 2.3.4.1 IO Server Requirements 133 2.3.4.2 IO Server Specifications 134 2.3.4.3 IO Server Module Dependencies 137 2.3.4.4 IO Server Inputs/Outputs 143 2.3.4.5 IO Server System Functional Breakdown 144 2.3.4.6 IO Server Packets 153 2.3.4.7 IO Server Data Objects 160 2.3.4.8 IO Server Testing 160 2.3.4.9 Database Testing 161 2.3.4.10 Database Abstraction Functional Requirements 161 May 08-35 Project Design Approach and Product Design Page of 172 Milestones 163 3.1 Power/Telemetry Board .163 3.2 Motor Controller Drivers .164 3.3 Sensory Controller Drivers 164 3.4 I2C Bus Library 164 3.5 Artificial Intelligence 165 3.6 Main Workstation 165 3.7 Localization System .165 3.8 Logging System 166 3.9 I/O Server .166 3.10 Total System Testing 166 IRP Feedback 168 Closing Remarks 169 Team Info .170 Signatures 171 May 08-35 Project Design Approach and Product Design Page of 172 _ List of Figures Figure 1-1 – Concept Sketch 10 Figure 1-2 – System Block Diagram 11 Figure 1-3 – Voltage Regulation and Telemetry System PCB 12 Figure 1-4 – Group Resources 16 Figure 1-5 – Project Schedule, Page .17 Figure 1-6 – Project Schedule, Page .18 Figure 1-7 – Project Schedule, Page .19 Figure 1-8 – Project Schedule, Page .20 Figure 1-9 – Project Schedule, Page .21 Figure 1-10 – Project Schedule, Page 22 Figure 2-1 – Main Systems Working Together 24 Figure 2-2 – Bottom of KoreBot Board 26 Figure 2-3 – Mechanical Specifications of all K-Team Boards 28 Figure 2-4 – Power Connector for KoreBot 28 Figure 2-5 – How the Stack Connectors Work 29 Figure 2-6 – KB250 Stack Connectors 29 Figure 2-7 – Location and Pin-out of KoreConnect Connector 32 Figure 2-8 – Location and Pin-out of JTAG to USB Connector .32 Figure 2-9 – Supplying +5V to KoreIO LE .34 Figure 2-10 – Overview of KoreIO LE Hardware 35 Figure 2-11 – Power Output Diagram 37 Figure 2-12 – Power Output Supply Connection .38 Figure 2-13 – Top of KoreMotor Board / Overview of Connectors 40 Figure 2-14 – Bottom of KoreMotor Board / Overview of Switches .41 Figure 2-15 – I2C Address Settings of DIP Switch 41 Figure 2-16 – Pin-out for Motor Controller .41 Figure 2-17 – Power Connector for KoreMotor 42 Figure 2-18 – 2232 006SR Physical Dimensions 45 Figure 2-19 – Power/Telemetry Board Schematic 51 Figure 2-20 – Top Layer of Power/Telemetry Layout .52 Figure 2-21 – Bottom Layer of the Power/Telemetry Layout 53 Figure 2-22 – Schematic of the Voltage Regulator Layout .56 May 08-35 Project Design Approach and Product Design Page of 172 Figure 2-23 – Software Motor Controller Sequence Diagram 69 Figure 2-24 – Software Motor Controller Class Diagram .70 Figure 2-25 – Sensor Processing Class Diagram .75 Figure 2-26 – Sensor Processing Sequence Diagram 76 Figure 2-27 – Camera Vision Field 89 Figure 2-28 – Worst Case Robot Orientation Error 92 Figure 2-29 – Position Error Due to Robot Orientation 94 Figure 2-30 – Deceleration Required to Avoid Collision at 2m/sec 95 Figure 2-31 – Framerate Required vs Speed 108 Figure 2-32 – Processing Time vs Robot Speed 109 Figure 2-33 – Distance Traveled vs Robot Speed 110 Figure 2-34 – Pipeline for Localization System 115 Figure 2-35 – Logging System State Diagram 121 Figure 2-36 – Logging System Class Diagram 122 Figure 2-37 – Setting Communication Options for Logging System 127 Figure 2-38 – Stop Logging Option for Logging System .128 Figure 2-39 – Monitor Sensor Data for Logging System 129 Figure 2-40 – Main Logging GUI 130 Figure 2-41 – Option Logging GUI .131 Figure 2-42 – Variable Logging GUI .132 Figure 2-43 – IO Server Module Diagram .136 Figure 2-44 – IO Server Module Dependencies 137 Figure 2-45 – IO Server Robot Connection System Dependencies 138 Figure 2-46 – Robot Processor Dependencies .139 Figure 2-47 – Logging Processor Dependencies 140 Figure 2-48 – Database Handler Dependencies .141 Figure 2-49 – Velocity and Acceleration Computer Dependencies 142 Figure 2-50 – Database Abstraction .161 Figure 2-51 – Database 162 May 08-35 Project Design Approach and Product Design Page of 172 _ List of Tables Table – KoreBot LE Hardware Specifications 27 Table – Pin-out for the J700 KB250 Connector 30 Table – Pin-out for the J701 KB250 Connector 31 Table – KoreIO LE Hardware Specifications .33 Table – Input/Output Functions for KoreIO LE 36 Table – KoreIO LE Power Output Limitations .38 Table – KoreMotor Hardware Specifications .39 Table – 2232 006SR Motor Specifications 43 Table – Rechargeable Battery Comparison 46 Table 10 – ThunderpowerRC Batteries Specifications 46 Table 11 – ADXL311 Accelerometer Hardware Specifications 48 Table 12 – CRS03 Gyroscope Hardware Specifications 54 Table 13 – Voltage Regulator Comparison 55 Table 14 – 1394 Camera Comparison 101 Table 15 – USB Camera Comparison 102 Table 16 – Computer Vision Localization Research .103 Table 17 – NI Benchmark for Vision Software 112 May 08-35 Project Design Approach and Product Design Page of 172 Introduction The materials found in the this introduction section include a summary of the project, acknowledgements, the problem statement along with the approach to a solution, operating environment, and end product deliverables 1.1 SUMMARY The project client and advisor is Dr Nicola Elia, a professor in Control Systems at Iowa State University During the summer of 2006, Dr Elia and the project team leader, Kyle Miller, started building an omnidirectional robot With the addition of seven new team members to the project, Dr Elia would like to see the robot finished along with an intelligent method of travel and a user interface system The group must complete the project plan and design report during the fall 2007 semester The project plan and design will be carried out during the spring 2007 semester as the implementation of the robot takes place and a prototype is built Tests on the prototype will then be performed and documented Finishing off the semester, a users’ manual and a final report will be created 1.2 ACKNOWLEDGEMENTS We would like to give special thanks to: Lee Harker, Jason Boyd, Steve Kovarik, and Greg Smith for their help and support 1.3 PROBLEM STATEMENT This section includes the problem/need statements, and the approach to solving this problem 1.3.1 Statement The world of robotics is growing and many people are interested in learning how to design versatile robots Many limitations exist in robot applications and engineers around the world must share their ideas and experiences with each other concerning methods to overcome these obstacles Dr Nicola Elia will be advising the building and testing of an omnidirectional robot The robot must have the ability to know its surroundings, as well as its heading, speed, acceleration, and rotation Detailed tests will be performed to characterize the performance of the end product As problems arise during the next May 08-35 Project Design Approach and Product Design Page of 172 year, they are to be documented and a detailed solution concerning the solution will be provided so other engineers can learn from this experience 1.3.2 Approach An overhead vision based localization system will be used to capture images of the robot in its surroundings These images will be sent to a nearby PC which includes image processing software After these images have been filtered and processed by the PC, information concerning acceptable headings will be sent wirelessly to the robot The robot will be running intelligent software which will use this information to decide where to move A telemetry system, complete with an accelerometer to measure x- and y- acceleration and a gyroscope to measure the robot’s rotational acceleration will be implemented into the robot system Using the output of the accelerometer, the speed of the robot can be computed by integrating the acceleration over time A logging system will be created to track the parameters of the robot The user interface of the logging system will allow a user to interact with the robot in a number of ways The user will have the ability to change which robot parameters are tracked during operation The user can also specify how often the robot parameters are polled 1.4 STRATEGY Numerous methods exist to solving this problem; this section includes a general solution The system layout and the interaction of the system components with one another are described in this section 1.4.1 Concept Sketch The concept sketch in Figure Introduction-1 below portrays the basic concept of the project The robot must be able to make smart decisions when navigating its area Using a vision-based localization system, an overhead camera will send images to the CPU where image processing will be done to determine obstacle and robot location The operating plane of the robot is a three meter by two and a quarter meter, carpeted surface which the robot is free to roam May 08-35 Project Design Approach and Product Design Page of 172 Figure Introduction-1 – Concept Sketch 1.4.2 System Block Diagram The robot is equipped with a wireless card and is connected to the server using a wireless network Information concerning the robot location, robot orientation, and ball location will be sent to the robot using the wireless communication This information will be processed by the robot and robot’s main development controller board will send movement commands to the motor controller With telemetry feedback sent to development controller board and the server, the localization system will determine the robot’s location and the robot will decide which direction it should travel The CPU will interpret the telemetry feedback to determine how fast it is traveling and accelerating, and in what direction The overall system is portrayed in the system-block diagram in Figure Introduction-2 below May 08-35 Project Design Approach and Product Design Page 10 of 172 } DatabaseInformationReply Sensor Data List { long – Timestamp bytes – Robot Identifier String – Sensor short – sensor information } LoggingCancelToIO Robot Identifier List { bytes – Robot Identifier } LoggingCancelToRobot no information needed NewLoggingConnection ArrayList logList The LoggingPacket is used to encapsulate the packets generated by the logging system The LoggingPacket encapsulates five packets in a payload and uses a header with the following values to identify the encapsulated data: RequestForRobotData DatabaseRequest DatabaseAddition DatabaseInformationReply LoggingCancelToIO LoggingCancelToRobot The RequestForRobotData packet is generated by the logging system and gets forwarded to the proper robot via the IO Server The IO Server uses the robot identifier in the packet to determine the connection to forward the data over If the robot is not connected to the IO Server a RobotNotConnected ErrorPacket is generated and sent to the logging system If the robot no longer has the desired sensor onboard, a SensorNotConnected ErrorPacket is generated and sent to the logging server The robot receiving the FrequestForRobotData packet satisfies the logging request by recording the desired sensor data over the specified time interval The DatabaseRequest packet is generated by the logging system and tells the IO Server to query the database for the desired information A timestamp identifies the beginning time of the data to return and a timestamp specifies the end time of the data to return The first timestamp corresponds to the time the RequestForRobotData packets were generated The second timestamp corresponds to the time the LoggingCancelToIO packets were generated The sensor string is the desired sensor data and the robot identifier identifies the robot of the desired data The DatabaseAddition packet is generated by either a robot when it’s recording logging data or the IO Server when a new localization packet arrives The sensor data list length specifies the number of sensor data objects in the sensor data list The DatabaseInformationReply packet is generated by the IO Server when the database returns the data from the logging system DatabaseRequest query The DatabaseInformationReply first holds an int identifying the number of returned objects in the data list The sensor data list holds information for each data entry returned The LoggingCancelToIO packet is generated by the logging system to cancel logging on the specified robots A robot identifier list is iterated by the IO server and for each robot in the list, the IO Server generates a LoggingCancelToRobot packet The LoggingCancelToRobot packet is simply a LoggingPacket with a header value of No additional payload data is required The NewLoggingConnection packet is sent to the logging system when the logging system connects to the IO Server The packet is generated by the RobotConnectionSystem when the ConnectionManager calls the NewLoggingConnection method of the RobotConnectionSystem The packet encapsulates the entire robot and sensor data as a RobotSensorsAndID object ArrayList for the logging system OperationPacket byte - Header bytes – robot identifier Payload{} NewRobot { byte - Number of sensor objects List of Sensor Objects { byte – Sensor name length String - Sensor } } RobotDisconnected no extra data needed SensorChange String – Sensor The OperationPacket is used to encapsulate various robot and sensor information packets The OperationPacket encapsulates four packets in a payload and uses a header with the following values to identify the encapsulated data: NewRobot RobotDisconnected SensorDisconnected NewSensor The OperationPacket also contains a robot identifier The robot identifier identifies the robot with the desired changes The NewRobot packet is sent to the IO Server by the robot once a connection is established between the two The IO Server will not forward localization information or logging requests to the robot until the robot delivers the NewRobot packet The NewRobot packet identifies the sensors available for logging on the robot and the identifier that corresponds to the robot The RobotDisconnected packet is generated by the IO Server and sent to the logging system to inform the logging system that the robot has disconnected No OperationPacket payload data is needed for a RobotDisconnected packet The SensorDisconnected and NewSensor packets both only identify the sensor string Therefore, they use the same encapsulated packet data: a SensorChange payload The different header values are used to determine the difference If a sensor disconnects from a robot, the robot will send a SensorDisconnected packet to the IO Server, which in turn forwards the packet to the logging system Both systems will remove the sensor from their corresponding robot information The NewSensor packet is generated by the robot, sent to the IO Server, and forwarded to the logging system by the IO Server The NewSensor packet allows the two systems to update their list of available sensors on the corresponding robot 2.3.4.7 IO Server Data Objects RobotConnections ArrayList robots RobotData RobotProcessor connectionThread RobotSensorsAndID robotInfo RobotSensorsAndID String RobotIdentifier ArrayList SensorData LocalizationHistory ArrayList SecondOfHistory 2.3.4.8 IO Server Testing Accuracy Testing The accuracy of the IO Server will be tested with two different tests First, the accuracy of data transmission is tested by sending a packet with predetermined data from the IO Server to each system The packet will be echoed back and compared with the original data The test is a success if the data doesn’t change during the transmissions between any of the systems Second, the accuracy of the VelocityAndAccelerationComputer module is determined through comparisons with the velocity and acceleration measurements Efficiency Testing Localization system accuracy depends on the efficiency of the IO Server to forward the localization data The efficiency of the IO Server is tested by checking the time for localization packet delivery Successful tests will show the IO Server giving precedence to LocalizationPacket data over every other packet A second efficiency test determines the IO Server’s ability to route logging requests to specific robots (as opposed to broadcasting requests) Successful tests will show the IO Server sending logging requests only to the specified robot 2.3.4.9 Database Testing The database will store all of the data from the IO Server The database is tested by checking static values sent to the database with what the database returns Successful tests will show the database returning the exact data sent to it 2.3.4.10Database Abstraction Functional Requirements Database Abstraction shall provide an interface to store information in the SQL database Database Abstraction shall provide an interface to get all information stored in the SQL database Database Abstraction shall provide an interface to get all information stored in the SQL database based on a specific Robot ID Database Abstraction shall provide an interface to get all information stored in the SQL database based on a specific sensor name Database Abstraction shall provide an interface to get all information stored in the SQL database based on a specific Robot ID and sensor name Database Abstraction shall provide an interface to flush all information out of the SQL database Figure Design Approach and Product Design-60 – Database Abstraction Figure Design Approach and Product Design-61 – Database Milestones Based on the deliverables state before, this section outlines the completed which have been achieved and the uncompleted milestones which the group must achieve 3.1 POWER/TELEMETRY BOARD Completed Schematic Completed Component Research Completed Cadence Research Completed Part Library Added Components Laid out connections Verified with Lee Harker and Dr Smith Layout Completed Cadence Layout Program Research Completed Component Specification Created Component Footprints Verified with Lee Harker and Dr Smith Received Completed Layout from Lee Harker Mock Up Board Completed Component to Board Soldering Completed Board Mounting Complete Testing Uncompleted Schematic Develop Battery Over-Discharge Protection Layout Layout Construction Component Library Updates Mock Up Board Completing Test Plan Completing Noise Measurements Completing Accuracy Testing 3.2 MOTOR CONTROLLER DRIVERS Completed Completed Planning Completed Design Controlled Open and Closed Loop Motor System Coordinated Static Commands Motor Control Experimented with motor controller settings Uncompleted Accept and execute movement commands Determine optimum PID values Determine and Implement efficient patterns Implement Safety Mechanism 3.3 SENSORY CONTROLLER DRIVERS Completed Completed Planning Completed Design Read Sensor Data Uncompleted Read and Process data from sensors 3.4 I2C BUS LIBRARY Completed Completed Planning Completed Design Uncompleted Characterize Limitation of I2C bus Regulate access to I2C bus 3.5 ARTIFICIAL INTELLIGENCE Completed Completed Planning Completed Software Design Uncompleted Move robot in static pattern Communicate if I/O Server Move Korebot remotely Make decisions based on data sent from Localization System Make decisions based on Sensory Controller Drivers Utilize all systems to coordinate movement Follow an object as it moves 3.6 MAIN WORKSTATION Completed Setup Network File Share (NFS) Uncompleted Set up Database 3.7 LOCALIZATION SYSTEM Completed Completed Localization Research Received National Instruments LabView software Developed Experimental Object Detection Programs Competed Initial Camera Testing Completed Localization Planning Completed Localization Design Completed Camera Specifications Uncompleted Purchase Camera Complete Object Detection Complete Pattern Detection Module Complete Communication Module Complete Functionality Testing Complete Performance Testing 3.8 LOGGING SYSTEM Completed Completed Planning Completed Design Completed Graphical User Interface Uncompleted Complete Data Module Complete GUI Modules Complete Communication Module Complete Functionality Testing 3.9 I/O SERVER Completed Completed Planning Completed Design Uncompleted Complete Korebot Communication Module Complete Logging System Communication Module Complete Localization System Communication Module Complete Database Abstraction Layer Set up Database Complete Velocity/Acceleration Computation Module Complete Functionality Testing Complete Performance Testing 3.10 TOTAL SYSTEM TESTING Completed Created Main Testing Maneuvering Field Uncompleted System Latency System Performance Robot Position, Speed, and Acceleration IRP Feedback The group made changes to the project design based on the feedback of the Industry Review Panel Each statement made by the IRP was noted and listed below The group made adjustments to the project design based on this feedback and each corresponding adjustment the group made is provided Statement – Current sense must be implemented on the motors Adjustment – The KoreMotor II board allows for current sensing Statement – Research issues the Cornell team had with their design and make adjustments to make the design more similar to theirs Adjustment – The Cornell design has been researched consistently throughout the project Statement – Run a risk analysis and adjust the project schedule to allow for possible setbacks Adjustment – The team has run a risk analysis and adjusted the project schedule to lengthen the amount of time spent on the most risk prone milestones Statement – State how much time is going to be spent on each individual task Adjustment – The time allotted for each task is provided in the project schedule Statement – Make the design more hierarchical Adjustment – A milestones section has been added to the design document with a top down approach Closing Remarks Designing and building a robot with the ability to travel in any direction at a point in time is a difficult task; however, providing detailed documentation concerning this experience can benefit many other engineers as they look to build similar devices The level of knowledge of robotics in the local community can also be raised through demonstrations and explanations of the robot’s system Through the collective efforts of the team, the design of the robot system will be laid out and then implemented with a fully functional robot All of the robot’s parameters will then be tested and a User’s Manual will be created Finally, a complete project report can then be created Team Info The contact information for each team member is provided below Dr Nicola Elia – Professor at Iowa State University Contact E-mail nelia@iastate.edu Phone Number 515-294-3579 Alex Baumgarten – Computer Engineering Contact E-mail abaumgar@iastate.edu Phone Number 563-650-4864 Jon Dashner – Computer Engineering Contact E-mail jdashner@iastate.edu Phone Number 712-527-0156 Shane Griffith Contact E-mail shaneg@iastate.edu Phone Number 319-213-2214 Kyle Miller – Electrical Engineering Contact E-mail milleky@iastate.edu Phone Number 641-420-3678 Mark Rabe – Computer Engineering Contact E-mail mrabe@iastate.edu Phone Number 319-330-6449 Chris Tott – Electrical Engineering Contact E-mail totty@iastate.edu Phone Number 712-898-3105 Jon Watson – Electrical Engineering Contact E-mail jonwat@iastate.edu Phone Number 515-520-9569 Joshua Watt – Computer Engineering Contact E-mail jpew@iastate.edu Phone Number 515-451-8200 Signatures The following signatures indicate the approval of the statements and conclusions provided in this document Client/Advisor: Dr Nicola Elia Alex Baumgarten (CprE) Jon Dashner (CprE) Shane Griffith (CprE) Kyle Miller (EE) Mark Rabe (CprE) Chris Tott (EE) Jon Watson (EE) Joshua Watt (CprE) ... together as shown in Figure Design Approach and Product Design- 11 below Figure Design Approach and Product Design- 11 – Main Systems Working Together 2.3.1 Robot System The robot system consists of... shown in Figure Design Approach and Product Design- 23 and Figure Design Approach and Product Design- 24 below show the connections for the controller Figure Design Approach and Product Design- 23 –... extensively while the robot is operating Figure Design Approach and Product Design- 15 – How the Stack Connectors Work (From KB250SPECS.PDF by k-team.com) Figure Design Approach and Product Design- 16 –

Ngày đăng: 18/10/2022, 15:11

w