Tạp chí Servo
04 74470 58285 11> U.S. $5.50 CANADA $7.00 Vol. 6 No. 11 SSEERRVVOO MAGAZINESELF-REASSEMBLING ROBOTS•PROPELLER•UNIVERSAL MOTOR•ROBOT PUPPETNovember 2008 Cover.qxd 10/8/2008 10:16 PM Page 1 Let your geek shine. Meet Pete Lewis, lead vocalist for the band Storytyme. Pete recently created the RS1000, a new personal monitor system for performing musicians. It was SparkFun’s tutorials, products and PCB service that enabled him to take his idea to market in less than a year. The tools are out there. Find the resources you need to let your geek shine too. ©2008 SparkFun Electronics, Inc. All rights reserved. Hear music from Storytyme at www.storytymeband.com, or check out Pete’s RS1000 at www.rockonaudio.com. Sharing Ingenuity WWW.SPARKFUN.COM Full Page.qxd 7/9/2008 9:57 AM Page 2 Full Page.qxd 10/7/2008 11:41 AM Page 3 Features 26 BUILD REPORT: 30 Pound Combat Robot — Mitch 28 MANUFACTURING: Even More Things to Consider When Building a Fighting Robot 33 PARTS IS PARTS: Chain Length Calculator and Chain Path Visualizer 35 A Brief History of WAR Events 30 Results and Upcoming Competitions 31 Event Report: Robot Battles 2008 Robot Profile 34 Limblifter SERVO Magazine (ISSN 1546-0592/CDN Pub Agree#40702530) is published monthly for $24.95 per year by T & L Publications, Inc., 430 Princeland Court, Corona, CA 92879. PERIODICALS POSTAGE PAID AT CORONA, CA AND AT ADDITION- AL ENTRY MAILING OFFICES. POSTMASTER: Send address changes to SERVO Magazine, P.O. Box 15277, North Hollywood, CA 91615 or Station A, P.O. Box 54,Windsor ON N9A 6J5; cpcreturns@servomagazine.com 06 Mind/Iron 18 New Products 20 Events Calendar 72 SERVO Webstore 75 Robotics Showcase 81 Robo-Links 81 Advertiser’s Index Columns 08 Robytes by Jeff Eckert Stimulating Robot Tidbits 10 GeerHead by David Geer Self-Reassembling Robot 14 Ask Mr. Roboto by Dennis Clark Your Problems Solved Here 21 Twin Tweaks by Bryce and Evan Woolley Surveyor’s Travels 68 Robotics Resources by Gordon McComb Hand Tools for Robot Construction 76 Appetizer by R. Steven Rainwater Why Just Build a Robot? Be a Robot! 78 Then and Now by Tom Carroll Robot Competitions and Contests PAGE 21 4 SERVO 11.2008 THE COMBAT ZONE . Departments TOC Nov08.qxd 10/8/2008 2:38 PM Page 4 11.2008 VOL. 6 NO. 11 SERVO 11.2008 5 36 A Robotic Puppet by John Blankenship and Samuel Mishal See how to implement computer control to give your robots the illusion of life. 40 The Universal Motor by Fred Eady This circuit gives you effective control of the AC power that is being applied to your robot’s motor without having to pamper the microcontroller. 48 Getting Control With the Propeller: Part 3 by Kevin McCullogh Stepper motors. 52 The Pico ITX Johnny 5 Project by Andrew Alter Part 3 shows the advantages of having an onboard PC. 56 When LEGO Meets Sumo by Phil Malone See how combining two styles of robots adds up to a unique competition. 60 Build a GPS Smart Logger by Michael Simpson You’ll want to get started building this device as it will be incorporated into the Ultimate Robot Build series which picks up next month. PAGE 56 PAGE 36 PAGE 40 Features & Projects TOC Nov08.qxd 10/8/2008 2:39 PM Page 5 Published Monthly By T & L Publications, Inc. 430 Princeland Ct., Corona, CA 92879-1300 (951) 371-8497 FAX (951) 371-3052 Webstore Only 1-800-783-4624 www.servomagazine.com Subscriptions Toll Free 1-877-525-2539 Outside US 1-818-487-4545 P.O. Box 15277, N. Hollywood, CA 91615 PUBLISHER Larry Lemieux publisher@servomagazine.com ASSOCIATE PUBLISHER/ VP OF SALES/MARKETING Robin Lemieux display@servomagazine.com EDITOR Bryan Bergeron techedit-servo@yahoo.com TECHNICAL EDITOR Dan Danknick dan@teamdelta.com CONTRIBUTING EDITORS Jeff Eckert Tom Carroll Gordon McComb David Geer Dennis Clark R. Steven Rainwater Fred Eady Kevin Berry Bryce Woolley Evan Woolley Andrew Alter Phil Malone Michael Simpson Samuel Mishal John Blankenship Kevin McCullough Ray Billings Mike Jeffries Charles Guan Robert Farrow CIRCULATION DIRECTOR Tracy Kerley subscribe@servomagazine.com MARKETING COORDINATOR WEBSTORE Brian Kirkpatrick sales@servomagazine.com WEB CONTENT Michael Kaudze website@servomagazine.com PRODUCTION/GRAPHICS Shannon Lemieux Joe Keungmanivong ADMINISTRATIVE ASSISTANT Debbie Stauffacher Copyright 2008 by T & L Publications, Inc. All Rights Reserved All advertising is subject to publisher’s approval. We are not responsible for mistakes, misprints, or typographical errors. SERVO Magazine assumes no responsibility for the availability or condition of advertised items or for the honesty of the advertiser. The publisher makes no claims for the legality of any item advertised in SERVO.This is the sole responsibility of the advertiser.Advertisers and their agencies agree to indemnify and protect the publisher from any and all claims, action, or expense arising from advertising placed in SERVO. Please send all editorial correspondence, UPS, overnight mail, and artwork to: 430 Princeland Court, Corona, CA 92879. I just finished listening to book one of Kevin Anderson’s Saga of Seven Suns, in which robots play a central role. In the story, the Klikiss robots are highly intelligent, multi-limbed bug-like creatures that communicate with other robots using digital data streams and with humans via speech. The tale reminded me that at least one perception of intelligent robots revolves around the power of speech. Unfortunately, progress in robotic speech is relatively stagnant. Speech synthesis has been a mature technology for decades, and advances in large vocabulary, continuous speech recognition seems to have hit a wall in the late 1990s. This is in part because the projected multi-billion dollar market for PC-based speech recognition document processing products never materialized. Today, few people even take notice of the speech recognition software available for the PC and Mac – and most hate the speech recognition systems used by the automated attendants employed by the airlines and credit card industries. Despite the mystique of “AI” surrounding speech recognition, speech recognition software that you can purchase for your PC/Mac works by simply matching spectral templates of sounds and using tables of likely word sequences to build sentences. For example, if you say “ball,” the speech recognition software would identify likely Mind / Iron by Bryan Bergeron, Editor Mind/Iron Continued 6 SERVO 11.2008 Figure 1 Mind-Iron Nov.qxd 10/8/2008 6:52 AM Page 6 candidates such as “ball,” “fall,” and “gall.” Now, if the previous three words are “Johnny hit the,” the algorithm will likely rank ball as the most probable word. Current accuracy limitations are about 97%, even with individual training, and accuracy isn’t improved by adding processing power or memory. The obvious limitation to current speech recognition software is that it’s simply a replacement for the keyboard and video display. There is no underlying intelligence or reasoning capability. Of course, prototype systems capable of reasoning have been developed in academia, but these demonstration projects have been limited to highly constrained domains. What we need in robotics is a system that not only recognizes the phrase, “Johnny hit the ball,” but that can infer with what. If Johnny is playing soccer, we might infer he hit the ball with his head. If the sport is baseball, then we might infer he used a bat. Back to our needs in robotics, the owner of a service bot should be able to say, “Please bring me the paper” and the robot should be able to infer that the owner is referring to the newspaper. There are also issues of image recognition, mobility, and grasping the paper, but they all depend on the robot understanding the need of the owner. The limitation of speech recognition in robotics then isn’t in the ability to transform utterances into machine readable form, but with how the computational elements of the robot should process the machine readable words and phrases into actionable commands. So, how do you go about accomplishing this? It’s a non-trivial task, as a search of the IEEE literature on Natural Language Processing will illustrate. The traditional techniques — such as Hidden Markov Modeling — might be a bit intimidating if you don’t have a degree in computer science. However, you can get a feel for the tools used to map out the contextual meanings of words and phrases by working with Personal Brain. You can download the free, fully-functional personal version at www.thebrain.com. You can use the Brain to build context maps that show, for example, inheritance and the relationship between various objects in your home (see Figure 1). For your robot to bring you the newspaper, it would have to first locate the paper, and it would help to know the possible locations the paper might be found in the home. It would be inefficient, for example, if the robot began digging through your clothes’ closet in search of the newspaper, instead of on the table in your kitchen. Once you get used to working with Personal Brain, you might want to explore other uses in robotics. For example, I keep track of my various robotic projects – parts, suppliers, references, etc.— by creating networks with the program. In fact, the best way to build context maps is to create explicit, detailed maps that actually help you in everyday tasks. SV SERVO 11.2008 7 Mind-Iron Nov.qxd 10/7/2008 2:37 PM Page 7 8 SERVO 11.2008 Bot Gets Bio Brain Placing a functioning human brain into a robot is still well within the realm of science fiction, but some folks at the University of Reading (www.read ing.ac.uk) have created a biological brain of sorts and hooked it up as a robot controller. It has been known for some time that cultured neurons are somewhat like ants that have been scattered away from the anthill in that they can no longer function as a single unit. However, when interconnected in a culture dish, such neurons form sim- ple networks that display spontaneous electrical activity and can function as memories; i.e., they can “learn” things. In this application, Prof. Kevin Warwick and associates placed the neurons on a multielectrode array which is a dish that employs 60 electrodes to pick up the cells’ signals. This activity is then used to control the robot’s movement. When the robot approaches an obstacle, signals are sent to the “brain,” and its responses are used to drive the wheels left or right to avoid hitting the object. The research is not aimed at creating biomechanical robots of the future, however. Rather, according to Warwick, “The key aim is that eventually this will lead to a better understanding of development and of diseases and disorders which affect the brain such as Alzheimer’s disease, Parkinson’s disease, stroke, and brain injury. This research will move our understanding forward of how brains work, and could have a profound effect on many areas of science and medicine.” Give Us Some Skin There’s a basic problem with creating a layer of skin for a robot. For the skin to provide tactile feedback, it must be able to conduct signals back to the “brain.” And if the skin is pliable enough to bend with the bot’s move- ments, it has to be made of something flexible, like rubber. The snag is that rubber is a terrible conductor. But now researchers at the University of Tokyo (www.u-tokyo.ac.jp) say they have developed a new, highly conductive rubber, paving the way for robots with stretchable “e-skin.” The trick was to grind up some carbon nanotubes, mix them with an ionic liquid, and add them to the mix. The resulting material flexes like ordi- narily elastic but offers conductivity about 570 times higher. Apparently, one can use it to create elastic ICs that can be mounted on curved surfaces and stretched up to 1.7 times their original size with no mechanical dam- age or significant change in conductivi- ty. (You can stretch the stuff more, but conductivity drops by about 50 percent by the time you get to 2.3 times the original size.) With further develop- ment of the material, bots of tomor- row may be able to feel temperature and pressure like we do. Must Be Nuts It isn’t immediately apparent how students at Troy High School (www. troyhigh.com) became concerned about the well-being of the world’s professional coconut pickers, but they are. It seems that gathering nuts from the “tree of life” requires harvesters to climb 100 ft trees and chop them down with machetes, which is both dangerous and inefficient. Hence, the “robotic tree climber,” which the students developed for the 2008 Lemelson-MIT InvenTeams event. The remarkable feature of the remote- controlled device is that it can accommodate changing tree diameters, thanks to its employment of a DryLin® QuadroSlide linear guide system, which was donated by igus, Inc. (www.igus.com), a manufacturer of various motion-related components and machinery. The developers of the climber tell us that it will allow pickers to scale more than 40 trees per day, as opposed to the present five to 10. Will bbyy JJeeffff EEcckkeerrtt RRoobbyytteess This small mobile robot sports a biological controller based on cultured neurons. Courtesy of the University of Reading. Flexible ICs may give robots a human-like sense of touch. Courtesy of the University of Tokyo. This climber bot could boost coconut pickers’ productivity by 800 percent. Robytes-Nov-edited.qxd 10/6/2008 10:38 AM Page 8 the resulting glut of coconuts cause a precipitous drop in the price of coconut cream pie? Only time and the commodity markets will tell. Heli See, Heli Do In the past, programming robotic helicopters has been something of a pain, given that they must perform some fairly complex maneuvers and (unlike fixed-wing vehicles) are inher- ently unstable. But computer scientists at Stanford University (www.stan ford.edu) — tired of laboriously pecking out source code from scratch — have developed some AI algorithms that allow their four-foot autonomous helicopter fleet to teach itself to fly. The process involves both ground- based and ‘copter-mounted instru- ments, including accelerometers, gyros, magnetometers, GPS receivers, and cameras. It begins with a human using a remote control to put a vehicle through a series of stunts and repeating them several times. The instruments record the flight data, which becomes the basis of the control program. But the AI system monitors the resulting autonomous flight data, crunches the numbers, and relays program modifications back to the helicopter 20 times per second, allowing the vehicle to learn from its mistakes and actually perform better than under remote control. In the real world, such improved autonomous performance could enable these choppers to be used in mission-critical operations such as monitoring wildfires in real time and searching for land mines in war zones. Bots For Art’s Sake According to Oscar Wilde, “Life imitates art far more than art imitates life,“ but sometimes art imitates imitations of life, and a couple interesting works were on display this year. Perhaps the biggest spectacle centered around La Princesse, a 50 ft (13 m) mechanical spider created by the French performance art company La Machine. The spider was showcased in Liverpool, England, back in September as part of the 2008 European Capital of Culture celebrations. In the photo, we see it clinging to the side of Concourse House, a derelict tower block that was scheduled for later demolition.The spider was built in Nantes, France, using steel and poplar wood, and complex hydraulics, taking an entire year to construct. Operated by up to 12 people strapped to its body, it weighs 37 tonnes, has 50 axes of movement, and offers seven different special effects: rain, flame, smoke, wind, snow, light, and sound. The project cost British taxpayers £1.5 million ($2.6 million), plus the cost of treating unhinged arachnopho- bia sufferers, but at least admission to the celebration was free. Less spectacular but (literally) creepy is Miyata Jiro, a crawling humanoid robot created by Japanese- born artist Momoyo Torimitsu, who now resides in New York. Miyata is a detailed and lifelike model of a Japanese “salaryman” who basically crawls around on his elbows like a soldier in the field. He has performed in New York, London, Paris, Amsterdam, Sydney, and Rio de Janeiro so far, evoking responses ranging from laughter to anger. According to Torimitsu, “When Japan entered its high growth period in the 1960s, Japanese society was transformed into a `businessman culture’ characterized by entertainment, movies, karaoke, TV, compartmentalized housing, bars, and even a sex industry that catered to them. This artwork reflects my impression of this particular culture.” Miyata can be seen at www.youtube.com/watch?v= glUnzzoFUxg. You may mistake the performance for just an amusing little parody, but thankfully we have critics to set us straight. According to zing magazine.com’s Rainer Ganahl, “The power and success of this life-sized crawling doll lies in the dramatic representation of a businessman in its most humiliating position: crawling in the street in a suit. This is a strong linguistic metaphor, as well as a psychoanalytical and a pathological one.” So there’s your elightenment for the month. SV RRoobbyytteess SERVO 11.2008 9 Stanford’s AI system allows helicopters to learn aerobatic maneuvers by “watching” others. Courtesy of Stanford University. Robotics continues to be a popular medium for artists, as demonstrated by Miyata Jiro, the robotic Japanese businessman. La Princesse, a giant mechanical spider. Robytes-Nov-edited.qxd 10/6/2008 4:54 PM Page 9 10 SERVO 11.2008 T he Self-Reassembling robot is a precursor to modular, self- configuring robots of the future, which are envisioned with many thousands of parts and modules that configure themselves for varying applications or — as in this case — reassemble all their parts after separation by explosion. In this experiment, the goals of the robot are to perform a task, suffer an explosion, reassemble itself, and continue the original task from where it left off. The robot is designed to disassemble along specific, preselected lines or weakest links between the modules in a structured fashion. By ensuring that the robot separates at these “bonds” between the modules, the robot absorbs the shock and disassembles at points where it is capable of reassembling. The self-assembly of the robot is part of a larger plan for self-repair. This type of self-repair involves diagnosis of the problem/break points, a plan for re-assembly, and an execution of that plan, according to Yim. Diagnosis The robot uses sensors to determine that it is no longer connected to itself. The robot consists of clusters of modules. According to Jimmy Sastra, the clusters are connected to each other at certain modules — using magnets. Each module face — which is connected to another module face — has two IR Contact the author at geercom@alltel.net by David Geer A Self-Reassembling Robot Ever seen a robot torn apart only to put itself back together? Jimmy Sastra, a student in the Modular Robotics Lab at the University of Pennsylvania has. He helped create it. As with most scientific endeavors, the Robotic Self-Reassembly After Explosion (SAE) project was a solution to a problem: how to get a robot to reassemble itself after ‘disassembly’ by ‘explosion’ (“Towards Robotic Self-Reassembly After Explosion,” the Modular Robotics Lab, University of Pennsilvania, Mark Yim, et.al.). Jimmy Sastra, a named author on the paper and research student at the University, calls an explosion “the rapid randomized disassembly of a system from a high-energy event.” As shown in the video linked here with, the explosion is the separation of the robot as students kick it apart, separating it into three parts. This cluster of five modules shows the camera module attached, top-side. Close-up of cluster with camera module. Geerhead-edited.qxd 10/8/2008 4:30 PM Page 10