ARTICLE International Journal of Advanced Robotic Systems Verification of a Program for the Control of a Robotic Workcell with the Use of AR Regular Paper Jozef Novak-Marcincin1,*, Miroslav Janak1, Jozef Barna1, Jozef Torok1, Ludmila Novakova-Marcincinova1 and Veronika Fecova1 Faculty of Manufacturing Technologies, Department of Manufacturing Technologies, Slovakia * Corresponding author E-mail: jozef.marcincin@tuke.sk Received 01 May 2012; Accepted 21 Jun 2012 DOI: 10.5772/50978 © 2012 Marcincin et al.; licensee InTech This is an open access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/3.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited Abstract This paper contributes in the form of a theoretical discussion and also, by the presentation of a practical example, brings information about the utilization possibilities of elements of augmented reality for the creation of programs for the control of a robotic workplace and for their simulated verification. In the beginning it provides an overview of the current state in the area of robotic systems with the use of unreal objects and describes existing and assumed attitudes. The next part describes an experimental robotic workplace. Then it clarifies the realization of a new way of verification of the program for robotic workplace control and provides information about the possibilities for further development of created functioning concepts. Keywords robotic workcell, control, augmented reality 1. Introduction Current manufacturing industries experience the dynamics of innovations. Product life cycles are shortened and diversification of the product range gets wider, all in the frame of progressive globalization, www.intechopen.com however, there is a shortage of skilled workers who, moreover, present high costs. A perfect solution for achieving both productivity and flexibility is automation based on industrial robots. Creation of a control program for an industrial robotic system for a specific application is still very difficult, time‐consuming and expensive. Small enterprises can have enormous difficulties taking advantage of robotic automation. In praxis today there are two main categories of robotic programming methods ‐ online and offline programming. Usually for online programming, the pendant is used for manual movements of the effector at each stage of the realized task. The robot controller records the configurations and a program is written that includes all the paths, postures and actions. This is only suitable for simpler processes and geometries. Of course the quality of the program responds to the skills of the operator. Despite these facts, this intuitive and rather cheap solution is widely used. In the field of offline programming some new methods are proposed. For example the OLP method uses the Int J Adv Robotic Sy, 2012, Vol 9, 54:2012 Jozef Novak-Marcincin, Miroslav Janak, Jozef Barna, Jozef Torok, Ludmila Novakova-Marcincinova and Veronika Fecova: Verification of a Program for the Control of a Robotic Workcell with the Use of AR complete 3D model of a robotic workcell that gets the tasks of the robot operator to the software engineer. In comparison to the online programming method, it provides increased flexibility, but usually requires additional setting procedures and calibration [1, 2]. The programming and verification method proposed in this paper does not require large capital investment and tries to combine the advantages of both basic methods. It is a solution with a robotic workcell using the elements of augmented reality utilized as the bridge connection between programming and its simulated verification. 2. Creation Methods of the Programs for Robot Control 2.1 Online Programming Online programming is usually realized by skilled robot operators. They guide the robot according to the required trajectory using a teach pendant ‐ this is called the lead‐ through method. While jogging the robot through the desired path, the robot controller records the specific points and uses them for creation of motion commands according to the path definition. Although this method is simple and widely used, it has several disadvantages. The operator must always track the coordinate frame of the actual jogging action, which can be quite complicated. Once the program is done it requires a lot of testing for assuring reliability, accuracy and operational safety. Moreover, the program itself is not very flexible considering the need to adapt to different conditions (workpiece, robot position). With online programming the programmed robot is also excluded from the production cycle. In spite of all these facts, online programming is still the usual method utilized in small companies (Figure 1). Figure 1. Conventional online programming training in company KMT Robotic solution, MI, USA Techniques of online programming have been improved using different sensors for detection of forces and positions, and eventually beam sensors and cameras. In Int J Adv Robotic Sy, 2012, Vol 9, 54:2012 some cases these enhancements even removed the necessity of jogging, as the robot is able to understand (to physically or visually check) the required path itself. Some authors state that the accuracy of the final program need not rely on the skill of the robot operator and a 3D robot path with higher accuracy can be generated automatically. This would present a significant advantage, especially for applications where the process tools are in contact with the workpiece or a surface (machining, etc.) [1]. 2.2 Offline Programming Offline programming methods have been developed to avoid some of the disadvantages present while using the online form. The characteristic feature of these methods can be found in the PC‐based offline programming interface which is connected to the robot controller. Out of known and common techniques we can mention so called graphical programming. This is based on the idea of the acquisition of the 3D geometrical data of the workpiece, robotic device and its environment (machines, fixtures, other objects) ‐ everything that creates the workcell. The data of the robot and other workcell equipment are usually present in the form of a CAD model, and workpiece entities can be eventually obtained from the coordinate measuring machine or from the 3D scanning process. The entire program package of the robotic device, including its paths and actions, is then prepared in the offline mode of the software environment, while the robot concerned can be used for realization of different tasks. The offline method allows implementation of computation processes and thus provides the tools for path optimization. Having the program created in a graphical software environment also enables launching the simulations and the visualization of future robot performance [3]. 2.3 Robot Programming with Use of Augmented Reality Besides online and offline programming, there are other possibilities for making the robot programming more visual and effective. A team of researchers at the Mechanical Engineering Department of the Faculty of Engineering, National University of Singapore, has developed a system for the programming of robots using the elements of augmented reality. This can be understood as a form of offline programming, but the ideas behind it are so advanced that it can be considered beyond conventional programming methods. The system called RPAR‐II (Figure 2) includes a manipulator arm, an electrical gripper, a robot controller, a desktop PC, a display unit, a stereo camera and a hand‐ held device with a marker. In this solution the kinematics and dynamics of the robot were considered, while www.intechopen.com augmenting the real environment with the virtual robot. An interaction device is used to guide the virtual robot according to the desired path. The system includes definitions of initial and final points together with complex mathematical computation regarding the optimization of robot paths. This means that once the geometric path is obtained, the trajectory planning process effectively deals with the kinematic and dynamic constraints of the robot. Both planned and simulated paths can be displayed simultaneously in the real working environment, so the difference can be seen and evaluated. The implementation of elements of augmented reality in programming processes is interesting mainly because it opens up the future possibility of considering additional constraints (velocity, acceleration) and increasing the level of human‐robot interactivity. The main remaining issue with this method is low accuracy, as dimensional data about objects and spatial entities are related to tracking systems [5]. As for the application area, the robot in this laboratory is used as a manipulator between different machining sequences. It can also be used for welding, assembly realization, cutting of material, packaging tasks, batching, machine servicing, etc. The initial position of this device so far is the place from where it can reach the working area of both machining devices. Those are didactical manufacturing devices EMCO appointed for basic operations of milling and turning. In relation to the programming method and verification of programming results we have the models of all present objects. The model of robot is in STL form downloadable on the Internet; models of the mill and lathe were created in the CAD module of the engineering system ProEngineer. Figure 3. Compact robotic device IRB 140 produced by ABB Figure 2. Unconventional programming with the use of elements of augmented reality – RPAR II system, Singapore [4] 3. Controlling an Experimental Robotic Workcell with an ABB Robotic Device 3.1 Hardware Characteristic of the Workcell The robot from the ABB company – compact robot IRB 140 – is a robotic device used at the experimental workplace designed at the Faculty of Manufacturing Technologies (FMT) in Presov (Figure 3). It is a machine with 6 degrees of freedom with a unique combination of great acceleration, work radius and solid load. It is the fastest robot in its class with good repeatability of position and very good trajectory accuracy (± 0,03mm). With load capacity of 6 kg it can manipulate up to a distance of 810 mm. It can be installed on the floor or on the wall. Currently it is situated on a floor stand with the intention to realize sliding for easy changing of position or eventually a table that would be freely movable all around the room. www.intechopen.com 3.2 Software Characteristic of the Workplace From the software point of view, as a component of delivery there is an application called RobotStudio, which presents a typical tool of online programming with integrated models of all virtual machines and devices from the ABB company. After disposition of all inserted objects and harmonization of their coordinate systems, the programmer defines the key positions of the robot effector which serve as the input of path creation for individual moves. The actions (types of movements) and operations determined are then translated in RobotStudio into program syntax suitable for robot control. Creation of programs for EMCO manufacturing devices is realized in a typical way – the sequences of the CAD module of ProEngineer are with the postprocessor translated into the form of the final NC machining programs which run the machine under the control system Fanuc0. These machines are meant especially for educational purposes and as the smallest in their class they do not have any control unit. Their control is simulated in a regular Windows interface under an application called WinNC. This solution actually presents an advantage from the viewpoint of easier communication and data interchange between the controls of the robot, the mill and the lathe machine. Jozef Novak-Marcincin, Miroslav Janak, Jozef Barna, Jozef Torok, Ludmila Novakova-Marcincinova and Veronika Fecova: Verification of a Program for the Control of a Robotic Workcell with the Use of AR Figure 4. Offline programming in the software environment – RobotStudio 4. Experimental Robotic Workcell Utilizing Elements of AR 4.1 Visualization Features Application of the elements of AR is in many manufacturing activities realized by software implementation (overlapping) of geometries of virtual models into the real environment recorded with the use of camera sensors [6]. This method is effective, but there is the need to watch the monitor that lies out of the normal working area, which sometimes leads to problems regarding the synchronization of working activities and moves. To fix this issue a new visualization unit was created at FMT in Presov. Its philosophy lies in the creation of a new mixed working environment. Thanks to the use of a half‐ silvered surface it finally provides better interactivity of the application and increases user comfort directly in the active working environment of the programmer [7‐9]. the same time allows a view into the working environment with no obstacle or decrease in view quality. This commonly available kind of mirror is often used in gaming, medicine or business presentations. By optical connection of two seemingly different views it creates an ideal platform for the creation of a realistic spatial effect (Figure 5). Displaying is a reversed emission of the view to the reflex surface. It is provided with the use of an LCD monitor that is placed over the working area, out of the view angle of the worker (programmer). A disadvantage of this visualization variant is that it makes the quality and character of the created view dependent on a fixed watching point. Such an unpleasant attribute was solved by the application of a combined view running under the OpenSource system Blender where a script was activated for tracking the user’s face. 4.2 Face Tracking Face tracking uses libraries and program elements from freely accessible database known as OpenCV. That is a special library for the creation of applications for computer imaging with the possibility to freely activating partial visualization scripts. It can be used under different platforms (Windows, Linux, MacOS, even iPhone). The OpenCV library was developed (Intel, 1999) for solving tasks based on complicated algorithms and logical operations in the area of computer imaging and artificial intelligence (AI). A solution using the face tracking technique is perfect in cases that require the coordination of a displayed view with the motion of the face (body). The monitoring process starts with activation of the script for face tracking and launching of data flow for video images recorded in real‐time with the web camera. These images are processed with logical script, which in an observed area automatically identifies and selects the face of the user (using face pattern). The script creates a rectangle over the detected face that is used for determination of the geometrical centre of the face (the intersection of the diagonals presents the virtual coordinate system of the user). In Blender software numerical values of this point are connected to the attributes of the imaging section, while setting the script for image location according to them (Figure 7). Figure 5. Two different positions of the robotic device – displaying the combination of real and virtual image using the half‐silvered mirror The surface of the glass is either half‐silvered or there is a half‐leaky foil stuck on it that creates a reflection and at Int J Adv Robotic Sy, 2012, Vol 9, 54:2012 Figure 6. Position of the Blender camera adjusts according to the user’s face www.intechopen.com Figure 7. Principle of face tracking applied to the working area 4.2 Additional Inputs and Outputs Another way to increase the quality of implementation of elements of AR in real working space (robotic workcell) is to use the option for audio inputs and outputs. The programmer of a robotic device can obtain audio instructions and information, for example, about threats of collisions detected on virtual objects, about violation of a safety zone, the start and end of the motion or activity of a real or virtual robot, and eventually about reaching or recording of the desired position. In addition to receiving the information he can also give vocal orders. By simple activation of the microphone and with the use of a regular PC (thanks to the possibility of linking the audio input with the Blender application) his voice can be an interactive feature of his work that can be used for immediate and more comfortable realization of partial programming functions. Together with the audio there is the possibility of direct text output of the information in the view displayed on the half‐silvered glass plate. Different text packages (coordinates of required point, position and state of the effector, collisions, important positions, warnings) can be simply texted directly into the view field of the programmer in the desired form and in real‐time in relation to the connections determined in Blender. 5. Programming and Visual Verification of Control Program of a Robot with the Use of Elements of AR The concept of programming with the use of the described method is based on the creation of a displaying unit and on the connection between more software www.intechopen.com environments. The displaying unit includes the construction (static frame), half‐silvered glass (reflection and leakiness), LCD displaying device (emission of the image to the glass from the point out of the view field of the user), camera (detection of face motion of the programmer) and PC (synchronization of receiving and broadcasting of the image, running of the Blender application itself) [10]. The possibility of program interconnection of several software environments is partially realized. For its full functionality and thus for real online programming from behind the imaging glass with the creation of augmented reality some additional programming corrections are required. This is based on the principle of mutual interaction of data coming from different software. Data from the RobotStudio application must be available for main imaging and the computation application running in Blender. Script from Blender has to (for example, with use of RobotStudio) generate the output in the form of a program with robotic syntax. A suitable improvement would also mean the availability of data from the control system of machining devices for the calculation purposes of the Blender application, which is supported by the simulated control of the mill and the lathe in the Windows environment. The concept of the overall combined environment of the robotic workcell and composition of its particular components developed on FMT in Presov is presented by Figure 8. Figure 8. Schematic view on overall application concept Jozef Novak-Marcincin, Miroslav Janak, Jozef Barna, Jozef Torok, Ludmila Novakova-Marcincinova and Veronika Fecova: Verification of a Program for the Control of a Robotic Workcell with the Use of AR Thanks to the combination of real and virtual complex data, the programmer has in his field of view the image combined from real objects, such as devices, lathe, mill, etc. and also from virtually inserted models, for example, the robot, group of robots, another machine [11]. The advantage of such imaging lies also in the possibility of using it for the design and disposition of the robotic workplace, when the designer/constructer has the possibility to visually check (in real‐time) the suitability of his proposal, placing of the machines, robots, working radiuses, etc. In the workcell there is another production device inserted (Figure 9). The programmer can use the virtual space of the Blender application even for verifications where potential problems can be signalled by different colours or combined with an audio signal. programming and simulation, as it stands on the border of online and offline programming (programmer is physically in the workcell, but programming tasks are realized more virtually) and tries to use the advantages of both. It is a way of making robot programming even more comfortable, more visual and easier. Future improvements will be in the form of better inter‐software communication and solutions for accuracy improvements which could bring very successful results. 7. Acknowledgments Ministry of Education, Science, Research and Sport of SR supported this work, contract VEGA 1/0032/12, KEGA No. 002TUKE‐4/2012 and ITMS project 26220220125 8. References Figure 9. Testing of the robot reach area regarding a virtually inserted lathe On Figure 9 there is verification of the working range of the robot related to another machining device that is not currently installed. The creation and simulation of control programs is open also in the case of a workplace that is not yet built or in cases where the disposition is to be changed [12]. 6. Conclusion This research focuses on the improvement of important features of robot control, concerning both the areas of programming and simulation. Details of the research and related concept are explained in the example of the experimental robotic workcell situated at the Department of Manufacturing Technologies, Faculty of Manufacturing Technologies in Presov, Slovakia [13]. The idea is based on the utilization of a newly created displaying unit that is based on the principle of half‐ silvered glass, fixed in a frame that is situated between the programmer and the workcell, which reflects and simultaneously transmits the light. This means that looking into the workplace through this glass, the programmer can see real objects behind it in combination with virtual ones inserted in the software environment of the application created in Blender. This can be considered a new approach among the current methods of robot Int J Adv Robotic Sy, 2012, Vol 9, 54:2012 [1] P. Zengxi, Recent Progress on Programming Methods for Industrial Robots, Robotics and Computer‐Integrated Manufacturing, Vol. 28, No. 2, 2012, pp. 87‐94, ISSN 0736‐5845. [2] V. Bottazzi, Off‐line Programming Industrial Robots Based in the Information Extracted From Neutral Files Generated by the Commercial CAD Tools (Industrial Robotics: Programming, Simulation and Applications). Pro Literatur Verlag, 2006, pp. 349‐ 364, ISBN 3‐86611‐286‐6. [3] N. R. Cazarez‐Castro, L. T. Aguilar, O. Castillo, Fuzzy logic control with genetic membership function parameters optimization for the output regulation of a servomechanism with nonlinear backlash, Expert Systems with Applications, vol. 37, No. 6, 2010, pp. 4368‐4378, ISSN 0957‐4174. [4] S. K. Ong, Interactive Robot Trajectory Planning and Simulation Using Augmented Reality, Robotics and Computer‐Integrated Manufacturing, Vol. 28, No. 2, 2012, pp. 227‐237, ISSN 0736‐5845. [5] S. K. Ong, A novel AR‐based robot programming and path planning methodology, Robotics and Computer‐ Integrated Manufacturing, Vol. 26, No. 3, 2010, pp. 227‐237, ISSN 0736‐5845. [6] S. L. Cardenas‐Maciel, O. Castillo, L. T. Aguilar, Generation of walking periodic motions for a biped robot via genetic algorithms, Applied Soft Computing, Vol. 11, No. 8, 2011, pp. 5306‐5314, ISSN 1568‐4946. www.intechopen.com [7] O. Castillo, R. Martinez‐Marroquin, P. Melin, F. Valdez, J. Soria, Comparative study of bio‐inspired algorithms applied to the optimization of type‐ 1 and type‐2 fuzzy controllers for an autonomous mobile robot, Information Sciences, vol. 192, 2012, pp. 19‐38, ISSN 0020‐0255. [8] J. N. Marcincin, Application of the Virtual Reality Modeling Language for Design of Automated Workplaces, Proceedings of World Academy of Science Engineering and Technology, Vol.25, 2007, pp. 160‐163. [9] J. N. Marcincin, M. Doliak, S. Hloch Sergej, et al. Application of the Virtual Reality Modelling Language to Computer Aided Robot Control System ROANS, Strojarstvo, Vol.52, No. 2, 2010, pp. 227‐232. [10] J. N. Marcincin, P. Brazda, M. Janak, et al. Application of Virtual Reality Technology in Simulation of Automated Workplaces, Technical Gazette, Vol. 18, No. 4, 2011, pp. 577‐580. [11] J. N. Marcincin, J. Barna, M. Janak, L. N. Marcincinova, V. Fecova, Utilization of Open Source tools in assembling process with application of elements of augmented reality, Proceedings of VRCAI 2011: ACM SIGGRAPH Conference on Virtual‐Reality Continuum and its Applications to Industry, Hong Kong, 2011, pp. 427‐430. [12] J. N. Marcincin, J. Barna, M. Janak, V. Fecova, L. N. Marcincinova, Composite lay‐up process with application of elements of augmented reality. The Engineering Reality of Virtual Reality, Vol. 8289, 2012, p. 1‐6, ISSN 0277‐786X. [13] J. N. Marcincin, J. Barna, Augmented virtual reality applications. Proceedings in Manufacturing Systems, Vol. 6, No. 2, 2011, pp. 101‐104, ISSN 2067‐9238. www.intechopen.com Jozef Novak-Marcincin, Miroslav Janak, Jozef Barna, Jozef Torok, Ludmila Novakova-Marcincinova and Veronika Fecova: Verification of a Program for the Control of a Robotic Workcell with the Use of AR