Draw Your Task: Robot Programming by Demonstration
The goal of the project is to develop a prototype that allows robot programming by demonstration and to evaluate its potential for the Swiss industry. The goal is to simplify robotic programming for intuitive and fast re-tasking of robots.
Factsheet
- Schools involved School of Engineering and Computer Science
- Institute(s) Institute for Human Centered Engineering (HUCE)
- Research unit(s) HUCE / Laboratory for Computer Perception and Virtual Reality
- Funding organisation Others
- Duration (planned) 01.08.2021 - 31.01.2022
- Head of project Prof. Dr. Sarah Dégallier Rochat
-
Project staff
Sylvain Barthe
Marino von Wattenwyl
Christian Wyss
Charly Charles Eric Blanc
Lucas Manuel Renfer - Partner microtech booster
- Keywords HMI, Tangible Programming, No-Code programming, Learning by Demonstration, Augmented Reality Collaborative Robotics
Situation
Some manual tasks are particularly difficult to automate because they require a high level of dexterity and task expertise. These skills are often difficult to translate mathematically in robotic trajectories. This is particularly problematic for applications with a high diversity and small production, where the programming effort is too big to make the automation profitable. A new approach to robotic programming, that we call tangible programming, proposes to ease the process by allowing for the definition of the trajectories through direct demonstration in the workspace. A 3D-tracked handheld device is used to collect the information about the performed movement and the resulting trajectory is shown using augmented reality.
Course of action
In this project, we propose to develop a setup that allows to program a robot or a machine by interacting directly with the workspace (tangible programming). More precisely, a trajectory can be defined by simply drawing in on the workspace and by interacting with the objects directly. The goal is to develop a system that can be interfaced with any existing machines. The setup will consist in a 3D-tracked handheld device, an augmented reality (AR) system (for instance, glasses) and a tablet. Thanks to this set-up, the movements can be programmed by demonstration and the step of translating them into machine trajectories is not necessary anymore, which saves time and simplifies the process. It also means that no coding is required. Typical industrial applications include: welding, gluing/dispensing, cutting, screwing/riveting, graving, grinding, spray-painting, inspection, pick and place/paletizing.
Result
A prototype was developed to better understand the feasibility, the desirability and the viability of the proposed solution. Three different tracking devices were integrated and tested. At the end, a solution based on the tracking system from Attracsys was selected because of the high requirements in terms of precision. The drawn trajectory can be visualized and edited using augmented reality on two different type of devices: an Android tablet and Hololens. The trajectory can be then performed with a virtual robot (FANUC CRX10iA or UR5). The execution on the real robot is in progress.
Looking ahead
During the project, three design thinking sessions were organized to discuss the relevance of the approach for the industry and to identify the most promising applications. Several use cases were defined, such as soldering or glueing of complexe surfaces, watch decoration or simply sewing or precise positioning. Different follow-up project are planned to further develop the proposed solution.