How Augmented Reality and Cobots Drive the Next Wave of Automation
Augmented reality (AR) technology, usually incorporated into special headsets, eyewear, or projections, superimposes data or graphics over real-world images and uses sensors and cameras to capture the operator’s motions for feedback and control. Until now, AR’s primary application has been in gaming. But as the technology has become commoditized, it’s now finding a surprising new role in robotics research, and it may soon have a huge impact on manufacturing and logistics automation, and eventually even home and service robots.
Robot programming was traditionally done by writing code, which was time-consuming and expensive. That meant that robots were programmed for a single specific task that they did over and over. Cobots made programming much easier by letting even untrained operators simply move the robot arm as desired and use a teach pendant to set waypoints and actions. This makes programming more intuitive and flexible, so the robot can be quickly reprogrammed for new tasks. But while the robot’s movements are precise and consistent, they’re generally not as smooth or as fast as human movements, and the robot still only knows how to perform the exact tasks it’s been programmed for.
AR changes the game
It allows a human operator to get inside the robot’s head, so to speak. The operator uses AR to control the robot using natural, smooth movements, giving the robot precise instructions simply by doing the tasks he or she wants the robot to emulate. This new approach is ideal for cobots, which allow human operators to work directly with the robot arm without the interference of safety cages or fencing.
Embodied Intelligence uses AR to help robots learn complex tasks
A new startup called Embodied Intelligence is banking on AR as a way to support machine learning that will greatly expand robots’ abilities. In a NY Times article, Embodied founder Pieter Abbeel explains that robot hardware is already nimble enough to mimic human behavior; the challenge is creating software to guide the hardware. The Embodied team has been using a UR3 cobot in its research with tele-operation – using AR to control the robot – to help robots learn new skills without the need for direct programming. Tele-operation gives the robot enormously more data more quickly than any other process. The robot can apply this data to imitate, develop, and reinforce skills such as picking complex shapes out of bins, performing inconsistent tasks such as plugging in wires or cables, and manipulating deformable and unpredictable objects. Abbeel told IEEE Spectrum that, “we’ve reached a point where we really believe that the time is right to start putting this into practice, not necessarily for a home robot, which needs to deal with an enormous amount of variation, but in manufacturing and logistics.”
Embodied Intelligence has already landed $7 million USD in venture funding, and the race is definitely on. As MIT Technology Review reported, other researchers at Google DeepMind and Kindred AI are also making progress in this area. And outside of the lab, manufacturers and integrators are exploring AR as well.
Modern Machine Shop did this video on ITAMCO’s AR solution with UR cobots.
ITAMCO demonstrates prototyping and collaboration applications with cobot and Microsoft HoloLens
ITAMCO, an Indiana-based manufacturer of precision-machined components, recently demonstrated an augmented reality application using a UR robot and a Microsoft HoloLens headset, which includes Xbox Kinect 360 sensors and is typically used for gaming. The HoloLens is a wearable computer that projects information on top of the reality of the robot’s actions and allows the operator to control the robot using hand movements. Because the HoloLens has a camera, it can record both real-life and virtual images to share with other individuals—for example, an engineer or operator could demonstrate a robot setup to someone in another plant or department.
Joel Neidig, Business Development & Technology Manager at ITAMCO says, “I think it’s going to bring a lot of collaboration between operators and engineers, even going out to the point-of-use on the manufacturing floor, where the UR robot is being used every day. You can capture workflows and the motion of the robot, and people can record their setups and display some of the virtual models inside the machine before they actually manufacture it.”
Using a virtual environment prior to manufacturing could be especially valuable to experiment with setup for expensive parts, or to plan for parts that haven’t been manufactured yet. The AR system can show the user how the part will be loaded in the machine without having the actual parts on-hand. “It’s really important to have more prototype tools like this throughout the industry, and being able to rapidly prototype and test your design,” Neidig explains. This type of system will allow engineers, manufacturers, and operators to collaborate and to make changes so that when parts go to production, the processes are as efficient as possible.
Example of 3D graphics seen through ITAMCO’s Hololens.
Joel Neidig, Business Development & Technology Manager at ITAMCO, commanding the UR5 cobot through HoloLens.
The operator continues to follow the instructions shown by the Light Guide system, interacting with it by swiping a hand over virtual “buttons.”
Tianhao Zhang, research scientist at Embodied Intelligence, with a UR3 cobot and Robotiq gripper.
At this year’s Automate show in Chicago, Kubica Corporation let attendees assemble the inside panel of a car door in an AR-assisted process.
Neidig looked specifically for a collaborative robot so that people using the AR system could safely stand next to the robot while it was in action, without being separated by a safety cage. The robot needed to be lightweight enough to be easily moved, and ease of integration was also key. Neidig says, “We chose the UR robot for this application because it’s an open platform. We can communicate with Python scripts and secure sockets, and it’s got a nice Ethernet port that’s already set up. UR brings it all together, and just being intuitive, it’s very easy to maneuver around, and we like the platform as a whole.”
Kubica integrates Light Guide Systems projection for assembly assistance and QA
The combination of AR technology and cobots can also bring a whole new level of collaboration to the table. Kubica Corporation, a Michigan-based engineering firm, recently demonstrated an AR automobile door panel assembly program using a UR10 robot and a Light Guide Systems projection system. Carol Choma, Operations & New Business Development Manager at Kubica, explains the application, saying, “This is a great example of a true collaborative cell work environment where the UR10 robot from Universal Robots is working with the operator at an assembly cell for the automotive industry.”
In this application, the Light Guide system projects assembly instructions directly onto the work environment. The operator swipes his hand over the virtual “start button” projection and watches for additional directions as the robot begins its process. The projection highlights the robot’s actions and prompts the operator for his tasks. The projection guides the proper assembly process by lighting up and color-coding the path for the operator to install a wire harness accurately while the robot is working on another part of the assembly.
The operator can respond to quality control messages, such as a missing pin, and use the projection system’s virtual controls to instruct the robot on additional processes. Once all processes are complete, the projection system takes a picture of the assembly for traceability purposes.
AR research is still in its early stages but promises to expand the use of robots into more complex applications, improve quality and consistency, and increase opportunities for collaboration with human workers.
Ready to get your team onboard the Automation Train? --> Find out how in our free ebook