The VR versus Teleoperating Robots

By CIOReview | Monday, November 13, 2017
302
494
91

Usually, manufacturing jobs require a physical presence to operate machinery.

Recently, MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) researchers demonstrated a virtual reality (VR) system that lets teleoperate a robot using an Oculus Rift headset. The system embeds the user in a VR control room with multiple sensor displays, making them feel like they are inside the robot’s head. Users can match their movements to the robot’s movements to complete various tasks using hand controllers. The team used the Baxter humanoid robot from Rethink Robotics. Their system is compatible with the HTC Vive headset as well.

There are two main approaches to using VR for teleoperation. In the first, the user's vision is linked to the robot's state. With these systems, a delayed signal could lead to nausea and headaches, and the user’s viewpoint is limited to one perspective. The cyber-physical model is distinct as the user is separate from the robot. The user deals with a virtual copy of the robot and the environment. The CSAIL has addressed the delay problem as the user constantly receives visual feedback from the virtual world. This has also solved the cyber-physical issue of being distinct from the robot.

Users operate controls in the virtual space to open and close the hand grippers to pick up, move, and retrieve items using VR controllers. A user is able to plan movements based on the distance between the arm’s location marker and their own hand while looking at the live display of the arm. The human’s space is mapped into the virtual space and virtual space is mapped into the robot space giving a sense of co-location.

To test the system, the team first teleoperated Baxter to do simple tasks like picking up screws or stapling wires. Users successfully completed the tasks at a much higher rate compared to the direct model.