NASA engineers have developed a system to control a robotic arm remotely using Kinect 2 and Oculus Rift.
The Jet Propulsion Laboratory (JPL) of NASA was looking for a system to control a robot in space in a natural way. After performing several tests, NASA engineers have probably found the perfect solution: Kinect 2. The new sensor is sold together with the Xbox One and allows to operate the robotic arm at a distance and with great accuracy.
The JPL earlier used the sensor manufactured by Microsoft, so it knows its unusual characteristics. When the Redmond Company launched the Kinect for Windows for developers, engineers at NASA joined the lab to get the second version of the sensor. The dev kit was delivered in November and, after a few days of study, the team has created a system that connects the Kinect 2 and Oculus Rift, the well-known head-mounted displays for virtual reality.
In the U.S., scientists believe the combination Kinect 2-Oculus Rift is undoubtedly the most immersive interface ever. In fact, the second-gen Kinect provides a precision far superior to that of the first model as it allows to trace the open and closed states of the hands, and the rotation of the wrists. Thanks to the numerous tracking point and the rotational degrees of freedom, it’s possible to remotely control the robot arm in a very accurate way.
Using Oculus Rift you can “dive” into the environment and observe the scene by using stereoscopic vision. In this way, the robot appears as an extension of the human body, and it’s easier to locate objects in the real world. However, the remote control does lead latency, or delay between sending the command and its execution, but this small latency can be easily compensated through a graphical indication on the screen.
NASA engineers plan to use the same technology to control Robonaut 2, located at the International Space Station. Then the humanoid robot will receive remote commands to perform dangerous tasks for astronauts.