Virtual Reality (VR) could be one of the most overhyped technologies since the introduction of 3D televisions. However, Juliano Pinto would be the first to rebut that criticism. Pinto, a paraplegic, is part of a new brain-machine interface study that reteaches the brain to walk using virtual reality and robotic exoskeletons. Pinto is one of eight people in a study by the the non-profit organization Walk Again Project. Remarkably, each participant has regained partial sensation and muscle control in their lower limbs.
If the combination of VR and robotics can retrain a complex neural network like our brains, what could the impact be on future control systems for robotics? This week, I was pitched on two applications of using VR headsets for robotics. I tend to believe that if I saw two this week, that means there are hundreds under development. While this article will highlight a few of the more prominent areas of research between the convergence of these two emerging technologies, the video below by 219 Design, a Silicon Valley product design group, does an excellent job of explaining the opportunity:
VR offers the opportunity for human controllers to deploy semi-autonomous robotics in places that are just too dangerous for organic life, such as inside power plants, space and large industrial complexes. At Johns Hopkins’ Computational and Interactive Robotics Laboratory researchers are developing an Immersive Virtual Robotics Environment (IVRE) that “enables a user to instruct, collaborate and otherwise interact with a robotic system either in simulation or in real-time via a virtual proxy.” IVRE’s first application is working with industrial robots that are often caged off from humans due to safety concerns. These machines rely on a sophisticated network of hardware, software, and sensors to ensure they do not kill a human co-worker in close proximity. By using VR, the human controller could control the robot on the assembly line, repair any faults or manage the operations from a safe distance.
The biggest game changer for IVRE’s technology could be disaster recovery, like the case of Fukushima Meltdown in 2011. While robots were used in the Fukushima Nuclear Power Plant during the famous meltdown, they were controlled via remote controls and cameras and proved less than effective in turning off critical valves. In the case of Fukushima, VR controls could have enabled engineers to control robots to open doors and turn off valves as if they were inside the power plant. Unfortunately, these technologies were not available in 2011 as Oculus Rift and Microsoft’s HoloLens were yet to be invented. Instead a number of the plant’s workers were severely injured and later killed in the disaster, including six engineers exposed to lethal doses of radiation.
This past month, researchers at Georgia Tech developed a 3D system without lenses to overcome one of the biggest “grasp challenges” in robotics – picking up a pen. Unlike the previous examples at Hopkins, the Georgia Tech team led by Professor Sonia Chernova developed a cloud-based point-and-click Web-based interface. Chernova’s method of “constrained positioning, intelligently limits the amount of degrees of freedom that a user needs to position.” The only thing that the user needs to select is a grasp point, approach angle, and grasp depth. The system automatically calculates the rest. By taking the pain point of manually calculating degrees of movement out of the equation, robotics has the ability to accelerate faster in new business applications without the need for roboticists.
An example of how robots are being deployed more creatively with these virtual reality-like tools is a robotic-arm drumming appendage. Last February at Georgia Tech’s Center for Music Technology, researchers showcased their latest concept of robotic-human mimicking with the display of synchronized drumming between human and an attached two-foot long robot arm that didn’t miss a beat. The arm responded to the human gestures as the the drummer reached for the high hat cymbal, the robot arm shifted over to the ride cymbal, when he went to the snare, the arm takes its talents to the tom, and so on. It’s robot VR without any headsets!
“We believe if you augment humans with technology, humans will be able to do much more. We thought that music was a great medium to try that,” said Gil Weinberg, director of the Center for Music Technology.
The idea of feeling what the robot feels is not new. The German Aerospace Center (German for Deutsches Zentrum für Luft – und Raumfahrt or DLR) a leader in telerobotics has been incorporating off the shelf VR tools with mechatronics for years. DLR’s newest area is incorporating “haptic feedback” into its platforms to enable the human operator to precisely feel and control the interaction forces between remote robot and the environment leading to a natural manipulation. The haptic feedback provides a wider range of the manipulation of task from powerfully switching levers to precisely turning screws or even handling fragile electronics parts. According to DLR, the researchers started developing the technology for their On-Orbit Servicing group but quickly found that there were a number of terrestrial applications for the technology as well.
Now, DLR’s remotely commanded robots are being tested to repair satellites, dive into deep sea mining, operate nuclear power plants and even lead robotic surgery. The premise is that a human operator can command such a robotic system while being in a safe and comfortable place. According to the DLR website “robots can be built to be more robust to hazardous environments making it: less risky for the expert; cheaper to get on site in most cases; and even more reliable and more precise.”
DLR is not the only space program experimenting with VR. At CES earlier this year, NASA utilized VR to promote their Mars Rover Program. The booth was manned by Maria Bualat, a robotics engineer from NASA. Bualat and her team demonstrated for CES participants the navigation software tools that NASA’s autonomous robots utilize in space, including a HaloLens and VR headsets for earth controllers. Putting on the VR gear, Bulat said, “you can actually walk around the robot and see exactly what the robot saw.” Bulat’s enthusiasm for VR is now part of future missions to Mars with humanoid robots, like Robonaut, while its human commanders fitted with headsets stay in Houston.