Researchers Command Adult-Size Robot Using Vive Controllers

Vive trackers plus a rescue robot equals the coolest puppet in existence.

There are very few terms associated more closely with science fiction than “VR” and “robots.” The two technologies have long been considered the future of entertainment and convenience. So when a project comes along that manages to merge the two fields you can be sure futurists are going to take notice.

We’ve seen several instances of this in the past, from a motorized bomb-defusal robot that allows Police to deactivate explosives remotely using a VR headset and motion controllers to a company using VR to train production machines. However it’s the University of Tokyo that may have just taken the cake for coolest collaboration between robot and VR.

Presented last month at the 2017 IROS Conference in Vancouver, Canada, scientists at the University of Tokyo have created a method for full-body control of adult-sized robotic humanoids using simple VR motion controls. By utilizing the HTC Vive’s controllers as well as two Vive Trackers strapped to each foot and one around the waist, Vive lighthouse sensors are able to track and map a user’s entire body movement and translate it as instructions to the robot.

However it’s the software behind this impressive method of control that’s the real magic. While researchers are making great leaps in the physicality of its robots and the movements they are able to achieve, current technology still limits their ability to perfectly mimic a human being. However using a specialized program serving as mediator between the user and robot, movements are recorded and then altered to match the actual capabilities of the machine.

“For example, stepping at a walking speed is allowed, but running and jumping are forbidden,” researcher Ishiguro Yasushiro told The Verge in an brief interview. “We force the robot to keep its gait always safe.”

The robot used in these Tokyo experiments, JAXON, is a veteran of DARPA’s Robotics Challenge, a prize competition sponsored by the US Defense Advanced Research Projects that took place 2012 to 2015. The contest aimed at promoting the development of semi-autonomous ground robots that could complete “complex tasks in dangerous, degraded, human-engineered environments.”

And while it is an impressive machine, JAXON suffers from the same limitations of other current-gen rescue robots: a lack of flexibility and allocation of of mechanical limbs. This is exactly where this VR control mechanic could change the game. By following the exact, natural movements of an actual human being, these machines can accomplish a new level of movement that would normally be unobtainable. This intermediary software can then smooth out the rough edges during the translation of information, resulting in a fluid and human-like response.

Although the team has only managed to achieve natural bipedal walking, Yasushiro hopes to eventually tackle more complex movements such as running, jumping, moving up and down stairs and much more. He even teases the idea of using VR headsets and haptic feedback suits to truly step into the shoes of our mechanical counterparts. Oh man, I’m getting some serious Avatar vibes right now…

Image Credit: University of Tokyo

About the Scout

Former Writer (Kyle Melnick)

Send this to a friend