VRoxy Elevates Telepresence Beyond Visuals and Words

VRoxy Elevates Telepresence Beyond Visuals and Words

In essence, many telepresence robots are primarily remote-controlled tablets maneuvered through a room. However, the VRoxy system distinguishes itself by mirroring the user's movements and autonomously navigating to various spots within a designated area.
Mose Sakashita, a Cornell University doctoral student in the field of information science, with the VRoxy system
Sreang Hok/Cornell University

In essence, many telepresence robots are primarily remote-controlled tablets maneuvered through a room. However, the VRoxy system distinguishes itself by mirroring the user’s movements and autonomously navigating to various spots within a designated area.

A group of researchers from Cornell and Brown universities is working on developing this system.

Design of the Current VRoxy Robot Prototype

The existing functional prototype of the VRoxy robot features a tubular plastic truss body with motorized omnidirectional wheels at the base, topped with a video screen. Additionally, it is equipped with a robotic pointer finger and a Ricoh Theta V 360-degree camera.

What sets VRoxy apart is that the user, located remotely, only needs to wear a Quest Pro VR headset in their office, home, or virtually anywhere else. This sets VRoxy apart from other gesture-replicating telepresence systems, which typically demand elaborate setups at both the user’s and viewer’s locations.

Seamless Transition between VRoxy Views and Autonomous Navigation

Through the headset, the user has the option to switch between an immersive live feed from the robot’s 360-degree camera or a 3D map view of the entire space the robot occupies. After selecting a destination on the map, the robot independently navigates to that location, and once it arrives, the headset seamlessly switches back to the robot’s first-person camera view.

This feature not only eliminates the need for the user to manually control the robot’s movement but also prevents them from experiencing motion sickness that could result from watching a live video stream from the robot while it’s in motion.

Cornell’s Prof. François Guimbretière, working on the VRoxy system
Sreang Hok/Cornell University

The VR headset tracks the user’s facial expressions and eye movements, replicating them in real time on the user’s avatar, which is displayed on the robot’s screen. The headset also detects head movements, and the robot adjusts the screen’s orientation by panning or tilting it using an articulated mount.

Controlling the Robot’s Movements

Additionally, when the user points their finger at something within their headset’s view, the robot’s pointer finger moves in the same direction in the physical world. The researchers aim to equip the robot with two user-controlled arms in the future.

In a test of the current VRoxy system, the team used it to navigate between a lab and an office, allowing a user to collaborate with different people on various tasks.

This research is led by Mose Sakashita, Hyunju Kim, Ruidong Zhang, and François Guimbretière from Cornell University, along with Brandon Woodard from Brown University. Their work is detailed in a paper presented at the ACM Symposium on User Interface Software and Technology in San Francisco.


Read the original article on: New Atlas

Read more: Fourier and Tesla Showcase Impressive Strides in Humanoid Robotics

Share this post