This is an example of a remote robot teleoperation of the KIT Gammabot using VR.
Normally, teleoperation would be realized by using a binocular camera and showing the user a video feed for each eye. However, this often leads to motion sickness, as the motion of the user's head does not correspond to the camera position.
Instead, here we take the raw ROS sensor streams and reconstruct them in 3D in the VR environment. This allows the user to move in any way they want, and the VR camera will always follow their head at 75 fps, even if the connection stutters and the update rate of the world is much slower and unreliable.
The grey cubes are an occupancy grid, where each full cell is projected upwards. The point cloud originates from a LIDAR scanner. Thus, even if the robot does not have RGB cameras, the contents of the world (including obstacles such as tables and people) are still clearly identifiable in real time, without any changes in the robot system.
Ещё видео!