Video: Google Robot Plays Table Tennis with Humans
Google used extensive data to train a table-tennis-playing robot to compete against humans and improve over time. The results are impressive, showcasing significant advancements in robotic speed and dexterity, and the robot’s performance looks quite entertaining.
“Reaching human-level speed and performance in real-world tasks is a key goal for the robotics research community.” This is how a paper by a team of Google scientists, who developed, trained, and tested the table-tennis robot, begins.
There has been significant progress in robotics, with humanoid machines now capable of performing various real-world tasks, from chopping ingredients to working in BMW factories. However, as the Google team’s quote indicates, adding speed to this precision has been slower to develop.
Table-Tennis Robot Wins 45% of Matches Against Humans
This makes the new table-tennis-playing robot particularly impressive. In the video, the robot competes well against human players, although it’s not yet at an Olympic level. Out of 29 matches, it won 45%, defeating 13 players.
While this is better than many New Atlas writers might fare, the robot only excelled against beginner to intermediate players and lost all matches against advanced competitors. Additionally, it was unable to serve the ball.
“Just a few months ago, we realistically thought the robot might struggle against unfamiliar opponents,” Pannag Sanketi told MIT Technology Review. “However, the robot far exceeded our expectations. Its ability to outmaneuver even skilled players was astonishing.” Sanketi, who led the project, is a senior staff software engineer at Google DeepMind, the company’s AI division. This research focused as much on data sets and decision-making as it did on the robot’s performance.
Data Collection and Simulation to Prepare Robot for Real-Time Adaptation
To train the robot, researchers collected extensive data on various table tennis ball characteristics, such as spin, speed, and position. They then used simulated matches to teach the robot the fundamentals of the game. This preparation allowed it to compete against human players. During actual matches, the robot utilized cameras to respond to opponents based on its training and continued to learn and adapt its strategies in real time.
“I’m a huge advocate for robots interacting with real humans, and this is a great example of that,” Sanketi told MIT. “Although it may not be a top player yet, it has the potential to improve and eventually reach that level.”
The video below provides further insights into the robot’s training and the diverse skills it was able to develop.
Read the original article on: New Atlas
Read more: A Soft Robotic Glove Restores Grip Strength to Weakened Hands