Sony AI has published research showing that its Project Ace robot can outplay elite and professional human table tennis players, a milestone that landed on the cover of Nature and marks the first time an autonomous machine has reached expert-level competitive play in a commonly played physical sport.
The paper, titled "Outplaying Elite Table Tennis Players with an Autonomous Robot," describes a system that fuses event-based vision, model-free reinforcement learning, and high-speed robotics to read spin, predict ball trajectories, and return shots with the agility of a trained athlete.
How Ace plays
The perception stack uses nine active-pixel cameras built around Sony's IMX273 sensors to localize the ball in 3D, alongside three gaze-control units with IMX636 event-based vision sensors paired with pan-tilt mirrors and tunable lenses. That setup lets the robot measure angular velocity and spin in real time — the variables that make table tennis notoriously hard for machines.
On top of that perception stack sits a model-free reinforcement learning policy that drives a high-speed robotic arm. According to Sony AI, the system achieved a return rate of more than 75% against spins of up to 450 rad/s, and scored 16 "aces" against opponents' 8 across the documented matches.
The match results
Sony AI tested Ace against five elite players — each with more than ten years of active experience and roughly 20 hours of weekly training — and two professional players, Minami Ando and Kakeru Sone. In the initial round, Ace won three of five matches against elite opponents and won one game against a professional, though it lost both pro matches overall.
The more striking result came in follow-up sessions. In December, Ace defeated both new elite players and one of the new professional opponents tested. In matches in March 2026 against three additional professional players, Ace beat all three at least once.
Why it matters
The project lands at a moment when "physical AI" — robotics powered by foundation-model-style training — is becoming a defining frontier for the industry, with NVIDIA, Figure, Physical Intelligence, BMW, and others pushing humanoid and embodied platforms toward production.
Table tennis is a useful proving ground because it compresses many of physical AI's hardest problems into a few square meters: sub-millisecond reaction times, partial information, deceptive spin, and an opponent who adapts in real time. Solving it under tournament conditions, against players who train for a living, is a different bar than scripted lab demos.
Peter Dürr, Director of Sony AI in Zürich, framed the achievement in those terms: "This research has shown that an autonomous robot can, in fact, win at a competitive sport, matching or exceeding the reaction time and decision making of humans in a physical space."
For Sony, Project Ace also serves as a showcase for its semiconductor business, with the company emphasizing that the same image and event-based sensors used in Ace are already shipping in industrial and automotive products. For the broader field, the result is a benchmark — one that suggests reinforcement learning paired with high-bandwidth sensing can finally close the gap between simulated agility and the real, messy, fast-moving physical world.



