Please select your country / region

Close Window
My Page
General Information

Introducing Gran Turismo Sophy: a Superhuman Racing AI Agent Trained Through Deep Reinforcement Learning

Gran Turismo Sophy, a revolutionary new racing AI agent, was unveiled on February 9. The new technology was developed through a collaboration between Polyphony Digital Inc. (PDI), Sony AI and Sony Interactive Entertainment (SIE).

Gran Turismo Sophy was created using state-of-the-art deep reinforcement learning technology developed by Sony AI and Gran Turismo, as well as through large-scale training via SIE’s cloud gaming infrastructure. Gran Turismo Sophy takes machine learning to the next level by introducing an AI agent to a hyper-realistic racing simulation that requires real-time decisions on a continual basis for the duration of the race.

Michael Spranger, Sony AI COO, describes Gran Turismo Sophy as “an AI agent that learned to drive by itself at a very competitive level and is able to compete with the best drivers in the world.”

Gran Turismo Sophy began as a blank slate and evolved from an AI that could barely maintain a straight line on a track into a race driver that can compete against the best Gran Turismo Sport drivers in the world.

Gran Turismo Sophy opens up new possibilities for gaming and entertainment. Below we explain how this exciting project came to life.

The First True Test

Gran Turismo Sophy’s training began in April 2020 with the formation of Sony AI. From this point onward, the Sony AI team worked closely with Polyphony Digital to develop and improve the agent’s capabilities. The first "Race Together" event took place on July 2, 2021, where Gran Turismo Sophy competed for the first time against a team of four seasoned GT drivers, led by 2020 FIA Gran Turismo Championships triple champion, Takuma Miyazono.

In solo time trials, Gran Turismo Sophy showed superhuman speed, recording faster lap times than its human counterparts. However, wheel-to-wheel competition against humans proved to be a different case altogether.

“I think we all underestimated how hard it would be to get the sportsmanship side of it right, and learn to do that without being overly aggressive or overly timid in the face of competitors,” said Peter Wurman, Sony AI Director and Project Lead.

Racing Done Right

An AI agent's performance is upper-bounded by the complexity of the challenges it faces, and Gran Turismo can present a great challenge to an AI system because it captures the dynamics and physics of the sport where other similar games only get racing physics partially correct.

“I wanted to recreate cars and the entire culture around cars in a video game,” said Kazunori Yamauchi, President of Polyphony Digital.

Charles Ferreira, a PDI engineer, added: “The realism of Gran Turismo comes from the detail that we put into the game, from the engine, tires, and suspension to the tracks and car models.”

It's this realism that makes this a unique AI challenge and pushed the Sony AI team and Gran Turismo Sophy to new heights.

Training with Mass-Scale Infrastructure

Using a technique called reinforcement learning, Gran Turismo Sophy learned to drive through positive and negative feedback using inputs such as how fast it is going, which way the wheels are pointed, the curvature of the track, etc. Mirroring how humans require an estimated 10,000+ hours to become proficient at a skill, Gran Turismo Sophy duplicated itself and took on multiple different scenarios simultaneously. This required a great deal of computing power, which was provided by Sony Interactive Entertainment.

“With a standard AI simulation, a model is created and then run. Analysis occurs and updates are then added to that simulation and run again. This process can sometimes become extremely time-consuming,” said Justin Beltran, Sr. Director, Future Technology Group, Sony Interactive Entertainment. “However, leveraging SIE’s worldwide mass-scale cloud gaming infrastructure, Gran Turismo Sophy was powered to deploy state-of-the-art learning algorithms and training scenarios and successfully run tens of thousands of simultaneous simulations in this cutting-edge environment that supported this revolutionary technology."

Returning to the Track

The next race session occurred on October 21, 2021, with the hope that Gran Turismo Sophy would win every competition that day, including the team race.

Not only did it dominate across the board, but the team also witnessed it adapting to a tough situation when it had a wipeout early in the third race and still went on to win the race. It took 1st and 2nd place in all three races and won the team score by double-digits over the humans.

While GT Sophy proved its capabilities by outracing the human drivers, the intention of this project is not to replace or diminish human interaction. Instead, it is about expanding and enriching the gaming experience for all players. Sony AI is calling this "AI for Gamers."

“We’re going to create artificial intelligence that will unleash the power of human creativity and imagination,” says Hiroaki Kitano, Sony AI CEO.

Ace-driver Miyazono added: “I want to race with GT Sophy more in the future. I really learned a lot from the AI agent."

"The goal with Gran Turismo Sophy is ultimately to entertain people,” said Yamauchi.

“We envision a future where AI agents could introduce developers and creators to new levels of innovation and unlock doors to unimaged opportunities,” explained Ueli Gallizzi, SVP of Future Technology Group, Sony Interactive Entertainment. “We could see new levels of user engagement, better gaming experiences, and the entry of a whole new generation into the world of gaming.”

We can’t wait to see what lies ahead as the worlds of artificial intelligence and interactive entertainment are bridged, and Gran Turismo Sophy represents the next step in this exciting journey.