DeepMind, Blizzard Invite Researchers To Build AI Agents That Can Master 'StarCraft II'

AI agents playing StarCraft II mini-gamesAI agents playing StarCraft II mini-gamesDeepMind and Blizzard announced that StarCraft II is now an open artificial intelligence research environment where people can develop and train their own AI agents to beat others at the game. Unlike other game bots, the new AI agents will learn and see the game the same way human players do, with no programming shortcuts to give them unfair advantages.

DeepMind AI

DeepMind has tried from the beginning to build AI agents that can solve complex problems. Its AI agents started with playing Atari games and then moved on to beat the world’s top Go grandmasters, a feat that AI experts didn’t think was possible for at least another decade. In the meantime, DeepMind’s AI has also been used for real-world applications such as cutting Google’s data center cooling costs by 40%.


DeepMind’s latest project is to conquer the much more complex game of StarCraft II. If its AI agents can learn to regularly beat top human players in a strategy game such as StartCraft II with its own 3D worlds, the AI could then be used for even more advanced real-world applications.

Why Conquering "StarCraft II" Will Be Difficult

There are multiple reasons why a game such as StarCraft II is actually much more difficult for an AI agent to learn than the game of Go. One is that StarCraft II is significantly more open-ended than Go. It also has more gameplay rules, making it a much more complex game.

Another reason is that both Go players know exactly what’s happening within the game at any point in time, whereas in StarCraft II, the fog of war clouds what each player knows about their rival throughout the game. This should make StarCraft II quite unpredictable for the AI, although it’s possible the AI will eventually figure out the most common strategies employed by human players at any given point in a game.

The more game replays there are, the better the StarCraft II AI agents should become. StarCraft II is quite a popular game that's played competitively online, so there should be plenty of game replays that can be used as a data source.

StarCraft II also has 300 basic actions that you can take, as opposed to Atari games that only have ten (eg, up, down, left, right, etc.).

As seen in the video below, on the left side, an early-stage training agent is failing to even keep its workers mining, a task that’s trivial for human players. On the right side of the video, a trained agent can perform more meaningful actions, but according to DeepMind, it still fails to beat even the easiest built-in StarCraft II AI.

Trained and untrained agents play StarCraft II 'mini-game'

The DeepMind team seems to be well aware that that training the AI agents to play StarCraft II well is not a simple task. Therefore, they’re encouraging everyone to take advantage of both DeepMind and Blizzard’s tools to create their own agents, so that everyone can learn from the process.

The tools include:

  • Machine learning API developed by Blizzard that gives researchers hooks into the game
  • 65,000 game replays (more than half a million in a few weeks)
  • Open source version of DeepMind’s PySC2 toolset to make it easier for developers to use Blizzard’s feature-level API
  • A set of mini-games in which the researchers can test their AI agents
  • A research paper written by DeepMind and Blizzard that reports initial results for for the test AI agents against the built-in StarCraft II AI
Create a new thread in the News comments forum about this subject
This thread is closed for comments
18 comments
Comment from the forums
    Your comment
  • Suiton20
    Are there any for Warcraft 3
    1
  • willgart
    why training an AI for a WAR game??? is it really what we want from an AI? we really want to go in this direction today?
    0
  • aaddrryyss
    maaaaan, you nailed it...
    0