Now that Google has proven that the DeepMind artificial intelligence is capable of playing the notoriously complicated Go at a world-class level, what's next?
"'StarCraft,' I think, is our likely next target," Google Senior Fellow Jeff Dean said at today's Structure Data event in San Francisco.
Dean is referring to the 1998 smash-hit PC game "StarCraft," which casts players as supreme commanders in an interstellar conflict between humans, the bug-like Zerg, and psychic warrior Protoss. Players compete to gather resources, build their forces, and outmanuever their opponents.
Board games like Go and chess are what researchers call a "perfect information" game, where both players have a total awareness of everything that happens on the board at all times.
"The thing about Go is obviously you can see everything on the board, so that makes it slightly easier for computers," Google DeepMind founder Demis Hassabis told The Verge recently.
Meanwhile, games like "StarCraft" and its sequel keep your opponents' moves largely secret, at least until you come in to conflict — skilled players watch closely for clues as to their opponents' strategy and try to anticipate their next move.
Of course, when Google is ready to test DeepMind's "StarCraft" skills, there are no shortage of skilled players who might be willing to step up: StarCraft birthed an insanely competitive professional gaming scene, including a status as a major spectator sport in South Korea.
Here's a professional StarCraft 2 match in action:
"You have to keep track of things happening off the screen," Dean says.
It means that Google's DeepMind would have a brand-new challenge of trying to outguess their opponent, and react if and when they come up with something totally crazy. It would test a new set of skills for artificial intelligence.
"Ultimately we want to apply this to big real-world problems," Hassabis told The Verge.