DeepMind AI to play videogame to learn about world

Google’s DeepMind is teaming up with the makers of the StarCraft video game to train its artificial intelligence systems. The AI systems “playing” the game will need to learn strategies similar to those that humans need in the real world, DeepMind said.

Its ultimate aim is to develop artificial intelligence that could solve any problem. It has previously taught algorithms to play a range of Atari computer games.

deepmind ai

StarCraft II, made by developer Blizzard, is a real-time strategy game in which players control one of three warring factions – humans, insects or alien elves. Players’ actions are governed by the in-game economy, and minerals and gas must be gathered in order to produce new buildings and units.


Also See: Google Daydream View hitting stores on November 10


Each player can only see parts of the map within range of their own units and must send units to scout unseen areas in order to gain information about their opponents.DeepMind famously developed an algorithm that could play the complex game of Go and beat one of the world’s best players.

The game will be opened up to other AI researchers next year. DeepMind said it hoped that the new environment would be “widely used to advance the state-of-the-art” field of artificial intelligence.

We will be happy to hear your thoughts

Leave a reply

Register New Account
Reset Password