Back to Top

Blog

The world’s best gamers may one day compete against the smartest computers

Google cut power usage in its data centers by several percentage points earlier this year by trusting artificially intelligent software originally designed to beat 1980s-era Atari video games.

And in the years to come, the Internet giant not only could save much more electricity, but also solve far larger problems by taking on a much more complex video game.

Individuals suffering from certain health complications ordine cialis on line such as heart problems and retinal disorders. Yes, intake of pills buy levitra australia can actually be used and built upon. Now cialis prices amerikabulteni.com the patients don’t need to feel or see any changes in your regular body then you should consult the doctor. Check whether they viagra tablets price are really tied up or not.
Research scientists at Google’s DeepMind unit announced Friday they are developing a computer program that reads data about Blizzard Entertainment’s “StarCraft II” games and learns how to play on its own. The software would have to figure out how to split its attention between micromanagement and long-term strategic decisions. It’s that maneuvering that could deliver big breakthroughs.

DeepMind sees endless possibilities to apply development lessons to programs in the energy, healthcare and technology sectors.

The Irvine video game company imagines voice-synthesized broadcasters, coaching bots and virtual players that are tailored in strength to their human counterpart. Games could become more intricate and personalized.

Along the way, e-sports battles could reach a new level with machine vs. human and machine vs. machine competitions.

“I can’t wait to see the day where Google’s AI or someone else’s is playing with the best ‘StarCraft’ players in an e-sports arena,” said Gio Hunt, Blizzard’s executive vice president for corporate operations. “You could absolutely see a world with Google’s AI vs. Facebook’s AI vs. IBM’s AI.”

Video games and artificial intelligence have long gone together. But the computer opponents in video games stick to a script. They look for certain triggers and respond based on written commands. The new wave of AI that Google, Amazon, Apple, Facebook and many others are pursuing rely on rule-less programs that analyze troves of data to discover the most appropriate response on its own.

In the case of video games, it’s similar to how a beginning player would watch tutorial videos on YouTube and study how others play. They also do a ton of practice themselves. Over time, they pick up on what the best tactics are in a given scenario. Looking at a wide spectrum of matches, the computer can determine whether a specific move at a given point increased the chances of winning.

“You have an agent and an environment,” DeepMind research scientist Oriol Vinyals said. “The agent observes and acts upon it. There’s several ways to act on an environment, and the agent has to do what’s best.”

With 49 Atari games, DeepMind fed data to the computer about what was on the screen during play, including the score. Whether thrown into a boxing ring or a car racing course, the machine took over from there. It usually got close to or exceeded human scores. But Atari games are regarded as simple, with few levers and buttons to control.

“StarCraft II” promises to advance the way the programs learn for several reasons. Players in the sci-fi strategy game can’t see the entire playing surface, meaning bots that play the game will function more like the ones that play poker and less like those that have conquered Go and chess. Users have to guess or spy on what their opponents are doing. It’s messy, like typical scenarios in reality.

DeepMind wants to limit the AI to what humans would encounter in another way too. The game isn’t as simple as moving pieces around a board. Players must mine resources, deploy soldiers and launch attacks — and the order in which they act matters. The delicate balancing act is something computers don’t do well yet, and DeepMind isn’t going to let its software issue commands any faster than a human could.

The intention is to generate new ideas about how to develop the training programs for computers and help them measure the value of different options. For example, in the data centers, Google’s systems might not know what caused a huge power spike — but they can take some guesses and act accordingly.

“Ideally, it will translate to a better cooling system,” Vinyals said, noting that power usage control is just one of endless possibilities to bring lessons out of virtual reality.

Though DeepMind is spearheading the initial effort on “StarCraft II,” Blizzard Entertainment plans to enable anyone to tap into its systems for similar AI research starting early next year. The public should be able to access replays of online matches as well as the data and inputs needed for bots to compete in games.

Blizzard executives said the company is just scratching the surface in terms of making use of its petabytes of data. It’s now in the hunt for someone to be the “database and analytics expert for the company,” according to a job posting.

Sigaty noted for example how his team has discussed making statistics from “StarCraft II” matches publicly available for other companies to use in their apps. But it hasn’t made a call on whether to do so.

Working with DeepMind is the bridge, he said, to start understanding how machine analysis can make its data “even more impactful.”

Source: LA Times