Channel: Artificial Intelligence
Viewing all articles
Browse latest Browse all 1375

'Machine learning' is a revolution as big as the internet or personal computers


robot pet

We're in the middle of a historic moment. 

It used to be the case that you had to program a computer so that it knew how to do things. Now computers can learn from experience.

The breakthrough is called "machine learning." It's unimaginably important for understanding where technology is going, and where society is going with it. 

Netflix's movie recommendations, Amazon's product recommendations, Facebook's ability to spot your friends faces, dating app's matching you with potential dates — these are all early examples of machine learning. 

And Google's self-driving car is becoming the classic case study. 

"A self-driving car is not programmed to drive itself," says University of Washington computer scientist Pedro Domingos, author of "The Master Algorithm: How the Quest for the Ultimate Learning Machine Will Remake Our World."

"Nobody actually knows how to program a car to drive," he says. "We know how to drive, but we can’t even explain it to ourselves. The Google car learned by driving millions of miles, and observing people driving." 

That's the key: machine learning allows algorithms to learn through experience, and do things we don't know how to make programs for. 

Machine learning had a major public breakthrough in March, when Google made artificial intelligence history by creating an algorithm that mastered Go, the ancient Chinese game with more possible board configurations than there are atoms in the universe. Google's AlphaGo program beat Lee Sedol, perhaps the greatest human Go player alive.  

But Google couldn't program an algorithm to conquer Go. It had to create a sophisticated algorithm that could process 80 years' worth of publicly available Go games, and learn what good moves look like from studying them. 

To Domingos, machine learning is as big of a breakthrough as personal computers, the internet, or electricity itself. 

"There were two stages to the information age," Domingos says. "One stage is where we had to program computers, and the second stage, which is now beginning, is where computers can program themselves by looking at data." 

Perhaps that's why Google's Eric Schmidt says that every big startup over the next five years will have one thing in common: machine learning.

Join the conversation about this story »

NOW WATCH: Consumer Reports just rated Samsung's new Galaxy phone better than the iPhone

Viewing all articles
Browse latest Browse all 1375

Latest Images

Trending Articles

Latest Images