- The biggest improvement coming to Google's search engine in the next 10 years is likely to be an ability to understand context and the intention behind your query, said Daniel M. Russell, a senior research scientist who has been with Google for 14 years.
- That's important for Google, considering most people type roughly two words when performing a search, which doesn't give the search engine much context to work with.
- Google recently took another step in this direction by applying a natural-language-processing model to its search tool that enables it to understand the context of a word by looking at the words that surround it in a phrase or sentence.
- Visit Business Insider's homepage for more stories.
Think about what it was like to do a Google search 10 years ago, in 2009.
It probably would have involved opening your laptop or booting up your desktop computer, typing in a few keywords, and hitting the search button. That may not sound very different from how you would look up the answer to a question through Google today, but chances are what happens after you hit that "search" button has changed drastically.
What was once a simple list of blue links has evolved into a richer stream of information that includes all types of content, from sports scores to facts about prominent public figures and more.
Over the next 10 years, the biggest change to Google will be less about how the information is presented and more about Google's process of getting the answers you're looking for.
In other words, Google is going to get much better at understanding what you're hoping to find when you enter a search query, according to a Google employee of 14 years.
"The general direction, I think, is going to be a deeper analysis of language," Daniel M. Russell, a senior research scientist for search quality and user happiness at Google, recently told Business Insider when asked how search would change over the next decade.
"Basically, we're going to get better at understanding the contents of webpages, and we're going to get better at understanding what you mean in your query," Russell said.
Google is already taking important steps in this direction. One of the biggest upgrades to Google in recent months, for example, has been the application of the company's BERT technology to search results.
BERT, which stands for "bidirectional encoder representations from transformers," is a natural-language-processing technique that enables Google to better understand the context of a word by looking at the words that come before and after it. That makes it easier for the search engine to understand the reason behind your search so that it can pull up the desired answer.
Such improvements are increasingly important for Google, considering the average length of a search query is about two words. That could present a challenge for Google, since it doesn't have very much context to work with.
For example, if you type in the word "jaguar," you could be looking for the animal, the car, or the Mac operating system from 2002. But if you add another search term like "South America," Google understands you're probably looking for more information about the animal.
Google's goal for search in the long run is to get even better at making those kinds of connections, even with shorter queries.
"We're getting very good at guessing," Russell said. "But we're guessing."