Google Search Improves How it Understands Natural Language with BERT
Date: 28 – October – 2019
Google announced that they are improving the way Search understands the natural language by applying BERT models to the search queries, helping the search engine to understand the context of the words and deliver a more helpful result for you.
BERT stands for (Bidirectional Encoder Representations from Transformers) is an open-source neural network-based technique for natural language processing (NLP) pre-training, this technique allows Google to consider the full context of a word by looking at the words that come before and after it when someone is doing a search query.
Google is applying the BERT model to the ranking results and featured snippets in Search, helping it to understand better one in 10 searches in the U.S. that are done in English, and it’s expected that Google will extend BERT to other languages over time, since one of the characteristics of the system is that it can take the learnings from one language and apply them to others.
Using BERT, Google now can better understand the nuances and context of words in a search query and deliver more relevant results for the user; these are some of the examples that Google provided of the improvement in the results when using BERT.
When searching for “2019 brazil traveler to usa need a visa” the relationship of the word “to” with the other words in the query is really important to understanding the meaning of the query, in this case the user is trying to look for information about a Brazilian traveling to the U.S. and not about an U.S citizen traveling to Brazil.
In another example, when searching “Can you get medicine for someone pharmacy” without using BERT, Google didn’t understand the importance in the query of the phrase “for someone” producing results about filling prescriptions, now with BERT it understands what the user is inquiring about the possibility that someone else besides the patient to pick up its prescription.
As a final example when searching for “Math practice books for adults” the results without BERT showed a book in the “Young Adult” category, and with BERT, Google understands that “adult” is matched out of content and providing a better result.
So, it is possible to optimize your content for BERT? Not really, as in the case of RankBrain (the learning-based search engine algorithm introduced about 5 years ago that helps Google to process search results and provide relevant results for users), there is no way to optimize for it, instead what you should do is improve your content and how you are organizing it, don’t forget that the content that you are writing is for the users and not for an algorithm.