On Nov 2019 Google released a major update in BERT algorithm. The inclusion of the BERT algorithm, the objective behind user search queries should be better understood, and more relevant results.
All about the BERT Algorithm
Tech giant Google recently updated a major algorithm alteration in BERT algorithm. If there’s one thing that learned about Google Search over the 15 years, it’s that the curiosity of people is endless. Everyday, we see billions of searches, and 15 percent of those queries are those that we haven’t seen before— so we’ve built ways to return results for queries that we can’t expect. Google BERT Algorithm Update is the latest one.
When people like you or I come to search, the best way to formulate a query is not always quite certain. We may not know the right words to use, or how to spell it, because we often come to search for learning— we don’t really have the expertise to begin with.
Search is about language understanding at its core. It is our job to find out what you are looking for and to surface useful web-based data, regardless of how you spell or combine the terms in your question.
While over the years we have continued to improve our ability to understand language, sometimes we still don’t get it right, especially with complex or conversational queries. In fact, that’s one of the reasons people often use “keyword-ease,” typing strings of words they think we’re going to understand, but they’re not actually asking a question naturally.
With our research team’s latest advances in language understanding — made possible by machine learning — we are making a significant improvement in how we understand queries, representing the biggest leap forward in the past five years, and one of the biggest leaps forward in Search history.
Last year, we introduced and open-sourced a natural language processing (NLP) neural network-based pre-training technique called Transformer Bidirectional Encoder Representations, or as we call it— BERT, for short. This technology allows anyone to train their own state-of – the-art answering system for questions.
This breakthrough was the result of Google’s transformer research: models that process words in a sentence rather than one-by-one in order in relation to all the other words. Therefore, BERT models can consider the full context of a word by looking at the words that come before and after it — especially useful for understanding the purpose behind search queries