The Google BERT algorithm (Bidirectional Encoder Representations from Transformers)
The Google BERT algorithm (Bidirectional Encoder Representations from Transformers) is a deep learning algorithm related to natural language processing (NLP).It helps Google understand natural language better, particularly in conversational search and with the increase in voice search this is a major upgrade. It is anticipated that this will have an impact on rankings.
What Google said
“These improvements are oriented around improving language understanding, particularly for more natural language/conversational queries, as BERT is able to help Search better understand the nuance and context of words in Searches and better match those queries with helpful results.
Particularly for longer, more conversational queries, or searches where prepositions like “for” and “to” matter a lot to the meaning, Search will be able to understand the context of the words in your query. You can search in a way that feels natural for you.”