SEO Update - What is Google BERT and how will it affect SEO?
post-template-default,single,single-post,postid-16833,single-format-standard,ajax_fade,page_not_loaded,,side_area_uncovered_from_content,hide_top_bar_on_mobile_header,qode-theme-ver-10.1.1,wpb-js-composer js-comp-ver-6.0.5,vc_responsive

What is Google BERT and how will it affect SEO?

What is BERT?

What is Google BERT and how will it affect SEO?

BERT which stands for Bidirectional Encoder Representations from Transformers is Google’s new algorithm update, the largest since RankBrain (2015). This update will affect about one in ten search engine results. The disclosure came through the official Google blog on October 25th. 

What is BERT? 

BERT is a neural network-based technology for improving natural language processing (NLP). It’s the second time that artificial intelligence has been applied to Google. 

NLP focuses on understanding how humans communicate and how computers can simulate this communication so that people can interact with machines in a more natural way. Google plans to improve the interpretation of complex long-tail search queries and to provide more relevant search results.  

With Natural Language Processing in place, Google no longer relies entirely on keywords to determine what pages to deliver for search queries; it relies on the syntax and the sentiment of the content instead. This means that SEO strategy should be implemented in such a way that the primary focus is on users rather than search engine bots.  

BERT improves Google’s understanding of the context involved in the usage of certain words. For example, words that can change meaning in different contexts are now better understood by algorithms – including the so-called stop words like and, or, in, of, etc.  

Natural language processing on Google Search Algorithm
Google AI now employs natural language processing

On Google’s official blog, Pandu Nayak, Vice President of Search, explains that sometimes even users themselves don’t know how to search for what they want. Even though Google has refined its understanding of different languages over the years, some more complex aspects of the conversation and natural language were still lacking. 

As a result, many times, when trying to assume what the robot understands, we search using only keywords and formulating incomplete sentences, and not as we would naturally do when talking to other people. Nayak mentioned that prepositions such as “to” and “for” were not previously considered much either, but thanks to BERT, the search engine robot is now able to understand how the meaning of an expression can change when these terms are used. 

How will BERT affect SEO? 

A significant impact of this update is the interpretation of the featured snippet, once again favoring those using structured data. From now on, Google Assistant will be able to process the question more effectively and then get the answer. 

For example:  

Google BERT search results sample

Previously the result presented displayed “taking a filled prescription”, now the chosen result is something like “A patient can ask a friend to take a prescription…”. 

There are other examples available in the original article.  

This shows a significant evolution in the interpretation of the search. However, BERT has only been fully applied to Google in English. There are plans to apply it to searches in other languages. We believe this adaptation will take place soon, as it is an artificial intelligence tool that can learn from one language and apply part of that knowledge to another.  

What do you think about this new algorithm update? Want to know if this will have an impact on your SEO strategy and rankings? Call us today: +1 403-907-0997

Related Articles:

No Comments

Post A Comment