Allow us to introduce you to Roberta. Or more accurately, RoBERTa. Bert for short works also. BERT stands for Bidirectional Encoder Representations from Transformers and is the largest update in a long time. It is also rather a mouthful that does not make any sense to most of us. In essence, what BERT is, and what Google has done, is adjust its search engine algorithm to better understand natural language processing. The simplest way to understand what that means is a way in which you already are quite familiar with: when you start typing a search query into the Google search bar, its autocomplete feature can already predict what you are likely trying to ask for and complete your search query without you having to type the whole thing out.
However, where BERT really changes the game is that, before its activation, prepositions would not affect search results in the way it does now. And that is where the Natural Language Processing comes in. Where before if you had search “Do hairdressers cut men’s hair?” your top results may have given you articles about hairdresser vs. barber training. Now, however, since BERT understand the entire natural language query, it will likely provide you a list of hairdressers that also cut men’s hair.
Google, through RoBERTa’s help, now understands the context of the query, and no longer only uses matching keywords to produce search results. This means the searcher receives more accurate and, importantly, relevant search results. The practical effect is that BERT affects top-of-the-funnel keywords, because now using only the top keywords will not result in your content ending up in the search results. BERT interprets the intent and context of the search query, and rather delivers those results.
This change is most certainly a difficult adjustment for marketers and content creators alike. We have already examined how the sales funnel has been affected by the ready availability of information through the Internet, and how the funnel is now no longer linear, but follow a far more complex path. Now, with BERT, there is the added challenge of getting very specific with the content you publish. Under the “old rules” creating long-form content was the best way to rank high on the SEO (Search Engine Optimisation) Rankings because of the number of keywords you could comfortably fit into a content piece. Long-form content still ranks high in terms of SEO, but the rankings are weighing the quality of the content rather than the word count.
Content, still the best way to create an online presence and following, will now have to move away from generalities and laser-focus on specific information using longer, more specific phrases that searchers and visitors are more likely to use the closer they get to the point-of-purchase or when they are using voice search. It is no longer about keyword density – you will have to answer a real question in natural language. And to score on the SEO rankings, you will have to answer the question better than your competitors.
0 Comments