Image Credit:

Share this post:

The ability to easily find information online with a few strokes of a keyboard has changed people’s lives forever. Need an answer to a vexing question? Google it! Debating over who has the most records for whatever achievement you might imagine? Ask Siri!

Search engines emerged in 1994, with WebCrawler considered to be one of the first entrants into the market. WebCrawler was followed by AltaVista, Excite, Yahoo and, of course, Google, which captured and continues to have a stronghold on the market.

Enter BERT

In October 2019 Google released what many are saying is the biggest impact on search in years—the BERT update. What is BERT?

BERT—or Bidirectional Encoder Representations from Transformers is Google’s most recent algorithm update, designed to improve the ability of its technology to understand the intent behind search queries, specifically voice queries.

While most casual users are unlikely to notice much of a difference in terms of their own search activities, according to Ignite Visibility, Google has reported that BERT will affect one in ten searches. The good news for marketers and website managers is that, unlike earlier algorithm shifts, BERT is likely to produce better results while relieving what, for many, has felt like artificial machinations to boost SEO.

How BERT Works

BERT is specifically designed to address the nuances of voice search which is far different from the type of searches people have typically done through text. Voice search is more natural and tends to include longer (or long-tail) phrases. Consequently, the BERT algorithm will be focused on learning and processing natural language.

But that’s not all. According to Search Engine Journal, BERT was actually created by Google as a research project—a natural language processing (NLP) framework “that Google produced and then open-sourced so that the whole natural language processing research field could actually get better at natural language understanding overall”. 

More than just an update, BERT is a movement.

As Google explains in their blog post

Last year, we introduced and open-sourced a neural network-based technique for natural language processing (NLP) pre-training called Bidirectional Encoder Representations from Transformers, or as we call it—BERT, for short. This technology enables anyone to train their own state-of-the-art question answering system.

BERT is an update, but not a replacement of RankBrain, which Google introduced in 2015. RankBrain is an artificial intelligence (AI) component of Google search designed to help better understand the context of search to drive users to relevant content. BERT goes beyond the functionality of Rankbrain to address the nuances of human language through the ability to process words in relation to other words in a search stream—the ability to understand content in context. While RankBrain focused more on the authority of the content, BERT search results will be analyzed in context, with consideration of words that make the meaning of the search more clear and deliver better results.

The Benefits of BERT

The big benefit of BERT is a shift from worrying about exact matching for boosting SEO to considering how users use natural language in voice search to find the information they’re looking for. It’s a more user- than bot-focused approach to searching for and finding content — something that content creators are likely to applaud.

BERT will also improve voice search by boosting the ability of search algorithms to understand the nuances of search queries.

John Frigo, SEO lead with, explains: “BERT has to do with the context of searches. The best example I’ve heard was someone searching for ‘how to catch a spotted cow fishing.’ A spotted cow is a bass fish. Prior to BERT, Google would have been confused. You likely would have gotten results about cows or cattle. What BERT aims to do is use the word ‘fishing’ at the end to give context to the search and give more accurate search results.”

Websites are unlikely to be negatively impacted by BERT. The main potential hit may be a loss of search traffic that was being directed inappropriately to sites in the first place (as in Frigo’s spotted cow example). But that wasn’t valued or relevant traffic anyway, so it should not impact real results.  

Shifting Focus

For content creators who have never become comfortable with the rigor required of highly scientific SEO strategy, BERT should come as good news. It places a premium on conversational queries—the kind of queries that regular people (not bots) use when searching for information of interest. That means that content creators can use their natural ability to connect with an audience by creating super-specific content that focuses on long-tail keywords—the kind of keywords that people use when they speak with each other.  

Ultimately, it’s the quality of your content and its relevance to your audience that will generate the best results. By focusing on a few core strategies, marketers can readily rank higher on Google SERPs.

In 2020, BERT is likely to play a big—and welcome—role in search, specifically voice search; it’s certainly something to keep in mind as you plan your SEO strategy for the new year. 

Need help in planning your B2B search optimization strategy for 2020? We’ve got you covered. Contact KeyScouts, today.


contact keyscouts


About Tomer Harel

Tomer Harel is founder and CEO of KeyScouts, and co-founder of He has been practicing Internet marketing for over a decade, helping hundreds of businesses to thrive online. If you'd like to contact Tomer or have him speak at a conference, meeting or event, please drop him a line via email (tomer.harel at keyscouts dot com).

Share this post: