There is no reason to fear the Google Algorithm

Every time there is a rumor of a Google algorithm update, a general panic ripples through the massive community of people who are heavily reliant on free traffic from Google’s search users. There is a collective holding of breath while the numbers are analyzed and then a sigh of relief (hopefully) of surviving the algo update unscathed.

After the update is released and especially if it’s confirmed by Google, there is a slew of articles and analyses produced attempting to dissect what it is that Google changed and how to win in the new paradigm.

In my opinion, all of this angst in entirely misplaced and is rooted in a fundamental misunderstanding of what exactly happens in a Google algorithm update. The Google algorithm is made out to be some sort of mythical secret recipe cooked up in a lab designed to simultaneously rob and reward sites at the whims of a magical all-knowing wizard. In this scenario, the goal of every SEO and webmaster is to dupe this wizard and come out on the winning side of every update.

Multiple Algorithms

Nothing could be further from the truth. Google’s algorithm isn’t even a single algorithm, rather it’s a confluence of multiple algorithms. On Google’s guide on how search works, it ALWAYS refers to the algorithm in the plural. If you were to parse the data on this page and read tweets mentioning algorithms written by Googlers it appears that there are three primary algorithms each of which has a different purpose.

  1. Crawling – this is an algorithm designed to crawl and understand the entire web.
  2. Indexing – This algorithm determines how to cache a webpage and what database tags should be used to categorize it.
  3. Ranking – Somewhat self explanatory but it uses the information in the first two algorithms to apply a ranking methodology to every page.

There is also like a fourth primary algorithm which is tasked with just understanding a user’s query and then modifying it something else when the search engine queries the database. This is the algorithm affected by Google’s announcement of BERT.

Understanding Google’s algorithms in this light it might make a lot more sense to how Google could claim to update their algorithm multiple times per day.

These algorithms are extensive and complex software programs which constantly need to be updated based on real scenarios. As anomalies are found by search engineers they are patched as a bug in any software program would be. In every other company this might just be a bug fix, but on search it translates to an algorithm update.

Product updates

In any software company where the software is the product there are product updates that happen multiple times per year. There are always changes being made, some visible and others not so much. As an example, Facebook is constantly tweaking all aspects of their product, they didn’t just launch their news feed many years ago and just leave it. Even our phone operating systems whether Android or OS are updated in a major way at least once per year.

For Google, like any other software company they release updates that take big leaps forward on their product; however, in Google’s case they are called “major algorithm updates” instead of just product updates. This phrasing alone is enough to enduce panic attacks.

Algorithms don’t hurt

Now with this knowledge of what exactly an algorithm update is, it is easier to understand why there really is never a reason to panic. When Google’s product managers determine that there are improvements to make in how the search product functions, they are usually tweaks at the margins. The updates are designed to address flaws in how users experience search. Much like a phone operating system leaps forward in a new update, Google’s major updates make significant improvements in user experiences.

If a site experiences a drop in search traffic after a major algorithm update, it is rarely because the entire site was targeted. Typically, while one collection of URL’s may be demoted in search rankings others more than likely improved.

Understanding what those leaps forward are requires taking a deep dive into Google Search Console to drill into which URL’s saw drops in traffic and others that saw gains. While a site can certainly a steep drop off after an update, its simply because they had more losers than winners, but it is most definitely not because the algorithm punished them.

In many cases, sites might not have even lost traffic – they only lost impressions that were already not converting into clicks. Looking at the most recent update where Google removes the organic listing of sites that have a featured snippet ranking, I have seen steep drops in impressions but the clicks are virtually unchanged.

Declaring a site to be a winner or loser after an update neglects the granular data that might have lead to the significant changes in traffic. It is for this reason that websites should not fear the algorithm if their primary focus is on providing an amazing and high quality experience for users. The only websites that have something to fear are those that should not have had high search visibility because of a poor user experience.

Past algorithm updates

In recent times, it is rare for a site that provides a quality experience for users – determined as satisfying a user’s query intent to have all of their URL’s demoted in an update. If that was truly the case, the site was likely benefitting from a “bug” in how Google worked and was already living on borrowed time. Websites that exploit loopholes in the way that Google ranks the web, should always be aware that Google will eventually close the loophole for the good of the entire Internet.

There were certainly times in the more distant past where entire sites were targeted by algorithm updates, but that is no longer the case. Panda which was designed to root out low quality content, Penguin which demoted unnatural links and medic which demoted incorrect medical information had specific use targets, but other sites were left relatively untouched. If a site was on the losing side of the algorithms prior to that update because competitors were exploiting loopholes, they likely saw significant gains as their competitors dropped out of search.

Updates are a fact of search life

Google will and should always continuously update their algorithms so that their product evolves into something that retains their users. If they would just leave the algorithm alone, they risk being overrun by spammers who take advantage of loopholes and Google will go the way of AOL, Excite, Yahoo and every other search engine that is no longer in existence.

Instead of chasing the algorithm, everyone that relies on search should maintain their focus on the user. The user is the penultimate customer of search and will thereby immunize their site from algorithm updates designed to protect the search experience.  There is no algorithm wizard. The algorithm(s) only have one purpose and that is to help a user find exactly what they seek.