There’s been much talk recently about Google implementing a broad core algorithm update.
A couple of weeks ago, webmasters started to notice changes to their search rankings which many suspected were due to an update to Google’s core algorithm. Google subsequently confirmed this via a tweet to its Search Liaison account, manned by former Search Engine Land editor and Search Engine Watch founder Danny Sullivan.
Each day, Google usually releases one or more changes designed to improve our results. Some are focused around specific improvements. Some are broad changes. Last week, we released a broad core algorithm update. We do these routinely several times per year….
— Google SearchLiaison (@searchliaison) March 12, 2018
Google has suggested that this update has nothing to do with the quality of content, and instead focuses on improving the quality of the SERPs. At SMX West, Nathan Johns, a search quality analyst at Google, stated in an AMA session that the core update was designed to “reward under-rewarded sites” rather than award penalties.
At Pi Datametrics, our data on organic search rankings would tend to confirm this, as the only real losses we’ve seen – while dramatic – were generally short-lived, and occurred in the run-up to the update itself.
However, if Google wasn’t testing quality, what exactly were they testing?
I turned to the SERPs to have a look, going back in time to the period just before, during and after the recent update. I asked Google a relatively simple question, then analyzed the results to detect any rumblings or suspicious flux.
Testing the Google broad core algorithm update
Google Query: What’s the best toothpaste?
I’ve focused primarily on content that was visible on page 1 or 2 at the start of this year.
We can clearly see that all these pages dropped out of the top 100 then reappeared on the same day. This occurred multiple times over a five week period.
Seven websites all performed pretty well (visible on page 1 and 2), with a further two sites appearing mid-way through the shake-up, that had no previous visibility (Expertreviews [dark pink] and Clevelandclinic [dark blue]).
The obvious shake-up started on 24 January, roughly five weeks before the algorithm was said to have fully rolled out (Sunday 4th March).
What we have here is a pattern we’ve seen many times before, something that is only visible with access to daily data on the entire SERPs landscape. It looks like a period of testing pre-full rollout, which is only to be expected.
Here’s the same chart, zoomed in from 01 February:
In the chart above we can see the flux continuing from February 5 onwards. Every site involved experienced almost the exact same pattern of visibility loss.
Things finally settled down on March 8. At first glance, it looks like all sites regained their original positions.
However, on closer inspection we can see that all came out slightly worse off, by an average of just over two positions; the smallest drop being one position (which can be painful on page one) and the largest being six.
Knowing when to act and when to sit tight
If this chart says one thing, it is DON’T PANIC if you drop out of the top 100 for a term you care about!
Just keep monitoring the SERPs every day. If you’ve ruled out content cannibalization, it could well be a period of algorithm testing – as with the broad core update.
If you’ve put the searcher first and created the kind of rich content that will satisfy them, then the chances are you will recover from these testing times.
Or maybe, like the Expertreviews site above (following the injection of a long-form, socially popular and recently updated piece of content into their ecosystem), you could even move from nowhere to position three, nudging all others down a peg.
Content that matched user intent was safe
The only two websites entirely unaffected by all of this were Reviews.com and Which.co.uk, proving that the combination of first mover advantage, relevance and fantastic authority ensures high visibility and algorithmic stability:
So, the immediate questions are – who has benefited from this shake-up? What happened in the gaps between the spikes? Who’s lost out and why? Are we now seeing a SERP more aligned with the intent of the searcher?
Who benefited from the early shake-up?
It wasn’t Expertreviews or Clevelandclinic. They benefited later.
Let’s introduce some of the the momentary winners who gained visibility during the downtime of all either sites:
Wins for Business Insider, Colgate and Amazon
- Businessinsider.com benefited from the initial shake-up. It has some great content, but it’s not been updated since October 2017. It has been indexed all this time, but only really became visible when Google pushed the previously well positioned sites out. Result? It survived the shake-up and ended on page one.
- The same happened to the Colgate page. Note its /en-us/ TLD. Arguably, it shouldn’t be visible in the UK anyway. This page only provided a list of toothpaste types e.g. ‘Fluoride’ or ‘Tartar control’ etc. This didn’t answer my question or match my intent. Result? Ended up dropping back to page five after the shake-up.
- The Amazon page simply displays a list of its bestsellers in toothpaste. From a content perspective, it’s not that inspiring. Result? Ended up dropping back to page three.
So the question is – if I were searching for “What’s the best toothpaste?” which of these new pages would I prefer?
All pages are mobile friendly, but if I really wanted to know what the best toothpaste was, I’d definitely prefer to read the Businessinsider.com page – coincidentally the only page that moved up to page one following the shake-up and stayed there.
In other words, the only one to satisfy my intent was in fact the only page that remained visible post shake-up. This page, to me answers my question perfectly.
What do these insights tell us about the core update?
Based on our testing, we can deduce that this algorithm is concerned with optimizing search results to support user intent, rather than to audit quality.
- Losses were not drastic, meaning we can rule out a penalty of any kind.
- Of all winners, none appeared to rise as a result of content updates.
- Some sites with strong, relevant content seemingly lost rankings in Google UK, as they were intended for the US market. This suggests that Google was auditing relevancy factors beyond just content (i.e. location / tld), to serve the best results and satisfy user intent.
In this respect, Google’s core update was concerned with the nature rather than the quality of content.
What better way to test the match of nature with intent than by shaking up the SERPs for a couple of weeks to determine user reaction?
Should you panic when your content visibility nosedives?
If your content visibility drops, it’s always necessary to carry out checks to ensure you have done everything within your power to mitigate the issue.
In the face of an algorithm update (like the recent broad core update), however, the best advice is to do nothing but monitor the SERPs closely.
If it is algorithmic testing, you most certainly won’t be the only one involved. Other sites will follow the exact same pattern down to the day. That’s a big clue that it’s algorithmic rather than isolated. Talking to others within the SEO and webmaster communities can help you to affirm that yours isn’t an isolated incident, and that you aren’t on the receiving end of a penalty from Google.
Google has confirmed that sites that experienced ranking drops as a result of the broad core update aren’t necessarily doing anything wrong. As I stated at the beginning of this article, the losses that we did observe were short-lived and not drastic.
If you want to make sure that your content is insulated against future updates of this kind, focus on creating content that puts the searcher first and will satisfy user intent. But above all: don’t panic.
A version of this article was originally published to the Pi Datametrics blog.
Much like the world of search and the perpetually updated algorithms of Google, the landscape of non-Google marketing sees techniques, platforms and priorities change over time. What hasn’t changed is the importance of understanding how to generate traffic without Google. Google is big, but it is not good to concentrate all your efforts into just one referrer.
Structured data is a core SEO tactic. Not only does it remove a layer of ambiguity for search engines (they don’t have to infer what a piece of data is; you’re telling them outright), it’s also the engine that drives rich results, which are taking up an increasing amount of real estate in the SERPs.
With more pages than you can even get your head around and issues like product variants, complex filtering systems and expired products, SEO for ecommerce sites requires a different kind of SEO strategy.