Almost a year ago, Google began the rollout of a series of updates, which succession one practically after the other made the SEO world talk of the “infinite season of Google’s updates”.

The fast-paced succession of three Core Updates (August, October and November), a Helpful Content Update (September), a Spam Update (October) and a Review Update (November) meant the sudden death for many websites or a dramatic negative fall for others.

The March ’24 Core Update and the March ’24 Spam Update did not offer meaningful improvements for the many websites hit by the previous updates. On the contrary, in many cases, the situation got even worse.

As it also happened to other SEOs, many websites contacted me looking for help in analyzing the reasons for their fall, and that means that I had the opportunity to investigate Google updates with a quite large sample of cases as it was not doing since the Panda and Penguin times.

However, even if I share true case studies and, albeit censored, true screenshots from Google Search Console and other tools, I will not reveal the names of the websites I analyzed because of NDA agreements and out of respect for my clients.

On the other, I think that revealing the names of the websites would not add real value to the insights I will share about why their organic traffic dropped and what actions the clients and I have started to implement to revert the situation.

Case study one: “The Fall of the Roman Empire”.

 

Ruin of Rome - Paint.

Sometimes a crawl cannot show us the real state of degradation a website is suffering.

Somehow, it offers us a picture like the painting you can see above: ancient buildings that have clear flaws, but still seem to have the capacity to have a practical function.

Technically, the Spanish portal, which had news and specific lead generation landing pages targeting elderly people, which I was asked to audit, looked relatively fine and did not present any dramatic issues.

However, Google hit it at almost every Core Update:

SISTRIX search visibility index of a portal about elderly people

One thing caught my attention, thanks to having set up the crawler to crawl also the organic landing pages it could retrieve from Search Console and Analytics: the presence of many orphan pages.

A page is defined as an orphan – as you surely know – when it is not linked by any page of a website, but it is still indexable.

Many times, they are listed only in the XML Sitemap, but this was not the case: they were old pages that for some reason were still alive… many pages.

I began to analyze them and pull the thread and soon discovered that they were the URLs of a site architecture before the existing one, and such that it not only cannibalized the current categorization of the portal but also offered contradictory information about the very structure of the content not only for a search engine but also for a user.

Problems, however, never come alone, not in this case either.

Also, many of the articles present in these zombie categories turned out to be orphans, many were very old, and their content is no longer current.

For instance, in the subcategory “New technologies”, there was an interesting how-to article about “Microsoft Word 2003” or PDAs.

However, do not make the mistake of falling to hasty conclusions. Not all “old” content should be pruned.

The problem arises when entire categories are made up only or largely of old content, and this was the case for many categories of the portal that I was analyzing.

The problem of content decay, however, did not only affect these orphan categories but also those regularly navigable within the website, which, theoretically, should have presented a greater presence of fresh or possibly updated content.

Articles hit by October 23 Core Update

Evidence of how the October ’23 Core Update hit the articles of the portal. They experienced a similar behavior in March ’24.

Let us remember that this portal targets an audience of elderly people, and that, therefore, many of its categories are on health, psychology, retirements, solutions such as retirement homes, inheritances and other legal topics.

In other words, topics that easily fall within the YMYL concept and for which, at least officially, Google pays particular attention in terms of quality, correction and topicality of the information offered.

Even just using the questionnaire that Google presents on “Creating helpful, reliable, people-first content”, for many articles on the portal the answers would have been a “no”.

The website presented important issues also in other sections that once had been very successful, many of them playing with the concept of nostalgia like biographies of famous people of whom the target is/was a fan, “what did happen on this day”, “dear memories” et al.

All these sections have been not maintained with the passing of time causing serious problems like biographies talking of some person as if she/he was still alive when she/he died a few years ago already or short articles about TV series presenting very old embed from YouTube and with very poor video quality.

Maybe the website’s owner took too literally the old phrase: “If it works, do not change it”… but the users did not appreciate the average quality these pages had, as the engagement metrics (time on page, bounce rate, pages per session) indicated on Google Analytics.

Taken on a case-by-case basis, these problems may not be considered so serious, so much so that they even seem like classic problems that are postponed to the end of a sprint because there are other things to solve that are perceived as more important.

However, over time they accumulate, and it is precisely the accumulation of small problems that are similar around which Google’s entire discussion on the concept of “threshold” revolves, beyond which a problem can trigger a negative evaluation of the general quality of a website.

What corrections had been planned for implementation?

The first recommendation I gave to the client was to optimize the architecture of the website, so to create more consistency in its elements and eliminate the redundant if not even duplicated ones.

The second, as you can imagine, was establishing a content-pruning process.

However, apart from eliminating the most evident old and low-quality content, when planning the pruning, I also consider metrics that many times are not contemplated in SEO such as:

  • Views for a page, aka how many times a page had been seen.
  • How many sessions and total users landed on the page from channels other than organic?
  • How much did each page contribute, if it did, to conversions and micro-conversions?

As well as these less-considered SEO metrics:

  • The number of keywords for which each page generated an impression on Google Search.
  • The number and quality of backlinks each page has.
  • How much does each page contribute to internal linking with in-content inlinks?

Then, I set up these rules:

  1. If the collected metrics are mostly poor, the content must be flagged for deletion.
  2. If they are mostly positive, the content must be flagged for maintenance and its regular updates calendarized.
  3. If the metrics are very good, the URL is flagged as high-priority content, aka content whose performances must be tracked regularly, its quality improved (i.e.: changing the embed in the case of YouTube videos) and, if it is the case, updated before any other so to maintain the freshness.

Case study two. When too much programmatic SEO is too much.

Another client, a website specialising in rural homes that contacted me because of being hit by the updates of Google, also individuated the reason for its problems in its content.

The diagnostic the client did: Helpful Content Update.

The most important section of the website in terms of organic traffic was its blog and its visibility decreased in coincidence with the rollout of HCU:

12 months search visibility index

Blaming the Helpful Content Update was, at first sight, correct: the evidence was clear as the blog lost positions for almost 2.500 keywords in the top 10:

Ranking decrease after Google update

As you can easily notice, on average the losses are for about 2 to 3 positions, but enough to represent an important decrease in the CTRs, hence of the organic traffic.

However, this does not seem like the classic annihilation we saw many examples of in September 2023, but something milder.

Moreover, classic HCU-triggering problems do not apply in this case:

  • This is a “corporate” blog that does not rely on display ads for its revenue,
  • The authors are not anonymous (on the contrary, they are well-known in the industry and deemed as experts).
  • The website itself is not “yet-another-travel-affiliate-website”, but a recognized portal in its niche.
  • Finally, it seemed to be the only one losing visibility for a type of content that other dozens of websites published and still publish almost in the same style:

Screenshot SERP "Pueblos con encanto" in Google.es

What about seasonality?

Here below you can see how much the searches for “Rural homes” decreased in Spain between the second week of August and the second and third ones of September:

Casas rurales Google trends

It would not surprise me to see the SEO visibility of a website decreasing and increasing following seasonalities patterns.

Google Search Console too, then, was not indicating loss of traffic during and just after the HCU:

Google Search Console Performance view

There were potential problematic patterns that may have triggered a mild hit, some of which surely had a stronger impact later over an already weakened website (March 2024), but many of them were shared with other websites that did not receive any negative effect from the HCU.

So I decided to eliminate the noise of the blog from my analyses and investigate the rural homes catalogue section, and there I found what I consider the reason for the update that indeed represented the real “Balrog” for this website: the October ’23 Core Update (again).

If you look again at the Search Console graph above, you can notice how it is in correspondence with the October Core Update (and the Spam Update):

GSC Performance. Effects of October 23 Core Update

The loss is clear, and using SISTRIX for reviewing the rankings, the evidence was that 56.989 non-blog keywords out of 65.769 ranked in the Top 10 lost positions.

Was the loss due to Google changing something in how it was painting the search results? Not really. The search features were practically the same as before the updates, as well as the perceived search intents.

What was the problem then?

Architecture, again, was inflated to extremely high and dangerous limits through programmatic SEO, generating thousands of new categories and faceted navigation pages that caused:

  • Thin content aka listing of only one or two rural homes if not any of the rural homes listed were not available.
  • Substantial duplicated content, when two localities shared the same rural homes because of how the database had been developed.
  • Extremely high dilution of link equity because of a too-scaled internal linking policy, causing the most important PLPs to see their “authority” dried up.
  • The majority of these PLPs had no search volume, which – considering the relative importance of the user engagement signals on SERP – translated into 0 clicks and almost 0 impressions.

The most evident case was a section created to present rural houses close to some important landmarks.

While it is an interesting idea, it can cause enormous if pushed to its limits.

For instance, a PLP targeted “rural homes close to Puerta del Sol” (the central place of Madrid) and another one “rural homes close to Parque del Retiro” (the most important public park of Madrid):

  1. Rural homes are… rural, so it is not only weird to consider PLPs about this type of lodging solution in a big city (a capital city in this case). Another story is, obviously, a search like “Rural homes close to Madrid”.
  2. Even if “rural homes” existed in the city of Madrid, the distance between Puerta del Sol and Parque del Retiro is so short that the rural homes would inevitably be the same.

walking distance between Puerta del Sol and Parque del Retiro in Madrid

Unfortunately, this problem wasn’t diagnosed until after my consultation, which occurred after the March 2024 Core Update hit the website again, and with even greater force.

What corrections had been planned for implementation?

  • Cleaning the architecture to eliminate every faceted navigation and category PLPs that were considered useless both for the users and the visibility of the website.
  • Reviewing and optimizing the database of rural homes and the rules for the automatized creation of PLPs (i.e.: diminishing the geographical radius considered for indicating that a rural home was related to a location, which previously was very big and causing substantial duplication cases).
  • Reviewing and optimizing the internal linking automated rules to prioritize better the most important PLPs.
  • Creating a calendar for navigation and internal linking review and optimization to better target destinations depending on seasonality.

What about the blog?

Before I wrote that it had problems that caused the HCU to mildly hit it in September.

However, those same issues, now combined with the PLPs (remember that the Helpful Content Update became an integral part of the general algorithm of Google in March 2024), generated a perfect storm that made the blog lose 50% of its clicks and impressions:

march 24 core update gsc

But why – as seen also in September – many other websites creating substantially the same kind of content – had not been hit? On the contrary, many had been prized with better visibility.

The keyword is “threshold”, as in the first case study.

Look at the blog posts that lost the most traffic. What do they have in common?

Patterns

Yes, you are right: the blog posts had been written following repetitive patterns as being listicles with not so much useful information.

Not that the competitors’ websites did not do the same, but the client did it at a too-big scale.

The quality was not worse than other blogs not hit by the March ’24 Core Update (let’s say that they all present an average quality), but the frequency and quantity on the client’s blog were bigger… so to pass the tolerance threshold of how much “low to average quality content” Google may have set up.

The recommendations were these:

  • To transform the listicles into pillar pages.
  • While maintaining the listicle format, to make it more useful, offering not generic information available on every other website, but answers to questions such as:
    • What can’t we miss when visiting this location?
    • What do WE as experts in organizing travels want to highlight that is not normally mentioned as an obligatory visit?
    • Does it offer interesting things to see or do if we travel with children or without?
    • When is the best time to visit the location?
    • How can we reach that location (not just by car)?” and so on.
  • To create content clusters that go into detail on the theme introduced in the pillar (detailed guides to the individual Pueblos Bonitos or the hiking routes or castle X in locality Y…).
  • To optimize the internal linking of content hubs.
  • To optimize the internal linking between related contents but belonging to different content hubs thanks to smarter use of classic elements such as tags.

Conclusions.

These two are just two cases of the many that I have analyzed of websites affected by one or more updates, which Google launched a year ago.

Some had their peculiarities, and some were particular like, for example, a site that had sent Google into crisis because after years of writing mainly football news, also obtaining excellent backlinks for its scoops, it suddenly decided to eliminate that type of content only to publish gossip articles, erasing all the historical thematic relevance that it had acquired over years of work.

But in general they all present patterns common to those described in the two case studies presented in this post.

They are:

  1. A progressive decline in the technical quality of the website due – I cannot describe it differently – to a more relaxed attention that caused an accumulation of trivial problems on their own, but which when accumulated become serious.
  2. A deterioration of the architecture of the website if not a very poorly structured architecture from the beginning. I will never tire of insisting that many SEO problems arise from poor architecture.
  3. “En medio stat virtus,” said the Scholastic philosophers, paraphrasing Aristotle. And in all cases, a website has been penalized for not having known how to maintain this balance. Programmatic SEO is fantastic and mandatory to use on many occasions; too much programmatic SEO inevitably leads to disaster. Or you should not insist on the same type of format and content just because it “works”; its constant repetition feels unnatural, scaled and spammy.
  4. Many of the websites created content looking at that of their competitors ranking better, instead of based on what their potential users want to find and respond to these searches/needs uniquely and differently from others. Why should Google reward the umpteenth content that doesn’t say different things from others that, perhaps, users know more about and, therefore, which they trust more than you?

Remember, as I explained in a previous post, Google itself tells us clearly in its Search Features what things users are looking for when they perform a search. Very few use those suggestions, they simply “copy” what others do. Then they claim theirs is better content, but it is the same old soup with a different name.

Finally, updates – especially Core Updates – touch so many variables that you cannot enter into Mordor, and think that one solution or a few (“eliminate ads”, “write only about things you have experience about” and similar) can truly resolve the reasons why your or a client’s website has been negatively hit by them, also because sometimes those solutions cannot be really implemented (no ads, so what business model can be used instead? Direct experience? Not always you can have “direct experience” about everything that is thematically related to your main topics).

Dedicate your time to deeply analyze each section of your website, using Search Console and Analytics, helping yourself out with Looker Dashboards like the Aleyda Solis one and extensions like GSC Guardian. Only by doing so, you can discover the true problematic areas.

Never forget that sometimes there may exist the possibility that the website has nothing wrong, but Google has reshaped the SERP with new search features and changed the search intent for those queries the website was prominently ranking for, and this means that Search evolves, as so every SEO strategy should do.

If your website has been hit by one or more Google updates, and you need help analyzing the effects and deciding what strategy and tactics to implement to recover the lost visibility and traffic, why don’t you send me a note using the contact form you can find in the Homepage?
Share if you care