Case Study: How the Cookie Monster Ate 22% of Our Visibility

The author’s views are totally his or her possess (excluding the not likely function of hypnosis) and may well not normally mirror the views of Moz.

Past year, the group at Homeday — one particular of the primary house tech providers in Germany — built the determination to migrate to a new content material management process (CMS). The objectives of the migration were, between other factors, increased site velocity and creating a point out-of-the-art, foreseeable future-evidence website with all the necessary attributes. A single of the principal motivators for the migration was to allow content material editors to get the job done far more freely in creating webpages with no the support of builders. 

Right after assessing several CMS solutions, we made the decision on Contentful for its modern day technologies stack, with a excellent practical experience for each editors and developers. From a technical viewpoint, Contentful, as a headless CMS, allows us to select which rendering approach we want to use. 

We’re at this time carrying out the migration in quite a few stages, or waves, to decrease the hazard of complications that have a massive-scale detrimental effect. All through the initial wave, we encountered an challenge with our cookie consent, which led to a visibility decline of virtually 22% within 5 times. In this article I will describe the problems we had been facing throughout this 1st migration wave and how we resolved them.

Environment up the very first check-wave 

For the to start with exam-wave we chose 10 Search engine optimisation web pages with substantial website traffic but lower conversion rates. We set up an infrastructure for reporting and checking those people 10 webpages: 

  • Rank-tracking for most suitable keywords 

  • Search engine optimisation dashboard (DataStudio, Moz Professional,  SEMRush, Search Console, Google Analytics)

  • Common crawls 

Just after a thorough organizing and screening section, we migrated the initially 10 Seo internet pages to the new CMS in December 2021. Even though various problems occurred in the course of the testing period (elevated loading periods, more substantial HTML Doc Item Design, etcetera.) we made a decision to go dwell as we failed to see significant blocker and we required to migrate the initial testwave ahead of christmas. 

Very first overall performance evaluation

Extremely energized about attaining the very first move of the migration, we took a seem at the functionality of the migrated pages on the next day. 

What we noticed following truly didn’t make sure you us. 

Right away, the visibility of tracked key phrases for the migrated webpages minimized from 62.35% to 53.59% — we shed 8.76% of visibility in 1 working day

As a end result of this steep drop in rankings, we done a further substantial spherical of tests. Between other items we examined for protection/ indexing concerns, if all meta tags were integrated, structured knowledge, inner hyperlinks, site velocity and mobile friendliness.

Next overall performance evaluate

All the article content had a cache day following the migration and the content was thoroughly indexed and staying read by Google. Moreover, we could exclude various migration hazard components (improve of URLs, written content, meta tags, format, and so on.) as sources of error, as there hasn’t been any changes.

Visibility of our tracked search phrases experienced another drop to 40.60% over the up coming handful of times, producing it a complete fall of pretty much 22% within just 5 times. This was also obviously proven in comparison to the opposition of the tracked search phrases (right here “estimated targeted traffic”), but the visibility seemed analogous. 

Data from SEMRush, specified keyword set for tracked keywords of migrated pages

As other migration chance factors furthermore Google updates experienced been excluded as resources of problems, it unquestionably had to be a specialized challenge. As well a great deal JavaScript, very low Core Web Vitals scores, or a much larger, additional sophisticated Doc Object Design (DOM) could all be potential results in. The DOM represents a web page as objects and nodes so that programming languages like JavaScript can interact with the page and transform for illustration design and style, structure and written content.

Subsequent the cookie crumbs

We experienced to detect problems as promptly as probable and do swift bug-fixing and limit additional damaging consequences and targeted visitors drops. We lastly acquired the initial serious trace of which technical explanation could be the cause when a person of our instruments showed us that the selection of internet pages with superior external linking, as very well as the range of webpages with greatest material measurement, went up. It is vital that pages really don’t exceed the maximum articles sizing as pages with a pretty significant amount of entire body articles may well not be fully indexed. Concerning the high external linking it is vital that all exterior back links are reputable and applicable for users. It was suspicious that the range of exterior one-way links went up just like this.

Increase of URLs with high external linking (more than 10)
Increase of URLs which exceed the specified maximum content size (51.200 bytes)

Both of those metrics were being disproportionately large in comparison to the selection of pages we migrated. But why?

After checking which external hyperlinks experienced been extra to the migrated pages, we noticed that Google was looking through and indexing the cookie consent variety for all migrated web pages. We carried out a site look for, examining for the articles of the cookie consent, and saw our theory confirmed: 

A site search confirmed that the cookie consent was indexed by Google

This led to quite a few complications: 

  1. There was tons of duplicated written content produced for each web site because of to indexing the cookie consent form. 

  2. The written content dimension of the migrated internet pages substantially amplified. This is a dilemma as webpages with a extremely huge amount of money of physique material might not be fully indexed. 

  3. The range of external outgoing backlinks drastically improved. 

  4. Our snippets instantly showed a date on the SERPs. This would propose a blog site or news post, even though most articles on Homeday are evergreen information. In addition, because of to the day showing up, the meta description was slice off. 

But why was this occurring? According to our company company, Cookiebot, look for engine crawlers accessibility web-sites simulating a whole consent. Hence, they achieve access to all information and duplicate from the cookie consent banners are not indexed by the crawler. 

So why was not this the case for the migrated pages? We crawled and rendered the webpages with distinctive consumer brokers, but even now could not uncover a trace of the Cookiebot in the source code. 

Investigating Google DOMs and exploring for a resolution

The migrated web pages are rendered with dynamic data that comes from Contentful and plugins. The plugins comprise just JavaScript code, and sometimes they occur from a spouse. A single of these plugins was the cookie supervisor partner, which fetches the cookie consent HTML from exterior our code base. That is why we didn’t find a trace of the cookie consent HTML code in the HTML source information in the to start with spot. We did see a bigger DOM but traced that back to Nuxt’s default, extra complicated, greater DOM. Nuxt is a JavaScript framework that we do the job with.

To validate that Google was looking at the copy from the cookie consent banner, we used the URL inspection resource of Google Lookup Console. We when compared the DOM of a migrated website page with the DOM of a non-migrated webpage. Within the DOM of a migrated web page, we finally found the cookie consent content material:

Within the DOM of a migrated page we found the cookie consent content

A little something else that acquired our consideration were being the JavaScript information loaded on our previous pages vs . the data files loaded on our migrated webpages. Our website has two scripts for the cookie consent banner, supplied by a 3rd party: a single to present the banner and grab the consent (uc) and just one that imports the banner articles (cd).

  • The only script loaded on our outdated internet pages was uc.js, which is accountable for the cookie consent banner. It is the a person script we need to have in every single website page to take care of user consent. It displays the cookie consent banner devoid of indexing the content and will save the user’s final decision (if they concur or disagree to the usage of cookies).

  • For the migrated pages, aside from uc.js, there was also a cd.js file loading. If we have a webpage, where we want to exhibit a lot more information about our cookies to the user and index the cookie facts, then we have to use the cd.js. We considered that equally information are dependent on each other, which is not correct. The uc.js can run on your own. The cd.js file was the motive why the content material of the cookie banner bought rendered and indexed.

It took a though to find it since we believed the 2nd file was just a pre-requirement for the first a person. We identified that merely taking away the loaded cd.js file would be the solution.

Performance evaluation just after implementing the solution

The working day we deleted the file, our keyword visibility was at 41.70%, which was even now 21% lessen than pre-migration. 

Having said that, the working day just after deleting the file, our visibility elevated to 50.77%, and the upcoming day it was pretty much back again to usual at 60.11%. The believed targeted visitors behaved equally. What a reduction! 

Quickly after implementing the solution, the organic traffic went back to pre-migration levels

Summary

I can envision that several SEOs have dealt with small problems like this. It appears trivial, but led to a significant fall in visibility and targeted visitors during the migration. This is why I counsel migrating in waves and blocking more than enough time for investigating complex problems just before and following the migration. Additionally, retaining a near seem at the site’s efficiency in the months just after the migration is essential. These are unquestionably my important takeaways from this migration wave. We just done the second migration wave in the starting of Could 2022 and I can point out that so much no important bugs appeared. We’ll have two additional waves and entire the migration ideally efficiently by the conclusion of June 2022.

The performance of the migrated web pages is practically again to typical now, and we will continue on with the following wave.