Start here: how our SEO split tests work
If you aren't familiar with the fundamentals of how we run controlled SEO experiments that form the basis of all our case studies, then you might find it useful to start by reading the explanation at the end of this article before digesting the details of the case study below. If you'd like to get a new case study by email every two weeks, just enter your email address here.
For this week's #SPQuiz, we asked our followers on X/Twitter and LinkedIn what they thought the impact of removing SEO content from an e-commerce customer's category pages would be.
Here’s what they thought:
Poll Results:
Poll results on X/Twitter and LinkedIn revealed that the majority of voters expected a negative outcome, with a only a small minority leaning towards a positive result.
The Case Study
Search engines rely primarily on whatever content they can find on a page to understand its subject and determine which search queries will be relevant. PLPs (product listing pages) serving as category hubs for ecommerce sites often just contain long lists of links, with minimal actual content of their own, and some SEOs have expressed concerns that these pages could appear “thin” to search engines.
Historically, many have considered it an SEO best practice to add “SEO content” to category pages to improve their rankings. In contrast, others have disputed this, citing low user engagement rates with this type of content and Google’s instructions to create content for humans, not bots.
We hypothesized that the SEO content on category pages was irrelevant and doing more harm than good, and that removing it would increase our customer’s rankings for the more relevant keywords remaining on the pages.
What Was changed
We deleted the SEO content at the bottom of the customer’s category pages.
Results
Our test analysis revealed that removing SEO content from category pages did lead to a statistically significant increase in organic traffic from mobile devices, while the influence on organic traffic from desktop devices was negligible:
Mobile devices |
Desktop devices |
Our best theory into why this impact was limited to the mobile device traffic was because the SEO content added to the scroll depth of the page and perhaps the was negatively impacting user experience.
We followed up this experiment by retesting this hypothesis on another part of their site that hosted SEO bot-focused content – and saw a positive uplift to mobile traffic there as well! Seeing the same positive impact on multiple parts of their site help reaffirm our experiment results.
This week's experiment is a good example of why it's important to A/B test SEO changes rather than relying on the promise of SEO best practices.
How our SEO split tests work
The most important thing to know is that our case studies are based on controlled experiments with control and variant pages:
- By detecting changes in performance of the variant pages compared to the control, we know that the measured effect was not caused by seasonality, sitewide changes, Google algorithm updates, competitor changes, or any other external impact.
- The statistical analysis compares the actual outcome to a forecast, and comes with a confidence interval so we know how certain we are the effect is real.
- We measure the impact on organic traffic in order to capture changes to rankings and/or changes to clickthrough rate (more here).
Read more about how SEO A/B testing works or get a demo of the SearchPilot platform.