SEO Tip: Enterprise Level Diagnosis for Massive Site-Wide Traffic Drop

enterprise seo diagnosis guide

When we talk about enterprise SEO, we’re referring to websites with thousands to millions of web-pages. And for such a behemoth of a website, where do you start with troubleshooting massive traffic drops?

The best way to go about it is to start from the big picture, rule out each possible factor and drill down from there. Here’s a first step diagnosis plan that hopefully will come in handy for webmasters who are in charge of enterprise SEO.

Use the elimination technique to be able to answer this question: Which is the likely cause of the drop:

  1. Algorithmic changes
  2. Manual penalty
  3. Server Accessibility
  4. Website Deployment Error
  5. Tracking Error

Before you start panicking, first check your data to single out the most likely cause for the traffic drop. Here’s how to go about it.

Algorithm Changes

Do a thorough search for news related to any recent algorithm updates. See what other webmasters are saying and if anyone else is experiencing traffic fluctuations. You can use tools such as Mozcast and Algoroo that closely monitor algorithm updates and make quite accurate predictions. Match algorithm changes and updates to your current SEO efforts and website status. See whether there is anything that was once alright but has become a problem due to new ranking criteria and factors.

If it’s due to algorithm changes, check who is winning and who is losing. Use tools such as Ahrefs, SEOMoz and ScreamingFrog to see how your competitors are doing. For those winning, see what is it that they are doing right that matches the algorithm change. You can start learning from here.

The recommendation is to benchmark with keyword competitors rather than blindly following what is mentioned in blogs and forums because SEO is a ranking game; ‘one man’s loss is another man’s gain’ and vice versa. When you are losing traffic, that same amount of traffic is going to someone else. When you gain a certain amount of traffic, some other websites in combination is losing that amount of traffic.

Manual Penalty

Check if your site received any manual penalties or security alerts. You can check this from the Webmasters Tool (Search Console). Go the the menu ‘Search Traffic’ and click on the sub-menu ‘Manual Actions’. If there is no penalty, it should say ‘No manual webspam actions found’. If any penalty was issued, the details will be shown here.

Next, go to the ‘Crawl’ menu and click on the sub-menu ‘Security Issues’. If Google has detected any malware, content injection or any other hack and security related issues on your website, it will be shown here with details.

Sometimes automated penalties are issued and you will receive no notice of these. In order to identify, check if you’re experiencing the same traffic drop from Bing and other search engines. If all is good and the only drop is from Google, then it’s likely you’ve been issued an automated penalty.

Receiving a manual penalty is perhaps the most difficult challenge to address. What it means is someone at Google has likely manually reviewed your website and business, and then intervened by lowering some of your SEO scores because they feel your site is not providing real value to Google search users and thus should not be in the top ranks.

In the case of manual penalty, no matter how well you do in SEO for all other factors, your threshold to rank has been limited so you will never be able to achieve high scores to rank higher until the manual penalty has been lifted.

Server Accessibility

Algorithm changes and manual penalties tend to result in ranking drops (sometimes with or without lost of indexed pages).

If you’re actively monitoring your rankings for top keywords, key phrases or landing pages, and find that there is no ranking drop, check if your pages are being removed from the Google Index. There’s 2 ways to go about this, in Search Console, go to the ‘Craw’’ menu and click on ‘Crawl Stats’ sub-menu. If the graph looks out of the ordinary such as spikes in time downloading a page or huge drops in pages crawled per day, then that’s not cool.

It means Googlebot is having problems accessing and collecting webpage information from your website. In the case there is no algorithm changes, no manual or security issues and you’re experiencing traffic loss, the next step is to check your server log and bandwidth. You may want to consider increasing bandwidth or blocking less valuable crawlers and scrapers.

Website Deployment Error

Sometimes when you deploy a new release or version of the website, the updated code may impact certain on-site SEO factors such as microdata markups, certain tags, internal linking and website structures for the PLPs (preferred landing pages).

Go through all website changes, features that were launched and hot fixes that took place since 1 month before you started to experience traffic drop. Keep in mind that it may take time for Google and search engines to relearn your webpages.

Tracking Error

You’re probably the luckiest person alive in the world of SEO if this is the case! Reason being, your Google Analytics data may show you a huge drop, but it will have little to no impact on your business. The best way to diagnose tracking error for websites with thousands to millions of pages is to check your server data if the number of hits, clicks on website and other activities are consistent over time. Another way is to proactively install a secondary analytics tools such as Yandex Metrica or Piwik to help you verify.