A Checklist to Identify the Cause of Your Site’s Google Organic Search Traffic Loss

Checklist: Google Organic Search Traffic Loss

Have you checked your Web analytics just to find out your site has lost a high share of its organic search traffic in the last days? Don’t panic! Here’s a checklist that will help you to identify the cause (you can also download a pdf version here):

1. A Tracking Problem?

Verify if your site is suffering from a real general traffic loss happening to all of your channels and pages, or specific one per area or device caused by configuration issues by checking the following:

  • Has your site been Offline? Verify if your site is (and has been) online and correctly working, or if it might have been down due to technical issues. You can be alerted when this happens by using an “uptime service” such as: Pingdom Uptime MonitoringUptimeRobot or Little Warden.
  • Do you have analytics tracking issues? Google Analytics Web tracking setup is correct for which you can verify if you have lost the traffic from all the sources or it is only happening to organic, as well as if this is something affecting all of your site’s pages or only certain areas. If it’s happening to all the traffic and not only to organic, use Google Tag assistant to verify your top pages Google Analytics tracking configuration, as it could be due to a configuration error in existing pages or new ones, such as those generated when launching AMP -for which you will need to set up Google AMP Client ID API service– that could not be correctly tracked.

Make sure that your site is running smoothly and all your properties pages are correctly tracked before moving on.

2. A Search Behavior Change?

Verify if there has been a change in users search behavior due to seasonality or even change of preferences:

  • Compare your current traffic trend with the ones of previous years: Use your Web Analytics, to see if they coincide. If there’s a difference verify if this only happens with your organic traffic or also with any other source of traffic too.
  • Identify your audience search volume trend for your targeted queries: Using the Google Keyword Planner “Search volume trends” feature or the trend shown in most of keyword tools, such as KWFinder, in case you don’t have historical data to see if they coincide with your traffic trend.

The goal should be to identify if the negative traffic trend is due to a lower number of searches and/or overall traffic activity due to seasonality or any other change in the audience behavior.

3. A Search Results Page Change?

See if there has been a change in the way your pages -or the ones of your competitors- are displayed in search results for the queries you have been ranking, impacting their visibility and CTR.

The goal should be to check if your site pages CTR in search results has decreased because of a change in their SERP features, rich results, direct answers, or inclusions in top stories carrousel due to an AMP implementation, the knowledge graph, maps, images or videos results, or even number of ads displayed in the SERPs:

  • Use your own Google Search Console CTR vs. Impressions data from the Performance report: Look for drops in Google Search Console “Search Analytics” (“Performance“) CTR vs. Impressions trend over time for your top queries and pages per search type (Web, image, video), as well as the rich results metrics and AMP metrics under the “search appearance” and enhancements reports.
  • Check keywords tools SERP Features for your domain vs. the competition: Checking your site SERP visibility metrics in SEMrush SERP features or Sistrix SERP features, and compare them with the ones of your competition, in case they have been the ones increasing their visibility, impacting your site pages CTR negatively.

Before worrying over a potential rankings loss, verify first if the negative traffic trend is due to a lower CTR caused by a change in the way your pages are displayed in search results and/or their visibility vs. others sites or ads.

 4. A Real Drop in Organic Search Rankings?

If it hasn’t been a tracking issue, a change in user search behavior or SERP features change, it’s then time to check if the organic traffic drop is due to a real negative change in rankings and if so, where and how is affecting your site by:

  • Check your rankings:
    Verifying your rankings per query, pages, in each device and search type with Google Search Console “Search Analytics” (“Performance”) sectionSEMrushSearchMetrics or Sistrix Visibility in case you don’t specifically monitor your rankings with tools like SEOmonitor or STAT (which is also recommended to do) to identify which are the specific areas, pages types and queries affected by the loss in case it hasn’t been something affecting the whole site.
  • Compare with your competitors rankings:
    Comparing your rankings vs. the ones of your competitors to identify if they have also lost them too, and see which of them have increased in rankings while you have decreased.

If you can identify a real drop in rankings, whether in certain pages or the overall site that also correlates in the dates of your traffic loss, then you should identify its causes in order to fix the issues.

 5. A Content Crawlability Issue?

Have you changed your site Web structure, migrated towards HTTPS or another domain without following SEO best practices accordingly, or moved to a new CDN or hosting without checking that their configuration allowed search bots access, and now Google can’t reach your pages? You can be alerted of technical configurations that can cause crawlability issues with Little Warden or ContentKing.

Verify if the loss of organic traffic and rankings are due to Web crawlability challenges by:

  • Checking your Google Search Console Index Coverage and Crawl Errors and Stats:
    Revise your Crawl Errors and Stats from the old version and to the new Google Search Console “Index Coverage” report, to find potential crawling errors affecting those pages for which you have identified a rankings and traffic drop. If you find them there, check which type of error it is: Server errors (5xx), unauthorized requests (401), not found (404), blocked by robots.txt, redirect errors, soft 404s, among others.
  • Using the “Fetch as Google” feature in the Google Search Console:
    Validate that the crawlability issues are still happening with Google Search Console’s “Fetch as Google” and selecting the “Fetch and Render” functionality, with both the desktop and smartphone bots to verify that they are correctly accessible, and alternatively verify the configuration of your Robots.txt, for which you can use Google Search Console Robots Testing Tool, if a Robots.txt blockage was specified as the cause of the crawlability issue.
  • Simulating a crawl with SEO Crawlers:
    Verify yourself that you’re blocking the Googlebot, by crawling the site simulating the mobile as well as desktop ones with SEO crawlers like ScreamingFrogDeepcrawlBotifySitebulbRyte and OnCrawl.
  • Revising your own Web server logs:
    If the results are not yet clear: Check your Web server logs to look for crawling issues and identify if there’s a correlation with the drop times, areas and devices and/or gaps with your own crawling and traffic, by using tools like Screaming Frog Log AnalyzerSEOlyzerBotify Log AnalyzerOnCrawl or Loggly.

If you have changed your URL structure and verified that are suffering from crawlability issues, check out these Web migration’s SEO best practices and steps to follow if you haven’t done a Web migration well to recover your traffic.

After fixing and verified that the issues are gone, remember to use the “Validate Fix” option from the new Google Search Console “Index Coverage” report for the affected pages, which will recrawl to see if they are now correctly accessible and indexable.

 6. A content indexing issue?

Have you non-indexed your pages by mistake or changed your content implementation relying on Javascript? You can be alerted of technical configurations that can cause indexing issues with Little Warden or ContentKing.

See if Google is not correctly indexing the content of the pages affected by traffic & rankings loss by:

  • Verifying if there has been a decrease in your site number of indexed URLs with Google Search Console “Index Coverage” report and the specified reason for leaving them out of the index.
  • If your pages are shown in Google Search Console as not indexed: Use the Google Search Console Index Coverage as well as the Google inspection tool to validate the specific reason of why the affected pages URLs have been excluded of the index: if the pages have been noindexed, if there’s an alternate page selected as canonical, if Google chose another URL as canonical different to the one specified, etc. due to an overlooked configuration issue that you can change.Do a manual check yourself by crawling your site with an SEO crawler too to verify all the affected pages with indexing issues and their specific meta robots and canonicalization configuration, that confirms what is shown through Google Search Console.
  • If your pages are shown in Google Search Console to be indexed: Check if your content relies on JavaScript that might not be correctly indexed by Google by disabling the JavaScript from your browser -you can use the Web Developer Chrome Extension to easily do this– and verify if your pages content is still correctly shown, and compare the one shown in the HTML and the Google cache (use the “site:” search operator to look for them) versus the one shown in the DOM (by using Chrome DevTools) and the Google’s Mobile Friendly Test and Google’s Rich Results test tools that shows the rendered HTML for mobile and desktop respectively. Some SEO crawlers already allows to easily verify this like ScreamingFrog as well as Sitebulb too. Take into consideration that:
    1. Google uses a Web Rendering service that is based on Chrome 41, so you should also do this test by using this Chrome version ideally
    2. they do it so in two waves as mentioned by John Mueller in Google I/O: “if the page has JavaScript in it, the rendering is actually deferred until Google has the resources ready to render the client-side content”, and the reason of why it should be avoided to show critical content at the moment.

After fixing the indexing issues, remember in this case too the “Validate Fix” option from the new Google Search Console “Index Coverage” report for the affected pages, which will recrawl to see if they are now correctly  indexable.

 7. A Google Search Console misconfiguration?

Check if you haven’t changed your Google Search Console settings as there are features that if misconfigured could negatively affect your existing organic search visibility, such as:

 8. A Web security problem?

Verify if Google has detected any security problem (due to malware or spam) by checking the Google Search Console “Security Issues” area. If so, you should follow the process specified here to fix the issues and request for a review.

 9. A manual penalty?

If the traffic loss coincides with a rankings drop but you can’t find any new crawlability or indexing issues affecting your site: Check if your site has been manually penalized by not following Google’s Webmaster Guidelines -the reasons can go from having unnatural incoming links, keyword stuffing to cloaking-  by checking if you have received a message warning about it in the “manual actions” section of Google Search Console.

If so, action type, reasons and areas of the site affected will be provided there and you’ll need to take the appropriate  clean-up or optimization steps before asking for a reinclusion review. If it’s a link related penalization you can use tools like Kerboo, Link Research Tools and CognitiveSEO to do it.

 10. A Google Search Algorithm Update?

Finally, if you haven’t received any manual action message -but the traffic loss coincides with a rankings drop and there aren’t any new crawling and indexing issues-, verify if you could have been negatively affected by a Google search algorithm update, by checking the correlation of Google’s algorithm updates with it.

In order to do it so you can check the following sites that document Google’s updates to look for coincidences:

As well as checking SEMrush and Sistrix organic rankings and visibility shown to your site and look if there are any updates listed directly there coinciding with the drop. Additionally, you can use the Panguin Tool.

Leave a Reply

Your email address will not be published. Required fields are marked *

Do you have a challenge we can help you with?

Let's have a chat about it! Call us on +91 99678 10264

Send us a message