On Friday, April 5, after many website owners and SEOs reported pages falling out of rankings, Google confirmed a bug that was causing pages to be deindexed:

image

MozCast showed a multi-day increase in temperatures, including a 105° spike on April 6. While deindexing would naturally cause ranking flux, as pages temporarily fell out of rankings and then reappeared, SERP-monitoring tools aren't designed to separate the different causes of flux.

Can we isolate deindexing flux?

Google's own tools can help us check whether a page is indexed, but doing this at scale is difficult, and once an event has passed, we no longer have good access to historical data. What if we could isolate a set of URLs, though, that we could reasonably expect to be stable over time? Could we use that set to detect unusual patterns?

Across the month of February, the MozCast 10K daily tracking set had 149,043 unique URLs ranking on page one. I reduced that to a subset of URLs with the following properties:

  1. They appeared on page one every day in February (28 total times)
  2. The query did not have sitelinks (i.e. no clear dominant intent)
  3. The URL ranked at position #5 or better

Since MozCast only tracks page one, I wanted to reduce noise from a URL "falling off" from, say, position #9 to #11. Using these qualifiers, I was left with a set of 23,237 "stable" URLs. So, how did those URLs perform over time?

Here's the historical data from February 28, 2019 through April 10. This graph is the percentage of the 23,237 stable URLs that appeared in MozCast SERPs:

image

Since all of the URLs in the set were stable throughout February, we expect 100% of them to appear on February 28 (which the

Read more from our friends at the Moz Blog