Tag: Google Search Console

  • Why Your GSC Impressions Dropped: Google’s War on Scraping Kills &num=100

    Why Your GSC Impressions Dropped: Google’s War on Scraping Kills &num=100

    Summary: Google has permanently disabled the &num=100 search parameter. This change ends the ability for SEO tools to request 100 search results in a single query. The immediate impact is a 10x increase in operational costs for rank trackers and SERP scrapers, which must now paginate through 10 separate pages. This is causing tools to break or error. A secondary effect is a sudden, sharp drop in desktop impressions reported in Google Search Console. This drop is not a loss of human traffic but the removal of bot impressions from these tools, revealing that historical impression data has been inflated for years. This is a deliberate, strategic move by Google to combat large-scale data scraping, and SEO professionals must immediately audit their tools and re-baseline their impression data.

    Google Kills &num=100 Parameter: Why Your SEO Tools Are Breaking and Your Data Is Wrong

    Have your SEO tools started failing in the last few days? Are your rank-tracking reports full of errors or incomplete data?

    Perhaps you logged into Google Search Console and saw a sudden, terrifying drop in your desktop impressions.

    Your first thought might be a penalty or a massive algorithm update. The truth is simpler and has more profound consequences for our industry.

    Google has effectively killed the Google &num=100 parameter.

    This isn’t a temporary bug. It’s a deliberate, permanent change. It represents a defensive move by Google to combat data scraping. The SEO industry, which has relied on this function for over a decade, is collateral damage.

    This single change fundamentally breaks the economics of most SEO tools. It also forces us to accept a hard truth: our historical impression data has been wrong for years.

    What Was the Google &num=100 Parameter (And Why Did It Matter)?

    For as long as most SEOs can remember, you could add a simple string to a Google search URL to change the number of results.

    That string was &num=100.

    A typical search for “London digital agency” would return 10 results.

    A search using google.com/search?q=london+digital+agency&num=100 would return 100 results on a single page.

    This was the backbone of the entire SEO rank tracking industry.

    Think about how your rank tracker works. It needs to check where your site ranks for a keyword, from position 1 to position 100.

    Using the Google &num=100 parameter, it could do this with one single request. One query. One page load. One proxy IP address.

    It was fantastically efficient. It allowed tool providers like Semrush, Ahrefs, Moz, and hundreds of others to collect massive amounts of data at a relatively low and predictable cost.

    That entire model of efficiency is now gone.

    The New Reality: A 10x Cost for SEO Tool Data

    As of this week, the parameter is dead. Sending a request with &num=100 now simply returns the default 10 results.

    What does this mean for a tool that needs to check the top 100 positions?

    It means it must now make ten separate requests.

    1. It requests page 1 (results 1-10).
    2. It requests page 2 (results 11-20).
    3. It requests page 3 (results 21-30).
    4. …all the way to page 10 (results 91-100).

    This is a catastrophic shift in operational mechanics.

    The cost of Google SERP scraping has just multiplied by ten overnight.

    Every tool provider now faces a 10x increase in the number of requests they must make. This means 10x the proxy IPs, 10x the bandwidth, 10x the infrastructure load, and 10x the risk of being blocked or served CAPTCHAs by Google.

    This is why your tools are breaking. They are failing to get data, timing out, or returning incomplete reports. Their entire scraping infrastructure is being re-engineered in a panic.

    This 10x cost will not be absorbed by the tool providers. It cannot be.

    Prepare for a new wave of price increases across the entire SEO tool industry. The era of cheap, daily, top-100 rank tracking is over.

    The ‘Desktop Impressions Drop’ Mystery Explained

    This tool crisis is running parallel to a second, confusing trend: the widespread desktop impressions drop in Google Search Console.

    At our London agency, we’ve seen clients’ GSC reports showing a sharp decline in desktop impressions, starting on the exact same day the parameter was disabled.

    The immediate fear is a loss of visibility or human traffic.

    This is not what is happening.

    You are not losing human visitors. You are losing bot impressions.

    For years, every time an SEO tool used the &num=100 parameter to scrape a keyword, it registered as a “desktop impression” for every single one of the 100 sites it found.

    Your site, ranking at position 78 for a high-volume keyword, might have been getting thousands of “impressions” per day. These were not humans. These were bots from Ahrefs, Semrush, and countless other rank trackers.

    Now that the &num=100 parameter is dead, that scraping has become 10x harder. The scraping volume has fallen off a cliff as tools scramble to adapt.

    The bot impressions have vanished from your GSC reports.

    The “drop” you are seeing is the removal of this long-standing data inflation. What you are left with is a much more accurate, clean baseline of actual human impressions.

    This is, in a way, a good thing. Our data is finally cleaner.

    The bad news? All your historical desktop impression data is inflated. Any report, any chart, any year-on-year comparison you’ve ever made using that data is based on a contaminated metric.

    This Is Not a Bug. This Is Google’s War on Scraping.

    Some in the SEO community have suggested this is a temporary test or a bug that will be reversed.

    This is wishful thinking.

    The widespread, uniform, and global nature of this change points to a deliberate policy decision. The 10x cost implication is not an accident; it is the entire point.

    This Google search parameter change is a strategic, defensive move.

    Google is in a data war, primarily against AI companies. Large language models (LLMs) are being trained by aggressively scraping Google’s search results. This taxes Google’s infrastructure and threatens its core business.

    By killing the &num=100 parameter, Google makes large-scale scraping 10 times more expensive and 10 times easier to detect. It’s harder for a scraper to hide when it has to make 10 distinct page requests per keyword instead of one.

    The SEO industry is not the target. We are simply the collateral damage in this larger conflict. Google is protecting its data asset, and it is willing to break our tools to do it.

    What SEO Professionals and Agencies Must Do Now

    We must accept this new reality and adapt immediately. Waiting for a fix is not an option.

    Here are the four actions you need to take this week.

    1. Audit Your SEO Tools. Contact your SEO rank tracking provider. Ask them directly how they are handling the removal of the Google &num=100 parameter. Are they stable? Are they paginating? Are they limiting tracking depth? Will their prices be increasing? You need these answers to trust your data.
    2. Re-baseline Your Impression Data. Go into Google Search Console immediately. Add an annotation for the date this change occurred. This is your new “Day Zero” for desktop impressions. You must explain to your clients, bosses, and stakeholders that the desktop impressions drop is a data correction, not a performance loss. All future reports must be benchmarked against this new, lower, more accurate baseline.
    3. Re-evaluate Your Tracking Needs. Do you really need to track 100 positions for every single keyword, every single day? For many keywords, tracking the top 20 or top 30 is more than enough. Reducing your tracking depth will be the primary way to manage the new costs that will be passed down from tool providers.
    4. Prepare for a More Expensive Future. The 10x infrastructure cost for SEO tool data is real. It will be passed on to us, the end-users. Budget for increased subscription fees for all your SERP-dependent tools.

    The end of the &num=100 parameter marks a new chapter in our industry. It’s one with more restricted, more expensive, but ultimately more accurate data. The sooner we adapt, the better.