🔑 Highlights
-
Google disables &num=100, breaking rank tracking and inflating SEO costs.
-
Chrome omnibox AI delivers instant answers, increasing zero-click searches.
-
87.7% of sites saw impression drops in GSC after bot traffic was filtered.
-
Spam update fallout shows need for proactive audits and quality signals.
-
SEO strategy must shift toward natural questions, not isolated keywords.
Google has shaken the search world with two moves in quick succession: killing the &num=100 parameter that allowed SEO tools to pull full-page search results efficiently, and rolling out AI-powered answers directly in Chrome’s omnibox. Together, these shifts signal a clear direction: more answers delivered inside Google, fewer clicks to websites, and higher barriers for SEO tracking.
At the same time, fallout from Google’s latest spam update continues to ripple through search rankings, while data discrepancies in Google Search Console (GSC) raise new doubts about the accuracy of impression reporting. For marketers and SEOs, the message is simple: adapt now, or risk invisibility.
Google’s New AI Era: No Keywords, No Clicks
For years, SEO has revolved around keywords and the SERP (search engine results page). That foundation is eroding fast. With “AI in the omnibox” now active in Chrome, users can type full questions—or even personal notes—directly into the browser bar. Google’s AI generates instant answers, summaries, and shopping suggestions without sending users to websites.
This represents a massive zero-click shift: websites may never get the traffic they once relied on, even if they rank. Meanwhile, short-tail keywords aren’t dead, but they are fading, replaced by conversational, natural-language queries. To stay competitive, businesses must pivot toward content that answers complete questions rather than targeting isolated keywords.
The &num=100 Shutdown: SEO Tools Hit a Wall
On September 12, 2025, Google disabled the &num=100 parameter, which had long been used by SEO tools to display 100 results per page instead of the default 10. The immediate impact was brutal: rank tracking became 10 times more resource-intensive, increasing costs and breaking data pipelines across the industry.
Why It Matters:
-
Rank trackers like Semrush, Ahrefs, and SerpApi have scrambled to adjust.
-
GSC impressions plunged for many sites—by as much as 25% in a single week—suggesting that bot-driven impressions had been inflating metrics all along.
-
Studies show 87.7% of properties saw declines in impressions, especially in positions beyond page one.
Google’s official stance is that the parameter “was never formally supported.” But the timing—aligned with impression data shifts and outages in tools like Semrush Sensor—suggests this was a deliberate anti-scraping move, likely aimed at both SEO trackers and AI companies piggybacking on SERP data.
The broader implication: Google is tightening control of its data ecosystem, leaving third-party SEO platforms with fewer options and higher costs.
Spam Update Fallout: Winners and Losers
While the &num=100 debacle dominated headlines, Google’s recent spam update has quietly reshuffled rankings. Some sites are experiencing sudden losses in impressions, while others remain untouched.
Key red flags include:
-
Pages vanishing from rankings while others hold steady.
-
Spikes in backlinks from suspicious sources.
-
Drops in clicks or impressions without clear cause.
For affected sites, the fix is clear: clean up toxic backlinks, merge or delete thin pages, and eliminate low-quality AI-generated content. For unaffected sites, the advice is proactive: audit anyway, reinforce trust signals, and monitor server logs. Spam updates are targeted, but complacency is risky.
Search Console’s Bot Problem
The sudden collapse in GSC impressions after September 10 has fueled debate in the SEO community. For months, many believed the so-called “Great Decoupling”—where impressions rose sharply while clicks stagnated—was due to AI overviews cannibalizing traffic.
But new evidence suggests a different story: much of the inflated impression data may have come from SEO bots scraping SERPs at scale. Once Google shut off &num=100, those bot impressions vanished, taking with them hundreds of thousands of “phantom” impressions from GSC reports.
This revelation shakes trust in GSC as a reliable dataset. If impressions were bot-heavy, many conclusions drawn about AI’s traffic impact may need to be revisited. Still, the long-term reality remains: AI overviews are here to stay, and they do absorb clicks that once went to websites.
The Practical Takeaways for SEOs
-
Shift content strategy: Optimize for full questions, not just short-tail keywords.
-
Audit regularly: Spam updates hit fast—clean backlinks, prune thin content, and improve UX.
-
Don’t rely solely on GSC: Impressions may not tell the whole story. Build independent data pipelines.
-
Expect rising SEO tool costs: Without
&num=100, rank tracking is more expensive and less comprehensive. -
Embrace adaptability: Google is changing its ecosystem to favor AI answers. Sites must offer real value to stay visible.
Conclusion: A Permanent Shift
Google’s recent moves show a clear pattern: discourage scraping, reduce external dependency, and funnel users into AI-driven, zero-click experiences. For SEOs, this means higher costs, murkier data, and tougher competition for dwindling organic clicks.
Yet opportunity remains. Those who pivot early—creating content that directly answers user intent, maintaining clean technical health, and diversifying data sources—can still thrive. The SEO landscape isn’t dead, but the old playbook is.