Last month, Google disabled the &num=100 search parameter, which allowed users and bots alike to see 100 search results per page. So, why does this matter? And how is it affecting digital marketing professional and tracking tools? Arc Intermedia is actively monitoring the situation. Here’s a breakdown of what is happening:
What was the &num=100 Parameter?
Normally a search on Google shows 10 results per page. The &num=100 parameter was a query-string modifier that essentially asked Google to return up to 100 results on one page instead of paginating every 10 results. SEO tools, rank trackers, and researchers often used it (or relied on tools that used it) to pull a larger set of SERP data in one HTTP request. Because of this, the &num=100 parameter was a key function of many analytical workflows, competitive analyses, keyword tracking systems, and scraping-based tools.
What Changed?
Although it has not yet been confirmed by Google, it appears that the &num=100 parameter was disabled by Google in Mid-September 2025. Around this time, many SEO and digital marketing professionals began noticing that the parameter was no longer functioning properly. The effect has been most visible in organic metrics (especially Google Search Console) and third-party rank tracking systems.
Why Was &num=100 parameter Disabled?
We don’t have a definitive statement from Google yet, but some of the prevailing theories include:
- Reduce scraping / bot load
Allowing &num=100 made it easier for SEO tools and bots to fetch many pages in bulk. Disabling it forces tools to make more (smaller) requests. - Improve the accuracy / quality of reported impressions
Some analysts believe that bots and rank trackers using &num=100 were inflating impression counts in Google Search Console (GSC) by “seeing” and counting results far down the list that real users never scroll to. Removing that helps align reported impressions with what users actually see. - Protect Google’s infrastructure
Handling large result sets and heavy automated access may impose increased load, and Google may see this as a risk, especially with growing AI-based tools that rely on search scraping. - Align with evolving UI / user paradigms
Google has been moving more toward infinite scroll, continuous load, or more dynamic interfaces (especially on mobile). Return to strict pagination with smaller blocks of results may fit better with their evolving UX, metrics, or data policies. - Commercial / anti-abuse strategies
Reducing bulk access to lower-ranked results might also reduce competitive data leakage or limit the degree to which external systems can “mine” Google’s SERP structure. Some analysts see it as part of a broader tightening of the search ecosystem.
What is the Immediate Impact of Disabling &num=100?
From what SEO and analytics communities are reporting:
- Sharp drops in Google Search Console (GSC) impressions
Many site owners saw sudden and large reductions in their “impressions” metric, especially desktop impressions. - Stable click trends
In many cases, clicks (and actual site traffic) have remained stable (or less dramatically changed), indicating the shift is mostly in measurement, not in real user behavior. - Fewer visible “unique ranking terms”
Sites have reported that the number of queries they rank for (visible in GSC) has dropped, especially for mid-tail and short keywords. - Apparent improvement in average position
Because impressions from deeper positions are no longer being counted, average positions in reports often move upward (i.e. “better”). But that is largely a mathematical artifact, not necessarily a real change in ranking. - Disruption / increased cost for rank tracking and SEO tools
Tools that relied on &num=100 must now revert to pagination (making up to 10 requests instead of one), increasing complexity, load, and infrastructure costs. Many tools have acknowledged gaps, slower data, or reduced tracking depth. - Potential pricing or plan changes in SEO tool subscriptions
Because of higher data collection costs, some SEO vendors may pass increased costs onto users (e.g. by reducing the default depth of tracking or raising subscription prices).
It’s worth emphasizing again: in many cases, real SEO performance (traffic, user engagement, conversions) hasn’t dropped — what’s changed is how the metrics are recorded and reported.
What Does this Mean for Website Owners, Marketers, & SEO Professionals?
To reiterate, this change does not affect SEO performance. But there are some adjustments and considerations to tracking and reporting you may want to consider:
- Recalibrate expectations and baselines
- Treat mid-September 2025 as a breakpoint. Historical impressions and ranking data may no longer be directly comparable to post-change numbers.
- Expect lower reported impressions and fewer low-ranking query results. Don’t panic: this is expected.
- Revise benchmarks: redefine what “good” looks like under the new, more conservative measurement.
- Shift focus to more meaningful KPIs
- Place more weight on clicks, click-through rate (CTR), organic sessions / traffic, engagement, conversion metrics, and attribution rather than impressions alone.
- Use trends over absolute numbers. Patterns of uplift or decline matter more than raw counts, especially when the floor of “visible” positions is shifting.
- Monitor query trends, landing pages, behavior metrics, and funnel metrics to understand changes in user behavior, rather than relying exclusively on GSC impression data.
- Review your SEO tools and vendors
- Ask your SEO tool providers how they are adapting. Are they implementing pagination? Limiting depth? Using alternative data sources (e.g. clickstream, aggregate SERP APIs)?
- Check if your reports now contain data gaps or missing results. Expect slower refresh times or limited depth in some tools (especially for positions beyond the top 10 or top 20).
- Be prepared for potential increases in subscription pricing or reduction in included data volume from your tool vendors.
- Rework any scraping or automated SERP-analysis systems
- If your internal systems used &num=100 to efficiently scrape 100 results per query, you will need to shift to paginated requests (e.g. requests for results 11–20, 21–30, etc.).
- Build in handling for rate limiting, delays, error conditions, and potentially restrictions imposed by Google.
- Consider alternative data sources beyond direct scraping. For example, third-party SERP APIs, clickstream data, or aggregate ranking datasets (subject to licensing and reliability).
- Use sampling or limiting depth where practical (e.g. only track top 20 or top 50, since deeper results are less business-relevant in many cases).
- Reassess the value of tracking deep rankings
- The deeper results (positions 50–100) rarely generate meaningful traffic or conversions. With this change, many SEOs are concluding that spending heavy effort to monitor weak positions is lower ROI.
- Consider focusing more effort on moving up within the top 10 or top 20, rather than tracking from very deep positions.
- Some tools may reduce their default tracking depth (e.g. top 20) to control cost and noise.
- Communicate with stakeholders and clients
- Prepare to explain sudden impressions drops or shifts in average positions. Many clients or internal stakeholders may misinterpret these as performance issues when they are measurement changes.
- Show contextual metrics and trends (clicks, traffic, conversions) to reassure stakeholders that true performance is intact.
- Update dashboards and internal reporting templates to reflect this change. Don’t let old metrics mislead decision-making.
- Monitor for further changes
- It’s entirely possible that Google may reverse this decision to disable &num=100. No guarantee, of course, but they’ve done things like that in the past. So, stay plugged into SEO and digital marketing news.
- Watch for possible partial reinstatements, modified behavior (e.g. &num=100 might work in limited contexts or for specific query classes), or new parameters.
- Track whether Google issues formal confirmation or guidelines for this change.
Want to learn more about this change, the health of your website, or how you can be proactive in the age of AI search? Contact the Arc team for a consultation and talk to us about your goals and pain points.
&num=100 FAQs
Does this mean my SEO rankings or traffic dropped?
Not necessarily. In many cases, the drop is in reported impressions, not real user behavior. Clicks, sessions, conversions may stay steady.
Is this change permanent?
We don’t know yet. Google hasn’t confirmed anything. Some scenarios suggest this might be an experimental rollout or part of a broader evolution in SERP design.
Will paid Google Ads reports be affected?
Generally, no. The &num=100 parameter pertains to organic search result pagination, not how Ads reports are generated. Industry commentary suggests PPC / Ads reporting hasn’t shown similar distortions.
Should I use Google’s Custom Search API or some other API?
Possibly. The Custom Search JSON API (and other SERP APIs) may offer structured access to more result data, but they often come with quotas, costs, and limitations. They might be part of your revised architecture, but they are not a free “magic bullet.”
- Evaluate APIs (Google, third-party) carefully in terms of cost, rate limits, licensing, freshness, depth, and coverage.
- Use hybrid models: for example, APIs for high-volume keywords, scraping fallback for others, or sampling techniques.
Arc Intermedia is a source for news, advice, guidance, ideas, and great digital marketing services.