From local business listings to reviews of retail products, consumers are increasingly turning to the internet to make purchasing decisions, and businesses need to be ready to harness the power of search engines to boost their reach online. Unfortunately, search marketing is a constantly changing industry with ongoing market shifts, and we recently saw another example of that unpredictability in the data gathering space.
For those of us who work in search engine optimization (SEO), there has been considerable volatility in the last month, with data reporting tools reflecting significant fluctuations. But why has this happened, and how might it impact your business’s digital marketing performance?
Keep reading to learn more about a recent update from Google around the search query parameter &num=100, how it’s affecting SEO, and how it might affect your business’s performance online.
Prior to the recent updates by Google, it was possible for users (and scrapers) to view search engine results pages (SERPs) with 100 results visible per page by typing &num=100 into their search queries. While that might sound like a scrolling nightmare for everyday users, it was an essential resource used by SEOs and scrapers. Displaying and analyzing 100 results at once was the primary method for faster rank checks, long-tail discovery, and competitor snapshots.
However, since as far back as 2018, when the option to change the number of results per page was removed, Google has been leaving behind the “results per page” approach, and a recent update shows the inevitable outcome of that mindset.
In early September of 2025, sources first observed that the &num=100 search query parameter was no longer working. By mid-September, without any word from Google itself, sources theorized that the company was testing the removal of the 100-results-per-page option. Industry reports started connecting the change to anomalies in Google Search Console (GSC) data. Finally, about 10 days from when the change was first observed, Google announced that it no longer supports &num=100.
What this means is that displaying data on the top 100 results now took 10 separate fetches instead of one, leading to higher data costs, slower tool performance, and more data volatility.
This drop of the &num=100 parameter was made as part of a change to how Google supports search activity, not a core update. Google core updates are more significant overhauls and search algorithm changes.
Previous core updates have been used to improve relevant search results and restrict bad actors, such as the 2021 link spam update, the 2023 core update to increase the presence of relevant sites in the search landscape, or the 2024 update to restrict expired domain abuse, meant to stop bad actors from manipulating search rankings.
There has been a lot of speculation as to why Google decided to make this change, but a few theories have come to the forefront. First, industry professionals believe that Google aims to crack down on scraping tools: the tools SEO experts use to gather search data. By getting rid of the 100-results page, SEO scraping and third-party replication of SERPs has become considerably more expensive. For some time, Google has made its stance against unauthorized scrapers clear, and removing scrapers’ impacts on ranking is possibly intended to improve organic search results.
The second theory is that Google is attempting to reduce data usage by LLM-powered assistants and meta-search tools, which also rely on 100-results SERPS, with AI tools being a leading competitor for Google search today.
A key thing to understand is that impressions are counted when a user’s page loads, not when the user actually physically views the link on their screen. That means that, for &num=100 searches, all 100 results get an impression. With this in mind, it makes sense that the end of &num=100 has led to considerable fluctuations in core web vitals metrics, specifically impression data. Now, in order for a rank 95 link to get an impression, the user (or crawler) has to perform a total of 10 fetches, instead of just one. With fewer results per page, there are fewer impressions.
In addition to a drop in impressions, sites may also see a boost in average ranking as low-ranking links are no longer being recorded: the ranking position for a link is only recorded when it receives an impression. However, there’s another thing to keep in mind: as crawlers produce fewer impressions for poorly ranking web pages, the pages’ positions are no longer recorded. In essence, as impressions decrease, data for low-quality content and low-ranking links drops off, and average position increases.
As the impacts of crawlers are reduced, Google Search Console data becomes clearer and connected to true user activity, distilling SERP data to true human user experience. While this represents a significant shift in SEO processes, sources like Search Engine Journal and our own experts say this could be a good thing overall.
“If Google intends to improve the quality of search by removing search bot activity, then it would make sense that average rank would increase across the board as Google is only counting the impressions and rank associated with human behavior,” our CEO, Kwabena “Koby” Ackie, told us. “In theory, this human behavior may be a lead indicator for rank. If the human being doesn't go past a certain page in the SERP, and that data isn't recorded, the total set of recorded ranking data and impressions data is much smaller, narrowing the range of recording "rankings" and boosting the average rank.”
If you’re an SEO professional, you might be nodding along, but everyday business owners are probably still wondering what this change means for them. The good news is that this change is unlikely to have an impact on your brand visibility for actual users, as most everyday users’ search experience will not be affected. While the number of pages reflected in search ranking data and your impressions may be reduced, your site performance in terms of organic search traffic, clicks, and conversions should be retained.
In addition, the reduction in SEO data “noise” from crawlers should help to reveal the true behavior of your target audience, improving our ability to accurately pinpoint that audience with fully optimized content. Without crawler data disruption, we can better understand search queries and search intent, improving SEO strategies and boosting your presence in local search results.
Here at Cobalt Digital Marketing, we are constantly attuned to the news coming out about this update, and we’ve kept a careful eye on our data. We’ve seen the changes described above: while impressions have seen considerable reductions, ranking has improved for most clients, and clicks have held steady. We continue to monitor the situation and are adapting our tactics to account for this new normal.
If you are looking for a digital marketing company that can help your business stay ahead of the competition, even when unexpected algorithm changes take place, you need Cobalt Digital Marketing. We are committed to maintaining the most up-to-date tactics and approaches, with an unshakable dedication to high-quality content and improving our craft. Reach out to our team today to learn whether or not our services are right for you.

5415 N. McColl Rd Suite 109
McAllen Texas 78504
866-774-8130
By submitting your information or reaching out to us, you consent to being contacted by our team. This may include automated messages providing information about our company, promotional offers, digital marketing updates, and other relevant news.
We may contact you through various methods, including but not limited to text messages, phone calls, emails, and direct messages on social media platforms.