AI everywhere, but Google remains number one: the evolution of visibility
The digital market has found a shortcut: using “AI” as a universal explanation. Is traffic declining? It’s AI’s fault. Are campaigns less effective? It’s AI’s fault. Is the brand struggling to stand out? It’s AI’s fault. Is it raining? It’s AI’s fault. This word closes the discussion and stops the analysis, hiding different problems under a single convenient label.
It is true that artificial intelligence has effectively increased its operational weight: it enters into research, shortens paths, redistributes attention between organic and paid, and reuses reputation signals to choose on behalf of the user. However, the question is another: how much AI is there really in our industry? What do the real numbers reveal?
Our analysis is developed on three levels with operational consequences: adoption, which defines frequency of use; traffic, which measures click generation; and citations, which establish when a brand becomes a source. Here are the five pillars that define visibility, starting with the contraction of Google searches and ending with the defense of authority in a market dominated by instability.
Google’s monopoly resists the decline in searches
Over the past year, we have searched Google 20% less. The State of Search Q4 2025 report by Datos shows an erosion in search intensity per user, which has fallen by a fifth in the United States, while in Europe and the United Kingdom the decline is milder, at 2-3%.
Let’s be clear: we continue to use Google, but we have radically changed the way we search. We have started to use the internet with surgical precision, rendering old browsing habits obsolete and marking a definitive shift towards a phase of absolute efficiency. The data confirms a vertical growth in queries of between 6 and 9 words, loaded with explicit and articulated intent. Searches no longer generate a SERP to be explored by trial and error, but are compressed into a response that solves the task even before visiting the site. The shortened feature is that of iterations: we use complex descriptions to obtain results that are already close to the final goal.
When we access an AI engine, we do so to solve an immediate problem. Analysis of the most visited domains after a generative search shows volumes concentrated on already dominant platforms and highly useful destinations: Google and YouTube remain at the top of the pyramid, followed by GitHub, Microsoft, and Amazon. AI-native domains are gaining ground within a map that confirms the existing balance of power.
This is an important clue for digital marketing because it contradicts the optimistic narrative of AI creating an “alternative web” where traffic is better distributed. In reality, it is compressing discovery and shifting selection to fewer, larger, more reliable, and more recognized hubs.
The market condenses intent into fewer interactions, and demand quickly closes within the same hybrid environment. The “search, read, don’t open anything” behavior has settled in and explains why you can generate fewer sessions while maintaining the same information needs. Traditional search still accounts for a huge share of our digital activity, worth about 10% of global desktop events.
The center of gravity of visibility remains locked in by Google’s monopoly, whose control of the European market remains steady at between 94% and 96%. It remains the main gateway to every external territory. The overall share of AI tools such as ChatGPT, Gemini, Perplexity, and Claude stands at 0.77% in the United States and 0.89% in Europe.
Within this micro-percentage, only two real players emerge: ChatGPT maintains its leadership with a share of between 34% and 46%, while Gemini stabilizes as the second player at 14%. The battle for attention remains in the hands of the monopolist Google, which is also gaining ground on the generative front.
Visibility is reserved for an increasingly exclusive club
Only 0.13% of Italian domains survive the AI Overview filter. Visibility in the above the fold space — the area visible before any scrolling — is a privilege reserved for a select circle of sources. The analyses we conducted with the SEOZoom Observatory confirm a ruthless selection: according to our database, Google knows and ranks over 168 million sites in Italy, but chooses just 216,023 domains to feed its summary responses. The Google AI Mode interface, although still minimal (less than 0.1% of events in Europe), already acts as another gatekeeper. The system governs the transition from free navigation to an assisted path, autonomously deciding how much space to grant you and the effort required to access the original resource.
The existence of such a selective barrier transforms the value of organic positioning. While in the past, occupying the first page guaranteed traffic, today, presence in SERP is the minimum requirement for hoping to be cited. The technical disconnect between the old ranking and the new visibility emerges from data from Brightedge‘s Generative AI Tracker: the overlap between the sources cited and the top results of traditional search is stuck at 54.5%. The machine systematically discards a critical mass equal to 45.5% of the leading sites in the classic ranking. The algorithmic preference falls on entities capable of offering vertical specialization or a data structure that is perfectly readable for linguistic models. Information is extracted from domains that demonstrate pinpoint semantic consistency with the user’s intent.
We are faced with a system that rewards signal quality and information density. Being chosen as a source requires the transformation of the site into a dense information asset, ready for instant acquisition. Competition requires the provision of fragments of truth that the algorithm can validate, rewarding only those who guarantee truly quotable value.
What happens to clicks?
Those left out of the citation lose 61% of organic clicks. Analyses by Seer Interactive describe a market in which traffic is fiercely concentrated in the hands of those who occupy the generative space. Entry into AI Overview reverses the fate of visibility: inclusion as a source guarantees a 35% increase in organic clicks and a 91% jump in Ads performance. Conversely, absence punishes sites with a 68% drop in paid traffic. The erosion of sessions, which Similarweb quantifies at up to 40% for informational searches, transforms the nature of interaction. The click is no longer a mandatory step, evolving into a conscious choice to delve deeper.
The impact of artificial intelligence goes beyond the presence of the generative box: even queries without an Overview experience a decline in clicks. The organic CTR in the absence of AI summaries fell from 2.74% in the summer of 2024 to 1.62% in September 2025, marking a 41% decline. This variation describes a profound change in human behavior: users arrive at the results better prepared, consult multiple sources in advance, and satisfy their information needs without exploring the entire list of results. On the Paid side, the dynamic appears even clearer, with CTR plummeting from 19.7% to 6.34% in SERPs dominated by artificial intelligence. Advertising loses its historical function of discovery, transforming itself into fierce competition over already mature intentions within a less predictable context. We pay for auctions where users arrive later and with a decision already made, making the strategy less tied to the optimization of the individual channel and closer to an overall direction of attention.
The technical difference between traditional search and the generative interface lies in the referral rate. Google Search fluctuates between 17% and 19%, acting as a turnstile that moves traffic outward. The Google AI Mode interface has a referral rate of 2%, operating as a closed perimeter that resolves intent internally. Zero-click searches in Europe and the UK stabilized at around 22.5% in December 2025, creating a barrier that reduces the space for intermediate information pages. Every interaction lost on the results page fuels new validation goals located outside traditional engines. When the machine’s summary exhausts superficial curiosity, traffic flows head to YouTube, Facebook, and Reddit. Data from Hootsuite Trends 2026 certifies that 52% of users use social networks as secondary engines for final validation of information. These platforms act as verification nodes where people seek social proof, visual detail, or technical confirmation. Visibility today requires the ability to oversee the entire discovery process, ensuring that the brand remains the constant point of reference between the response generated and the final destination of navigation.
AI in Italy: imperfect adoption
48 and 15.7 are the two percentages that define the fragility of the national generative habit. These figures illustrate the deep divide in our society, where artificial intelligence has won over half the population while the productive structure struggles to respond to the new demand for information.
The Euroconsumers/Altroconsumo survey of January 2026 certifies that 48% of Italians use generative AI, marking a 20% increase in just one year; However, technical expertise remains a distant goal. Only 33% of users feel qualified, and just 14% use the services on a daily basis. It is a critical mass that questions the machine without the necessary criteria to validate its output independently, fueling fragmented and uncertain demand.
Adoption is growing as a habit far from mastery, generating two immediate effects. The first concerns the generation gap: among the over-60s outside the professional sphere, only 19% feel competent, while among 18-26 year olds, the figure rises to 56%. Identical results are interpreted and used differently by different audiences, shifting expectations towards brands in terms of trust and the need for verification. The second effect lies in the decline in satisfaction. The increase in use is moving in the opposite direction to satisfaction: only 55% of users say they are happy with the results, down 3 points from 2024. The initial enthusiasm has given way to an awareness of incorrect answers, the infamous hallucinations. Search behavior has undergone a defensive change, where declining trust pushes users toward a phase of continuous verification.
Consumer demand also clashes with a still embryonic corporate offering, highlighted by Istat data showing that small and medium-sized enterprises remain at 15.7% adoption. The disconnect between market curiosity and corporate unpreparedness is total: while people experiment, small and medium-sized enterprises remain spectators, missing the opportunity to feed models with proprietary and authoritative data. Italian consumers use AI as a quick guidance tool, seeking social proof and signs of human authority before finalizing a purchase. Curiosity governs discovery, while trust requires a solid and consistent brand presence on every validation platform.
On the operational front, we must manage the balance by providing the machine with the technical information to choose us and offering people the security they need to prefer us. The decline in trust brings those with established signs of reliability and reputation back to the center of the discovery process. The visibility game in Italy is won by positioning oneself as the source of truth that users turn to in order to complete their search, using brand authority to overcome the uncertainties generated by a technology that is still perceived as fallible. AI is now widespread, but its uneven use and unstable trust require us to look beyond simple usage rates and focus on the interface’s ability to decide on behalf of the user.
Authority as the only defense against model instability
The order of brands recommended by chatbots changes in 99% of cases. Fixed positioning belongs to the archaeology of digital marketing: analyses by SparkToro confirm chronic instability, with suggestions varying in almost every single search session. Ranking is a fluid variable and visibility now depends on frequency of citation, i.e., the ability to appear consistently in the responses generated across an entire set of intents.
The technical solution lies in building a rock-solid Local Authority, where the consistency of the data distributed across each node of the network becomes the primary selection parameter for the machine. Findings from the SOCi Local Visibility Index show that artificial intelligence compresses local discovery into a single recommended choice, making visibility 30 times more selective than traditional search. ChatGPT recommends just 1.2% of brand locations, compared to an average presence of 35.9% in Google’s local pack.
This barrier also punishes those who dominate classic positioning: there is only a 45% overlap between local SEO leaders and AI-recommended brands. The models act as extreme reputation filters, setting “sentiment floors” below which the brand disappears: ChatGPT selects almost exclusively businesses with an average rating above 4.3 stars.
You have to interpret stability as a technical illusion and invest in signal persistence. Only total consistency between reliable data and social proof signals allows you to overcome the uncertainty of the models. Brand authority is the only anchor capable of guaranteeing a certain presence in a system that continuously reshuffles its sources of reference.
What you can do with SEOZoom
So, are we making a lot of noise about nothing? Are AI optimization efforts only worth 0.89% of traffic?
This interpretation is wrong for several reasons – and those who rest on their laurels thinking they have escaped danger risk finding the quickest way out of the market!
In addition to underestimating the impact of Google’s AI, this figure does not even describe the actual adoption of the technology, but rather the current inability of interfaces to deliver the value they extract. While clicks languish, over 30% of people already consult AI systems to decide what to buy or who to trust. The scarcity of referrals is a technical and regulatory bottleneck that is bound to burst: if (when) citing the source becomes a legal requirement, traffic will explode with instant violence towards those who have been able to position themselves in advance. Visibility within Gemini, ChatGPT, or AI Overview has a brutal advantage over traditional SEO: success is mathematical and deterministic. Here, there is no guesswork, only calculation. We can know with pinpoint accuracy where we are going wrong, what we are missing, and what signal we need to send to be chosen by the machine.
At SEOZoom, we have translated this logic into an operating protocol that eliminates uncertainty. Through the AEO Audit and the GEO Audit, you can dismantle your digital footprint to identify the semantic gaps that make you invisible to the machine. This is a necessary step to overcome the 0.13% barrier and claim space in the upper visual area, ensuring that your brand is ready for instant acquisition by language models. The results of this strategy can be found in the Analytics section of the project, where a tab isolates AI traffic to show you how much you are actually bringing home from platforms such as Perplexity or Gemini. The evidence separates those who generate real visits from those who simply feed other people’s models without getting any returns.
At the same time, the AI Prompt Tracker acts as a reputation radar: you monitor the frequency and quality of mentions in real conversations, finding out whether the algorithm suggests you as a solution or prefers your competitors. To fill any gaps detected, the AI Engine facilitates the production of content structured according to the density requirements of the vectors. Dense assets ensure that the site speaks the language of entities, reducing the risk of exclusion. The ultimate measure of your strength is entrusted to Zoom Authority, the indicator that certifies the strength of your brand signal and its persistence in a market dominated by volatility. A high authority value ensures that your brand remains the constant reference point between the generated response and the final navigation goal.
SEO is still important with GEO, remember: it is the only way to dominate artificial intelligence and transform technological complexity into a definite competitive advantage.
