Skip to content
woman looking at phone

AI Traffic Is Already in Your Analytics. You’re Just Not Seeing It.

Most marketers know we’re undercounting AI referral traffic. What I don’t think we realize is by how much.

We can see what browsers report in the referrer field, and AI platform referrals have been growing. But a lot of AI-driven traffic isn’t coming through a browser. It’s coming through apps. When those apps open a link, the referrer gets stripped or didn’t exist at all before the request leaves the device. That session lands in Direct with no source, no signal, and no way to connect it back.

How much is actually missing hasn’t been clearly answered yet. Most teams are eager to find out; they just haven’t had a way to do so. So we went looking.

What we found: where our server logs captured 56 visits from Gemini on iOS, GA4 recorded only 5 referrals in the same window, 9% of the actual picture.

That ratio matters because Gemini iOS is the most detectable mobile AI surface we tested. It’s the only app that identifies itself in the User-Agent string, and only since mid-February 2026. Every other major app leaves little to no fingerprint anywhere. Not in GA4, not in server logs. Those sessions are Direct with nothing to identify them. The 9:1 gap we measured with Gemini is a floor (and a new one worth monitoring). For every other AI app, the undercounting is unknown and almost certainly worse. This is why marketing teams need to get comfortable with server logs today; GA4 alone will not show you this.

How A Click Becomes Direct

When you click a link in a web browser, the browser passes a Referrer header to the destination. That’s how GA4 assigns a session to chatgpt.com or gemini.google.com and counts it as Referral. Native apps work differently. When an AI app on your phone surfaces a recommendation and you tap it, the link opens in an in-app browser or WebView. That environment typically strips the referrer before the request leaves the device, a behavior that’s become more pronounced as Apple has tightened cross-app tracking in recent years. We observed the same pattern on Android.

The important thing to understand is that this is a choice, not a technical limitation. We know it’s solvable because some platforms are already doing it. Gemini identifies itself in the User-Agent. Copilot’s product and shopping module links append utm_source=copilot.com directly to the destination URL, and that parameter passes through. The capability exists. Most platforms just aren’t using it. Here’s what we tested and what we found:

PlatformSurfaceReferrer PreservedAI IdentifierAttribution in GA4
ChatGPTWebYeschatgpt.com referrerAttributable
ChatGPTiOS appNoNoneDirect
ChatGPTAndroid appNoNoneDirect
ChatGPTMacOS desktop appNoNoneDirect
GeminiWebYesgemini.google.com referrerAttributable
GeminiiOS appNoUA string: GeminiiOS / GoogleWvDirect in GA4, detectable in server logs only
CopilotiOS generic chartNoNoneDirect
CopilotiOS product/module linksNoutm_source=copilot.com in URLAttributable
ClaudeiOS appNoNoneDirect
PerplexityiOS appNoNoneDirect

None of the major AI providers publish a breakdown of usage by platform. But given how quickly AI assistants have grown on mobile, and how dominant app usage is relative to web for most consumer products, it’s reasonable to assume the app-based share of total AI traffic is significant. The surfaces generating the most unattributable traffic are likely also the ones with the highest overall usage.

What The Data Actually Shows

We looked at two accounts where the channel mix is clean enough to isolate what’s happening, both with minimal paid search spend, which removes the brand awareness spillover that makes Direct hard to read.

Client A is a technology brand. Direct traffic held at roughly 50,000 to 55,000 sessions per month across eight straight quarters, Q3 2023 through Q3 2025, with normal seasonal variance and no movement in any other channel to explain a shift. Then Q4 2025 stepped up to around 72,000 sessions per month. Q1 2026 is running around 90,000.

To understand how much AI is visibly contributing, we compared referral-only sessions from known AI sources: chatgpt.com, gemini.google.com, perplexity.ai, copilot.microsoft.com, claude.ai, quarter over quarter. Visible AI referral sessions grew roughly 163% between Q4 2024 and Q4 2025, adding approximately 3,100 sessions. Direct grew roughly 42% over the same comparison period, adding approximately 64,000 sessions.

What makes this harder to dismiss is where the Direct traffic is actually landing. We pulled the top pages being crawled for this client by ChatGPT via Cloudflare, then looked at how Direct sessions to those same pages had grown in GA4. Every one of them follows the same pattern: multi-level content URLs which nobody is manually typing into a browser. When we isolated Direct to pages in this category year over year, Direct grew 118% in the equivalent 90-day window. Five ChatGPT cited pages alone generated 40% of total direct traffic visits within that content pillar. Our ChatGPT top-cited page? That was created 2 months ago and is already dominant across the entire content hub for direct visits.

Client B tells the same story at a different scale. It’s a non-commercial site with no paid traffic, in a subject area AI platforms cite heavily. Direct page views on content pages (excluding the homepage and a high-traffic viral tool) grew over 200% year over year across multiple comparable month pairs. And just like Client A, the growth is concentrated in deep, specific URLs that nobody types. The distribution of Direct traffic has spread across three times as many pages as it did two years ago, with individual resource and program pages pulling tens of thousands of direct visits a month. The pattern is identical.

Why This Matters for AI Monitoring

It’s easy to look at a referral report, see a modest number of sessions from chatgpt.com, and conclude that AI isn’t moving the needle enough to justify a dedicated investment. That’s the wrong read, and the attribution gap is why. If the majority of AI-driven clicks are arriving as Direct, and this data suggests they are, then visible referral sessions in GA4 represent only a fraction of what AI is actually sending. A site showing up consistently in AI results may already be seeing the benefit in its Direct trend line, with no way to make the connection through standard reporting. The teams that have written off AI monitoring because the referral numbers look small may be undervaluing a source that’s already contributing to their results.

There’s a quality dimension worth naming too. When an AI platform recommends your site in response to a specific question, that user has already been guided through a research process and sent to you as the answer. That’s a different intent profile than most paid, social or organic traffic sources, and it’s currently landing in the same bucket as every other unattributed session, where it’s invisible to most reporting.

The current measurement structure undervalues AI visibility in a specific way: web-based AI clicks are attributable and show up in Referral, while app-based clicks, likely the larger share, land in Direct. As long as that asymmetry exists, visible referral numbers will systematically understate the case for investing in AI presence and monitoring. For a deeper look at how the measurement landscape is evolving across platforms, our piece on the three layers of AI search analytics covers where platform-reported data fits alongside server-side and third-party monitoring.

How To See It For Yourself

You don’t need a full research setup to get a directional read on how AI traffic is showing up on your site.

Server log cross-reference: Starting around February 17, 2026, Gemini’s iOS app began identifying itself with GeminiiOS and GoogleWv in the User-Agent string. Pull visits containing those strings from your server logs for any date after that, then compare against GA4 Gemini referral sessions filtered to iOS for the same window. The gap between those two numbers is your visibility ratio for the one mobile AI platform you can actually measure, a real lower-bound multiplier for thinking about total AI traffic.

The manual test: Filter your server logs for your own IP address, then open ChatGPT, Claude, Perplexity, and Gemini on your phone, not the web app, and ask questions that would logically surface pages on your site. When a link appears, tap it. Check your server logs in real time and see whether that request shows a referrer from the AI platform or arrives as a direct hit with a blank referrer. The result is immediate, repeatable, and doesn’t require any additional tooling beyond log access.

Direct channel baselining: Pull your monthly Direct sessions back to 2022 or early 2023, before AI assistants reached mainstream adoption. If your Direct trend has diverged from that baseline over the past 12 to 18 months without a corresponding change in paid, email, or another known driver, AI is worth investigating as a contributor. For context on how AI is reshaping the broader search picture, our breakdown of what actually happened to organic search after ChatGPT is worth a read alongside this analysis.

These Are Early Findings

Two accounts, a controlled test set, and a server-log comparison on a single platform over a short window. It’s enough to establish the pattern, but not enough to quantify it precisely for every site and vertical.

What we’re confident about: the measurement gap is real, it’s larger than most teams assume, and the standard analytics stack wasn’t built to see it. The platforms choosing to pass attribution signals, Gemini’s User-Agent tag, Copilot’s URL parameters, are proving it’s solvable. Until the others follow, the teams building their own methodology around server-log analysis, Direct baselining, and AI citation tracking will have a more accurate picture of what’s actually driving their results.

The traffic is already there. The question is how much of it you can account for.

Want more strategic insights like this?

Our newsletter explores the strategies, technologies, and approaches that are actually moving the needle for privacy-first brands. No fluff, just actionable insights and real-world lessons from the front lines of performance marketing.

 

Wheelhouse DMG Mobile Logo in White and Gold

Contact Us
Name