Article Digital Transformation
15 January 2026

The death of the blue link (and what comes next!)

In today's article, David Low reflects on the impact of the shift to AI interfaces and the potential impact on the search results we've known and loved since the early 2000's.

There’s a running joke in digital marketing that SEO experts are the only people who still type full questions into Google. The rest of the world just asks ChatGPT.

Like most jokes, it’s funny because there’s a kernel of truth in it. And that kernel is quietly reshaping everything we thought we knew about how businesses get found online.

I’ve spent a chunk of the past few months at Waracle thinking about this shift – partly because clients are asking pointed questions about the future, and partly because the data is now impossible to ignore. What started as AI being a cute party trick (“look, it can make funny pictures of pies!”) has evolved into something that fundamentally changes how information flows on the internet.

So let’s talk about GEO – Generative Engine Optimisation – and why it matters even if you’ve never heard the term before.

Onwards.

The great decoupling

For the best part of 25 years, the basic bargain of the web was straightforward: create content, optimise it for search engines, get traffic, monetise that traffic. The “ten blue links” were the front door to the internet, and Google was the bouncer deciding who got in. Google treated it so carefully they even a test with 41 shades of blue to be certain.

But that model is breaking down faster than people realise – it’s under attack from multiple vectors.

Google has already been busy adding its own furniture above the ten links, be it commercial SEM boxes or its own modules – travel search ,for example, is a churn business where Google knows people go there to find an agent; replacing it with their own real time pricing engine directly removed traffic from its clients. To an extent it’s cannibalising its SEM money so they must be sure they can retain people better over time by doing it. 

But now that’s spreading further, the numbers are stark. According to SparkToro’s zero-click search study, over 60% of Google searches now end without anyone clicking through to a website at all. When an AI Overview appears (those multi-paragraph summaries Google now shows at the top of results), that figure jumps to around 83%. For informational queries – the bread and butter of content marketing – organic click-through rates have collapsed by 61% since mid-2024, with paid CTRs down 68% as users simply get their answers from the AI summary.

Let that sink in. Six out of ten searches don’t result in a single website visit. The user asks, the AI answers, job done.

Gartner reckons traditional search volume will drop by 25% by 2026 as people migrate to conversational AI assistants. Whether that specific number lands or not, the direction of travel is pretty clear. Google’s AI Overviews now appear in over 60% of all US queries according to Xponent21’s analysis – this isn’t a beta experiment anymore.

And all this is after you factor in the assault on search from ‘influencing’ – social curation of content where people don’t need to go searching, it’s placed in front of them based on graphs of linked behavioural data. Look at Spotify’s ongoing product-led growth; they experimented early with real time sharing of playback but that’s morphed over time into one of the world’s largest datasets of behaviour, which they unleash once a year as “Wrapped”, creating mainstream headlines as much as it drives inter-person behaviour influencing.

Look at any TikTok influencer advert and it comes with a baked-in “search for…” box and that stops you ever needing Google for the same thing. Growth brands prefer to lock you in through content and Shopify tools than chancing their arm on SEO or fighting for an expensive SEM position.

From links to citations

And this is where it gets interesting. The panic about “traffic apocalypse” might be somewhat misplaced – at least for certain types of business.

The visitors who do arrive via AI referrals are worth dramatically more. Research from Semrush suggests AI-referred visitors convert at 4.4 times the rate of traditional organic traffic. Ahrefs reported something even more eye-opening in their own analysis: whilst AI search accounted for only 0.5% of their total traffic, it drove 12.1% of their new sign-ups. That’s a 23x multiplier on conversion.

Why? Because by the time someone clicks through from an AI answer, they’ve already done their research. The AI has compared options, synthesised information, and pre-qualified the intent. The tyre-kickers got their answer in the chat window. The people who actually click through are ready to do something.

This creates a rather uncomfortable new equation. Lose 50% of your traffic but triple your conversion rate? You might end up ahead. The maths won’t work for everyone – particularly ad-supported publishers who need raw pageviews – but for transactional businesses, it’s a different story entirely.

Is it any surprise OpenAI has an “inject your products into ChatGPT” service, much like Google and Yahoo with their “add your link” tools 25 years ago? They know what they’re trying to convert – attention.

The Chegg moment

As always, not everyone’s a winner (baby). The cautionary tale of the moment is Chegg, the homework-help platform that built a massive business on students paying for answers to academic problems.

Then ChatGPT arrived, and students realised they could get the same thing for free.

Chegg reported a 49% decline in non-subscriber traffic in early 2025, forcing them to lay off 45% of their workforce. Their business model – essentially gatekeeping information that an LLM could synthesise – was cannibalised almost overnight.

The lesson isn’t subtle. If your value proposition can be summarised by an AI, your value proposition is in serious trouble.

According to analysis from The Digital Bloom, Stack Overflow saw a 17.7% traffic drop shortly after ChatGPT launched, as developers started asking the bot rather than waiting for fellow humans to respond on a forum. Business Insider is down 48.5% according to Similarweb data. The Huffington Post fell 40-42%.

Meanwhile, People.com posted 27% year-on-year growth. It turns out celebrity gossip and entertainment – things you want to engage with rather than just extract information from – remain stubbornly human.

The invisible audience

But that’s another interesting paradox in itself; that complicates all this: a growing chunk of your “traffic” isn’t human at all.

Security firms and analytics providers now report that automated bots account for over 50% of all web traffic. AI agents scraping content to train models or generate real-time answers are consuming your pages without viewing ads, signing up for newsletters, or buying anything. Wikipedia reported a 50% surge in bandwidth consumption from AI bots since January 2024 – real infrastructure costs for traffic that generates no direct revenue.

Your content might be “read” millions of times by AI systems, which then convey that information to millions of humans via ChatGPT or Perplexity. But your analytics dashboard just shows declining sessions. The disconnect between influence (how much your content gets used) and traffic (visits to your site) is getting harder to reconcile.

What actually works

Alright, enough doom. What does one actually do about this?

A seminal research paper from Princeton, Georgia Tech, and the Allen Institute for AI – titled simply “GEO: Generative Engine Optimization” – found that specific optimisations could boost visibility in AI responses by up to 40%. The strategies that worked weren’t revolutionary, but they were specific:

Fact density matters. LLMs are tuned to prioritise content stuffed with verifiable data points. Citations, statistics, quotations from named experts – all increase the likelihood of being referenced. The fluffy, conversational, “helpful” tone that SEO wisdom told us to adopt? Less useful when your reader is a machine parsing for facts.

Structure is everything. AI agents are not humans. They parse code and structure. Schema markup, clear headers, logical hierarchies – these help the machine understand what you’re saying. Walls of text get ignored. Research from Single Grain found that sites with comprehensive schema markup showed 40% higher click-through rates because AI agents can actually parse and present their data cleanly.

Brand beats backlinks. This one surprised me. Research from Seer Interactive found the correlation between branded web mentions (people writing about you, even without linking) and visibility in AI Overviews is 0.664. The correlation for traditional backlinks? Only 0.218. “Brand” is becoming the new “Backlink”.

Be the primary source. If you own the data, you own the citation. Publishing original research, proprietary data, or unique insights means AIs must cite you when that information is needed. In a world of synthetic text, grounded truth becomes incredibly valuable.

The concentration problem

One thing that should worry everyone: GEO appears to follow a “winner takes most” dynamic.

Analysis of Google’s AI Overviews shows the top 20 domains capture nearly 66% of all citations. Wikipedia alone accounts for over 11%. This isn’t the long tail of the open web; it’s a power law where a handful of trusted entities hoover up the vast majority of visibility.

For smaller players, this creates a genuine dilemma. Do you try to compete for general topics against the Wikipedias and WebMDs of the world? Or do you go deep and narrow, becoming the unquestioned authority in a specific niche that the generalists can’t serve as well?

I’d argue the latter. Generalist sites are losing to specialists because AI prefers to cite a healthcare site for health queries and a coding site for coding queries. Diluting your domain’s focus hurts your proximity to the topics that matter in vector space.

The scale of the shift

Just to put numbers on where attention is actually moving:

According to Sequencr AI’s GEO statistics report, ChatGPT now processes over 1 billion queries daily, with 400 million weekly active users according to OpenAI’s disclosures. Perplexity – the “answer engine” positioning itself as a research-focused alternative – saw query volume grow by 239% in less than a year, reaching 780 million queries per month by May 2025. Average session duration on Perplexity is over 23 minutes, compared to the seconds typically spent on a Google results page. People are treating these tools as research partners, not directories.

Nearly 35% of Gen Z users now use AI chatbots as their primary tool for information discovery, bypassing traditional search entirely. That’s not early adopters anymore; that’s a generational shift in behaviour.

The Boring Practical Bits

If you’re wondering where to start, here’s the dull-but-practical to-do list:

  1. Fix your Schema. Most sites have basic or broken structured data. Organisation schema, product schema, FAQ schema – get it done properly.
  2. Check Bing. Yes, really. ChatGPT and many other AI tools lean on Bing’s index, not Google’s. Bing Webmaster Tools suddenly matters again after years of being ignored.
  3. Create an llms.txt file. Like robots.txt but for AI crawlers – it tells them which content matters and provides high-level context about what you do.
  4. Answer specific questions well. Long-tail, question-based queries are the sweet spot. Don’t aim to rank for “CRM software” – aim to be the single best answer for “what’s the best CRM for a 10-person consulting firm”.

Build credentials into your content. Author bylines, expert quotes, citations to primary sources. The “E-E-A-T” stuff that SEO people have been banging on about for years? It actually matters now.

What comes next?

The honest answer is: nobody entirely knows, but let’s have our lunchtime pie and look through the crystal ball of uncertain doom.

We’re probably 2-3 years away from the “Agentic Web” becoming mainstream – where AI agents acting on behalf of users do the searching, comparing, and potentially even purchasing. Gartner predicts 60% of B2B seller work will be done by AI by 2028.

But just as we put this article together Google, Shopify and other big hitters in the world of e-commerce, announced the Universal Commerce Protocol (UCP), a new open standard co-developed with Google to bring commerce to agents at scale. Native shopping on Google surfaces is rolling out soon, which means Shopify merchants can sell directly in AI Mode in Google Search and the Gemini app. It’s the 2015 gold rush of agents on steroids, and it won’t be the last such surprise.

Share this article

Authors

David Low
David Low
Chief AI Officer

Related