Matt Burgess avatar

Matt Burgess

low brow entry to high brow topics

You're making decisions on bot data

Lumen CEO says AI bots are taking over the internet It’s a good line. It’s also a PR piece - Lumen sells the infrastructure that carries internet traffic, so they have a reason to make you anxious about what’s on it. But strip the self-interest out and the data underneath holds up.

Cloudflare, which processes a substantial fraction of global internet traffic, found that automated traffic grew at 23.5% year-on-year in 2025 - eight times faster than human traffic. Imperva, tracking bot activity for over a decade, recorded 2024 as the first year non-human traffic overtook human traffic outright.

Most of what is on the internet is now machine-made. This matters for customer insight - and not only in the ways you might expect.


Start with the quant. NPS scores. Customer satisfaction surveys. The numbers that entire teams use to set targets, justify investment, and claim progress. Those reach customers digitally - by email, by SMS, through apps. If half of internet activity is now automated, what proportion of your survey responses are genuine? No one knows with any certainty. If they’re contaminated, then the infrastructure built on those contaminated numbers is increasingly unreliable - and AI systems processing that data to generate summaries and recommendations are amplifying the problem, not diagnosing it. Your decisions are downstream of bots.

And where the quant is now contaminated at source, the qualitative is getting lost in AI-transit. Good research comes from genuinely engaged customers - people who have something to say because they care, not because they’re compensated to attend. The research challenge isn’t just finding those customers. It’s holding what they express in a way that survives the crossing into decision.

Every AI intervention from insight to action - the transcription, the auto-theming, the synthesis, the summary - is a processing step: a riff on a riff. Maintaining signal fidelity was always hard. AI just made it harder.

Auto-generated description: Four girls are outside, with each one whispering into the other's ear, down the line

If what’s upstream has already been summarised, the AI might produce something coherent. It may well produce something with the texture of insight. It will not produce something tethered to what a customer actually experienced.

This is what I mean by customer coupling: the relationship between what a customer expressed and what eventually informs a decision. In a world where AI mediates more of that chain, the coupling degrades faster - each processing step is a translation, and translation loses something. The original signal, skillfully obtained, is anchored in something real. AI processing a summary-of-a-summary is not.

Something genuinely valuable was expressed. The hesitation, the aside, the thing that didn’t fit the question. Whether any of it survives is another matter.

As AI becomes the processing layer for everything, that original signal becomes simultaneously more valuable and more endangered. More valuable because it’s the only ground truth available. More endangered because the incentive to let AI handle it - summarise it, riff from it, approximate it - is overwhelming and accelerating.

What you do with that signal before AI touches it determines the quality of everything AI produces from it.

The architecture for holding customer signal matters more now than it did two years ago. Not less.


Follow along: mattburgess.micro.blog/subscribe… · mattburgess.micro.blog/feed.xml · micro.blog/mattburge… · Mastodon @mattburgess@micro.blog