Is AI really giving us value or is it due a reckoning?
A few years on after AI tools like ChatGPT reared their uncanny heads, the sheen seems to be wearing off.
The same tech that promised to transform everything is showing the classic signs of a bubble: The Bank of England has warned that market concentration in AI stock valuations havenât been this extreme in half a century, while veteran analysts posit the current frenzy mirrors the dot-com era almost perfectly.
It seems like the returns arenât keeping pace with the rhetoric, and itâs starting to feel less like a revolution and more like recursion.
And yet the digital marketing industry has gone all in on it.
Every agency is âAI-poweredâ, every platform âfuelled by machine learningâ, and every tool seems to have added an AI feature no one asked for. Speed has become the new god, and originality the sacrifice.
What happens when the miracle stops performing?
If the AI bubble bursts (and if you read on, youâll see itâs starting to creak), an industry that has built its identity on automation will have to reckon with an awkward truth: the machine was never that clever to begin with.
As a content writer at Bamboo Nine, Iâve ridden the AI wave from the moment it crashed into our industry. Like many on our team, I cut my teeth on copywriting before the prompts and plugins. Iâve watched the promises, the panic, and the pivoting first-hand, and what Iâve seen has left me sceptical.
How it started.
Letâs rewind to what in digital timelines are the sepia-tinted days of yore: 2022.
For many in the industry, it was the closing months of this year when AI slinked onto the scene.
Some called it as world-altering as the Cambrian explosion. Some.
In 2023, ChatGPT was really making changes to our workflows. We were told that tools like it could easily do our jobs. For us in the Content Team, that meant that by feeding ChatGPT a few keywords and a short brief, it could supposedly produce 700 words of SEO-optimised content in seconds.
Shit.
But the reality was far less dazzling.
All pattern no pulse.
The dust has settled somewhat from its initial hype, and itâs fair to say AI has not turned out to be as seismic a creative shift as it was portended.
If youâve used them as much as we have, youâll know that LLMs are⊠fine. But theyâre certainly not revolutionary.
The more we used it back then, the more we realised that everything it wrote sounded eerily similar, and thatâs still the case today. Early versions were worse still: you had to re-feed context every few prompts, and after hours of re-editing to make it sound remotely human, the time saved was negligible and you could actively feel your brain cells dying.
But even as we saw its limits, the hype kept inflating. Clients began to ask the very understandable question: if AI can write faster and cheaper, why pay humans at all?
But this is the same illusion fuelling the AI investment boom, explored below: speed mistaken for value, output mistaken for originality. ChapGPTis an imitator, not a creator.
And thatâs the crux of the AI bubble: AIâs perceived worth keeps climbing, even as the real creative and economic returns flatten. Itâs not a revolution, itâs replication, and replication doesnât hold its value for long.
So whatâs the AI bubble and what does it mean?
A bubble, in economic terms, is what happens when speculation inflates faster than value, and sooner or later, reality catches up.
The numbers donât add up.
OpenAI and Anthropic are growing, yes, but theyâre nowhere near profitable. Their business model hinges on continued faith â and funding â in a technology thatâs improving more slowly with each generation. As The Atlantic notes, ânewer models have been marred by delays and cancellations, and those released this year have generally shown fewer big improvements despite being far more expensive to develop.â
Then thereâs the money. Itâs been reported that Oracle recently inked a colossal $300 billion cloud infrastructure deal with OpenAI. But OpenAI doesnât have that capital, which means itâll need even more colossal funding rounds. So, if OpenAI isnât as potently profitable as it’s predicted to be, the financial fallout of this could be disastrous.
And the returns so far arenât inspiring confidence.
A recent Forbes summary of MIT research found that 95% of enterprise AI pilots produce zero return on investment. And, according to RogĂ© Karma at The Atlantic, âthe entire U.S. economy is being propped up by the promise of productivity gains that seem very far from materializingâ. Citing data from The Wall Street Journal, Karma notes that Alphabet, Amazon, Meta, and Microsoft, the tech titans supposedly riding the AI wave, have seen free cash flow decline by 30% in two years.
So, collectively, the big five (including Tesla) have spent around $560 billion on AI-related projects since 2024, yet generated only $35 billion in AI-related revenue.
Oh dear.
The pressure is building.
This is why weâre getting warnings of âa growing risk that the AI bubble could burst,” because Big Techâs trillion-dollar valuations are built largely on expectation, not evidence.
If the AI bubble hasnât burst yet, itâs only because weâre still inflating it.
The gap between expectation and outcome is widening. And as investors, businesses, and marketers all look for the âtransformative impactâ they were promised, it increasingly feels as though AI isnât the dawn of a new era, but just another expensive dawn simulation.
How the AI bubble affects digital marketing.
The industrial complex of AI thrives on festering FOMO.
It convinces you that if youâre not prompting, youâre falling behind. That if youâre not using AI, youâre in the stone ages.
That illusion of inevitability, that AI is the only future worth having, is what keeps the hype alive. Itâs not just investors pouring billions into models that canât yet turn a profit; itâs businesses and marketers racing to prove theyâre âAI-poweredâ before theyâve asked what problem theyâre actually solving.
I think that the result is an echo chamber of efficiency-speak and automation evangelism, where using AI is treated as a strategy in itself rather than a means to one.
AI limits our thinking.
Add to this that AI isnât merely failing to deliver the productivity gains that justify its price tag; it may also be dulling the very skills it was supposed to enhance. Emerging research is showing the surprising cognitive cost of an overreliance on AI tools.
In the MIT Media Labâs Your Brain on ChatGPT study, researchers used EEG (brainwave monitoring) across groups tasked with writing essays either with or without AI help. They found that participants relying heavily on language models showed weaker neural connectivity, diminished memory recall, and declining engagement in creative tasks â what the authors framed as an accumulation of âcognitive debt.â
Another study, published in Societies by Michael Gerlich, investigates the connection between frequent AI users exhibiting weaker critical-thinking performance due to âcognitive offloadingâ, or letting the machine do the mental lifting.
The pattern is clear: the more we outsource our words, the less we understand what weâre actually saying.Â
If we let AI think for us, we stop thinking like marketers, the very people meant to understand emotion, timing, and persuasion. AI can mimic empathy, but it canât feel it. It can reproduce the shape of persuasion, but not its substance.
Which brings us back to the bubble. Just as investors are overvaluing the economic promise of AI, weâre overestimating its cognitive one. The more faith we place in its output, the less capable we become of producing true value ourselves.
The AI content trough.
So if youâre using AI to mass-produce thoughtless content without an expert understanding of copywriting, of SEO, of strategy, of marketing, youâre doomed to fail. Good messaging needs a brain that understands marketing without a chatbot needing to explain it.
What good is speed and scale if what youâre scaling is slop?
And pure AI-driven content is slop of the highest order: pre-packaged, semi-coherent paste, shunted down a chute into the already-swirling trough of online content. Your feed.
Do you want to eat slop? No!
Neither does your audience.
Rage against the machines.
Itâs no wonder people are starting to push back. The more synthetic our content becomes, the more audiences crave something real.
Last yearâs viral fiascos said it all:Â Googleâs âAI Overviewsâ recommending people eat rocks and run with scissors; the tongue-in-cheek slur âclankerâ becoming shorthand for online frustration with AIâs growing ubiquity.
The backlash is becoming measurable, too: according to NIM research, labelling content as AI-generated tends to increase scepticism and reduce engagement. People are more cautious when they know a piece was machine-assisted.
Supporting this, a report from accounting firm KPMG found that only 8.5% of 48,000 people surveyed âalwaysâ trusted AI search results, while Gartnerâs report on AI search summaries found that over 50% of consumers donât trust AI searches at all, most citing âsignificant mistakes.â
If audiences donât trust the machine, why would they trust a brand that sounds like one? In marketing terms, thatâs not just a reputational risk, itâs a value collapse. The supposed efficiency of AI content is already undermining the very trust that gives brands their worth.
Whatâs the future?
The irony is that in chasing the future, much of the industry has started to regress: faster production, flatter ideas, and an overreliance on tools that promise transformation but deliver repetition.
If youâre hitching your future to a tool thatâs burning through capital, water, and electricity faster than it creates value, is that really future-proofing? Or is it just inflating a bubble thatâs one prompt away from popping?
But despite what personal misgivings you may have about it, AI isnât going anywhere fast in digital marketing.
There may come a reckoning for AIâs environmental cost in the future (Googleâs emissions have increased by a horrifying 50% since 2020, largely due to the enormous amount of energy that AI demands, while Sam Altman recently announced OpenAIâs next wave of AI data centres will apparently require âas much power New York City and San Diego combinedâ). But the tools arenât going away.
So, the challenge is to use them sustainably, to build smarter, not faster.
The Bamboo Nine way.
Our goal is simple: use the tools without becoming them. To create work that lasts because itâs driven by sense, strategy, and a very human kind of intelligence.
At Bamboo Nine, we use AI where it makes sense: to save time and make processes smoother so that we can deliver more bang for our clientsâ buck.
But we donât mistake it for strategy, creativity, or craft. AI can give you words, but not meaning. The best work still comes from people, from those who interpret, empathise, and push beyond whatâs predictable.
And, really, thatâs what audiences crave now more than ever: something real, made by someone who understands what it means to be human.
Find out more about how we set the right foundations through our Discovery Consultation service.
