AI wrote this. Probably.
(It didnât)
The amount of content being published has increased at a pace most teams havenât properly accounted for. Since 2022, the constraint has shifted. It used to be time, resource, and effort. Now itâs judgement.
AI has made it possible to produce content continuously, at scale, and at a cost that would have been unrealistic only a few years ago. HubSpotâs recent research found that 43â45% of marketers use AI for content creation tasks like writing copy.
The outcome is unsurprising. Publishing frequency has increased across blogs, landing pages, emails, and social. Forbes reported that 64% of content marketing now has some level of AI involvement, whether thatâs drafting, editing, or full generation. Even if you take the lower end of those estimates, the shift is still significant enough to change how content competes.
So, whatâs the result?
As output increases, average quality drifts. Messaging becomes broader, safer, and less distinct. The content technically answers the brief, but doesnât move anything forward. AI has removed the bottleneck of time, but it hasnât removed the need for thinking, the need for a thought-out content strategy. If anything, it has made it more obvious which teams are relying on it and which teams are still applying real, human judgement
What’s wrong with content written by AI?
ChatGPT, Claud, Gemini, whatever your preferred method of AI is, theyâre all lacking the same thing.
A literal human brain.
AI doesnât understand what itâs writing. It predicts what comes next based on patterns it has seen before. That distinction matters. AI draws from a vast average of existing content, which means the output tends towards what is familiar, widely accepted, and statistically likely to âsound rightâ. Not necessarily what is precise, original, or commercially sharp.
In practice, that shows up in predictable ways in the tone of voice. Positioning becomes safer. Language becomes broader. Structure follows patterns youâve already seen a hundred times. The copy works, in the sense that it reads cleanly and ticks boxes, but it rarely pushes an idea forward.
AI writes like someone who has read everything and committed to none of it.
The tell-tale signs of AI-generated content.
Writers can usually spot AI-written copy within a few lines. Not because itâs wrong, but because itâs weirdly familiar.
Clock that generic framing, spaces filled with emphasis rather than meaning, and explanations of things that donât need explaining. The tone is confident, but the thinking underneath is thin. Sentence structures repeat. Points circle rather than progress.
For example:
AI:
âIn todayâs fast-paced digital landscape, itâs more important than ever for brands to create high-quality content that resonates with their target audience and drives meaningful engagement.â
Human:
âMost brands are producing more content. Very little of it is doing anything useful. The difference isnât resonance, itâs understanding.â
And donât get me wrong, itâs smart, but itâs not a writer.
The graft vs the prompt.
Writing well has never been about filling space. Itâs thinking, structuring, cutting, rewriting, and then doing it again until the point actually holds up. The words are the output, the output is pride and the time spent is hard work.
Thatâs what experienced copywriters are paid for.
AI compresses the effort it takes to produce something that looks finished, but it doesnât replicate the process that makes it effective. And thatâs where the tension sits. Because now you have work being generated in minutes and presented as if the thinking came with it (it usually doesnât). But from the outside, that distinction isnât always obvious.
That creates a quiet pressure on the people doing it properly. The ones still doing the graft. The ones still pouring hours of time, energy and passion into writing. Pricing becomes harder to justify. Time spent thinking looks inefficient next to instant word count. And work that took real judgement to shape is compared directly with something assembled from patterns.
None of this is new. Every shift in tooling changes how work is valued, but this one cuts closer, because writing is the product. When a tool can create something that mimics the end result of human work, it blurs where the actual skill sits.
Everyone’s a “writer” now.
The barrier to entry has collapsed with the rise of AI. Not just for copywriters, but for anyone whose work depends on language.
Content creators, journalists, authors, academics, linguists, researchers. People who have spent years learning how to shape ideas, argue clearly, and write with intent. All of them are now competing in the same space as tools that can produce passable writing in seconds.
That changes the baseline.
Output is no longer a signal of expertise. Itâs just a signal that something was produced.
For the people doing it properly, thatâs a difficult shift. Writing has always required time, attention, and cognitive effort. Reading, thinking, structuring, editing. Itâs slow by design, because clarity isnât immediate. Now that process sits alongside instant alternatives that look finished enough to publish.
The credibility problem no one asked for.
Thereâs a new, slightly absurd side effect to all of this. Write something clearly, structure it properly, use correct grammar, and thereâs a decent chance someone will assume itâs AI.
Weâve reached a point where the signals have flipped. Consistency, grammar, and well-formed sentences, the literal basics of good writing, are now associated with machine output.
We canât win.
You end up in a position where doing the job properly works against you. Write well, and a robot wrote it. Add a bit of friction or imperfection, and it goes against everything weâve ever been taught as writers.
Itâs a strange place to land. The tools created to help, feel like theyâre working against us.
For anyone whoâs spent years learning how to write properly, it does feel like a no-win scenario.
The “Shy Girl” drama: AI-generated content gone wrong.
The Shy Girl controversy is one of the clearest examples of where this is heading.
The horror novel, originally self-published and later picked up by Hachette, had its US release cancelled and UK distribution halted after allegations that AI had been used in its creation.
Good.
Some analyses suggested a significant portion of the text may have been AI-generated, but this apparently remains contested. The author denied using AI directly, instead attributing it to editorial involvement, which only complicates the situation further.
What followed wasnât just a publishing decision. It was backlash. Readers questioned the legitimacy of the work. Industry voices raised concerns about how it passed editorial checks in the first place. The book was pulled, and the conversation shifted quickly from curiosity to distrust.
And this is where it lands badly for people who actually write for a living.
That slow, deliberate, passionate and often invisible work.
To then see something produced in seconds enter the same space, be judged on the same surface-level criteria, and in some cases pass more easily, is really difficult to ignore.
When authorship becomes unclear, the value of the work starts to blur with it. And once that happens, itâs not just a publishing issue⊠Itâs a credibility issue for the entire writing space.
Where AI is actually useful in content.
(Yes, Iâm playing devilâs advocate)
For all of this, it would be dishonest to pretend AI isnât useful. It is. Just not in the way itâs often being used.
AI is a great assistant, but thatâs all it should be.
Itâs good at tightening things up. Cutting repetition. Making rough thinking more structured. If youâve already done the hard part, the thinking, the positioning, the argument, it can help tighten that into something clearer and more concise.
Used properly, it can speed things up. You can take a long piece of writing and sharpen it. You can sense-check phrasing. You can iterate faster without losing direction.
Do I think it can do all of the above without lowering the standard?
No.
So… what now?
This isnât a takedown of AI, or the people using it. It would be hypocritical. AI is great at planning a piece, refining a thought, turning scruffy, haphazardly (misspelt) meeting notes into something other people can make sense of.
But nothing will ever be more valuable than the human voice when it comes to writing.
Value doesnât come from how quickly something is produced. It comes from whatâs being said, the living brain thatâs created it, the research and understanding that has been gained in order to put those very words onto a page. That part still relies on a person sitting there, thinking it through properly.
For all the progress AI has made, the most valuable thing in any piece of writing is still the sameâŠ
A clear point of view, shaped by someone who actually has one.
A more considered way to approach content writing.
If youâre producing more content but not seeing better results, the problem might not be your output, but the intention behind it.
We approach copywriting and content marketing services the same way we approach everything else. Strategy first. Clear thinking and brand understanding before execution.
We use AI where it adds value. To refine, to structure, to support the process. But the work itself is shaped by people who have spent years learning how to write properly, think commercially, and make decisions that actually move things forward.
If you want a clearer view of what your content should be doing and how to make it work harder for your business, you can learn more about our Discovery Consultation Process.
*written by a copywriter who is sick of being accused of AI for using a semicolon correctly.
