AI Unplugged: A copywriter’s view of AI-generated content. 9 min read

AI wrote this. Probably.

(It didn’t)

 

The amount of content being published has increased at a pace most teams haven’t properly accounted for. Since 2022, the constraint has shifted. It used to be time, resource, and effort. Now it’s judgement.

AI has made it possible to produce content continuously, at scale, and at a cost that would have been unrealistic only a few years ago. HubSpot’s recent research found that 43–45% of marketers use AI for content creation tasks like writing copy.

The outcome is unsurprising. Publishing frequency has increased across blogs, landing pages, emails, and social. Forbes reported that 64% of content marketing now has some level of AI involvement, whether that’s drafting, editing, or full generation. Even if you take the lower end of those estimates, the shift is still significant enough to change how content competes.

So, what’s the result?

As output increases, average quality drifts. Messaging becomes broader, safer, and less distinct. The content technically answers the brief, but doesn’t move anything forward. AI has removed the bottleneck of time, but it hasn’t removed the need for thinking, the need for a thought-out content strategy. If anything, it has made it more obvious which teams are relying on it and which teams are still applying real, human judgement

What’s wrong with content written by AI?

ChatGPT, Claud, Gemini, whatever your preferred method of AI is, they’re all lacking the same thing.

A literal human brain.

AI doesn’t understand what it’s writing. It predicts what comes next based on patterns it has seen before. That distinction matters. AI draws from a vast average of existing content, which means the output tends towards what is familiar, widely accepted, and statistically likely to “sound right”. Not necessarily what is precise, original, or commercially sharp.

In practice, that shows up in predictable ways in the tone of voice. Positioning becomes safer. Language becomes broader. Structure follows patterns you’ve already seen a hundred times. The copy works, in the sense that it reads cleanly and ticks boxes, but it rarely pushes an idea forward.

AI writes like someone who has read everything and committed to none of it.

The tell-tale signs of AI-generated content.

Writers can usually spot AI-written copy within a few lines. Not because it’s wrong, but because it’s weirdly familiar.

Clock that generic framing, spaces filled with emphasis rather than meaning, and explanations of things that don’t need explaining. The tone is confident, but the thinking underneath is thin. Sentence structures repeat. Points circle rather than progress.

For example:

AI:

“In today’s fast-paced digital landscape, it’s more important than ever for brands to create high-quality content that resonates with their target audience and drives meaningful engagement.”

Human:

“Most brands are producing more content. Very little of it is doing anything useful. The difference isn’t resonance, it’s understanding.”

And don’t get me wrong, it’s smart, but it’s not a writer.

The graft vs the prompt.

Writing well has never been about filling space. It’s thinking, structuring, cutting, rewriting, and then doing it again until the point actually holds up. The words are the output, the output is pride and the time spent is hard work.

That’s what experienced copywriters are paid for.

AI compresses the effort it takes to produce something that looks finished, but it doesn’t replicate the process that makes it effective. And that’s where the tension sits. Because now you have work being generated in minutes and presented as if the thinking came with it (it usually doesn’t). But from the outside, that distinction isn’t always obvious.

That creates a quiet pressure on the people doing it properly. The ones still doing the graft. The ones still pouring hours of time, energy and passion into writing. Pricing becomes harder to justify. Time spent thinking looks inefficient next to instant word count. And work that took real judgement to shape is compared directly with something assembled from patterns.

None of this is new. Every shift in tooling changes how work is valued, but this one cuts closer, because writing is the product. When a tool can create something that mimics the end result of human work, it blurs where the actual skill sits.

Everyone’s a “writer” now.

The barrier to entry has collapsed with the rise of AI. Not just for copywriters, but for anyone whose work depends on language.

Content creators, journalists, authors, academics, linguists, researchers. People who have spent years learning how to shape ideas, argue clearly, and write with intent. All of them are now competing in the same space as tools that can produce passable writing in seconds.

That changes the baseline.

Output is no longer a signal of expertise. It’s just a signal that something was produced.

For the people doing it properly, that’s a difficult shift. Writing has always required time, attention, and cognitive effort. Reading, thinking, structuring, editing. It’s slow by design, because clarity isn’t immediate. Now that process sits alongside instant alternatives that look finished enough to publish.

The credibility problem no one asked for.

There’s a new, slightly absurd side effect to all of this. Write something clearly, structure it properly, use correct grammar, and there’s a decent chance someone will assume it’s AI.

We’ve reached a point where the signals have flipped. Consistency, grammar, and well-formed sentences, the literal basics of good writing, are now associated with machine output.

We can’t win.

You end up in a position where doing the job properly works against you. Write well, and a robot wrote it. Add a bit of friction or imperfection, and it goes against everything we’ve ever been taught as writers.

It’s a strange place to land. The tools created to help, feel like they’re working against us.

For anyone who’s spent years learning how to write properly, it does feel like a no-win scenario.

The “Shy Girl” drama: AI-generated content gone wrong.

The Shy Girl controversy is one of the clearest examples of where this is heading.

The horror novel, originally self-published and later picked up by Hachette, had its US release cancelled and UK distribution halted after allegations that AI had been used in its creation.

Good.

Some analyses suggested a significant portion of the text may have been AI-generated, but this apparently remains contested. The author denied using AI directly, instead attributing it to editorial involvement, which only complicates the situation further.

What followed wasn’t just a publishing decision. It was backlash. Readers questioned the legitimacy of the work. Industry voices raised concerns about how it passed editorial checks in the first place. The book was pulled, and the conversation shifted quickly from curiosity to distrust.

And this is where it lands badly for people who actually write for a living.

That slow, deliberate, passionate and often invisible work.

To then see something produced in seconds enter the same space, be judged on the same surface-level criteria, and in some cases pass more easily, is really difficult to ignore.

When authorship becomes unclear, the value of the work starts to blur with it. And once that happens, it’s not just a publishing issue
 It’s a credibility issue for the entire writing space.

Where AI is actually useful in content.

(Yes, I’m playing devil’s advocate)

For all of this, it would be dishonest to pretend AI isn’t useful. It is. Just not in the way it’s often being used.

AI is a great assistant, but that’s all it should be.

It’s good at tightening things up. Cutting repetition. Making rough thinking more structured. If you’ve already done the hard part, the thinking, the positioning, the argument, it can help tighten that into something clearer and more concise.

Used properly, it can speed things up. You can take a long piece of writing and sharpen it. You can sense-check phrasing. You can iterate faster without losing direction.

Do I think it can do all of the above without lowering the standard?

No.

So… what now?

This isn’t a takedown of AI, or the people using it. It would be hypocritical. AI is great at planning a piece, refining a thought, turning scruffy, haphazardly (misspelt) meeting notes into something other people can make sense of.

But nothing will ever be more valuable than the human voice when it comes to writing.

Value doesn’t come from how quickly something is produced. It comes from what’s being said, the living brain that’s created it, the research and understanding that has been gained in order to put those very words onto a page. That part still relies on a person sitting there, thinking it through properly.

For all the progress AI has made, the most valuable thing in any piece of writing is still the same


A clear point of view, shaped by someone who actually has one.

 

A more considered way to approach content writing.

If you’re producing more content but not seeing better results, the problem might not be your output, but the intention behind it.

We approach copywriting and content marketing services the same way we approach everything else. Strategy first. Clear thinking and brand understanding before execution.

We use AI where it adds value. To refine, to structure, to support the process. But the work itself is shaped by people who have spent years learning how to write properly, think commercially, and make decisions that actually move things forward.

If you want a clearer view of what your content should be doing and how to make it work harder for your business, you can learn more about our Discovery Consultation Process.

*written by a copywriter who is sick of being accused of AI for using a semicolon correctly.

How can we help?