
Probably the best system for copywriting peer reviews ever devised. But if you’re following it, your best copy may never see the light of day… (Though I still recommend it! Click above to get it from Amazon.)
Copywriting peer reviews can really suck. They can be horrible, horrible things. And NOT for the reason you’re thinking.
But they can also be good, productive, helpful processes.
And today, I’d like to try to address both sides.
Now here’s the really hard part… I’m going to TRY to remove my fragile ego from this.
Because there’s the “weak ego” part of me that says peer reviews suck. The petty, needy, arrogant, defensive writer version of myself who can’t stand to have my masterpiece critiqued. Who has to be right all the time.
This weak sense of self will NEVER like peer reviews, because the weak ego will always interpret an attack on copy I’ve written as an attack on ME. (Even a mere suggestion that copy could somehow be improved will be seen as a personal attack!)
And my guess is that your first reaction, when you read the title of this lesson — Why Copywriting Peer Reviews Suck — is that this is the first place you went. It’s where my mind goes first — even before I can catch myself. If someone says peer reviews suck, I assume they just went through a particularly bad one, and ranting is a way to lick their wounds.
But that’s not what I want to talk about today…
And so I’m going to try to set aside the egoic reasons why I don’t like getting my copy run through the gauntlet of peer review, and really dive into a foundational issue that I think is a flaw in the peer review process.
“Regression toward the mean” and the yin and yang of peer reviews…
I came into my own in the direct response copywriting world at about the time the Copy Logic! peer review process from Michael Masterson (Mark Ford) and Mike Palmer was spreading like wildfire through our industry.
Put very simply, it was a system for conducting peer reviews. The idea is that copy is voted on by a group. Depending on the score, the copy is dumped or passed through into an improvement process. For the improvement process, suggestions are made, and the group is given the opportunity to decide whether a suggestion will help, hurt, or is neutral.
This is a process developed at Stansberry & Associates, I believe, and tested in different Agora divisions and other companies prior to the book’s publication.
It’s also a process you’re sure to go through if you write for AWAI. I was first exposed to it even in writing simple articles for them. All the promos that ever made their way out the door for AWAI had gone through at least a couple of these peer reviews.
I don’t intend to pick on Copy Logic! in particular. It’s definitely the most refined and widest-used. It’s best known in the direct response world. And I’ve gone through more Copy Logic! peer reviews (or versions thereof) than any other approach. But the flaws are not necessarily in Copy Logic! No, they’re in the idea of “peer review” itself.
A lesson from math, and how bad copy is made good through peer review…
The idea of a peer review is that if we all bring our best, the result will be the sum of the group’s best — and therefore will be better than any of us will achieve on our own.
The thing is, nobody is a very good judge of their own “best.”
And so, in the moment, we may try to bring our best. But the reality is we’re probably bringing our average. Or at least, to the average peer review, we will on average bring our average. (That’s a whole lotta average!)
Which also means that, as a whole, a peer review group will bring its average.
In math, there’s a concept called “regression toward the mean.”
Things are pulled back toward their average — toward their trend line.
In economics and investing, there’s a saying I first heard from Doug Casey. “The cure for high prices is high prices. The cure for low prices is low prices.” Abnormally high prices can’t be sustained because they kill the economics of demand. Abnormally low prices can’t be sustained because they kill the economics of supply. Either way, prices are going back toward some sort of middle ground.
Well, a lot of things work on this regression toward the mean concept.
If you’re in a peer review with a bunch of copywriters who are better than you, they’ll bring you up toward their average.
It’s how the process works. It’s why it’s been championed throughout the industry.
Because when you’re dealing with large numbers of copywriters — particularly more novice writers who are still getting their chops — it’s a great way to get better copy out of them.
This is what Copy Logic! is great at.
If you’re running a copy department, hiring a bunch of new copywriters, and you want to make them better…
Stick them in a peer review with a bunch of more experienced direct response writers, and do Copy Logic! It will improve the writing they do today. And if they use each review as a learning opportunity, it should improve their writing to somewhere around the group’s average through time.
And yet…
“Regression toward the mean” can also make great copy merely good in a peer review setting…
The same thing that makes the peer review process work is also what makes it not work.
Stick a truly breakthrough piece of copy in peer review, and it’s going to get slaughtered.
It thrives by being different, by being new, by breaking rules, by introducing innovative thinking.
And for all those reasons, the group will hate it. They’ll vote it down. They’ll try to improve it by introducing everything it eliminated in the process of becoming extraordinary (“extra ordinary” meaning literally more than or beyond or outside of ordinary).
This mathematical concept of regression toward the mean that brings bad copy up to the good average of the peer review group will also drag great copy down to the good average of the peer review group.
A famous example that comes to mind is Gary Bencivenga’s promotion with the headline, “Get Rich Slowly.”
The reason this worked is it went completely contrary to all the vast, fast riches promises that prospects were so used to seeing. All the other copywriters in the market at the time were simply competing on versions of the get rich quick promise. Gary turned that on its head. What do you think a peer review would have looked like on that? No doubt, Gary would have been eviscerated, and sent back to the drawing board. But the client ended up testing it, and it was a winner for the ages.
This is the yin and yang of peer review…
Bad copy is built up to good…
Great copy is brought down to good…
Exceptions, deviants, miscreants, and geniuses are tamed…
And the marketplace is filled with copy that does its job in a rather bland, vanilla way.
More of the same, more of the same, more of the same.
It stuck with me a couple years ago when Bill Bonner, speaking at AWAI, lamented about what was missing from copy today.
Quirkiness.
That spice of individuality, even eccentricity, that is the je ne sais quoi of copy you can’t ignore, can’t put down, and can’t help but respond to…
The solution?
Well, that’s certainly trickier than pointing out the problem.
One of few inventors to ever grace the cover of Time Magazine, Charles Kettering, said, “A problem well defined is a problem half-solved.”
I guess that was my point today.
Okay, I can’t help myself… One idea…
If you hire copywriters or run a peer review group, here’s an idea.
It actually comes from Marty Edelston at Boardroom, and how he always treated copywriters in building a $100-million-plus direct response business…
“Give the copywriter a panel in the test.”
Each copywriting assignment should result in not one, but two pieces of copy.
The first is the one that goes through all the peer reviews, and is written within that silo.
The second, is up to the copywriter. They can incorporate the lessons of the peer review. But they are not required to. As long as what they say is legal and factual, they get to choose how their copy is done.
Then, test.
My bet is with most copywriters, the result will be roughly 50/50 in terms of what wins.
Novice writers will probably write a lot better in the peer review process.
Pros will probably beat the peer review most of the time, but not always.
But what I’m sure you’ll find is that if you do this enough times, you’ll find that even when the peer review wins, it won’t result in breakthroughs.
It will simply result in workhorse copy that performs and keeps the business afloat.
But an overwhelming percentage of the breakthrough packages will be because the copywriter was allowed the flexibility to do it their way.
Yours for bigger breakthroughs,
Roy Furr
Editor, Breakthrough Marketing Secrets
I think it was Clayton Makepeace who said, "The greatest need of man is not shelter, food or even sex. It's to critique someone else's copy."
Quick story: I once had a copy chief change one word in my headline and change the edgy promise, and made it a blatant lie.
It went from walking the edge of believability, to being outright fraud.
That was the beginning of the end, and a lesson hard learned.
I had just read Copy Logic about a month ago, and thought it was excellent.
It would have been nice to have a team looking over the copy, instead of just one decision maker.
I'm betting my original headline would have stood — or at least changed so it wasn't promising what it could not deliver.
Thanks for the post, Roy.
Aaron, that's definitely a lesson hard-learned! Not all copy chiefs are created equal.
Copy Logic is definitely the best system out there for peer reviews, but it's still subject to my point that it can take the edge off good copy.
It's ALWAYS a balancing act!
Coming from a background of engineering, and therefore mathematics, "regression toward the mean" is deadly accurate.
When I was a senior tech writer at Hewlett-Packard, very few of us came from engineering. Most had a degree in English or Journalism.
My first boss when I started as a senior tech writer was an English major. She wanted to see the first draft of everything I wrote. Here editing mark-ups were a very quick education in where I was making simple mistakes, and I was able to sharpen my skills substantially. After a few months, she told me she didn't need to see my work anymore, and I ran with nothing but technical reviews for the rest of 20 years.
But on occasion I was given a block of text to insert at various locations in a large reference manual. The instructions were essentially, "This has been approved by five (or six) committees, so use it exactly as written and don't change it."
I was unpopular with some of them, because the "committee-approved" text was muddy, lacked clarity, and was inaccurate or fuzzy, making it hard to understand clearly. In one case, I went to an engineering manager who worked on a similar industry-standard document with their version and mine side-by side, asking for his opinion.
His response was curt. "Yours is better. Us it instead."
It was yet another example of what I've maintained for decades: Committees function on compromise because everybody wants to "get along" (that's one of the problems in Congress).
Consequently, you always end up with mediocre designs, mediocre product, and mediocre results.
Progress, improvement, and useful change always come from fanatics — people who are fully committed to making things better. That's the source and foundation behind new ideas, new ways of doing, and new ways of communicating.
Don't be afraid to be different, to stand out from the crowd, and defend your position. Some won't like it, but others will love it, and you'll be a great benefit to many when your idea works and proves itself in the marketplace of ideas and learning.
Yes, yes, and yes! Some peer reviews are better than others, but some are really horrible! It's nice when — occasionally — the "copy by committee" folks recognize that they've edited the life out of something, and yield.
Best wishes!
I love your/Bonner's shoutout to quirkiness. I was just looking at some of David Ogilvy's great print ads that (if submitted anonymously today) I have little doubt would have been tarred and feathered in peer reviews and/or edited into tired, chest-pounding, me-too marketing cliches.
This great post reminds me of the times I've had drafts I sweated over neutered–I mean, edited–into something unrecognizable by clients.
And every time I've just wanted to scream "PLEASE TEST MY VERSION AGAINST THIS ONE YOU'RE ABOUT TO SEND OUT!!!"
My ego and I are never shut off to the possibility the client's desired version actually was/is somehow better–that their changes weren't to indulge their peculiar preferences, but rather *were* the ticket to higher-converting copy.
But that's the point (that Edelston understands)–can we please just find out for sure??? If I'm wrong, they can find a copywriter who's a better fit for their audience next time. If they're wrong, they can keep their hands off my next deliverable.
Next time that sort of thing happens, perhaps I'll just forward a link to this post, with no commentary of my own 🙂
James, James, James… I warn you — your clients may not like reading this post! 🙂
BUT if they actually follow the conclusion, there's a chance they'll end up with a bigger winner.
Did I inadvertently don the ol' Gary Halbert "Clients Suck" hat up there? No, or at least I didn't mean to. Haven't earned it yet.
It's just that sense of "WHAT IF mine had run exactly as I wrote it?" which eats away at me.
At any rate, the 1-2 clients I wanted to beg for a split-test are already ex-clients anyway 🙂
Bear with me a moment while I talk about lawyers and bots before I tie it in to your discussion.
"Average" legal documents, including multipage contracts, that used to be written by committees of lawyers are now being written by computer programs. Customers are paying for them because they are "adequate". What this means to the legal profession is that "average" work by humans is int the process of becoming obsolete and therefore won't sell.
I've heard the programmers are already churning out advertising copy. Soon, it will be good enough to turn out "average" work. That will be the end of advertising committees because no one will pay for hours of human committee time to turn out a result similar to a ten second program output.
Here's the thing, computer programs can only rearrange or modify the blocks they are given. Like a builder building a new house out of the blocks from a previous house. It may look good, but there's no creativity in it and eventually reused material gets stale.
Speaking of programs, the best computer programmers are 10 – 100 times more productive than the "average" programmer. I would imagine that a good, creative copywriter is many times more "productive" than an average one. So yes, test.
Bob, I've done the research, and I believe we're on track for this in decades, not years. There's software out there now that is okay at writing VERY SHORT copy (a few words) and testing options to see what works best. There's a long way between that an AI that can write a magalog! We're safe… For now!
Thinking about some of Dan Kennedy's ideas about making the copy or marketing material "good enough" and not wasting too much time and energy going beyond what is needed or wanted. I wonder how much of what Bonner said in that talk about being an "artist" copywriter rather than an "artisan" applies to many of us.
Then again, I score dozens of English essays every week written by non-English speakers and while I am retrained and double-checked regularly to keep me accurate to the scoring criteria, there are many times when the "passable" essays are almost indistinguishable from each other because they are all trying to do the same things to the same topic prompts just as they have been taught.
To me, reading those monotonous, carbon-copy essays, each created by a different individual are proof positive that when we get to the point where computers start writing copy, no one will be reading it.
JBH
The question isn't when computers can write bad copy, it's when they can write good. I think it'll be a couple decades still, but when AI gets smart enough, there's no reason to believe it can't write incredibly compelling copy. But who knows what the world will look like when that happens.