File today’s daily essay under “More stupid mistakes from Roy’s misadventures in trying to master Facebook ads,” with a cross-reference under “Public Service Announcement.”
(The public service announcement comes from Googling “What Facebook placements are best for long copy ads?” and not finding an answer… The answer is below!)
If you want to blow through your ad budget quick…
Get all excited that you’re doing the right thing because you’re getting clicks and traffic…
But have it dawn on you that nobody’s converting…
And you’re left with not much to show for it at the end of the day…
All you have to do is copy my stupid mistakes!
I say that tongue-in-cheek, but I have yet another great lesson for you today, learned only through failure.
(See, I told you that “fail fast” is the best way to get good at anything! Heck, Claude Hopkins even devotes an entire chapter of Scientific Advertising to test campaigns. It starts, “Almost any question can be answered, cheaply, quickly and finally, by a test campaign.”)
I mentioned yesterday that I was testing a new Facebook ad. And, in fact, I was so bold as to think it was going well.
It’d been running for a while. It had gotten its first conversion pretty quickly. And it was getting a bunch of attention, as evidenced by the clicks it was getting.
By the end of the day, a different picture was emerging.
I was getting a good amount of traffic, to a landing page that should be getting at least minimal conversions.
But, alas, I had to shut down the ad. Because the people who were showing up weren’t converting at a level that I could justify letting it run.
The day before? No clicks on my ad. Yesterday? Clicks but no conversions. What gives?
So I started diving into the data.
Why was I getting a bunch of people interested in my ad enough to click, but having a disconnect when it came time to opt-in for the offer I was making? Considering the offer is FREE, as long as the copy in my ad reasonably sets them up for the ask, I should get at least 10% opt-ins.
And I know I’d done that with the copy. I’d written it in a way that when you read, you knew that I’d be asking for an email address after the link click.
Of course, that assumed you read the dang thing…
When I looked at the data, an interesting pattern emerged.
I had set it up to let Facebook find the best potential audience, based on their advanced algorithms (trusting the Facebook algorithms is often the best way to go).
I was starting with a lookalike audience, which is usually pretty helpful for targeting. But beyond that, I let it run the ads to all the placements, on the assumption they’d do a good job of connecting the audience with my ad, wherever they were.
Almost 97% of my clicks were coming from the Facebook Audience Network.
That’s the ads that appear off-Facebook.
And I had a realization. For the ads that are appearing off-Facebook, there’s not always a “read more” link. And if my ad NEEDED long copy to convert… That could be a problem!
So I quickly went back and looked at all the ads. Sure enough, the ads that appeared in the Facebook News Feed all had that “read more” link.
And the ads that were off-Facebook, that were getting all the clicks and eating up my budget without any conversions to show for it?
No “read more” links!
What’s the lesson? What is the takeaway?
First off, if you’re running long copy ads, the ONLY placement option you should use is to run those ads direct on Facebook, and only in the News Feed.
Set that up at the Ad Set level. Create an Ad Set specifically for “long copy” and use that for testing any long copy ads.
If you use any other placement and are hoping your long copy will get read, it won’t! It’s actually impossible. Depending on the placement, they can only read the first couple sentences, or even just the first few words.
(However, if you do get a long copy ad to work in News Feed ads and want to see if you can replicate it in other placements, there’s an easy trick… Convert that long copy ad into a landing page on your site that functions as an advertorial jump page, redirecting visitors to the landing page you want them to end up on. Make the Facebook ad the shortest version possible to work for all placements, link to the longer copy on your advertorial page, and fill that page with links to the opt-in page. This isn’t guaranteed to work, but if you have proven long copy and want to get access to the other placements, this should be your first test.)
That’s not the ONLY takeaway though, and the rest may be even more valuable…
A quick metaphor: Nobody knows exactly how humans first figured out how to make wine. But here’s the best guess. A bunch of grapes were stored in a large, sealed pot. And forgotten. In that pot, there was also some microorganisms — yeast — that started to break down the sugars in the grapes. Give those yeast enough time, and they convert the sugars in the grapes into alcohol. Someone discovers this strange-smelling pot of old grapes, and decides to taste the juice. Voila — wine!
Many times mistakes yield unexpected results.
I wrote the ad yesterday with a very specific intention.
My first ad that wasn’t getting enough clicks was very direct. It was about the offer itself. If you’re familiar with the Eugene Schwartz market awareness spectrum (Breakthrough Advertising and Great Leads are both required reading), you’ll recognize that this means it was aimed at the most-aware segment of my market.
But, that’s not who I really wanted to target. Rather, these ads I was creating were meant to convert a cold audience. People who didn’t know me from Adam. Who would be interested in the result I was offering to provide, but who had zero awareness of me or the offer.
And so yesterday’s ad was meant to be much less direct. Much less about the offer, and more about the general problem I was going to help them solve.
I’d had an inspiration from an off-hand remark in a podcast, and decided to test it.
Turns out that problem was a mega hot-button!
When I looked back at those ads that were getting all the clicks but not conversions, all you could read was that I was offering to solve that problem.
The thing is, that’s not what the landing page was about. Because in the long-copy version of the Facebook ad, the copy connected the dots. It said, “If you have this problem, here’s how you solve it…” Once I connected the dots, then I asked for a click to get the solution. And the solution was a fit with my offer.
On the assumption that you read the intro to the post, clicked “read more” and read until at least the first call-to-action… The process made sense!
But you know what they say about assumptions… 🙂
Problem was, everybody clicking from all those placements off Facebook didn’t have a chance to read the copy that connected the dots. So they were hoping for me to present the solution on the landing page. Instead, they were getting a jump direct to the offer.
So here’s a big takeaway: create more ads to address the proven hot button!
If you test an ad and it gets a ton of clicks but no conversions, that’s actually a good thing. Even if you’ve just blown a bunch of cash with no hope of making it back.
All you have to do is to create a better total experience around this promise.
For example, I can do the reverse of the process I mentioned above.
I can take the ads that I already know get clicks, and accept the fact that the short-copy version works to get clicks. Then, I can take the copy from the long-copy version, and put that on a landing page. That can serve as an advertorial, and I can pack that page with links to the opt-in page I want visitors to end up on.
Then, I can track what percentage of visitors click through from there, as well as the percentage of visitors that opt-in. As well as take later steps in my funnel.
And each of these can be optimized, given enough traffic, to make each perform well.
The closer you look, the more you’ll find to optimize…
Final thought. Very early in my marketing career, I was given an incredible gift. This was actually from my very first freelance client, David Bullock.
David taught Taguchi testing, which is a method for testing a ton of variables in a single marketing campaign. But instead of testing all combinations of all variables, you test a tiny subset of the possible combinations.
So, if you want to test two versions of each of eleven different ad elements, you’d have to test 2,048 combinations to get them all. But using Taguchi statistics, you can select the best-performing of all 2,048 ads with just a dozen test panels.
To do this type of multivariate testing, you have to develop x-ray vision. That is, you have to be able to break down an ad or landing page or campaign into its smallest component parts. To consider each individually, and recognize that one tiny thing (such as a headline or picture) can play an important role in the total response. But to test that, you have to be able to isolate it.
The gift David gave me was the ability to really break down any advertising approach into its component parts, and start to understand what its relative performance means.
So high clicks but no conversions is a failure of the total system, but there’s something in there that’s successful. How can we use that as a starting point for the next test, and the next? How can we be constantly iterating with our destination in mind, until we get there?
A jet airplane doesn’t fly in a straight line. Rather, the pilot takes off with a destination in mind. They aim the nose of the plane in the right direction, but wind and other factors quickly veer them slightly off course. So they make a little correction, to get back on course. This happens thousands of times in a flight. By knowing the destination and constantly course-correcting, they know they’ll get there. And it happens every time.
Yours for bigger breakthroughs,