Why Testing Doesn't Always Work...

2 replies
Hi guys,

I love testing... but I've found in my own tests that it's not always 100% accurate... and treating it as such can be a killer.

I just read in "How to Write a Good Advertisement" by Schwab that most advertisers (at least when the book was written... which was I think in the 60's) ignored a split-test result if there wasn't a 15% difference in response rates... because even the EXACT SAME AD could have a 15% response rate difference.

That doesn't mean a 5% and 20% CR BTW... that means say a 1% and 1.15% CR... just so we're clear.

I would suspect that these days tests are probably a little more accurate... but it's still an interesting theory... and one I personally agree with.

Anyone care to share their own experiences on this?

-Dan

P.S. Be careful though... I think Vin sleeps with a copy of this thing under his pillow, so if you are going to say something bad about the book... do it from behind some riot gear
#work
  • Profile picture of the author Scott Murdaugh
    Hey Dan,

    I'm definitely not going to bash "How To Write A Good Advertisement"...

    But online we have a MASSIVE advantage over the old copywriting guard, and that's that we can constantly be testing, and we have access to instant feedback.

    New headlines, new bullets, guarantees, price points... And we have some amazing multivariate testing scripts available to us so we can test all of these things on an ongoing basis...

    That being said, I will generally stick to testing the most important elements... When it's a winner I'll move on.

    This is also timely for me...

    I had a client several months ago with a great product... The problem is that both the product and the sales copy were around 5 years old. The product is still relevant and the client thought that his traffic may be jaded by the old sales letter.

    I rewrote the letter for him, and he kept it live for several months. His initial feedback was a positive increase in conversions.

    I go check on clients from time to time, and I noticed yesterday that his old copy was live... This kind of bummed me out... I know we can't hit a home run every time but if my copy wasn't beating his old control I wanted to figure out why.

    It had been written by a pro, it wasn't bad at all. I just took the whole angle/USP in an entirely different direction.

    I emailed him to ask what was going on, if my old copy didn't beat the control, anything I could do to help, etc. I had even given him several versions of the letter testing various aspects of it.

    His response was that my control beat the old letter on traffic that had seen the old copy before... But on the organic traffic to the site the letters were converting almost exactly the same.

    I was happy to hear that he hadn't switched permanently, but was still in the process of testing both letters on various sources of traffic.

    You're right, sometimes two different elements, or even completely different letters, will yield almost identical results.

    On the other side of the coin, sometimes small changes equal massive changes in conversions.

    So even though you *might* not see enough of a difference to make it relevant, if you don't test you're almost always leaving money on the table.

    I know you weren't suggesting that testing isn't worth it, I figured I'd jump in and share some of my recent experiences with it.

    Peace,

    -Scott
    Signature

    Over $30 Million In Marketing Data And A Decade Of Consistently Generating Breakthrough Results - Ask How My Unique Approach To Copy Typically Outsells Traditional Ads By Up To 29x Or More...

    {{ DiscussionBoard.errors[1206606].message }}
  • Profile picture of the author Raydal
    Originally Posted by Daniel Scott View Post

    Hi guys,

    I love testing... but I've found in my own tests that it's not always 100% accurate... and treating it as such can be a killer.

    I just read in "How to Write a Good Advertisement" by Schwab that most advertisers (at least when the book was written... which was I think in the 60's) ignored a split-test result if there wasn't a 15% difference in response rates... because even the EXACT SAME AD could have a 15% response rate difference.

    That doesn't mean a 5% and 20% CR BTW... that means say a 1% and 1.15% CR... just so we're clear.

    I would suspect that these days tests are probably a little more accurate... but it's still an interesting theory... and one I personally agree with.

    Anyone care to share their own experiences on this?

    -Dan

    P.S. Be careful though... I think Vin sleeps with a copy of this thing under his pillow, so if you are going to say something bad about the book... do it from behind some riot gear

    Hi Dan,

    You are quoting from page 174 of his book and the point Vic is making
    is that the SAME ad could vary in results (between itself) of 15% so
    it doesn't make sense to get excited about a 15% difference with
    a DIFFERENT ad.

    So you may find that your conversion rate varies between 2% and 2.15%
    so if you run this control against a new letter and find the results are
    2.15% and 2.25% that's not enough difference to get excited about.

    That's the real point the author is making. Which makes a lot of sense.

    -Ray Edwards
    Signature
    The most powerful and concentrated copywriting training online today bar none! Autoresponder Writing Email SECRETS
    {{ DiscussionBoard.errors[1207096].message }}

Trending Topics