Significant Split Test Results?

by TimS
5 replies
I have a question for those of you that regularly run split tests.

I was hoping for something that would jump out and slap me in the face, but the results seem almost insignificant.

After about 200 subscribers, running an A/B split test, one page resulted in 94 subscribers, the other is at 104.

Is that enough to consider it a significant run and start testing the better page against another?

Thanks
#results #significant #split
  • Profile picture of the author jaggyjay
    No - those numbers aren't enough to be statistically significant. Are the pages very similar? If feasible, you'll want to continue testing until you get statistically significant results. If pressed for time, just make a decision and go with the better producing one.
    {{ DiscussionBoard.errors[1360012].message }}
    • Profile picture of the author MikeHumphreys
      Originally Posted by jaggyjay View Post

      No - those numbers aren't enough to be statistically significant. Are the pages very similar? If feasible, you'll want to continue testing until you get statistically significant results. If pressed for time, just make a decision and go with the better producing one.
      I agree with Jaggyjay... definitely not statistically significant.

      The more important question is, why isn't the testing software you are using telling you if it's statistically significant or not? That's a pretty common feature in many commercial testing scripts.
      {{ DiscussionBoard.errors[1360019].message }}
      • Profile picture of the author mequad
        Yes JaggyJay...

        TimS should use much different copy angles to test,
        not just something slightly different.

        Add video, or short vs. long copy... not just changing a word
        here or there.

        Although simple headline changes can often make a huge difference.

        I think the approximate 100 ACTIONS from each test IS significant
        enough to pick the winner, although it is only about 10% better in
        this case.

        If it is 100 exposures (not actions), then no...

        But 100 ACTIONS/CONVERSIONS is significant, whether it is
        opt-ins, or sales.

        Some statisticians say 50 ACTIONS/CONVERSIONS by both controls
        is enough to be 90% certain of the winner.

        From my learning and experience...
        {{ DiscussionBoard.errors[1360046].message }}
  • Profile picture of the author Frank Bruno
    I would use the better version and use it as the "control" and split test another version against the current winning version.

    Frank Bruno
    {{ DiscussionBoard.errors[1360033].message }}
  • Profile picture of the author TimS
    Thanks for the replies. I'm not using an A/B script, I'm using a PHP script that pulls one of the two pages at random. It loads them evenly over the course of time and gives me a good split.

    The pages are very different. One is with a headline and copy, the other is with a link that asks them to fill out the form w/ a video included on the page. The text is well below the fold.

    Thanks,
    Tim
    {{ DiscussionBoard.errors[1360651].message }}

Trending Topics