Everybody thinks about it, but there’s a lot more talk than action.

There’s been a lot of conversation lately about split-testing, also called A/B testing. Very simply, the idea is that you test one version of something against a different version, to determine which one works best.

It could be two emails with just the subject lines differing.

It could be two ads with just the headlines differing.

It could be two opt-in forms—one asking for name and email address, the other just asking for the email address.

It could be differing positions of the opt-in form on your landing page.

It could be two different graphics.

— you get the idea.

 

The point is that you randomly send some of your traffic to one version, and some to the other, and you measure the results. It’s not unusual to find huge differences in results, which you would never know without testing.

It’s not quite that simple, of course, because one version might get you more clicks or opt-ins, but fewer sales. For example, if you make it easier to opt in, by only asking for email address, and not name, more people may opt in. On the other hand, they might not actually be as interested in your proposition, and your conversions could drop off, so you have to be careful.

You also want to change only one thing at a time, or you won’t know which change really made the difference. (There is a form of split testing called multivariate testing that does allow testing several things at once, but it’s a good deal trickier and requires lots of traffic to get reliable results.)

After you determine the winner, you test that one, now called the “control”, against a new version. You’re constantly trying to beat the control.

What many people don’t recognize is that the improvements compound. If you improve your headline by 20%, and then improve your opt-in form by 20%, and then change the position of the opt-in form and get another 20% increase, you’ve just increased your overall results by 73%

There are a number of tools to help you do all this, and there’s more info in this post Split Testing Leads To 300% Better Conversion Rate

Anybody have a split-testing story to share? Comments? Questions?

 

Scott

P.S.  I’m working on getting out of my comfort zone and writing catchier headlines so we can get more links and traffic.  Mission accomplished with this one?