The Introduction to A/B testing

How can we know that our website is easy to use? Is it convenient enough for visitors? What attracts them more?

There is a good tool which helps to find out the answer to these questions and you probably know it. Yes, I’m talking about A/B testing.

A-B testing

So, what is A/B testing?

A/B Testing (also known as Split Testing) is a website optimization technique that involves sending half your users to one version of a page, and the other half to another version, and watching the web analytics to see which one is more effective in getting them to do what you want them to do (for example, sign up for a user account)

Simply put, it’s just a way to figure out how you can get your website visitors to click that shiny red button instead of yesterday’s green one. Or help you find out if the green one was the better choice to begin with, thus forcing you back to the drawing board.

Put another way, it’s a method for measuring two versions of one changed element so you can determine which one is more successful. Is the winner version A or version B?

Clear so far?

To discover which version is a better one, you have to subject both to experimentation at the same time. After which, you’re left with a more successful option to put into real-world use.

Here’re some important points that you should know about A/B testing.

Where to start?

It is a very common problem to know what you should be testing, but not to know where to begin with. To get started, let’s look at a typical A/B testing process:

  1. Get a clear understanding about what is happening on your website and find areas where A/B testing can bring the best results. Form a clear hypothesis for your test. Clearly defining your hypothesis will help you come up with testing variations that give meaningful results. Make sure that before you start testing you have a clear idea of the results you’re looking for.
  2. Change some elements, create a new variant that will have a greater conversion rate.
  3. Start an A/B testing. Based on your hypothesis, you create a variation in which you reduce the number of form fields. You split the website traffic 50/50 between the original and the shorter variation, and wait for the experiment to run until it achieves statistical confidence (95% confidence is the accepted standard). Remember always to test against the original (the “control”) at the same time so you can compare results. This way you’ll know whether a variation is better or worse than the original.
  4. Analyze the results. After you’ve had a statistically significant number of visits to our forms, you ask the question. Did the top-aligned labels make a difference in generating leads?
  5. Optimize. Choose the best variant that benefits you.

If the change increased leads to a point you are happy with, you remove the A/B testing script and make the new version of the form your default form page.

If they don’t significantly improve or actually reduce conversion, you create a new test and come up with a new variant for testing version B.

What to test?

You can test virtually anything in your marketing materials: headlines, calls to action, body copy, images, etc. If you can change it, you can test it. But that doesn’t mean you should necessarily spend months testing every little thing. Instead, it’s better to focus on the things that are most likely to have a big impact.

Once you have decided what conversion rate you want to improve the next stage is to work out what to change on the page to try and increase conversions. Look at the various elements you have on the page in question that would be changed, these may include:

  • Headings – size, color wording
  • Images – placement, different images
  • Content – amount, wording, font, size and placement of content on the page
  • Call to action buttons such as: buy now, sign-up and subscribe buttons can be different sizes, colors, in different places on the page and have different wording.
  • Social media buttons – placement, size and wording are all worth testing
  • Logo and strapline

Time for testing

A/B testing is not an overnight project. Running a test takes at least a full business cycle, which is weekly in 95% of cases. Even if you reach your Minimum Sample Size in 3 days, you should not stop your test until it has run for 7 full days, or whatever duration your business cycle is.

The only time when you can break this rule is when your historical data says with confidence that every single day the conversion rate is the same. But it’s better to test 1 week at a time even then. If you didn’t test for full weeks, the results would be inaccurate.

Does A/B Testing Always Work?

A/B Testing works every time if you define the purpose of your A/B Test beforehand. Set a goal you wish to accomplish with the A/B Test and then go about doing the test. Again there’s no need for you to start on a whim, do things methodically.

Before you get down to testing make sure you know exactly what you’re setting out to achieve.

Do’s and Don’ts of A/B Testing

Even though A/B testing is super-simple in concept, keep some practical things in mind. These suggestions are the result of a real-world experience of doing many A/B tests:

Don’ts

  • When doing A/B testing, never ever wait to test the variation until after you’ve tested the control. Always test both versions simultaneously. If you test one version one week and the second the next, you’re doing it wrong. Always split traffic between two versions.
  • Don’t conclude too early. There is a concept called “statistical confidence” that determines whether your test results are significant (that is, whether you should take the results seriously). It prevents you from reading too much into the results if you have only a few conversions or visitors for each variation.
  • Don’t surprise regular visitors. If you are testing a core part of your website, include only new visitors in the test.
  • Don’t let your gut feeling overrule test results. The winners in A/B tests are often surprising or unintuitive. Your goal with the test is a better conversion rate, not aesthetics, so don’t reject the results because of your arbitrary judgment.

Do’s

  • Know how long to run a test before giving up. Giving up too early can cost you because you may have gotten meaningful results had you waited a little longer.
  • Show repeat visitors the same variations. Your tool should have a mechanism for remembering which variation a visitor has seen. This prevents blunders, such as showing a user a different price or a different promotional offer.
  • Make your A/B test consistent across the whole website. Showing one variation on page 1 and another variation on page 2 will skew the results.
  • Do many A/B tests. Let’s face it: chances are, your first A/B test will turn out a lemon. But don’t despair. An A/B test can have only three outcomes: no result, a negative result or a positive result. The key to optimizing conversion rates is to do a ton of A/B tests, so that all positive results add up to a huge boost to your sales and achieved goals.

I’d like to share one example to show how it works in practice.

See the examples of A/B testing

Example 1: testing buttons

Some marketers will tell you that you should spend 50% of your time on the headline and the other 50% on your call to action (CTA). Smarter marketers will include the value of your content into this equation, but you get the point. CTA’s are great to start your A/B testing. Not only by adjusting shape, color, size and position, but the text as well.

In the example below changing the shape and color of the button resulted in a 35% increase in conversion.

Button A/B testing

Example 2: testing headlines

Your headline should be the hook to keep visitors from an immediate bounce. Copywriters at Upworthy apparently won’t choose which headlines to test until they have come up with at least 25 concept headlines. Most businesses will rely less on extremely captivating headlines, but nonetheless it will always be the first or second element of your website that visitors will notice.

In the example below, borrowed from Unbounce, the second headline resulted in a 68% increase in click through rate.

Landing page A/B testing

Conclusion

A/B testing is quite a good thing with its pros and cons. And you know it. Nothing magic, but productive.

It can improve the conversion rate and give your website like a second life. I don’t mean it will blow up the internet. The results may be different. But in order not to lose, take into consideration the important points, mentioned above, which can help you to avoid simple mistakes and to achieve success.

Yes, everything is simple. Just try it!

Shares

Leave a Reply

Your email address will not be published. Required fields are marked *