Gathering Variables for A/B Split Testing

A/B split testing is no longer an enigmatic term amongst web professionals; countless articles and books cover the basics. What more, access to tools such as Visual Website Optimizer (disclaimer: this is my startup)—which simplify the setup and maintenance of A/B tests have—have made the testing process itself as straightforward as possible. Despite this, though, A/B split testing isn’t part and parcel for UX designers and internet marketers. The question then becomes: why not?

If practitioners have a firm grasp of the concepts behind A/B testing as well as tools to aid them in the process, the only thing detering would-be testers is the notion of “what to test,” and “why?” In this article, we’ll take a look at elements of a website that should affect its users.

Fix your primary test objective

Most websites serve multiple goals. Take, for example, UX Booth. Three hypothetical objectives of this blog are: a) getting more subscribers; b) increasing visitor engagement (measured in terms of how many visitors comment/participate in discussion); and c) increasing clickthroughs on advertisements.

In an A/B test, it is important to consider only one goal at a time. Though you can always measure performance on multiple goals, when it comes to creating variations, your primary goal should always set the stage for one objective. Other goals should be monitored only to be sure that your tradeoffs are balanced. For example, if this blog is optimizing for subscribers by testing a larger sized RSS icon, it may be the case that the winning variation is actually decreasing total revenue from advertisements. Hence, while doing testing, you should keep an eye on all website goals. Consider integrating your test with a web analytics software to measure impact of variations across the site.

Decide what to test

Once you have determined your primary test goal, the next step is to think which elements on the website can possibly influence performance towards that goal. This isn’t straightforward; I consider it as the most important (and perhaps hardest) part of doing an A/B test. Theoretically, all elements on the page influence visitors’ decision to complete website goals which means that, ideally, all major elements of the website should be tested. This strategy is obviously impractical. That is why you should prioritize which elements on your test you should test.

Here are some ideas:

  1. Usability testing

    For getting ideas, a good strategy is to conduct a usability test specifically with a forthcoming A/B test in mind. Instructions to the participants should be to complete your primary website goal. At the end of the usability test, ask them what influenced their decisions. Their responses will pinpoint you to elements that matter. Recently, I did a usability test using FeedbackArmy where I asked whether the participant would like to signup for my product. A major pattern emerged in the responses and it indicated that there is too much text on the homepage and the text is a bit too technical.

  2. Feedback from friends and colleagues

    As website owners/designers, we know our work intimately and therefore we have lots of blind spots when it comes to evaluation. Honestly, unbiased feedback from friends and colleagues really shines here. Take note of their comments such as ‘The download button was hard to notice,’ ‘Is it a free service?’, etc. A great thing about feedback from friends is that you can follow up with more questions and even ask for improvement suggestions. An ideal candidate for getting feedback is someone who is friend of yours in a non-business setting and who is vaguely aware of what your website is about. This is also known as a hallway test.

  3. Web analytics data

    Mine your analytics to determine what prevents your visitors from completing the goal. For example, if increasing signups is your goal, determine whether visitors aren’t visiting the signup page or are simply bouncing off the page after arriving on it using your web analytics tool. In the former case, you need to test the signup button or link on the website; in the latter, you need to test the signup form on the signup page. Similarly, if your objective is to increase sales during the checkout, you may be tempted to test the size and color of the Buy Now button. However, before jumping to conclusions, see which pages your visitors browse to right after the checkout page. It may be the case that many visitors are visiting your shipping policy page instead of completing the checkout process because it wasn’t clear to them what you charge for shipping. This information will change your priority for testing: a site design where shipping policy is clearly laid out before the visitor enters the checkout process.

  4. Heatmaps/clickmaps

    Heatmaps provide a visual representation of where visitors click on a page. For example, a heatmap for your homepage may reveal that your visitors aren’t scrolling below the fold to notice your signup button. A good test in this case would be to test against a version where signup button is above the fold. Crazyegg and Clicktale are good tools for this. If you’re looking for a comparison, be sure to checkout UX Booth’s roundup of these tools.

Create test variations

The third and final step is to create variations for the element(s) you have selected in the previous step. Now is the time to get into the creative mode but there is a tradeoff—more variations means more time required for running the test. Selecting a few variations that have a good chance of beating the original is important. Again, coming up with creative ideas for test variations shouldn’t be a random process.

Pouring through existing A/B split testing case studies can be a fantastic source of ideas. Although they aren’t unique to your website, there is a good chance that they’ll kickstart your testing. Search the Internet or use A/B Ideafox for browsing test case studies specific to your industry and your test objective. Try not to copy exact variations but rather look at general patterns tested in the case study. For example, you may notice that converting a download link into a button increased sales in one case. In other studies, you may notice that moving the location of the promotional message on homepage decreased bounce rate.

The key message here is that usability testing, web analytics data, and case studies help practitioners get the maximum out of an A/B test. With proper preparation, your A/B test need not be a shot in the dark. Rather you can increase chances for success, laying a good basis by selecting the right elements to test and then coming up with good variations.

Good luck with your next A/B test!

Want a beta invite?

Visual Website Optimizer is currently in beta but if you want to use it, signup for a free account using the invite code “uxbooth” (without quotes). This is an exclusive offer for UX Booth readers.

Further Reading

About the Author

Paras Chopra

Paras is the founder of Wingify, a startup in web analytics/optimization space. Their first product Visual Website Optimizer is the world’s easiest to use A/B, split and multivariate testing tool. His aim with the product is to take the fear out of A/B split testing and bring this methodology to fortune 5 million businesses. He regularly posts detailed articles, tips and tricks on conversion optimization blog and I love split testing blog. You can follow him on Twitter @wingify and @paraschopra.

Write for UX Booth

Contribute to UX Booth
Contribute

Contribute a guest post to UX Booth and let the community know what's important to you!



Comments

  1. Good article, I’m really curious to see how your tool works out.

    A/B testing is incredibly helpful when done right. You’ve raised some really good points about how to test properly; I think many people are guilty of trying to test or optimize too many different things at once. That being the case, I think it’s impossible to get good data. You are right in saying focusing on a specific task is the key.

    I have tried all of the usability tools you’ve suggested, but for smaller designers and freelancers, I find the cost is prohibitive, and the time requirements too great for our clients. We’ve developed our own usability testing service at http://intuitionhq.com and we think we’ve hit a pretty sweet price point at $9 a test, and made it as quick and easy to use as possible.

    I think it’s really pleasing to see this new array of tools coming out these days though, and I think it bodes very well for the future of the web. Design can’t help but improve with selection of great tools available.

    Cheers, and good luck with your new service.

  2. Jacob, thanks for your comment. You make an excellent point that testing is usually done in a hap-hazard manner and in the end when people don’t get results, they pass on the blame to the methodology.

    A/B or usability testing is no magic bullet and by itself none can bring wonders. However, if properly planned both these methodologies can create good value for a business. If you can make plan for your business, why can’t you make a proper plan for its optimization efforts?

    Your tool looks very interesting! Good luck to you too.

    -Paras

    • That’s exactly what I think too! I think a lot of it is about educating people about the process, the tools involved, and the outcomes that can be achieved.

      Once people are aware of the impact this can have on their site and their business they are much more willing to invest their time and money into developing stronger, more user friendly sites.

      That’s why I love UXBooth, so much educational material! Thanks.

  3. Great article, that is very helpful.

  4. You write very informative blog posts and are helping to accelerate the business of A/B Testing, landing pages, and good design.

  5. Thanks for the article, I was just writing a protocol about this subject! =)

  6. I love the fact that you not only put a disclaimer on your start up, but you also included a rel=nofollow on the link as well!

    I’m currently half way through Avinash Kaushik’s book Web Analytics 2.0 and I’m excited to start my testing shortly.

    • I’m glad you are excited about testing. It is a fascinating subject.

      Yep, I put a disclaimer in the post but “nofollow” isn’t mine. Probably, UXBooth guys did it! Thanks for letting me know.

  7. Nice read. I always believes that A/B testing is an art that need constant learning.

    This article open my insight into A/B testing further.

    Paras, thanks for writing this article!

  8. great read…. a well written article

  9. If anyone is stuck for ideas then a client just got a 29% boost in conversions by changing the words on a page headline.

    Just the words.

    This is the cheapest thing to test on a page and can get impressive results.

  10. Thanks paras, I’ve heard about AB testing before but there are so many things already to cast an eye over I haven’t given it a go yet. I have one question if that is OK. Do I use the visual web optimizer software with Google optimizer or on its own??

  11. Yeah great drill down on A/B split testing, I’ve also known for while I needed to put more time into this area – instead of focusing on getting more traffic I need to begin to shift focus on converting more of my existing traffic. First stop *A/B split testing!

Related Posts