We all remember Barack Obama’s presidential campaign in 2008, which appeared to be innovative in many ways. But few know that, back in 2007, this team acquired a new member, Dan Siroker, who was a Google product manager at that time. As a digital advisor for the presidential campaign, he did what he could do best – applied his experience as a Googler and showed Obama’s team how to do A/B testing. Using that tool, Siroker and other specialists changed the campaign website. The improvements resulted in 40% more signups and $75 million in money raised.
Sounds impressive, doesn’t it?
What Is A/B Testing?
A/B testing is a method of UX research used to determine the most effective variant among different versions of the same thing. For example, you have a Call-to-Action (CTA) button, but you’re not sure what text to put in there. So you create 2 variants, place them on your website with the help of special software, and wait. In the meantime, these versions are randomly shown to different users who naturally click on the version they like most. After the test is over, you get the results and see which CTA button works best.
It is safe to say that split testing, another name for A/B research, was used long before the web took our lives by storm. Of course, the form and execution were different, but the core concept has stayed the same through all these years. A/B testing as we know it today was first applied by Google in 2000 to identify how many search results users wanted to see on one page. Nowadays, it’s a tool used by every goal-setting company that wants to create a successful and popular product. The reason for such a popularity is simple: you learn what your users prefer not from their words – which sometimes can be sugarcoated, to say the least – but from their behavior. And where can you find more reliable information?
Control and Variation
Two main notions rule in the A/B testing world:
- Control – the original version of the element you want to test;
- Variation – the alternative, which can comprise a changed text, color, design, etc.
They are always tested simultaneously, as this is the only way to compare their effectiveness. If you show different versions to your audience separately, the data you receive will not reflect the reality. Factors that impact user’s behavior might change and result in entirely different findings. You, on the other hand, need to see results from when the control and variation are tested under the same conditions.
Types of A/B Testing
One of the subsequent functions of A/B testing is to show how well (or not) you know your target audience or how subjective your opinions/preferences might be. When Siroker started working on Obama’s campaign website, everybody on the former president’s team was positive that a video of Barack giving a speech at a rally would beat any photo on a homepage. But the A/B test results indicated that even a simple photo of Barack Obama on a turquoise background outperformed the video by 30%.
This example proves that you shouldn’t trust your instincts when it comes to the web. What you should do is test your hypotheses – and to do that, you have 3 types of A/B testing:
- A/B split testing: This is a regular experiment where an original element (or a cluster of elements) from a web page is tested against its modified version. If you involve several variations, the element(s) under the change must stay the same.
- Multivariate testing: If you want to assess a CTA button in the 1st variation, replace background photo in the 2nd one, and rewrite the headline in the 3rd, this type of testing is just for you. You will be able to see which element increases your conversion rate the most in comparison with your control variant(s).
- Experimental design: This sort of combines two abovementioned A/B types in one. It allows you to evaluate several changes in one variation with completely different modifications in another one. For instance, your 1st variation has an amended CTA button and different background photo, while your 2nd variation tests a rewritten headline and new menu organization.
Of course, each type of A/B testing has its pros and cons. Split testing, for example, limits you to one element or the same set of elements that can be estimated, but the results of this research are very easy to interpret. Multivariate testing, in turn, gives more variety, but requires a great number of users because if you have 10 different elements to test, your audience is split in ten, and you have to wait a considerable amount of time to receive scalable results. As for the experimental design, it’s quite hard to interpret the findings, but it reduces the number of variations.
There is no such question as to whether you need to use A/B testing or not. It’s a question of choosing which type to use. Every situation is unique, so it’s necessary to estimate your resources to learn which technique to apply.
If you’re still unsure whether you should opt for A/B testing, the next section is for you.
Why Is A/B Testing Important for Your Web Product?
Before we move on to the details of the A/B testing process, we will sum up all the benefits of this UX research method. They are important to know, because when you do you’ll understand the value of split tests and have no more doubts.
- It is a proven way to get honest feedback. The users have no idea that they are participating in an experiment. Thus, they behave naturally and click on the things they truly prefer.
- A/B testing is universally applicable. It is suitable for large corporations, small companies and, at the same time, individual designers/developers. What’s more, to conduct an A/B test, you can have a big user database or a small one.
- It’s a low-risk business. You gradually introduce changes that are “approved” by your end users.
- You cut down bounce rates. This is a great way to identify what exactly makes visitors leave your website/page.
- You raise conversion rates. By improving buttons, calls to action and design overall, you convert your visitors into subscribers, clients, donors, etc.
- A/B testing optimizes the work of the team. It is no longer necessary to spend hours on determining which version of the text or which color of the background is better. You leave it all up to the users.
Now let’s find out what exactly you can test to refine your website.
What To Test
To gain the maximum benefit and feedback, you need to use A/B testing for the specific elements this technique was designed for. In this section, we’re going to enlist these testable elements and expand on how they are analyzed during A/B testing.
Web Design: Layouts, Colors, etc.
No matter how professional and competent a designer you are or how great your design team is, it’s the user who makes the difference. That’s why when you are creating a website, landing page or blog, you should know which design leads to more engagement and conversions from the visitor’s side.
Here you can test background colors, various organizations of website elements or menu structures. Your variations don’t have to be completely different: you can test small components as well, because details play an important role in visitor behavior.
Headlines and Product Descriptions
Here you can determine which title makes readers click and read the article. You can also find out what kind of product description persuades potential clients to end up making a purchase. To get the most out of A/B testing at this stage, brainstorm 10-15 samples of the same headline or short description, and select 3-4 favorites and put them to the test. The stats will indicate the best variation.
Calls to Action and Buttons
A/B testing can identify for you which button shape, color or location generates the most clicks. It’s essential to test the call to action too, because the “Learn more” text may attract much more attention than the usual “Sign up”. But you’ll never know until you try.
Pricing and Offers
It’s not always obvious that a lower price will generate more sales than a bigger one. You may find out that the higher quote gets just as many clicks as the lower quote, and that the problem is not in the pricing entirely. A/B testing allows you to evaluate sales pages and offers to establish the best-selling variation.
It’s critical to test images for Facebook ad campaigns, homepages, sales landings, products in online shops, etc. You may discover that a 360-degree picture drives more sales than a carousel, where the user has to click to get a full view. But when assessing images, remember to test them not only against your control version, but against other variations as well.
Amount of Text
Work out what kind of content your readers prefer (long reads or small blog posts) or determine how much text to put into one line to make it more readable. Yes, it’s possible to do that with A/B testing!
A single word, the form of fields, background color, and so on, make a difference for a subscriber who’s deciding whether to sign up or not. With A/B testing, you can find out exactly which elements you need to improve by creating several variations and analyzing the number of subscriptions.
These are the main elements you can test. They are not the only ones, but are the best for evaluating with A/B testing. It’s essential to single out the component that requires changes. The moment you do it will be your first step to starting A/B testing.
How to Conduct A/B Testing
To obtain valuable results from A/B testing, you have to plan and organize the whole process. In other words, you need to create a strategy that must include:
- Defining your objectives. You can certainly try to guess which improvement will lead to more engagement or where the real problem is on your website. But A/B testing is a scientific approach, so preparing for it should be evidence-driven. There are several ways to understand your goals and support them with proof:
- Technical analysis;
- Evaluation of web analytics;
- Mouse-tracking analysis;
- User testing;
Using these techniques, you will isolate the weak points. But that is not the end of defining the objectives. A/B testing requires a clearly formulated thesis. For example, the CTA button must be changed for the new version to generate more clicks. There you go – the what and why.
- Create variations. Keep in mind that testing too many variations at the same time may result in distorted findings because your audience will be split into too many groups.
- Remember about redirections. Want to test a separate element? Create an HTML version of the variation and upload it to your A/B testing tool (or create a variation within the software you use, which will generate a code for you that you will have to insert into the website). The tool will replace the control with a variation before a random user accesses the website. If you wish to test the whole page, you need to develop and upload it to your website. This way, your A/B tool will redirect some visitors to a modified variation.
- Set a time limit. The unspoken rule of A/B testing is to determine the duration time up front. Experts say that the experiment should last at least a week. If you plan it to run it longer, take into account that approximately 10% of users delete cookie files every 2 weeks. The last thing to consider is how many variations you test simultaneously: the bigger the number, the longer the experiment should be.
- A/B test. Choose the most suitable tool and run as many tests as you have resources for.
- Interpret the results. In the simplest outcome, you can have 3 kinds of A/B test results:
- the control still wins;
- one of the variations is a hit;
- there is no difference.
This will be visible in a preview of your findings. However, if you wish to dig in, export the results of your A/B testing software into an Excel document. The data will be structured and easy to analyze. Researchers also advise that you integrate your A/B test tool with Google Analytics to get more refined results.
Individual cases might have additional transitional steps, but in general the process of A/B testing looks like this. Of course, it involves using special software and tools like those mentioned above. There is certainly no shortage of them nowadays.
Tools for A/B Testing for Web Products
You probably know the Bing search engine. Before implementing any changes into their system, 80% of the proposed variations undergo active testing. That should answer your question of when you should opt for A/B research (that is, almost always). And to answer your second question, regarding what tools you can use to optimize the process, we have several options for you:
A/B Testing Platforms
These are resources where you hand over the whole experiment to the software. The most popular ones are Optimizely, VWO, AB Tasty, Unbounce, etc. You just register your website, create a variation, receive a code that you upload into your website – and voila! Testing platforms track the user’s behavior and show you the results at the end of the experiment.
Buffer successfully tests headlines for its blog in Twitter. They post different titles for one article and look at the stats (retweets, favorites, mentions, clicks) to choose the most engaging one. The same thing can be done on Facebook.
Inbuilt A/B Testing Tools
Many of the platforms you may use for different purposes provide their own inbuilt A/B testing tools. For example, you can experiment with:
- newsletters at MailChimp;
- blog titles with the help of WordPress plugins;
- ad campaigns with Facebook manager;
The instruments are plentiful. To select the most suitable one, determine what you want to test first. Then it will be easier to settle the question.
A/B Testing Case Studies
Let’s look at some recent examples of A/B testing conducted by famous companies and consider the results:
Creative Market – Design Platform
The Creative Market team set a goal to redesign their pricing offers page for credits purchase to increase conversions. They created a variation relying on the best practices of usability: reducing an additional click and placing a CTA button above the fold. And what do you think happened? The best practices didn’t work. With the variation design, total revenue dropped by 11%.
Fab – Platform for Buying and Selling Handcrafted Goods
Using A/B testing, the Fab team discovered that adding a CTA text to a cart button increased the click-through rate (CTR) by 49%. That proves the general hypothesis that users react better to text than to icons.
Omoda – Dutch Shoes Retailer
The design team at Omoda believed that their unique selling points were not sufficiently visible in the original version. VWO proposed that they increase their visibility by changing the background of customer service elements to grey and developing unique icons for each USP. While on the desktop version the variations didn’t cause any stir, sales from mobile devices skyrocketed by 13,6%.
These case studies prove that user behavior doesn’t give in to any predictions. A/B testing is the ultimate research method to determine how your clients truly feel about your website or its separate elements. Use the knowledge gained from our guide to set proper goals, choose the right A/B testing technique, run efficient experiments and get reliable results. This is what you simply must do if you categorize yourself as an accomplished developer, designer or business owner.