How to Structure, Run, and Analyze an A/B Test on Your Website
Last Updated / Reviewed: Feb 1st, 2022
Execution Time: 30min-2h
Goal: To properly structure and implement an A/B testing culture across your business;
Ideal Outcome: You’ve flawlessly planned, executed, and analyzed an A/B test on your website and are able to determine whether a variant should be implemented definitively on your website or not, based on actual data.
Pre-requisites or requirements:
- You need to have Google Optimize set up on your website.
- Related SOP: SOP059- How to Setup Google Optimize on your Website;
- You should have Google Analytics conversion goals set up on your website.
- Related SOP: SOP021- Setting up Conversion Goals in Google Analytics;
Why this is important: Without a proper A/B testing process in place you are not able to confidently make business or UI/UX decisions.
Where this is done: In Google Optimize;
When this is done: Every time you want to test a new hypothesis on your website;
Who does this: The person responsible for Conversion Rate Optimization;
Determining what you are going to test
- Select the element on the landing page that you would like to test:
- The most common testing elements are:
- Headline / Sub-headline;
- E.g: Replace “Just another WordPress site” with “5x your ROI today through custom emojis”
- Form positioning / Form fields;
- E.g: Move the lead generation form from the bottom to the top of the page;
- Media on the page (Images and Videos);
- E.g: Replacing the background image on your hero image;
- CTA (Call-to-action) text, color, and shape;
- E.g: Changing the CTA text from “Submit” to “Get your free emoji cheat-sheet now”;
- Sales copy;
- E.g: Testing a completely new sales letter, or a specific part of it;
- Authority / Trust elements;
- E.g: Adding reviews to your landing page, or testing different customer reviews;
- Pricing;
- E.g: Change your product pricing from $97 to $79.
- Headline / Sub-headline;
- Find specific testing elements for your business:
- Recent changes that you suspect might have impacted conversion rates (for the better or worse) and want to make sure that is the case, and if so, quantify it.
E.g: “Pricing” page link was removed from the landing page. The conversion rate seems to have dropped after that; - Customer feedback and common customer questions.
E.g: Go through your support tickets or your survey feedback, if you find that your customers keep asking questions such as “How does feature X work?” you might want to try a version of your landing page that includes a section on how that specific feature works. - Session Recordings:
Watch at least 10 session recordings where a conversion happened.
Watch at least 10 session recordings where a conversion did not happen.
Are your users not converting due to UI issues?- E.g.: Typing their email addresses in the “name” field, not being able to generate a password with the requirements you have at the moment, etc. You may want to create a variation that aims at fixing those issues.
- Identify commonalities among converting and non-converting users (specific sections of the page they look at, specific conversion paths, etc) and hypothesize changes that could move non-converting users down the same path that the converting users are going.
- E.g: If you are offering plumbing services, pushing users to book straight away from the landing page, and you identify that most converting users go through your contact and reviews page before going to your checkout page, you might want to create a version of your page that includes your contact information and reviews on the landing page.
- Recent changes that you suspect might have impacted conversion rates (for the better or worse) and want to make sure that is the case, and if so, quantify it.
- The most common testing elements are:
Defining how you are going to test it
- Open the A/B Test documentation spreadsheet;
- Note: Although this spreadsheet is specially designed to be used with Google Optimize, it should be able to be used with most of the A/B testing tools available. If your A/B testing tool does not offer a specific feature (e.g.: targeting specific audiences) you can always remove/edit that column to fit your specific A/B testing tool.
- Fill out the spreadsheet:
- Test #: Incremental number, this should be used internally. It is useful to communicate with your designer, programmer, copywriter, or whenever you want to mention a specific A/B test during a discussion or a Project Management tool.
- E.g: 001
- Start Date / End Date: Add the Start Date whenever you start running your experiment, and update the End Date once the experiment is over. This will allow you to quickly overview which experiments are still running
- Note: The spreadsheet will automatically update the status of the “Running Days” column and set it to “Still Running” if no end date was added yet;
- Note 2: If the experiment ran for less than 14 days the “Running Days” cell will turn red to warn you that the test might not have ran for enough time for your results to be meaningful (although this will ultimately depend on how many people were exposed to your experiment during that period of time.)
- Created by: The person responsible for this experiment;
- Running days: Leave empty, this cell contains a formula to calculate how many days your A/B test ran and also to let you know of the A/B tests that are still ongoing;
- Purpose: Clearly define the purpose of this test. The purpose should identify what you are going to test and why. You can use this template to fill out that cell if you don’t have any other ideas:
- To test if [INSERT CHANGE HERE] has a positive impact on [INSERT METRIC HERE];
- E.g.: “To test if personalizing the headline with the user’s location has a positive impact on signup conversion rate.”
- To test if [INSERT CHANGE HERE] has a positive impact on [INSERT METRIC HERE];
- Testing Element: Define which element on your page you are going to be testing.
- E.g: “Hero Headline”
- Audience: Define which audience you are going to be targeting in your experiment. You can run experiments only for a specific group of people. Depending on which tool you are using you might be able to target specific Devices, Countries, Traffic Referrals and Traffic Sources, Browsers, etc.
- E.g: All US Visitors
- Metric #1, #2, #3: Define which metrics you want to use to evaluate the success/failure of an experiment. You should add the metrics in order of importance to the given experiment (the most important metric should go first, and the least important, last).
- E.g:
- Metric #1: Signup Conversion Rate
- Metric #2: Revenue
- Metric #3: Bounce Rate
- E.g:
- Version A, Version B: Insert a URL with a screenshot of your control version (Version A), and add a URL with a screenshot of your test version (Version B).
- _If you don’t have a tool to screenshot your page yet you can use the _Awesome Screenshot Chrome Extension it’s free and it offers you a way to screenshot the entire page with a single click:
- Testing Page: This is the URL of the landing page where your test will be running.
- Experiment URL: This is your A/B testing tool’s experiment URL, typically (depending on which tool you are using) this is the URL that will allow you to configure your experiment and check the experiment’s results. If you haven’t set up your experiment yet, leave it blank and come back to the spreadsheet to update it once done.
- Results: Once your A/B test has ended this is where you should log your results so that in a few months you can look back and understand how your previous experiments went, or share it with your team so they are all aware of them.
- E.g: “Personalizing the headline with the user's location increased signups by 36% in the US. It also increased revenue by 12% and Bounce Rate decreased by 10%. The test ran for 3 weeks and a statistically significant result was reached, with version B having a probability to be best of 95%+ on all metrics.”
Starting an experiment using Google Optimize
Note: If you haven’t set up Google Optimize yet you can do it now by following SOP059: How to Setup Google Optimize to run A/B tests on your Website.
- Using Google Chrome, navigate to https://optimize.google.com/ and log in with your Google Account.
- Note: Make sure you have the Google Optimize Chrome Extension on your browser, otherwise you won’t be able to create variants.
- Note 2: Make sure you are not using any privacy extension (such as the Google Analytics Opt-out Browser Add-on), or AdBlocker, or any extension that may block the Google Analytics or Google Tag Manager code, otherwise you won’t be able to create variants.
- Click “Create an experiment” in the top-right corner:
- On the experiment name field, you should type your “Test #” that you’ve already added to the A/B Test documentation spreadsheet plus some descriptive name for your experiment;
- E.g: “Exp001-US-Headline Geo-personalization”
- On the URL insert the URL of the testing page you’ve already defined in the previous chapter.
- E.g: Your homepage’s URL
- Select A/B Test from the list → Click “Create”;
- Link your Google Analytics View:
- Select a view from your Google Analytics account:
- Click “Finished”
- Add your experiment objective (Metric #1, #2, and #3 on your A/B Test Spreadsheet):
- Select “Choose from list” if your metric is already set up as a Goal on your Google Analytics view.
- Choose your objective:
- Select “Create custom” if your metric is not a Google Analytics Goal yet. This option allows you to use the Google Analytics events that you have set up on your website, as well as specific pages as experimental objectives.
Note: If you haven’t already, you can implement Google Analytics events by following the chapter “Setting up Google Analytics Events using Google Tag Manager” on SOP021.
Note 2: This SOP assumes you already have your main objectives set up as goals on your Google Analytics account. Settings for this option will vary depending on your event and page structure. You can implement your events and goals in your view by following SOP021.
- Select “Choose from list” if your metric is already set up as a Goal on your Google Analytics view.
- If you have additional objectives, add them now (just repeat step 8 for each):
- Note: You can have up to 3 objectives on Google Optimize’s free plan.
- Copy the column “Purpose” from your A/B Test documentation spreadsheet to the “Description and hypothesis” box:
- If you want this A/B test to run only on a specific targeted segment, click the “Targeting” tab:
- Select the percentage of visitors to target (typically 100%);
- Select the weighting of visitors to target (typically 50% / 50%);
- Under “Additional conditions” click “And”:
- Select the type of rule that you need and configure it accordingly.
E.g for location targeting:
- Hit “Save” in the top right corner:
- A new message will appear on the sidebar prompting you to run a diagnostic of your setup. Click “Run Diagnostics”:
- Click “Run Diagnostics” and after a few seconds you should see a success message:
Creating a variant on Google Optimize
- Inside your Google Optimize experiment click “Create Variant”:
- Name your variant and click “Add”:
- Your variant will appear on the list and a red “0 changes” label will appear as well, click that variant:
- You will be taken to the page editor, of the page you’ve defined in the previous chapter:
- Select the element you want to edit by right-clicking on it, and select whether you want to remove it, or edit its text.
- Edit your element:
- Once you’re done editing your element, click “Finished” in the bottom-right corner:
- Repeat steps 5-7 as many times as needed until you’ve created the variant you had envisioned.
- Remember: You should not change multiple elements at the same time, otherwise you will not know which of the elements impacted your results positively/negatively.
- Once you’re done editing your page, click “Save” in the top right corner:
- Test how your variant looks on smaller screens by clicking the device dropdown on the top:
- If your variant looks good on other screen sizes, click “Finished” in the top-right corner:
- You will be taken to your experiment dashboard, click the devices icon () on your variant → click “Share preview”:
- You will be taken to your experiment dashboard, click the devices icon () on your variant → click “Share preview” → Copy that link and send it to your team so they can also preview your variant on different devices.
Note: It’s recommended that you test your variant at least on Google Chrome (Desktop & Mac) and on Safari for iOS and Chrome for Android before publishing it. - Once you made sure your variant works on most devices, you’re ready to start running your experiment, click “Start Experiment”:
- Now just click “Start” and that’s it! You are now running an A/B test on your website.
Note: You’ll see a status bar letting you know the A/B test is running:
How to analyze your results on Google Optimize
- Inside your Google Optimize experiment click the “Reporting” tab:
- You will be shown the current stats of your experiment, on the table you will find how well your variant is performing in comparison to baseline. Ideally, your variant will perform better than the baseline for all metrics, but even if it only performs better on a specific metric it might still be worth it to implement (in this example the experiment is still running):
- By clicking on your objective you will be able to analyze how your experiment is performing for each objective specifically.
Definitions:- Improvement: The difference in the modeled conversion rate of the variant and the baseline, for a given objective. This is the likely range in which your conversion rates will fall. (as defined by Google)
- Probability to be best: The probability that a given variant performs better than all other variants. (as defined by Google)
- Probability to beat baseline: The probability that a given variant will result in a conversion rate better than the original's conversion rate. Note that with an original and one variant, the variant's Probability to beat the baseline starts at 50 percent (which is just chance). (as defined by Google)
- Conversion Rate: A box plot of how the collected data looks like for that variant when compared to the baseline. The further away those are from each other the higher the likelihood that your variant will have a meaningful impact on your chosen metric (for the best or the worst).
- Conversions: The number of conversions that were attributed to that variant during the experiment period.
- That’s it! When your experiment has run for more than 2 weeks, has collected enough data, and has reached meaningful statistical results Google Optimize will end your experiment and declare a winner.