Which of these two calls-to-action would be more successful when placed on one of your website’s pages?
“Want to learn more? Click here!”
“Limited-time offer – click here now!”
Think that’s an easy one to answer? Actually, it’s just a rhetorical question; there’s no correct answer. For example, a reader just being introduced to a product might respond better to the first option offering more information, while another who’s already been pre-sold might jump on the second choice. Different audiences will respond to content in very different ways.
In fact, there’s no legitimate way to determine which one would be “more successful” until you define “success.” Is it total number of sales? Gross income? ROI? Size of the email list you’ve built? You have to specify the goal you’re trying to accomplish before you can judge “success.”
Choosing between calls-to-action is difficult enough. It’s even harder when you’re trying to decide between two different versions of an entire page. That’s when A/B testing becomes imperative. And thankfully, Google Analytics now makes it easy to create an A/B test for your content, with Google Experiments.
What Is A/B Testing?
Even if you’re not familiar with the concept of A/B content testing, it’s simple to understand.
Let’s say you’re considering changing one of your website’s pages (call it “content A”) and have written an alternate version of the page (call it “content B”) that you think might convert better with your visitors. If you show half of your traffic “content A” and half of your traffic “content B,” and then measure the results (remember, you need to define your goal in advance) – you’ll be able to tell which page works better and whether you should make a permanent change.
You might think that anything that sounds so easy must be difficult or expensive to implement. It was, once upon a time, and there are still a number of complicated and costly in-house software programs you can use for A/B testing and much more sophisticated analysis. However, there are now a number of web-based solutions which will let you do A/B content experiments, from the inexpensive Optimizely (which has a simple-to-use WYSIWYG interface) to the well-known and much pricier KISSmetrics (which lets you dive deeply into a massive pool of data).
For standard A/B content testing, though, you can’t beat the Experiments system that’s built into Google Analytics. That’s partly because it’s surprisingly full-featured, partly because it’s not too difficult to set up – and mostly because it’s absolutely free.
Let’s look at how it works.
How To Create an A/B Content Test With Google: The Preliminaries
We’ll assume that you already have a Google Analytics account and that the code is installed on your website. If not, go ahead and set it up. We’ll wait.
Welcome back! You probably think that we’re going to dive right into setting up an A/B content test, but that’s getting ahead of things; you wouldn’t pull out a recipe and start baking unless you knew you had all of the ingredients you needed on hand, and we’re going to do the same thing here: ingredients first.
To start, make sure you have your “B content” – the page that you want to test against your current page – ready to go and posted on your website, with the URL readily available so you can cut and paste when you need it. (You can actually test up to twelve pages at one time in the Google Analytics Experiments tool, but for now, it’s easier to discuss A/B testing than A/B/C/D/E/F/G/H/I/J/K/L testing.)
If you’re going to test click-thrus to another page, have that URL available as well. Next, decide exactly what you’ll be using as your criterion for success; most will want to maximize revenue or transactions, but some may want to track the time spent on the page, ad clicks, or another metric. It’s important to make this decision before setting up your test.
Finally be sure that your Google Analytics tracking code is posted on both pages that you’ll be testing.
How To Create an A/B Content Test With Google: The Experiment
OK, let’s get started. Open your Analytics control panel, where you’ll select the “reporting” tab, then look in the left-side navigation bar for the “Behavior” section, and click on “Experiments” underneath it. Click the “START EXPERIMENTING” button on the next page, and you’re ready to set up your A/B test on the screen that opens. (If for some reason there’s already an experiment set up in your account, click on “Create experiment” to start a new test.)
The “Create a new experiment” screen will open.
- Enter whatever name you’d like to use for the test in the “Name for this experiment” field; make it something unique that will be easy to recognize later because your test will be running for weeks or months, and you may want to create other experiments in the meantime.
You can also create your own goal, such as click-thrus to a new page or number of video views, by clicking “Create a new objective.” The easiest way to set this up is to select “custom” on the first page that comes up, enter a name for the goal and select its type (Destination, Duration, Pages or Event) on the second page, enter the relevant destination page or event (the one you have ready to cut and paste) and an optional monetary value per click or sale on the third page, then click “Save goal.” Click back to the “Objective for this experiment” page and you’re all set. Don’t worry, this is actually a lot easier than it might sound.
- The next choice on this screen is “Percentage of traffic to experiment.” This selects how much of the traffic to your site will see the original page and how many will see your “B content” page – but there’s one tricky thing to consider. With this drop-down, you’re choosing what percentage of visitors will be participating in the test, not how many visitors will see each version of the page. That means that if you choose 50%, you’re not deciding that 50% of visitors will see “A” and 50% will see “B.” You’re deciding that 50% will see “A” by default (the normal page they’d view) and 50% will be entered into the A/B experiment – so 75% of your visitors will end up seeing “A” (50% + 25%) and 25% will see “B.” Bear this in mind when making your selection.
One other note: if your alternate page is very different than the original, you may want to limit the percentage of visitors participating in the experiment in order to minimize potential revenue or conversion losses. You can always increase the percentage in the middle of the experiment if things are going well.
- The “Email notification” choice is self-evident.
- “Advanced options” has one important selection you must make. Checking “Distribute traffic evenly across all variations” will ensure that each of your pages continues to receive an equal amount of test traffic. If it’s not enabled, Analytics will automatically start sending more traffic to the page that’s performing better. The former will give you a test that’s standardized across the testing period, while the latter will start maximizing performance as the test proceeds while still rendering accurate results. There are two other advanced options you can consider: “Set a minimum time the experiment will run” will prevent Google from naming a “winner” too soon, and “Set a confidence threshold” allows you to decide how decisive a measurement you want before Google declares one page better than the other.
Allow yourself a sigh of relief. It now gets much easier. After you’ve saved your changes, the next screen is “Configure your experiment.” This is where you copy and paste the URLs for the “A content” and “B content” pages; click “Save Changes” and you’ll be shown the experiment code for the A/B test. Copy it and paste it onto the “A” page, right below the <head> tag near the top, then click “Save Changes” again.
If you’ve done everything right, Google will validate the code and tell you that you’re all set – or if there’s a problem, you’ll be shown the errors which need to be corrected. In rare cases, Analytics won’t be able to find the code on a complicated page or a web server whose settings prevent it. If this happens and you’re sure you haven’t made any mistakes, don’t worry about the validation. Click “Start Experiment” and off you go. (You can also choose to “Save for later” if you’re thinking about making changes.)
How To Create an A/B Content Test With Google: The Results
Once everything is set up your A/B experiment will start right away, and after a day or two you’ll begin seeing results which can be viewed in your Experiments list.
The main window will show the test’s status, major details and the number of visits the pages have received; if you click on the experiment’s name, you be taken to a more detailed window. There you can see a wealth of information in table and graph form, based on the goal you selected when the test was set up. This can include the percentage of users who accomplished the goal, and the numerical or monetary value of their goal completion, if applicable. You’ll be able to tell how well each page is doing, see comparisons between page performance, and even Google’s estimate of the probability that the new page will outperform the old one by the time testing is complete.
If you allow the A/B test to run to its normal completion, you will see one of three possible status reports:
- Ended (Time limit reached), which means the experiment ran for three months (or the time period that you chose during set up) without a clear winner.
- Ended (No winner), which means there was no statistically significant difference between the performance of the two pages.
- Ended (Winner found), which we probably don’t have to explain. The winning page will be identified on the data page, along with all of the specifics.
Here are a few frequently asked questions about the use of Google Experiments for A/B testing – and their answers.
Q: Should I start making changes to my site if I see an early trend in my reports?
A: No. It can take several weeks for traffic to stabilize and reliable trends to emerge; even if the numbers look overwhelming, Google won’t declare a winner for at least two weeks to allow data to stabilize.
Q: Is it a good idea to test more than two pages at once?
A: It can be, but bear in mind that additional variations will mean that a lot more traffic will be needed to draw reliable conclusions, since visitors will be divided between all of the tested pages. More traffic requires more time, so be prepared to wait longer for results.
Q: Should I consider a “multi-armed bandit” experiment?
A: For those who aren’t familiar with the term, it refers to the “Distribute traffic evenly across all variations” option discussed during the set up of your experiment. You’ll remember that if you don’t choose this option, Analytics will begin diverting more traffic to the better-performing page over time, which is known as the “multi-armed bandit” approach (named after a hypothetical slot machine experiment).
While this may seem counter-intuitive to proper testing procedures, there are advanced mathematical models showing that this approach is not only statistically accurate but more efficient, so you get results more quickly while maximizing performance. If you trust the science more than your gut, it’s a good alternative.
Q: Should I run more than one experiment at the same time?
A: As we’ve mentioned you can up to twelve concurrently, but be aware that as you run more and more tests, they can start interacting and produce results which are difficult to analyze. You can try it, but it’s safer to run just one or two at a time unless you’re sure they won’t conflict.
Q: Can you run a Google Experiment with pages that serve dynamic content?
A: Not if the content is served by means of permalink-type URLs. If the pages use query-string parameters, you should be OK.
Kyler is a Marketing Manager at HostGator. You can usually catch him writing articles or running around to get things done. Other than that, he’s out there fighting crime. It may not be real crime, but someone has to call others out on Reddit. Check him out at KylerPatterson.com