Take a look at the two emails below. Which one do you think performed better, the one on the left or the one on the right?
If you guessed the one on the left, you were right!
We found the email with more text had a 11.83% lift in open rates and a 22.49% lift in click-thru rate. The open rate is more of a red herring, as readers can’t tell until they open the email what the contents look like. But the nearly 23% lift in click-thru rate makes a strong argument for readers preferring the email with more textual context for the images.
Having more text that explains the images could help in cases where the HTML doesn’t come through properly on an email reader, as it gives readers more information about the value they can hope to gain from the linked content.
You can see that both emails have different formatting, as well, so that could also be contributing to the increase. We would need to test out more of the individual elements to narrow down how much of the increase was due to the heavier text.
At HostGator we send millions of emails in a year. With each email comes a new opportunity to test something. The example above illustrates multivariate testing (which involves testing multiple variables), but today we’re going to focus on A/B testing.
What Is A/B Testing?
The A/B test is a simple tool that businesses of all sizes use to make their email marketing more effective. If you’re not familiar with A/B testing, we’ve put together this guide to show you why A/B tests are so useful and how to get started.
A/B email tests are real-world experiments you create to see how changing one element in an email such as the subject line, graphics, or timing affects the way recipients respond. Each test compares only one variable (A) to only one other variable (B) to keep the results clear and easy to interpret.
For example, a café owner might create two different subject lines for the same email newsletter and then send version A to one part of her list and version B to another part. If subject line A gets noticeably more opens and click-throughs than B, the café owner knows subject line A is the best choice to send to the rest of her list and to serve as a template for future subject lines.
Speaking of subject lines, it’s time for another test. Take a look at the subject lines below. Which one do you think resulted in higher open rates – the one on the left or the one on the right?
In this test we tested subject lines that highlighted different aspects of our cloud hosting product launch. We wondered, which is more important to users – the value add/features of the product or the discount?
We found that the features won! With 99% statistical significance, the subject line focused on features (“double your speed”) had a 7% lift in opens and a huge 46% lift in click-through rate.
We’ve rerun similar subject line tests, and again and again, the subject lines that promote features outperform the discount.
What’s the takeaway here? Consumers first need to be interested in what you’re selling, before they’ll care about getting a discount.
We’ve learned a lot from testing subject lines over the years. Here’s what we found works for us. Let us know in the comments if something different works for your brand!
- Encouraging a sense of intrigue and hinting at content drives more opens
- Spelling out the discount consistently suppresses response, both in subject lines and email content
- Positioning offers as $ off seems to work better than % off
- Using personalization – real or even implied ('you', 'your') – is more engaging
- Presenting an offer at the right time matters – response can be nearly double if you time it right
- Subject lines influence clicks as well as opens, so work to get them right!
How A/B testing boosts your marketing efforts
A/B testing provides information beyond individual email elements. It can also give you:
- Insight into what your audience cares about and responds to
- Long-term, gradual improvement to your email metrics
- A road map to build future campaigns
- A proving ground for new ideas and offers
- A test lab for design changes, such as new colors and templates, layout changes, and more
- Data for tracking your overall marketing progress and share with your leadership team
You can also test other elements of your marketing campaigns. Social media posts, web page design, paid ads, and more can be A/B tested to improve results.
How do you run an A/B test?
Careful planning, execution, and analysis will give you the most value from your A/B tests. Here are things to consider at each step. First, decide what one thing you want to test. The subject line is a common example, and it’s one of the simplest elements to test. Other basic elements to test include the email’s pre-header copy, button copy, button visual design elements like color or size, button placement in the email, and the number of links you include.
Beyond that, the sky’s the limit. You can test for responses at different times of day or days of the week, the effect of including a discount offer, different images, and much more. Just remember to stick to one element per test. Additional variables can make it impossible to get useful results.
Give yourself a pre-test
You also need to examine your goals. Take the time to answer these pre-test questions.
What do you want to learn from this test? For example, do you want to know if button shape influences click-through rates
What question will this test answer? For the example above, the question is, 'Which button shape, round or square, generates a higher CTR?'
How will you use the test results in the future? This test may settle the issue of which button shape to use in future campaigns so the designer can focus on refining other elements.
Can you generalize the test results? 'Round versus square' is easy to repeat with another test if you’d like to verify your button-shape results. If you compare round versus heart-shaped buttons for a Valentine’s Day campaign, the results won’t help you with non-Valentine’s Day campaigns.
Will the information you learn be meaningful? How much value will the test add to your brand over time? If you’re trying to raise overall CTR for your email newsletters, testing variables such as button shape makes sense. If your emails have high CTRs but low conversion rates, you’re probably better off focusing on a different variable.
Once you’ve thought about what you want to test, think about how you’ll measure it.
Choose your metrics. For a subject line test, open and click-through rates are the metrics you’ll probably want to watch. For our hypothetical round-versus-square button test, CTRs and conversion rates make the most sense.
Schedule your test. Choose a time when seasonal fluctuations like summer slowdowns and pre-holiday peaks, concurrent sales, or current events won’t skew the results.
Choose your test audience. Select audiences that should respond in similar ways. Don’t run your test on different audience segments, because they won’t behave in comparable ways. Run your test with a large enough group that your results can be statistically significant.
Analyze your results. To be valid, the test must generate statistically significant results with a 95% or better level of confidence. What that means, in simple terms, is that if you keep repeating the same test, you’ll get the same results at least 95% of the time.
If you’re comfortable calculating statistics, you can find the p-value of your test results to determine the level of confidence. If you’re not as skilled in statistics, there are plenty of A/B test calculators online to help. This one, from Dr. Pete of Moz, is easy to use. It will also tell you how many more visitors or audience members your test needs if it falls short of the 95% level-of-confidence threshold. That’s helpful because the required size may vary depending on other factors.
An A/B email test checklist
Here’s a quick list to jumpstart your A/B test planning.
- Name your test.
- Answer the pre-test questions:
- What do you want to learn from this test?
- What question will this test answer?
- How will you use the test results in the future?
- Can you generalize the test results?
- Will the information you learn be meaningful?
- Identify the metrics to observe (for example, open rate, CTR, or something else).
- Schedule your test at an optimal time for valid results.
- Choose test audience groups that will exhibit similar behavior.
- Choose a test audience large enough to give you statistically significant results.
- Review to see if you have you run this test before.
- If this is a repeat test, clarify the reason and what, if anything, you’ll do differently this time.
- Analyze your results.
- Log and track your test results.
A solid library of A/B test data is a valuable marketing resource you can build for your business, one variable at a time.
What have you found in your A/B testing efforts? Share your results in the comments!
Casey Kelly-Barton is an Austin-based freelance B2B content marketing writer. Her specialty areas include SMB marketing and growth, data security, IoT, and fraud prevention