Leaving Work Behind

Why (And How) to A/B Test Your Email Campaigns

When it comes to email marketing, not all your messages are going to be hits. Sometimes, your subscribers might not click on them. Even if they do, you might not convince them to convert. That’s all part of the business, but it doesn’t mean you shouldn’t make an effort to create better emails.

One of the most powerful tools at your disposal to increase engagement is A/B testing. By testing your emails, you can figure out what works best and what doesn’t. Armed with that information, you can improve future campaigns and ensure you get an excellent conversion rate.

In this article, we’re going to give you a quick introduction to A/B testing. Then I’ll show you how to A/B test your email campaigns step by step. Let’s step into the lab!

What A/B Testing Is

A/B or split testing is when you create multiple variants (usually two) of a design and send them to your audience. Then you find out which of those designs people liked better, or which one got more conversions.

The goal behind these tests is simple. They enable you to know what works and what doesn’t when it comes to email campaigns. You can then use this data to design better emails in the future and improve conversions across the board.

It’s important you understand when it comes to A/B testing, you usually want to vary a single element or two between designs. If you show people two wholly different emails, for example, you’ll know which one they like more, but that’s not necessarily useful information. Here’s a quick example of two emails used in an A/B test, side by side:

As you can see, the only difference here are the colors and the language of the Call-to-Action (CTA). Testing small changes like this is more useful over the long term because it enables you to optimize each element of your email campaigns.

When it comes to email marketing, A/B testing can be incredibly powerful. Usually, a conversion rate of anywhere between 2-5% is pretty good for a campaign. If an A/B test can help you bump that percentage up even a little bit, then it’s great news for you, particularly as your subscribers grow.

Do keep in mind, though – you need to have at least a couple hundred subscribers to run A/B tests with results you can trust. Anything less than that won’t give you statistically reliable data, which means your results might not be trustworthy. Overall, the more subscribers you have, the more accurate your A/B tests will be.

How to A/B Test Your Email Campaigns

There are several ways you can A/B test your email campaigns. For example, you can try out multiple headlines to find out which ones your subscribers click on more.

Most modern email marketing platforms enable you to take advantage of this feature. For this example, we’re going to use Constant Contact. The first thing you’ll want to do is cook up a new campaign, assuming you already have your subscribers list set up.

Once you’re ready to send your campaign, Constant Contact will ask you to choose which email list you want to send it to if you have more than one:

On the next screen, you’ll be able to turn on an A/B Test toggle at the top right of the screen:

There should be two empty fields below that toggle, identified as and B. Go ahead and enter both the titles you want to test:

You can also configure which percentage of your email list will receive your campaign with each title. We recommend using an even split. Otherwise, your results might not be that accurate.

Before you start your test, you can also configure how long it will last. The way this works, all emails will go out at the same time. However, Constant Contact will stop tracking clicks counting towards your A/B test results after the time you specify. In my experience, 48 hours is a reasonable period. If someone hasn’t clicked on your email by then, chances are they won’t get around to it later.

Earlier on, I talked about how it’s best to test small variations between emails instead of designing entirely new ones. However, when it comes to titles, you have a little more leeway since they’re self-contained elements. Just  to give you an idea, here are three titles from emails I got a while ago from the Headspace app:

  1. Tired of sleepless nights?
  2. Goodbye sleepless nights
  3. Have you tried our new sleep sounds?

In my experience, subject lines like number one and three tend to do better. The first one identifies a problem and implies it can provide you with a solution, while the third outright tells you about a product you might be interested in. However, I only know that because of experience. Even so, I’d be crazy not to A/B test my email campaign titles since it’s free information I can use to get more conversions.

When you’re ready to launch your first A/B test, remember to hit the send button! Keep in mind – the process might vary depending on which email marketing platform you’re using, but almost all of them include A/B testing features.

Conclusion

A/B testing sounds like the kind of feature only professional marketers use. However, these days, most modern email marketing platforms enable you to run split tests for your emails quite easily. Once you start running your first tests, you’ll be amazed at the kind of data you can dig up and how much it can help improve your emails.

Testing your emails isn’t as complicated as it sounds either. You just need to create more than one variant of the same campaign, send them to your subscribers, and check out which one does best. Then you’ll have a better idea of what works and what doesn’t.

If you want to take a shot at A/B testing your email campaigns, we recommend using Constant Contact. They offer a 30-days free trial, so you can get acquainted with how the whole process works at your own pace!