Not all email marketing campaigns are created equal. Great email marketers incorporate A/B testing into their strategy and leave room for experimentation—even after pressing send.
Though marketers may experiment with different website landing pages for a few weeks to uncover insights about a target audience, email A/B testing delivers results in as little as a few hours. Learning your readers’ preferences can help you craft more successful email campaigns with better open rates and more click-throughs—resulting in more overall conversions.
Here’s what to know about A/B testing and how to use the technique to achieve email marketing success.
What is email A/B testing?
Email A/B testing, also called email split testing, measures two different versions of the same email against each other to measure engagement. Each version goes out to a sample size of an audience, and the winning email (determined by performance in a chosen variable) is sent out to the remainder of the recipient list.
Most email automation software offers this feature, making it easy to try new strategies with each campaign.
Email variables in A/B tests
When it comes time to A/B test, email marketers have multiple variables at their disposal. Here are a few key elements worth testing in your upcoming email marketing campaigns:
Subject line
The subject line is one of the most commonly tested variables in email marketing campaigns. The results from running a test of two different subject lines are as close to an unbiased gut check as you’re likely to get from your audience. The email with the higher open rate will be the winning version.
To get the most out of testing two subject lines, focus on notable differentiations like:
- Word order. Word order can be useful for encouraging clicks, especially if you lead with a promotion or discount code. Try rearranging the contents of your subject line to see if customers respond to a direct statement (“Take 50% off all end-of-season sale items”) or something more nuanced that teases the contents of your email (“End-of-season sale items are now 50% off”). You can also test a statement versus a question (“Want 50% off? End of season items are now on sale”).
- Emojis. Emojis can help you stand out in an inbox full of text. Many brands use them to accent their subject lines. Depending on your audience, emojis may or may not drive opens. A/B testing is one way to gauge the degree of playfulness your customers like to see.
Call to action
The call to action (CTA) is a critical component of any email campaign because it is the action you hope your audience takes after opening your email. Try A/B testing your CTA as a hyperlinked button (and the language on the button, like “Buy Now,” “Get Yours,” or “Read On”) versus a plain text link to see which is more likely to boost your click-through rate.
Tone
Email A/B tests can be a great way to experiment with tone. While you don’t want to stray too far from your established brand voice and identity, your email marketing strategy might involve a more conversational tone to boost brand loyalty.
Does a lighter tone convert more often than content focused on scarcity? The click-through rates or conversions will show you what resonates with most subscribers.
Images
Testing images can mean sending one email without any images or switching which images you use in your message. Since recipients won’t see these images until they open the email, you can use click-through rates or conversions to measure the success of either option.
Design
The layout is another way to test visuals on your audience. For some brands, this might mean testing design-heavy email templates versus plain-text emails. You might imagine that a big, splashy email body would win every time, but sometimes, a simple layout can succeed.
Plain-text marketing emails worked for Glade Optics founder Curt Nichols: “The body of the email is plain text, no photos, maybe one or two hyperlinks. The email comes in looking like it’s an email from a friend or a family member.”
Curt developed this approach through A/B testing. “We tested the more traditional marketing emails, with lots of flashy pictures and things like that, against these plain text emails,” Curt told Shopify Masters. “The engagement level on these [plain-text] emails is so much higher.”
Length
Storytelling is a key pillar for many brands and businesses, which can translate to longer, narrative-driven body copy in marketing emails. For others, long emails are a conversion killer.
Testing copy length is less about adhering to one approach than the other and more about diversifying your email marketing strategy. If the longer format in your test version works better than expected, consider it an opportunity to delve into niche topics every so often.
Content
You can tweak the content of a marketing email, but most small copy edits won’t lead to meaningful distinctions. The key is to consider the content of the email from a broader perspective: Do customer testimonials get better results than product details? Or does a sale announcement catch more people’s attention than a Q&A with an industry expert?
Personalization
Testing a personalization variable may include sending from your company name versus an individual employee’s name or using the recipient’s name in the subject line or body copy to grab their attention. Open rates will help you identify the winner in this A/B test if you adjust the send name or the subject line.
Send time
Is your audience more likely to open your email on their lunch break or on their way home after work? Are they morning people or night owls? Experimenting with send times—and send days—narrows down when your target audience is most likely to open and click, allowing you to plan for maximum engagement going forward.
How to conduct an email A/B test
- Establish the goal
- Determine your test elements
- Choose the winning criteria
- Send to a randomized segment
- Observe the results
When you’re ready to test your marketing email, follow these steps:
1. Establish the goal
Before beginning an A/B test, know why you’re testing what you’re testing. Are there anticipated outcomes or hypotheses you’re looking to answer? What are you looking to improve or learn?
Say you’re a high-end sneaker brand. Maybe the emails you’ve been sending about new product drops aren’t leading to as many opens as you’d like, and you suspect it might have to do with a dry approach to subject lines. Knowing this will determine how meaningful the exercise is to your overall strategy.
2. Determine your test elements
Upload a control version of the email into your campaign scheduler. Make this email as close as possible to the previous emails you’ve sent. Then, create a version where you alter the components you want to test in preselected fields, like the subject line.
For the sneaker brand, the control subject line might be something like “New Fall Styles Now Live,” and the test version could be “Meet the Siren Song of Every Sneakerhead You Know.”
3. Choose the winning criteria
Will you measure success by open rate? Or will the winner have a higher percentage of in-email actions like click-throughs? The variable you test will determine which metric you track.
Open rate reflects a strong subject line, while a click-through indicates that someone was interested enough to click a link in the body of your email. A conversion often means those clicks led to a purchase.
Our sneaker brand will use open rate as the winning criteria to test the strength of a more creative subject line. If tweaking the subject line leads to a higher open rate, the brand might experiment with other criteria.
4. Send to a randomized segment
Usually, your email provider will automatically send both versions to a randomly selected subset of subscribers. The percentage of your total subscriber list you test is entirely up to you. If your email client doesn’t have an automated A/B test feature, you can segment your list manually.
Ideally, you want to use a large enough segment that a notable preference emerges. If your list is small, aim for a 50/50 split; if it’s larger, testing anywhere from 20% to 50% of the total list can inform which email should go to the remaining subscribers. For example, if the sneaker brand has 10,000 email subscribers and 20% will get the test, 1,000 will receive the control version and it will send the test version to another 1,000. The determined winner will then be sent to the remaining 80%, or 8,000, emails on the list.
5. Observe the results
Send your split test and watch for trends that emerge between the two variants over the period you’ve selected, usually an hour or two. Once one has been declared statistically superior, take note of what worked and consider sending the winning version to the members of your list not included in the test.
Email A/B testing FAQ
How do I run an A/B test email?
Most email marketing software makes it easy to test one variable or multiple variables simultaneously. The software will select a winning variant according to your chosen metric (open rate or engagement-driven signifiers like click-throughs or conversions) and then send the winning variation to the remainder of the recipients.
What are the metrics for an email A/B test?
When A/B testing a marketing email, there are three main metrics for determining a statistical significance between the two versions: open rates, click-through rates, and conversion rates. Open rate measures the percentage of your audience that opened an email. Click-through indicates that someone clicked a link in the body of your email. A conversion often means those clicks led to a purchase.
Why is A/B testing important in email marketing?
A/B testing is a powerful tool for demonstrating the social proof behind your marketing decisions. Testing copy and design choices helps refine your marketing strategy and improve your email engagement.