This blog post accompanies our latest ebook: 15 A/B testing ideas for ecommerce email marketers. Click here to download.
It’s 2017, and the days of low-quality, irrelevant marketing messages are well and truly over; if your email doesn’t seem interesting to a recipient, it will be deleted, unsubscribed from or - dare we say it - marked as spam (😧 ).
As a result, it’s critical that your email marketing is crafted in such a way that it actually resonates with your subscriber list on a personal level.
The only way to know how your email marketing is *really* performing right now is to look at the numbers. The numbers don’t lie. If they are good, that’s great, but if they’re not so good, changes are probably needed (for some context, the average open rate for ecommerce email is 16.75% and the average click rate is 2.32%).
Fortunately, A/B testing can help you go about making those changes.
Recap: What is A/B testing in email marketing?
Also known as split testing, A/B testing refers to the method of creating and delivering different versions of an email to different portions of your subscriber base to observe which variation performs best. A test will usually include just one variable (the part of the email that is different for each version), and one metric that will be used to measure success.
In this blog post, we’ll explore seven aspects of email marketing that every ecommerce marketer should be testing - as well as some ideas for how to test them. (For A/B testing best practice and advice on how to create great split testing hypotheses, download our ebook here.)
7 A/B Tests
1) Subject line
A/B testing a subject line can help a brand’s marketing success in both the short-term and the long-term; for example, it can be used to determine which of two versions works best for an isolated campaign, or as a way to detect subtle patterns/trends that emerge over time.
- Length: Short and snappy or a bit more loquacious?
- Copy: Should you go straight to the point or keep it cryptic?
- Personalisation: try including your recipient’s name (or, if relevant, even the product/category viewed)
- Questions: To include questions or not to include questions?
- Emojis: You either love them or hate them. How do most of your subscribers feel?
2) Sender name
Who should you say your email is from? Would a first name work best? Or would it be more professional to either use first and last name or the name of your brand?
It may be that different formats lend themselves better to different types of email; for example, a customer service email may want a completely different vibe to a fun newsletter.
- Person name and brand name: For example, Laura Jones from Rose Gold
- Brand name: Just Rose Gold
- Brand name and subject: Rose Gold, Fashion News
- Brand name and brand slogan or description: Rose Gold - Since 1880
- Team/entity/department: Rose Gold Newsletter
3) Number of emails in an automated campaign
Does a one-off welcome email convert new subscribers to first-time buyers better than a multi-stage welcome series? What’s the optimal number of times you can remind a cart abandoner of what’s left in their cart before they tune out? You can use split testing to determine the number of emails that should be in an automated campaign.
You may want to test:
- The number of emails in a campaign - e.g. one email versus a three-part series
- Sending no email at all - e.g. enrolling half of cart abandoners in a basket abandonment campaign, while not send the other half any emails at all. That way, you can tell whether those who didn’t receive the campaign would have converted anyway.
4) Send time
A/B test the send time of your emails to discover whether there’s a certain day and/or time your recipients are most likely to engage.
- Time of day
- Specific time vs optimised time (i.e. choose a particular time to send and schedule, or rely on data-driven tactics such as send-time optimization)
5) Hero image
In an industry increasingly centred on the “visual” and what “looks good”, your email’s hero image is likely to influence whether or not recipients’ hit “delete” or “shop now”.
- Lifestyle imagery vs product imagery
- Stand-alone products vs people wearing products
- A single image vs multiple images
Other things you could experiment with here include user-generated content and/ or social influencer marketing, where you use authentic images opposed to photo shoots.
6) Text-only email
Despite the fact that, today, thanks to gifs, emojis, Instagram posts, selfies, Snapchat (the list goes on…), images are a huge part of digital communication, this doesn’t detract from the fact that - in some circumstances - text-only remains best.
For example, certain automated emails - such as welcome emails, or messages sent just to your VIP customers - can sometimes work better as text-only to give the illusion it’s been personally (and manually) sent to them by a member of your team.
- A/B test a customer service-style, text-based template vs your normal template for an appropriate campaign (you may want to just try it out with a certain segment for starters. For example, VIP recipients may be more inclined to believe your email has been hand-typed than a lead).
To discount or not to discount, that is the million dollar question.
From 10% off in a welcome email to free international delivery when you spend over £40, for some brands incentives are a fundamental part of an email marketing strategy… but do they work for you? And if they do, which sort?
These are questions that can be answered via A/B testing; for example:
- Include a coupon in an automated campaign (e.g cart abandonment)
- Go completely incentive free
- Test different types of incentives, for example:
The incentive itself: free returns vs free shipping
The presentation: % off vs £ off
Free product vs free experiment (such as a holiday)
Want to learn more?... 👓
This blog post is just a taste of what you can do with A/B testing in ecommerce; for more testing ideas, as well as professional advice on things to bear in mind when running your tests, download our ebook on the subject here.
Whilst it's likely that, in the future, artificial intelligence will result in marketers spending less time checking in on campaign performance (and making small tweaks), and less time manually setting up A/B tests, for now this marketing experiment continues to be extremely powerful.