I know what you’re about to say: you’re struggling to create one email, let alone multiple versions of the same email.
It may seem like A/B testing is just a nice-to-have approach that you’ll tackle when you’ve got more bandwidth, but that’s not the case.
Read on to discover why you need to prioritize A/B testing, the different testing approaches you could take, and the huge results you could see.
Why do email marketers need to A/B test?
It’s no secret that we can always learn more about our customers and target audience. A/B testing allows you to do just that through a series of targeted, strategic tests.
Think of A/B testing as your customer consultant, whispering in your ear about how they react to certain messaging, offers, call-to-actions (CTAs), and other elements.
A/B tests can increase conversion rates by 49%, so it’s a worthy time investment for those looking to drive conversions (aren’t we all?).
Check out these countless A/B testing success stories if you need more convincing.
A/B testing: Where to start?
So how do you start A/B testing effectively and regularly?
Well, first think about the key elements in your emails, and what you could switch up as part of your A/B testing strategy. Examine the results you’ve seen in previous campaigns and on your website, what messaging, colors, and designs worked the best?
It’s up to you to decide what’s your primary objective, and start A/B testing to achieve it!
To get you started, marketers often test for these common email campaign elements:
- The subject line (72%)
- Message (61%)
- Layout and images (50%)
- CTA (50%)
- Days of the week sent (46%)
- Time of day sent (39%)
- Personalization (34%)
- Landing page (32%)
- Target audience (30%)
- From line (26%)
- Mobile layout and images (15%)
Testing these different elements will provide you with a diverse range of results and insights. For example, if you change the text or color for your CTA button then that’s probably going to impact the number of clicks and conversions you get, while switching up the subject line may influence the number of people who open your email.
It’s important to note that you should only A/B test one element at a time to ensure you get accurate results.
Additionally, it’s recommended that you run the A/B test on your entire email list, rather than segmenting it into different groups, for your initial tests.
A lot of experienced email marketers we speak to like to use the 10/10/80 test. This involves testing 10% of your list against the other 10%, and the winner gets sent to the 80% left.
Moving forward: Turning insights into action
Once you’ve sent out an A/B test, it’s now time to analyze the results and glean the insights you’ll need to inform your next campaign.
Document the details of the A/B test, the elements you were changing, and the design or copy details of the two versions. Track and document the results of the two versions and analyze how each version performed.
Zachary Williams, founder and CEO of digital marketing agency Venveo, stresses the importance of looking at the results from all angles. “If one iteration gets you more clicks that might look like the winner of the test at first glance, but if the other iteration gets you more conversions from fewer clicks then it’s actually the winner. Remember, you’re not just looking for a higher quantity of response, you’re looking for quality of response.”
After comparing the results of these two versions (examining for both quantity and quality), incorporate that knowledge into your ongoing A/B testing strategy and wider email marketing strategy.
Each A/B test should build on that knowledge base and inform your strategies, ensuring your team continually learns and improves.
Above all else, remember the modern marketer’s mantra, always be testing!
If you have any questions about A/B testing or email marketing in general, give us a holler by opening the chat window down on the right or by emailing hello@stensul.com.