3 Steps to More Effective Email A/B Testing
You know that you should be email A/B testing – everyone says so. But frankly, a lot of marketers are doing it in a scattered or one-off type way, and not really getting anything useful from their tests.
Spending a ton of time on A/B tests that don’t teach you anything means you’re not actually improving your email or marketing strategy. And static performance from your emails means less opportunity for additional revenue.
To get real value out of A/B testing, what you really need is a framework. In this post, I am going to help you develop a framework that will make it easier and more efficient to manage your A/B testing. When you’re getting all you can from testing, it’ll help you better understand your customers, improve your marketing overall, and increase sales.
Creating a spreadsheet to map out your A/B testing is the essential first step.
When coming up with a long-term A/B testing strategy, it is important to consolidate all your results and planning into one place.
I manage most of Klaviyo’s A/B testing for our own marketing. When I started the job, one of the first things I did was set up my spreadsheet and I still use it regularly. Here’s a sample version:
- Test stage: The first column should let you know has this test been completed, is it currently in progress or is it just an idea? This will allow you to stay on top of what you have already tested and what you should be checking results for.
- Idea column: This is what are you actually testing! In this column include the location of your test: is it a campaign, a welcome series flow, abandoned cart flow, etc. Additionally, you should indicate whether you are simply adding something to a ‘B’ variation of an existing email or what each variation in a test is focusing on.
- Hypothesis: You may be thinking that this is a column that you can skip. But this is actually one of the most important columns. Seldom do marketers just test wildly. If we’re testing two images or two subject lines in an email, we usually have a gut feeling which one will be more successful. By writing it down you can see how well your expectations match your customers’ behavior. This allows you to really improve your understanding of your customers. And the better your understanding of your audience, the more you can tweak your marketing to increase engagement and ultimately sales.
- Success Metric: You need to make sure your test matches your metric, so it’s imperative to include the metric this test will be looking at. If you change the CTA, that wouldn’t have an impact on open rate. Subject lines and preview text are the best place to test open rate. Body content and ctas are good for impacting click through rate, conversion rate and revenue.
- Results: Once your test finishes, you should include the results. Rather than having to dig back through old emails to find your results, they will all be in one place. This will also make it easier to think about what your future tests could be because you can easily look over your past results.
- Follow Up: The results of tests may lead to new ideas for next tests, so this is also something you should definitely include in your spreadsheet! If your hypothesis is successful, you can think about what the next test could be to further validate it.
Big and Little Tests
So now you have a central place for all your tests and results, but you still need to think about planning strategically. One of the biggest challenges is knowing the scale to which you should be testing. An easy way to think about this is doing a big vs. little A/B test.
A little test can be thought of as an optimization test. You are testing one specific element of an email to see how it impacts behavior. For example, testing two images in your body copy to see which results in better click rates: one of just your product and one that has a person using your product. Little tests are imperative and should be done regularly.
You’ve probably heard this many times (we’ve even written it on this blog before) – you should only test one thing at a time. That is often true. However, once in a while it is a good idea to run a big test. A big test is one where pretty much everything in your email body is different. One email might be super designed with tons of images and the other text-based. Or one email is long with a lot of personalization while the other is much shorter with a clear simple call to action.
So why would you run a big test? Let’s say you spend all your time running smaller tests to optimize your current email. If it turn outs that there is a different base design that even without a lot of optimizing performs better than your super optimized email, then you’ve wasted valuable time.
This is known as the pitfall of the local maxima.
Let’s say your current design has a base CTR of 2%. You might be able to tweak it with tons of little tests to get a 2.5% CTR. That 2.5% would be considered your local maxima. But you could test a different design and find that without any tweaking, it comes in at 3.8% CTR. Clearly the second email design should be your starting point. If you tweak that email instead, you might be able to get your CTR to 4.3% – which could really increase your sales!
This is not something you’re going to want to do in every instance. This is a good thing to do periodically as your audience and brand grows and matures. It can also be a good sanity check to make sure the direction you are taking your marketing is the best fit for your audience.
Test to better understand your customers
Now that you’re organized and have a record of what is and isn’t working, it’s time to really absorb what your results are saying about your customers.
Let’s say you tested 3 subject line variations:
- Our customers say we have the softest t-shirts around
- One of our t-shirts will last you 3 times longer than your average shirt
- We have shirts in every color, what are you missing?
These subject line variations are actually telling you a lot about your customer. Let’s say variation 2 resulted in the highest open rate. This tells you that your customers really like your brand because of how long your t-shirts last. If you know people respond to that type of messaging, then it can be incorporated into other parts of your marketing – beyond email! You could add content around your long-lasting t-shirts to your website or your social media messages.
Plus you might be able to identify audience segments based on responses to different messaging. Let’s say you test adding a customer recommendation to an email. You find that overall your two variations did not perform that differently. However, when you look just at how first time buyers responded, you find that your conversion rate was much better for your email with customer recommendations. From this you’ve now learned that first time buyers are swayed by recommendations even if repeat buyers are not. This information could lead you to create a first time buyer segment for your flows or campaigns that includes customer recommendations, because you know this particular sub-audience responds better to them.
You can then take this a step further and start testing what kind of recommendations they prefer – for example ones about price or comfort. Once you know what specific types of recommendations this audience segment like, you can optimize your marketing beyond email by adding those specific types of recommendations to your website.
Once you have a solid framework for A/B testing in place, it should quickly become a core part of your marketing strategy. When you A/B test in a way that helps you to better understand your customers, then you are able to improve your emails and overall marketing.
Getting organized and putting all your tests and results in one place makes it easier to look at A/B testing as a long-term marketing strategy – not just a one-time standalone test to optimize a specific campaign. It will also help you to quickly see what you’ve already tried and inspire additional follow up tests.
Testing isn’t JUST about the little things. When you occasionally run big tests on your marketing direction, you can really challenge your expectations about how your emails perform. This will ensure you’re on the right path and quickly change direction if you’re not!
Finally, when you’re running tests in email that give you new insights about your customers, you can take these learnings and apply them to other parts of your marketing. Applying your results across marketing functions and marketing channels makes A/B testing one of your most valuable assets.
Back to Blog Home