Q&A: 3 Experts on Real-World Lessons from Email Tests of the Past

email tests

As marketers, most of us have executed A/B tests that we thought would yield a specific result, only to see our hypotheses blow up in our face. These experiences usually leave us with bruised egos and more questions to answer, but they also tend to uncover insight that changes how we approach a particular strategy or channel.

One such experience from my career came when I led the redesign of a startup’s website. I chose to re-write the copy for the new site, which seemed like an easy task given my opinion of the old copy (hint: I hated it). So, when the time came to test my copy against the messaging of the past, I was confident. Obviously, mine would prevail.

Except it didn’t. And it wasn’t even close. 3 Marketing Experts, 30+ Years of Experience, 3 Lessons Learned“Experience is simply the name we give our mistakes.” – Oscar Wilde

The lesson I learned from the experience above was that it’s never safe to assume that some big idea (even if it seems better) will deliver the results you expect. The only way to find that out is to test your ideas and analyze the results — ideally, before you unleash that new thing on the world.

It also doesn’t hurt to study the experiences of other marketing experts, which is why I reached out to three of Klaviyo’s email gurus to share the lessons they’ve learned from tests of the past.

In the Q&A below, you’ll find real-world insight from actual email tests, advice on the parameters that need to be in place for a test to be successful, and tips for establishing the right testing cadence. The expert panel includes:

Jake Cohen, Klaviyo’s Director of Product Management and the former Director of Customer Marketing at DataGravity and a co-founder at Privy
Agata Celmerowski, Klaviyo’s VP Marketing and the former VP Marketing at Databox and Campaign Monitor
Brian Whalley, Klaviyo’s Director of Marketing and a former Director of Marketing at InsightSquared and HubSpot
Collectively, Jake, Agata, and Brian have 30+ years of experience and they’ve run hundreds of tests. So, what have they learned from those tests? And what advice would they give to fellow email marketers?Q: What’s one of the most memorable email tests you’ve run and what did you learn from it?jake-headshotJake Cohen: One of my biggest successes came when we tested a plain text email against a nicely designed email. Needless to say, the results were surprising. Both emails featured the same content, but the plain text variant tremendously outperformed the designed version in terms of open rate and engagement (response or click).

What we gleaned from that test was that our audience had preconditioned themselves to ignore overly designed emails. They wanted to have human conversations that didn’t feel produced. As a result, they responded best to emails that offered a provoking thought and a tone that made them feel like there was an actual human on the other side of the computer.

Now, this was for a B2B SaaS company, so what we learned may not apply to ecommerce or other consumer industries. But this lesson does: When you define the intent of your email, everything about it — design included — should align with that intent. If you want people to believe, for example, that you make the most beautiful furniture in the world, then every element in your email — images, copy, design — should support that message.

agata-headshotAgata Celmerowski: I’m going back to 2005 for my success, and it wasn’t a straight test but more of an experiment.

I was managing email-based programs for a media company. We used emails to drive leads for our clients and we were seeing declining response rates. Because clients paid on a per lead basis, the decline in response was threatening revenue. At the same time, there were a couple areas of the business where demand for our programs was so high that we regularly sold out of email inventory, since we were unwilling to send any given person more than one email per day.

So, I had to figure out how to drive more leads and create more inventory without messaging our subscribers any more frequently than we already did. We were already segmenting our email subscribers based on the topics they told us they were interested in when they signed up. But I thought we could factor in another dimension to how we were cutting our lists by taking a look at what other content people were consuming on our sites.

Fast forward a couple months and a lot of quality time spent with our analytics team, and we automated targeting for all campaigns to rely primarily on how people behaved on our websites. Back then, that required a hell of a lot of custom development. But it was worth it: the number of people we emailed for any given promotion declined by 50% on average, which doubled inventory overnight. And the response to those promotions increased by 10%.

brian-headshotBrian Whalley: One of the most memorable email tests I ever ran was way back in 2008. I was working for a small music contest website (OurStage) and I was in charge of the email newsletter we sent to keep people active on our website. One of the audiences we wanted to engage was the musicians on our site, because we wanted them to continue participating in our community.

At the time, it was really difficult to get those artists engaged via the email. They just didn’t respond. And the subject lines we used would be something like: “[OurStage] May 2008 OurStage New Channels and Rules Changes”. There wasn’t a strong call to action. Nothing demanded real attention. And it essentially looked like the privacy policy update emails you get (and definitely ignore).

So, I started trying to personalize the newsletter based on the band name for that artist.

This led to subject lines like: “[OurStage] Artist Update for The Shapes – June 11 2008″. This test more than tripled our open rates, and our click rates remained steady with that rise. That told me that we’d had the right content in our weekly campaign all along, but that we’d failed to prove we had a meaningful message for that specific artist. By making the email seem more substantial, personal, and timely, we were able to drastically improve engagement.Q: What are the keys to a successful email test? brian-headshotBrian Whalley: For me, it comes down to answering one question: What result are you trying to drive?

Going back to my example above, I was pretty sure that we had the right email content. I just needed to to get more people to read it. If I could do that, I was sure they’d respond and get involved. So, I’d say this: Don’t run tests just to say you’re running them. Instead, think about the behavior you’re trying to drive and the steps you need to test to get there.

jake-headshotJake Cohen: I completely agree with Brian. Outside of thinking about your goal, I’d ask some other questions:

What’s your KPI?
What’s your baseline?
What’s your control group?
Do you have a sufficient sample size?
Have you controlled for certain variables (e.g. time of day, day of week, subject line, quality of relationship with recipient, current events, etc.)?
I think it’s also important to consider how you’ll measure the performance of your tests. What tool will you use? And are you really ready to tear up some .csv’s?

agata-headshotAgata Celmerowski: I’ll agree with Brian and Jake here, and add that the scientific process should be part of all testing — email and otherwise. People go wrong when they don’t have a clear hypothesis structured in a way that you can actually learn from a success or a failure.Q: How often should marketers be running email tests? brian-headshotBrian Whalley: I’d say on a weekly basis — whether it’s a dramatic test that might require a lot of prep work, like completely changing the format or styling of a message, or small tests like a subject line variation that can be tested on short notice.

But, again, I think it all comes back to knowing what result you’re trying to drive. If you don’t know what behavior you’re trying to drive or what element you’re trying to test (and why), there’s no reason to run a test.

jake-headshotJake Cohen: I couldn’t agree more.

I think most marketers test too many variables and for the wrong reasons. The most important thing to decide is the key metric you want to improve. Do you want more sales? Clicks? Opens? Responses? Forwards?

Pick an objective. Write down hypotheses as to why you think people AREN’T doing the KPI you want, and then brainstorm 2-3 ideas that you think would solve those problems. Those are your tests. For example, if you wanted to improve your clickthrough rate on a particular button or CTA, you might develop hypotheses that assume:

  • People don’t know where the button will take them
  • The button isn’t a color that promotes clicking
  • People aren’t clicking because they aren’t ready to buy and they associate buttons with buying

Now you have three tests to run. With each email you send, you can test the impact of tweaking or adding one element, and analyze that change’s impact on your objective.

agata-headshotAgata Celmerowski: I think what Brian and Jake said is spot on. But I’ll add this: There’s no reason not to constantly test your emails, as long as you’re approaching it methodically with a clear understanding of what you’re trying to achieve.

If you operate with that mentality, then every test you run will either help you learn something you didn’t know before or it will confirm your hypothesis. Either way, you’ll be able to execute with more confidence and intelligence than you would have without testing.Keep LearningInterested in getting more tips and advice like this? Subscribe to our newsletter and get our freshest content on ecommerce marketing and more.

Sign Up for the Klaviyo Newsletter

.yuzo_related_post img{width:260px !important; height:250px !important;}
.yuzo_related_post .relatedthumb{line-height:16px;background: !important;color:!important;}
.yuzo_related_post .relatedthumb:hover{background:#ffffff !important; -webkit-transition: background 0.2s linear; -moz-transition: background 0.2s linear; -o-transition: background 0.2s linear; transition: background 0.2s linear;;color:!important;}
.yuzo_related_post .relatedthumb a{color:#323b43!important;}
.yuzo_related_post .relatedthumb a:hover{ color:}!important;}
.yuzo_related_post .relatedthumb:hover a{ color:!important;}
.yuzo_related_post .yuzo_text {color:!important;}
.yuzo_related_post .relatedthumb:hover .yuzo_text {color:!important;}
.yuzo_related_post .relatedthumb{ margin: 0px 0px 0px 0px; padding: 5px 5px 5px 5px; }

jQuery(document).ready(function( $ ){
//jQuery(‘.yuzo_related_post’).equalizer({ overflow : ‘relatedthumb’ });
jQuery(‘.yuzo_related_post .yuzo_wraps’).equalizer({ columns : ‘> div’ });
})

Back to Blog Home