6 Email A/B Testing Rules You Need to Know

I’ve been managing email marketing in some capacity for eight years. I’ve learned a lot of lessons along the way — and made a lot of mistakes.

Like that time I almost sent an email about the NBA, but used a photo of a college basketball team. Oops, hoops.

Source: Flickr

(Thank you to my sportsball-fanatic coworkers for catching the mistake before I hit “send.”)

Luckily, there’s no shortage of email best practice content out in the wild to help guide you on your journey to email marketing success. And hopefully, prevent you from making rookie mistakes.

To name a few pieces of email best practice content, there’s:

We even wrote a guide to help you learn Email Marketing in a Mobile-First World. And this summer, we released a Mobile Marketing Trends report (Not Your Grandma’s Email) that explains how to use email as part of integrated campaigns to drive 3x greater engagement.

But guess what’s the single best practice that will make your email campaigns run a whole lot smoother?

Email A/B testing.

When it comes to email, you can A/B test a whole lot.

So grab your coffee, get comfy, and read on. Below, we take a look at everything you should A/B test in your email marketing campaigns, from desktop to app.

1. Subject Line

You had to know we’d start here.

As I mentioned, we already wrote an in-depth post on email subject line A/B testing. Go ahead and give it a read now if you haven’t.

Your email subject line is your first opportunity to get super creative. If you convince readers to open your email, the doors are open for you to encourage them to click and come back to your app to take subsequent action.

Personally, I have found in my last few email marketing sends that using a humbler em-dash leads to higher opens —  it helps focus attention on an action item and stirs up a little drama. But you won’t know what works for you and your readers until you A/B test.

Keep the following tips in mind, and you might end up on a list of email subject line hall-of-famers.

Length

Experimenting with short versus long copy can yield surprising results.

Check out this example from The Hustle, a newsletter that does a great job of experimenting with email subject line length. Here’s an example of a subject line from them I received a few months ago.

subject line length

Why did I open this email that day? Because it’s so dead simple. It’s one word! And in the realm of email, shorter sometimes means more attention-grabbing. There’s a hint of mystery about what’s inside, and you’re also guaranteed to see the full title on mobile.

Personalization

Personalization makes for an eye-catching subject line. Here’s a subject line I received from Return Path back in May.

subject line personalization

Out of all the emails in my inbox, this one jumped out at me. It feels quite special to see my name in an inbox of generic blasts.

I’m not the only one who reacts this way to see personalized messages. Data shows that personalized content leads to 4x higher open rates.

email personalized messaging

To turn heads with your next send, try adding a user’s first name — or even their company name — into your next subject line. Let us know how it goes.

Power Words

Power Words are a copywriter’s best friend. These provocative words call for urgency, allude to exclusivity, incite emotion, or provide value.

email ab testing copy

You can use Power Words in your copy to ask a question, tease a time-sensitive deal, or address customer pain points.

Emojis

Everyone loves emojis. We send them in text messages and use them prolifically on social media. The library of available emojis keeps growing, currently at 2,823. And emojis are even recognized by Dictionary.com.

So it makes sense that adding these friendly characters can boost your email open rates by up to 66 percent.

emoji use open rate

Why do emojis consistently boost opens? When a person sees an emoji, their brain lights up in the same way as when they see a human face. The brain recognizes emojis as nonverbal information, and are therefore processed as emotions. Naturally, more people will want to tap on something that delights them.

No wonder positive emojis consistently make an appearance in the emoji repertoire of marketers.

Listing the CTA

Try letting your users understand exactly what they’re getting upfront by being specific in your subject lines. This may lead to longer copy, but, hey, that’s something you wanted to A/B test anyway, right?

Here’s a set of subject lines we tested in a webinar invite.

As you can see, the two subject lines that were longer outperformed the shortest, simplest option of the bunch.

What’s even better, we’ve seen that our click-to-open ratio is much higher than normal when we try this approach.

2. Audience

Don’t overlook who is getting your email.

After all, the batch-and-blast approach has its limits. Perhaps you want to alert everyone about a 70 percent off sitewide sale, but in many cases, only certain audience segments are interested in your message.

email segment ab testing

At Leanplum, we often segment our emails by customers versus prospects, and unengaged versus engaged. And when we have something that’s only relevant to a region — perhaps an invite for an event in London — we keep our email sends focused to that particular locale.

No matter what your business, segmentation can ensure that your sends go to the most relevant audience. And there’s a whole slew of attributes to draw from when segmenting your email audience.

email ab testing audience

Often, this could mean you have the opportunity to trigger emails based on user demographic (say, women ages 18-35), location, behavior, or preferences. For example, if you have a travel app, you could let users who normally fly out of JFK know when there’s a winter sale for flights from New York to the Bahamas. Or, you could let music lovers who previously listened to John Legend know when a similar artist releases a new track.

Whatever the use case, data shows that messages triggered by user behavior can see up to 800 percent higher opens. Relevant sends for the win.

behavior based triggers email

3. Timing & Frequency

The time you send an email can be just as pivotal as the audience it goes to.

Back when I did PR, I noticed journalists were more likely to respond to my pitches if my emails landed in their inbox at 8 a.m. local time on Tuesdays and Wednesdays. That’s when my audience was most engaged.

Your audience has their own preferences. And you need to A/B test the timing to find out what it is.

Try sending emails:

  • At the start of the business day
  • In the middle of the week
  • On the weekend
  • At a specific time, in every user’s time zone
  • Using an optimal time algorithm, that analyzes when each individual user is most likely to open based on past engagement

Once you have the timing down, don’t forget about the frequency. Everyone has a different threshold.

In our email report, we found that the average open rate for apps that send one email a week is 13.4%. When apps start sending two emails a week, that number plummets dramatically to 1.67%. But then for every additional email delivered, the average open rate increases.

email frequency open rates stats

But don’t think that means that everything is all roses. We also found that that the apps that sent a higher frequency of emails weren’t blasting everyone — they were triggering emails based on user behavior to a small, self-selected group of users.

In other words, it’s best to send the more generic sends less frequently, maybe once every week or two. If you send highly personalized emails, experiment with how often you can send them before your unsubscribe rate rises. Then you’ll know the ideal timing and frequency for your audience.

4. Creative

The email creative is basically anything within the body of the email. This gives you a lot of freedom to play around.

Try optimizing the…

Sender

Experiment with sending from a company (Leanplum) versus an individual person (Brittany Fleit). Like Hubspot did here.

personal vs company sender ab testing

This is a great approach to test in blog digests and invites to webinars and events. A personal approach could have a bigger impact for you. But on the flipside, name recognition is everything — so if a person isn’t well known, but your company is, it’s worth seeing how this pans out.

Salutation

Similar to the sender, in the body of the email, try greeting your users personally — or not — with something like, “Hi, Brittany” tested against “Hi, there!”

Many email platforms offer the ability to add in a token that makes inserting a person’s first name automatically a cinch. Just make sure you add a default parameter, so the greeting doesn’t show up as “Hi ,” with an awkward space and comma.

Copy

As a writer, one of my favorite areas to optimize is the copy. But as a data-driven marketer, I’m always surprised by what I find.

Write a short version (promoting one outstanding thing) versus a long version (like a slew of things that may interest a broader group) and see what performs best.

There are endless opportunities to express your brand via email. Make them count.

Images

How big of a splash do images make? This test can be pretty fun.

Compare multiple images against each other. Or a gif or video against a static image. Or (my favorite) see if one image performs better than no images. Disguising a marketing email as a plain-text personal email is a great tactic.

Tone

Discover what causes a bigger sensation — keeping it professional instead of playful, stoking fear versus delight.

PCH quickly found that the closer a user is to churning, the more urgent of a tone needed to reactivate them. By experimenting with tone, the brand increased its engagement five percent.

engagement testing

PCH’s other tests have played with personalization, emojis, length, and more. Each campaign proved to be a huge success that deepened user engagement, enhanced user experience, and added new learnings about user behaviors and churn.

5. CTAs

Your CTAs — also known as your calls-to-action — are the heart of your email campaigns. This is what you’ve been leading your users toward all along. You convinced them to open and now you’re convincing them to click.

So approach CTAs in your email marketing with the care and attention they deserve.

A few ideas for what to A/B test when it comes to your CTAs:

Buttons

You can test much when it comes to your buttons. There are endless blog posts about A/B testing button color (currently 245 million search results to-date). And copy is another quick-and-easy test: “download now” vs. “get it,” for example.

But a more interesting A/B test is button placement. Tesco, the second largest retailer in the world, 
wanted to improve the number of conversions coming from the product details page within its app. Due to the number of feature details customers want to see, and the legally required information, the static “add to cart” button was pushed far down the page. As a result, 
it was not clear to customers how to purchase items.

ab testing button

So Tesco tested two “add to cart” buttons: a static button, plus a scrolling button that appeared once a user began 
to move down the page. As a result, Tesco saw a 3.3 percent increase in items added to cart from the product details page.

retail app cart conversions

Amount

You can A/B test both the number of CTA links and the number of CTA variables.

Here’s what I mean.

You can hide the CTA links in several places throughout the email. Of course, the button is an obvious choice. But you can also add links in text, images, even your logo in the header.

In terms of the number of CTA variables, you can experiment with asking someone to do multiple things. For example, a retail app may ask you to shop a sale and watch a video of a new product. A travel app may ask you to book a hotel and check out deals on tours in that city. A media app may ask you to upgrade your subscription and discover all the great new shows you can watch ad-free.

The options are endless, but determining if one or two (or more!) variables has a greater impact is something you shouldn’t overlook.

6. Channels

We discussed A/B testing everything within your email — but is just A/B testing that email enough? What are you doing on other channels?

Personally, I’m email-driven. I always strive for inbox zero. But not everyone is as OCD as me. I I know people with thousands of unread emails in their inbox.

Shudder.

My point is, if you want to truly reach your full audience, you need to A/B test that email — and A/B test sending a larger campaign. Think of it as email plus.

Email plus mobile push notifications. Email plus web push notifications. Email plus in-app messages. Email plus App Inbox. Email plus the in-app experience.

… You get the idea.

Data shows that coordinating messages across channels increases engagement 3x. Staying top of mind and reaching your users where they prefer to connect is an essential multi-channel messaging strategy for customer engagement.

cross-channel messaging stats

A/B Test Your Email Campaigns for Better Mobile Engagement

There you have it — everything you should A/B test to create better, more captivating emails.

  • Subject line
  • Audience
  • Timing
  • Creative
  • CTAs
  • Channels

Now that you’re equipped with this email best practice, go forth and conquer. And let us know what worked best for you.

Want to read more? Download our guide, Not Your Grandma’s Email: The Transformation of Email in a Mobile World, today.