Rethinking your A/B Tests

Posted by Mary Sohn on

Let’s say you ran an A/B test. The email with the higher open rate wins, right?

Maybe.

Unfortunately, most Email Service Providers (ESPs) that provide an automated A/B test in their tools will automatically pick the campaign that has the best open rate, but you may be missing more critical metrics (revenue, conversion rates, clicks, etc)

To illustrate, here are real A/B test results I reviewed for a client where they tested two subject lines with the same email content:

2016_04_01_17_34_04_oracle_content_marketing3___application

When I saw these results, I noticed that Subject Line B had a higher click through rate (CTR) and a lower unsubscribe rate. Even if the CTR was higher, this doesn’t guarantee that Subject Line B would have performed better since the OR was lower. Without having access to conversion or revenue data, I re-analyzed the A/B test results with the objective of finding the campaign that would have resulted in higher clicks through to the website.

To do this, I took their sending volumes for both campaigns and “reset” them to reflect results for sending to exactly 100,000 subscribers for each test group:

Client Supplied Data:

2016_04_01_17_31_47_oracle_content_marketing___application

“Reset” Data:

2016_04_01_17_32_08_oracle_content_marketing___application2

Based on these results, it becomes very clear that Subject Line B should have been the test winner. Even though Subject Line A had more opens than Subject Line B, these results show that Subject Line B was better at preparing the subscriber for the email content and resulted in a higher number of click-throughs and lower unsubscribes.

When running A/B tests, make sure you measure the results against your email goals. Open rates are important, but should only determine the results of an A/B test when the goal of your email program is brand awareness or re-engagement.

For more information on A/B tests, here are some helpful links:

  • All About A/B Testing” by Return Path. In this ebook, we’ll give you the tools to become an A/B testing expert, including: How to set up and run your A/B test, analyzing your results, A/B testing best practices, and 50 testing ideas to get you started.
  • How to Set Up and Run an A/B Test” by Lauren McCombs.
  • Our free Sample Size Calculator to help determine what percentage of your list should receive the “test” version of your email marketing experiments.

Popular this Month

 The Intelligent Email Gathering

The Intelligent Email Gathering

The best day in email in 2017 is coming up this month. You don’t want to...

Read More

 The Impact of Gmail Tabs, Four Years Later

The Impact of Gmail Tabs, Four Years Later

In 2013, the introduction of Gmail’s tabs seemed to be a cry of war to...

Read More

 Getting Through to AOL Just Got Easier

Getting Through to AOL Just Got Easier

That title was probably a bit click-bait-ey because AOL has always had a...

Read More

Author Image

About Mary Sohn

Mary brings years of advertising experience to email and views deliverability data from a marketer's perspective. In her spare time, you'll find Mary eating through Canada's best diners, drive-ins, and dives (without the justification of a reality TV show). Follow Mary on Twitter @juenology for a haphazard glimpse into her life as a hungry-for-food email specialist.

Author Archive

Stay up to date

Enter your name and email address below to subscribe to our mailing list.

Your browser is out of date.
For a better Return Path experience, click a link below to get the latest version.