Testing: The Email Program’s Equivalent of Flossing
For many marketers, email testing can be considered the equivalent of flossing. When someone asks if you’re doing it, you say yes (even if you’re not). Why? You know it’s beneficial, it delivers results and that you should be doing it. However, it takes time, it can be somewhat unpleasant and you have to remember to do it often (preferably daily) and at a time when you just want to get everything else done so you can move on to the next thing. Everyone knows the consequences of not doing it, and yet, it’s one of the easiest regimens to ignore. In fact, 30% of marketers cited testing as a barrier to overcoming their top challenges, according to MarketingSherpa’s 2013 Email Marketing Benchmark Survey.
This is especially ironic because the tactics that marketers feel will have the greatest impact on their programs could all benefit from testing. For example, when those same marketers were asked “What new developments will affect your email marketing program in the next 12 months?” almost every answer that was given would require some element of testing to effectively optimize it. The most popular answer, the “pervasiveness of mobile smart phones and tablets,” is a great case in point. If you’re concerned that your email program may not be optimized for mobile email, a logical next step would be to gather data on the types of devices your subscribers are using to access and interact with your email. If the percentage of smartphone and tablet usage is high, then testing the impact of optimizing your email creative and landing pages for mobile viewing on response rates would be essential. Likewise, if the test results were positive, an additional strategy could be to test implementing responsive design and a mobile version of your website.
Whether or not a tactic is effective and has the ability to be a game-changer for your program’s performance is all about testing. Consider social media (the second most popular answer from the survey). This channel has been particularly tricky for marketers when it comes to attributing ROI; however, if a large percentage of your email subscribers are active on social media, then testing the addition of social icons to your email templates, as well as promoting email sign-ups on your social pages and featuring social content in your email campaigns could be beneficial to your program’s performance.
While it all sounds simple in theory, it can be significantly more complex in practice. What can be most overwhelming is the combination of what to test and how to get started. This shouldn’t come as a surprise: email marketers could potentially test almost everything about their programs, from subject lines, to frequency, to time of day, to campaign images, to shapes and colors of call-to-action buttons, and so much more. This seemingly endless array of possibilities in combination with the various methods of testing, as well as getting executive buy-in to spend time and resources on testing and making program changes based on testing results, can create a state similar to analysis paralysis before the data has even been gathered. Adding to the confusion is a recent survey by Econsultancy that showed UK marketers find both the simplest testing approach (A/B testing) and one of the most complex (multivariate testing) “highly valuable” for improving conversion rates, rated by respondents at 60% and 59% respectively. So what’s an email marketer to do?
Normally, I would proceed with a list of 5-10 tips, however in my opinion, there’s only one that matters: keep it simple and consistent. What I mean by that is, don’t test everything at once. Determine what you test based on the metrics that matter most to you. For example, if you care about getting subscribers to watch a product video, a subject line test will be far less impactful than a creative test to determine what format to display the video in your email creative. Likewise, only test what you know you can optimize. If you don’t have the resources or technology to stagger your email sends to various subscriber segments, don’t test time of day or day of week.
Build support for your testing efforts by socializing results and promoting the outcome of optimization efforts based on test results. Create a schedule that allows you to test often (if not every campaign, than every week) and incorporates testing steps into standard campaign production processes. This ensures that testing becomes an engrained practice (not a temporary or ad-hoc tactic) essential to the health of your email program.
The world of email is changing all the time and the inbox is an increasingly dynamic place. Like it or not, testing is the only way to take the pulse of your subscribers and gain the information you need to adjust your approach for continued engagement and ROI. If you need inspiration, I like to check out the featured tests on Which Test Won. My bet is that more often than not, you’ll guess wrong, and that’s perhaps the strongest case for why testing is critical to email marketing success.