Enhancing Your Email Testing Program with Competitive Insights

Last week, I had the pleasure of participating in the DMA’s Post-Conference Email Marketing Certification course. My session was on email testing and one sections of the training focused on the   steps necessary for setting up of culture of testing in an organization. While there are many things that need to be done to ensure your email testing program is set up for success (getting executive buy-in and making it a team effort, for example), gathering data prior to launching a testing program is also a necessary step to get the testing program off on the right foot.

While just launching a testing program blindly can work, it is bound to work better with some advanced planning.  To set your program up for success, gathering the following data helps ensure you start testing the elements that will have the biggest impact on your program.

Data from your analytics platforms: What do customers purchase most? Where do they browse but not purchase? Which emails get opened and clicked the most? Which content drives the most conversions?

Data from your social media sites: What type of content resonates best with your audience? What products create the most buzz? What content generates the most positive (or negative) sentiment?

Search keywords: What are the most frequently used search keywords related to your brand? Does your audience search on different words that you usually use in your email programs?

Your customer database fields: What data elements are available for testing? What percentage of the potentially useful data fields actually have values? What additional data would be most useful for testing purposes (gender? location?). How can you go about getting these additional data points?

Institutional knowledge:  What do you and your teammates know about what campaigns have been successful in the past and why? Is there certain content, offers, or creative that seems to generate higher responses? If something hasn’t worked, why? Just because you haven’t launched a formal testing program, you still have a great deal of knowledge that can help establish a successful baseline and starting point.

Your competition: What type of content does your competition send? What type of testing do they employ? Have they sent something unique that you should be testing? What is working for them and what is not?

I'd like to expand on this last point a bit more. With test accounts, it can be difficult to get a handle on the different variations of content your competition is sending and how it is performing.  With Return Path’s new Inbox Insight product, it is easier to get a complete picture of the content your competition is sending and the engagement it is generating.

For example, Macy’s recently sent an email campaign that promoted 20% off as well as free shipping on purchases over $75. By looking at the campaigns in Inbox Insight, we were able to not only see the different subject line tactics that Macy’s employed and how they performed, we were also able to see the different creative treatments in one centralized location.

Subject Line Benchmark Read Rate Delete Unread ISP-Marked Spam
Just for you, _ Extra 20% off + Free Shipping at $75! Above Average 17.61% 15.16% 0.51%
Just for you: Extra 20% off + Free Shipping at $75! Average 12.60% 15.06% 0.58%

 

In the example above, we can see that Macy’s employed some personalization in the subject line (delineated by an underscore (“_”) in the first subject line). From looking at the data, we can see that this campaign generated a 40% greater read rate than the subject line without personalization. Granted, the subscribers with first name in Macy’s database might be Macy’s better customers, but this type of insight can not only help you understand how Macy's is segmenting their audience and what is working for them, but that testing personalization in the subject line is likely a worthwhile test.

While the creative for both of the subject lines above was the same (other than the use of the first name in the pre-header), there was similar creative sent with a subject line focused on the 20% off offer sent on the same day. This campaign had an “average” benchmark and a read rate of 15.79%.

But diving into the creative, we are able to uncover a bit more about the likely segmentation used for this campaign: This campaign: 

  • Looks to be targeted to an international audience
  • Did not include the free shipping offer (which is likely because the email was sent to an international audience)
  • Included fewer content blocks below the main content (the content blocks included in the international version focused on international shipping and an international savings card offer)
  • Removed the personal message from the president as well the request for feedback at a specific email address

By having access to these additional competitive data points, you can use these insights to determine what data can be valuable to collect, what is working for your competition, and how to integrate and test different elements within your campaigns. Combined with your other data, your email testing program can best set up for success and help make your email program best in class.