A View of The Email Universe: Return Path’s Q2 Reputation Benchmark Report

Posted by George Bilbrey 

When we set out to build a reputation data network we had a strong sense that the volume of email being sent to top receivers was staggering. But sensing is one thing. Having empirical data is another.

Which is why I’m so excited about the Return Path Q2 Reputation Benchmark Report. Now we have actual email performance data that tells us what the email traffic really looks like.

You can read the report yourself here.

Here’s my high level take on what we found:

  1. Most of the servers sending email shouldn’t be. Only 20% of the IPs we studied were legitimate, well-configured, static email servers. It’s important to point out that this doesn’t speak at all to the quality of the messages from those servers – lots of horrible spammers know how to configure a mail server. The other 80% of the mail is coming from servers that are either identifiably bad or unidentifiable and probably bad. No wonder ISPs and other large receivers feel besieged.
  2. Servers with good reputations get their messages delivered. Servers with bad reputations don’t. This might seem obvious to those of you reading this blog, and of course it is. But again, having that empirical data is gratifying. We found a direct linear relationship between an IP’s Sender Score and that IP’s average delivered rate. Of course I have to point out here that it is not the low Sender Score that is causing the delivery problems, a common misconception. The reputation issues that give an IP a low Sender Score are what also cause that IP to be blocked from inboxes.
  3. Specific best practices have a direct result on an IP’s delivery rates. We found a 20 point difference in delivery rates for IPs with just one spam trap hit. For servers with unknown user rates above 9% the difference was 23 points versus servers with cleaner data.
  4. Blacklists don’t cause blocking, they predict it. We found that servers listed on any one of nine public blacklists (the lists studied are noted in the report) had an average delivery rate of 35% versus 58% for servers not on these lists. But the reason is not that those blacklists are used by receivers. In fact, some of them are not used very much at all. Much like with the Sender Score, the behaviors that land a server on the blacklist also cause that server to be blocked by many receivers.

Read the report now. And if you haven’t already gotten your Sender Score, you should go to our reputation portal at www.senderscore.org. You get your score for free, or you can register with us (still free!) and get a more detailed report on your reputation score.

Author Image

About George Bilbrey

George Bilbrey is the founder of the industry’s first deliverability service provider, Assurance Systems, which merged with Return Path in 2003. He is a recognized expert on the subjects of email reputation and deliverability and is active in many industry organizations, including the Messaging Anti-Abuse Working Group (MAAWG) and the Online Trust Alliance (OTA). In his role as president of Return Path George is the driving force behind the ongoing innovation of our products and services. Prior to Return Path, George served as Director of Product Management at Worldprints.com and as a partner in the telecommunications group at Mercer Management Consulting. He holds a B.A. in economics from Duke University, and an MBA from the Kenan-Flagler School of Business, University of North Carolina.

Author Archive