Email Optimization

How to Evaluate & Compare Anti-Spam Products

Posted by J.D. Falk on

by J.D. Falk
Director of Product Strategy, Receiver Services

Hey, you! Yeah, you with the ethernet cable. Get in here and look at all this spam. What do you mean you can’t do anything?! Didn’t we buy one of those anti-spam thingers in 2002? Oh fine, I’ll approve an upgrade, but you can only choose one thing — make sure it’s the best. Otherwise, you’re fired. What was your name again?

Victory! Sort of. The pointy-haired boss who can never remember your name will finally let you replace the crufty old filtering appliance he bought from a failed dot com at an auction, way back when all the dot coms were failing — including the one who built the appliance. It’s amazing the thing even powers up anymore.

But with a limited budget and only a short time before he forgets, how can you make sure that whatever you buy will solve the problem — now, and in the future? Spam changes so quickly, and the most effective techniques now involve real-time queries of external blacklists, whitelists, and reputation scores — so you can’t just throw a bunch of old spam at a system to see what happens.

This is a problem that the smart folks in MAAWG face, too — ISPs have budgets, too, after all. So they’ve written up what they called the Email Anti-Abuse Product Evaluation Best Current Practices,” which is a long-winded way of saying “here’s what you do.”

The document is basically a checklist. There are specific questions to consider regarding both functional and business requirements: will it run on your hardware? Is there support? Does the vendor conform to your privacy and regulatory framework? And the document offers advice on useful metrics; obviously there’s the catch rate, but also accuracy, hardware utilization, and more.

Then there’s a ton of useful information about performing the analysis and comparison, including some not-so-obvious ideas on how to fit multiple filtering engines into or alongside your production mail system for evaluation — while reducing risk to users.

Finally, the document includes the results of a survey of MAAWG members regarding which methods they’d prefer to use, versus which they actually get to use. Not surprisingly, half would prefer to do their testing safely in a lab — but far fewer actually get to do so.

Some of MAAWG’s documents cover topics which can have a real, visible, immediate effect on the email ecosystem. The results of these particular Best Practices will be more subtle: ISPs, enterprises, and others who use these techniques will choose anti-spam technologies based on testing and data, rather than promises and innuendo. So they’ll have better protection, which will slowly but surely make life more difficult for the spammers — and for any technology vendors whose products don’t live up to their hype.

Best of all, end users — even the annoying ones — will have more effective protection against spam and other abuse, while still receiving email that’s important to them. Then you can sit back, put your feet up, and — oh. Time to replace the printer, too, eh? Can’t help you there.