I've got a small side-project going here code-named "Zosimos", and I thought I'd share some of the things I've learned so far from the time I've spent discussing, researching, and reaching out to people on the topic... and it centers on numbers, or rather, their meaning.
If you've ever heard someone use the phrase popularized by Mark Twain on statistics that goes "There are three kinds of lies: lies, damned lies, and statistics" you'll appreciate how difficult it is in our profession to make meaningful sense of the incredible amounts of data out there.
Without rehashing an old problem, there are entire product lines, workshops, conferences and corporate roles dedicated to helping organizations sort out the vast quantities of comprehensible and collectible data. Numbers are all around us... from the number of packets that bounced off your firewall(s) today, to how many viruses your anti-malware tool found and stopped at the mail gateway today, to the number of SQL Injection defects discovered in the latest attempt by your application security guy to scan its applications secure. What do these numbers mean?
The answer lies in context. This shouldn't shock you. But what should shock you is that after all this time and energy spent talking about metrics, KPIs and "doing it right"... many are still doing it woefully wrong.
The point of context starts with audience. Who are you collecting and refining this mountain of data for? Typically we're pulling information together at one of 3 levels:
The audience matters when you're collecting data and trying to make sense of it. In fact, the audience matters so much that sometimes you can't even 'refine' your way from one tier to another without going out and collecting a whole new set of data.
As an example, when I had to report out to the CIO a number years ago the status of my 'security organization' I had much less to work with than today, but the challenge was no less great.
Since I wasn't quite sure what to deliver - mainly due to inexperience which was quickly rectified - I would bring in pages and pages of spreadsheet data. The metric of "threats neutralized" included viruses, worms, dropped packets, blocked IPS 'attacks', and other assorted things I could pull together. When I needed to "make it look good" I would use whatever was at my disposal including things like line counts from the apache log files listing out http queries for malicious resources, or attempts to break into the app. Yup, each one counted when I needed to have a good month... but what in the world did any of that mean?
The answer the the last question is simple - not a **bleep** thing.
Today I find many of the security-focused managers out there are still collecting random bits of security data trying to make their metrics scorecard look good... stopping viruses, ping sweeps, and IIS4-based attacks may seem like you're doing your job but I suspect these metrics are meaningless in the context of the broader business.
So what do your score cards look like when you report out to senior management?
What about what your team reports to you?
Do you feel good about the insight you're providing those consuming your reports, or are you just trying to make yourself look like the super-hero?
The subtle difference between metrics and insight is that one is meaningful to more than just security and has context that actually most anyone in the organization can grasp. Insight takes no interpretation, extrapolation, or additional manipulation. Insight tells us things like it's taking too many man-hours to close the loop on security defects in our application development lifecycle. Insight will tell us things like security's impact to change management has been 50% less unplanned outages/incidents that comes to a savings of 100 consultant man-hours per month... then translates that into something the organization really cares about - dollar figures.
When I've worked hard to provide one of my customers insight, I feel good about the fact that no matter whether the end result is, positive or negative, they'll understand what direction they're headed in. That's it, pure and simple. That's it, do your thing, and my version of "hacking" will tell you whether you're going in the right direction or not. Sometimes we require complex mathematical models before we ultra-simplify, and sometimes it's many degrees simpler.
When you're done reading this post I want you to answer one question, to yourself, and be honest. "Are the things you're reporting supporting your case for security being beneficial to the business?" And if you answer yes, "Why?"
I'm interested in talking to you if you fit into any one of the categories below here... because there is a cause that's bigger than what most of us are enamored with in security. Beyond 'hacking bits' there is hacking the board room... and that's what I do. So whether you're doing it right, ashamedly doing it wrong, or you don't know the difference - let's fix this and build a framework for sanity.
Do you fit one of these categories?
- forced to report 'metrics' but don't get any impact from them?
- struggling to find the 'right' information to report to prove effectiveness?
- endlessly compiling more and more numbers that never seem to be getting better?
- have an effective set of things you report up, and/or down and are willing to share?
If so... let me know, here via comment, or via email, or even on Twitter (@Wh1t3Rabbit). As I keep moving on the project, I'll keep sharing things that sound interesting, and maybe one day in the near future it'll be something worth a talk or whitepaper... until then, keep at it.
Cross-posted from Following the White Rabbit