A recent lunchtime keynote at the Los Angeles ISSA Summit was interesting, and demonstrated the difficulty in modifying behavior in our industry.
The speaker, Bruce McConnell of the Department of Homeland Security, talked about security policy and how the DHS is working to help secure the nation and our critical infrastructure.
One of the things that struck me immediately was as we dove into critical infrastructure protection there was an overwhelming sense that good standards, policies and guidance needs to be enforced.
Here's the problem - when it comes to critical infrastructure protection it's very difficult to legislate and regulate the organizations that matter into a state of better security.
The problem is that in order to enforce policy and rules there either have to be consequences to failing, or incentives not to fail... or both. If we start at the one we're good at in Information Security, consequences, the first question is - how do you enforce consequences on public infrastructure that is either taxpayer funded or single-source for the customer without raising the customer's rates, or spending taxpayer dollars to fix someone else's mistakes?
This isn't a simple question to ask - and the folks at my table struggled with this a great deal. My pal Dave Marcus surmised that there is no way to do enforcement through negative consequences without hurting the people you're trying to serve, the utility customers.
While I agree with Dave's point - I also recognize that over the past decade we as an industry have been trying to raise the bar on security through positive incentives - only to find out that they weren't being taken advantage of.
Bruce had an interesting perspective, and mentioned that the DHS is backing bills in Congress right now that would, as a positive incentive, shield organizations from lawsuits or provide other legal support.
That's interesting - but is it relevant? Are we seeing public infrastructure organizations being sued for negligence or as a result of data breaches or cyber security failures? Are we seeing this behavior in other industries - and does it apply in the critical infrastructure sector?
While there have been a significant amount of law suits over data breaches citing negligence and "not enough due diligence" there is still a lot of room for escape based on minimally accepted requirements, and baseline levels of 'risk tolerance'. Once again we're left waiting for a catalyzing event - likely some catastrophic failure - before things change.
The difficulty, as broadly applied to the Information Security industry - is how do we provide enough positive incentive to help move change, while creating negative consequences when those incentives are ignored? This is very difficult, and a challenge we fight ever day.
Ultimately this is a necessary change in behavior, and a change in culture. From a culture that accepts risk blindly until something catastrophically fails, to a culture that accepts the push from a positive incentive and has that negative consequence in the back of the mind, we need to try and find that balance. This is a difficult balance to achieve, and something that we need to keep working at.
Cross-posted from Following the White Rabbit