Computing

ANALYSIS

Breaches: Fix the Issue, Not the Blame

Following a natural disaster that causes property damage to businesses and homes — say a hurricane, fire or flood — how often do you hear suggestions that the victims were at fault for their misfortune, or that they could have done something to prevent the event from occurring in the first place? Not often, right? We all know that events like that are possible. We plan around those possibilities, and we don’t blame the victims when they happen.

It’s different when it comes to data breaches, though. Unless you’ve been living under a rock for the past few years, chances are good that you’ve been impacted to one degree or another by a data breach. Statistically speaking, it’s a near certainty that your information has been lost, stolen, or otherwise involved in one of the many data breaches that have dominated the headlines.

In contrast to a natural disaster, though, it’s not uncommon after a breach to hear people on the sidelines suggest that the victim is at fault — that there was some action they could have taken, some tool they could have used, or some process they should have had in place to prevent being breached.

Fruitless Investment

Sometimes there’s a grain of truth in this. Just as homeowners in hurricane-prone areas can use specific building methods to minimize potential hurricane damage (building their house on pylons for example), steps like data encryption can help offset the potential impacts of a security breach.

When such measures aren’t used, damage can be worse than otherwise would be the case. Still, the event itself is in large degree probabilistic. You can do everything right and still get hacked — or do everything “wrong” and, through sheer luck, remain unscathed.

The natural human tendency to fix the blame can be counterproductive in a security context. It distracts from cultivating the lessons learned that could help offset or mitigate similar situations in the future.

Further, it can lead to a pattern of fruitless investment. Organizations may sink money into trying to prevent the unpreventable immediately following a breach, while grossly underinvesting immediately before one.

A Better Path

To find out what we can do instead, and how best we can marshal resources, I caught up with IDC Vice President of Security Research Pete Lindstrom in advance of his keynote session on this topic at MISTI’s InfoSecWorld 2019. His session, “Security Heresy: Cognitive Dissonance Amidst Economic Realities,” addressed this topic head on.

In an interview for this article, Pete pointed out that every breach will have a “smoking gun” — that is, some unique chain of events that allowed attackers to gain access in the context of a particular breach.

In the cold light of hindsight, it’s almost certain that a different alignment of circumstances — or some different action on the part of the victim — could have caused events to play out differently. However, this “armchair quarterbacking” is a bit of a red herring, he cautioned. Why? Because of the probabilistic nature of data breach causality. For every smoking gun that comes to light, we don’t know how many others went unexploited.

Pete proposed looking at things a new way.

“We can’t continue to look at things in binary terms. A new vulnerability is discovered and we’re insecure — we patch against it and become secure again. This implies a preordained result where cause inevitably leads to effect,” he explained.

“Instead, it’s much more like playing poker — you play the hand you’re dealt based on probabilities to maximize the likelihood of winning,” Pete continued. “Like in medicine, a course of treatment doesn’t always produce identical outcomes; instead, we maximize success by cultivating options and treating the system holistically.”

He went on to suggest that a more economic-oriented mindset can help organizations plan better. What’s needed is a mindset that accounts for the opportunity costs of how we spend (investing in one countermeasure means you have less money to invest in others), understands the tradeoffs that we make in our businesses, and considers how we communicate the impacts of those tradeoffs up the organizational chain.

Optimizing Resources

“We recently collected data about the correlation between spending on security and data breaches — they’re less connected than you’d think,” Pete noted.

“We need to stop assuming that just because you’re spending more money that you’re more secure,” he said. “Instead, we need to think like economists do: understanding unintended consequences, and building in a way to highlight them when they occur; understanding that spending in one area offsets resources for others, reallocating investments quickly if need be; and by providing transparency about this to decision makers.”

How does one do this? Pete highlighted metrics, both operational and economic, as critical. The first area — metrics about the performance of security measures — is one that many organizations have in place but could improve by making those metrics more actionable and putting them in context. For example, reporting just the number of IDS alerts over a given time period is less useful than reporting the percentage or ratio of attacks relative to legitimate requests.

The second area, economic metrics, is less often to be found in the field because it implies understanding of two things many organizations don’t track as carefully: 1) the costs involved in security measures (both hard dollars and softer costs like personnel time); and 2) specific risk areas an organization faces based on its operations.

Collecting and reporting on these two elements together is helpful. It allows us to invest in places where that investment will do the most good, and it also allows us to redeploy investments into different areas as situations change.

By adjusting to adopt an economics-oriented mindset, we can move away from a culture of blaming the victim and toward a culture of recognizing that breaches can happen to anyone. Preparing for them means understanding our own readiness and best using the limited resources available to us to respond.

Ed Moyle

Ed Moyle is general manager and chief content officer at Prelude Institute. He has been an ECT News Network columnist since 2007. His extensive background in computer security includes experience in forensics, application penetration testing, information security audit and secure solutions development. Ed is co-author of Cryptographic Libraries for Developers and a frequent contributor to the information security industry as author, public speaker and analyst.

Leave a Comment

Please sign in to post or reply to a comment. New users create a free account.

What's your outlook for the business climate in 2025?
Loading ... Loading ...

Technewsworld Channels