Sunday, March 11, 2012

Invasion of the Risk Managers: Altering the Complexion of Security

120228 Risk management panel 01.jpg

The modern-day philosopher Thomas Kuhn theorized that scientific revolutions are only brought about by practitioners who are not already trained to think a certain way - or to use Kuhn's terminology, in keeping with a given paradigm. When people train themselves to believe something, they expect their observations to match their beliefs, and thus may fail to observe something truly revolutionary. And it is observation that is the "step one" of science.

So it was with the face of Thomas Kuhn looming largely overhead that a panel of two security architects, a noted Gartner researcher, and two risk management professionals met at the RSA Conference in San Francisco last week. Two worlds collided here, and this was one of the focal points. One side represented the existing paradigm. The revolutionaries came in suits with calculators and adjustment formulas. And Gartner's Bob Blakley literally wore a Satan suit just to make sure the fire and brimstone kept flowing.

The Balancing Act

"We teach people that risk is about science, about numbers, and about metrics. And the reality is, that only works for half of our risk. The other half of our risk is the things that we can't predict how frequently they're going to happen." This from Andy Ellis, the Chief Security Officer of Akamai. No, he's not a risk management professional by nature. He's learned the language because, as he explained, he had to. He's been converted to the extent, he says, that he has constructed a business continuity plan for Akamai in case of a zombie apocalypse.

120228 Risk management panel 02.jpg

"We do that because it's an easy way to cover a whole lot of different threat scenarios. But I cannot make a prediction of what the likelihood is of that event happening. And an awful lot of the risk that we face, you can't calculate the likelihood, we're not part of a large population that we can do actuarial studies on. So risk management becomes more of an art than a science, and we have to discern which risk is art and which is science, and not apply the principles of one to the other."

The traditional data center security paradigm is based around responding to threats as they occur or after they have been detected. An evolved version adds a layer of prevention, though in recent years, this layer has taken on the flavor of a handful of household maintenance tips from your afternoon local TV news. Risk management (applied properly) should be the application of principles in planning and procurement so that the impact of threats that may occur is kept within tolerable levels.

That is, when it's applied properly. And here is where Bob Blakley enters the picture. "Risk management is not bad," he told attendees. "It's evil, and it's actually the enemy of security.

"You go through this exercise every year," said the devil incarnate. "You bring a bunch of security people into the room, and their job normally is to defend against threats." The exercise proceeds, he explained, with these security people generating a list of threats. That list is then presented to upper managers so that they may use the principles that they consider risk management "to decide which controls they are not going to implement. In hindsight, it's really diabolical. We get the security people to cut their own budget to participate in the exercise that builds the list of what are guaranteed to be 365-day vulnerabilities - the list of things we know are currently broken, and we're not going to fix until we get the budget to request for next year."

An Ounce of Prevention, As Compared to a Ton

One questioner in the audience admitted that his business employs risk managers, but says they understand some of both the art and the science, as Ellis explained it, more than they understand how to defend against threats. Confirming Blakley's pessimistic picture, he explained how risk managers in his business calculate how much loss of business or capital it would incur as the result of a threat event, balance the more nominal losses against minimal expenses for protections and remedies, and in the act leave behind the more serious threats that they can't afford to throw money at. Inevitably, these risk managers have the wrong executive sign off on their finalized list of expenditures, asking that executive to decide whether to a) fix the problems, or b) accept the risk. If a five-year-old running out in the middle of traffic were to run into one of these risk managers, he said, given the same type of assessment and asked to make the same decision, it's impossible to imagine him being convinced by a table of probabilities to stay safe behind the curb.

Andy Ellis took that analogy one step further. Asking parents in the audience to confirm this observation, he said when a five-year-old runs out into the street, it's parental instinct to run toward him and grab his arm. Hopefully this is followed by a stern explanation of the risk of running into traffic. If every time the child darted or walked or nudged his way toward the street, the parent gently nudged the child back without a word spoken, Ellis said, "you're making a risk decision on their behalf. You're not educating them.

"Realistically, people have a constant level of risk tolerance," he went on. "They will tolerate a certain amount of risk, and if you take some away, they'll go find more risk. NASCAR drivers are a great example of this. They keep doing things to make the cars safer, and they drive more and more dangerously and recklessly because they now think they're invulnerable. So what we find is, there are fewer accidents, but the ones that happen are really big and really bad. The same things happen on our streets."

Many executives make the mistake of believing that when the calculated reward or benefit for a project exceeds the calculated risk, then the difference between the two becomes an acceptable level of extra risk that can be tolerated for the next project. Ellis suggested that a risk decision should be met with a binary yes or no, not a calculation of probability. Security engineers should utilize those probabilities in their internal assessments, but in their end, apply their philosophies clearly and coherently. "At the end of the day, the business decision maker - the person who gets to choose to take risks, who is not INFOSEC - is making that decision," he notes.

Next page: What are "acceptable levels?"

Source: http://www.readwriteweb.com/enterprise/2012/03/invasion-of-the-risk-managers.php

cardinals jennifer nicole lee jennifer nicole lee chris harris peter schiff matt holliday project runway winner

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.