Smart CISOs know when not to pay attention to the "wisdom of the crowds"

If Apple followed the ‘wisdom of the crowds’ in 2006-2007 they’d never made an iPhone. If smart CISOs paid too much attention to the article in the Information Risk Leadership Council’s latest article they’d be in as much trouble as they purportedly are right now. There is a lot wrong with CISOs that put all their hope and budget in prevention, but the word itself is definitely not the problem. Nor is the solution that CEB IRLC (Executive Board’s Information Risk Leadership Council) advocated - although they just followed the lead by NIST.

When you use ordinal scales ...

… you are committing a cardinal risk management sin. Of course that doesn’t stop people from continuing to do qualitative risk assessments, and there’s absolutely nothing wrong with that so long as there is no comparison between the risks. If you use qualitative risk assessment you cannot compare assessed risks. The reason for that lies in the ordinal scale that is typically used: The example above is exaggerated slightly to prove a point: whilst you would generally expect a value of 4 to be double that of 2, this doesn’t work once you start using purely ordinal scales.

Complex topic? Dumb it down and everyone loses.

It must be just me, but every time there’s a need to present a complex topic to the executives or business leadership (topic for another musing, methinks) I get the typical looks of “oh no, he’s going to get all lectury again”. And it’s true, I prefer to present complex topics as complex, even if the style of presentation makes them approachable. There’s no way to dumb down something that’s complex without: also sending the message that sure, they may be leaders of the organisation, people that we entrust to make the right decisions, but hey, let’s not try to present them something that’s not so simple that a 5th grader could solve or they’re end up in foetal position on the floor begging to make it go away; quite decidedly making the whole organisation poorer for the experience and less equipped to make the right calls because we, the experts, decided that only we should hold the knowledge; and actually making ourselves poorer for the experience, because when we start dumbing down, as opposed to making approachable, complex topics we also deny ourselves the opportunity to challenge our own knowledge of the topic.

Stated and revealed preferences and risk management

Economists make the distinction between “stated” and “revealed” preferences—loosely defined as what we say versus what we do—when analyzing decisions and looking for utility. Luckily, in technology risk management the “what we do” part is readily available. It makes itself known in all our resource allocation decisions. When we determine a mix of activities to perform, we spend people money. When we make purchasing decisions, we spend service or capital investment money. All of these decisions reveal something about the perceived value of the activity relative to other actions.

Risk Appetite Redux

In the “Risk, risk everywhere and not an appetite for it” post I proposed the following spur-of-the-moment-inspiration-through-significant-dose-of-caffeine definitions for risk appetite and risk tolerance: “Risk appetite: This is your general, high level expression of what you are, or aren’t willing to risk in order to reach your goal. Get your goodies. Join the dark side to get your hot little hands on their cookies. Whatever it is that your long term goal is. An example of a risk appetite would be: I’m happy to risk 10% of everything I have in order to get at least 20% profit.

Musings on risk appetite and complex issues

It could be just me, but every time there’s a need to present a complex topic to the executives or business leadership (topic for another musing, methinks) I get the typical looks of “oh no, he’s going to get all lectury again”. And it’s true, I prefer to present complex topics as complex, even if the style of presentation makes them approachable. There’s no way to dumb down something that’s complex without: also sending the message that sure, they may be leaders of the organisation, people that we entrust to make the right decisions, etc.

It's the utility, stupid!

"Managers who are isolated from the intelligence customer tend to monitor the quantity of reports produced and level of polish in intelligence products, but not the utility of the intelligence itself."[1] This sounds equally true if you replace “intelligence” with risk. [1] Jack Davis, The Challenge of Opportunity Analysis

Intelligence values - also for risk management

I’m reading up on contemporary intelligence as part of my grad course and came across these six intelligence values. So far all I’ve read on intelligence reads very true to information risk management and often risk management as a whole. Have a read, see if the values for intelligence don’t marry neatly with risk management values: Accuracy: All sources and data must be evaluated for the possibility of technical error, misperception, and hostile efforts to mislead. Objectivity: All judgments must be evaluated for the possibility of deliberate distortions and manipulations due to self-interest.

Conventional thinking and risk avoidance

In large, slow-moving bureaucracies, conventional thinking and risk avoidance become paramount, irrespective of how many times a day people at that organization use the word “strategy” or “innovation.” Peak Intel: How So-Called Strategic Intelligence Actually Makes Us Dumber - Atlantic Mobile

Risk management and intelligence

When the intelligence business works, it helps create organizational cultures where empirical evidence and concern for the long-range strategic impact of a decision trump internal politics and short-term expediency. Peak Intel: How So-Called Strategic Intelligence Actually Makes Us Dumber - Atlantic Mobile