It must be just me, but every time there’s a need to present a complex topic to the executives or business leadership (topic for another musing, methinks) I get the typical looks of “oh no, he’s going to get all lectury again”. And it’s true, I prefer to present complex topics as complex, even if the style of presentation makes them approachable. There’s no way to dumb down something that’s complex without: also sending the message that sure, they may be leaders of the organisation, people that we entrust to make the right decisions, but hey, let’s not try to present them something that’s not so simple that a 5th grader could solve or they’re end up in foetal position on the floor begging to make it go away; quite decidedly making the whole organisation poorer for the experience and less equipped to make the right calls because we, the experts, decided that only we should hold the knowledge; and actually making ourselves poorer for the experience, because when we start dumbing down, as opposed to making approachable, complex topics we also deny ourselves the opportunity to challenge our own knowledge of the topic.
Economists make the distinction between “stated” and “revealed” preferences—loosely defined as what we say versus what we do—when analyzing decisions and looking for utility. Luckily, in technology risk management the “what we do” part is readily available. It makes itself known in all our resource allocation decisions. When we determine a mix of activities to perform, we spend people money. When we make purchasing decisions, we spend service or capital investment money. All of these decisions reveal something about the perceived value of the activity relative to other actions.
In the “Risk, risk everywhere and not an appetite for it” post I proposed the following spur-of-the-moment-inspiration-through-significant-dose-of-caffeine definitions for risk appetite and risk tolerance: “Risk appetite: This is your general, high level expression of what you are, or aren’t willing to risk in order to reach your goal. Get your goodies. Join the dark side to get your hot little hands on their cookies. Whatever it is that your long term goal is. An example of a risk appetite would be: I’m happy to risk 10% of everything I have in order to get at least 20% profit.
It could be just me, but every time there’s a need to present a complex topic to the executives or business leadership (topic for another musing, methinks) I get the typical looks of “oh no, he’s going to get all lectury again”. And it’s true, I prefer to present complex topics as complex, even if the style of presentation makes them approachable. There’s no way to dumb down something that’s complex without: also sending the message that sure, they may be leaders of the organisation, people that we entrust to make the right decisions, etc.
"Managers who are isolated from the intelligence customer tend to monitor the quantity of reports produced and level of polish in intelligence products, but not the utility of the intelligence itself." This sounds equally true if you replace “intelligence” with risk.  Jack Davis, The Challenge of Opportunity Analysis
I’m reading up on contemporary intelligence as part of my grad course and came across these six intelligence values. So far all I’ve read on intelligence reads very true to information risk management and often risk management as a whole. Have a read, see if the values for intelligence don’t marry neatly with risk management values: Accuracy: All sources and data must be evaluated for the possibility of technical error, misperception, and hostile efforts to mislead. Objectivity: All judgments must be evaluated for the possibility of deliberate distortions and manipulations due to self-interest.
In large, slow-moving bureaucracies, conventional thinking and risk avoidance become paramount, irrespective of how many times a day people at that organization use the word “strategy” or “innovation.” Peak Intel: How So-Called Strategic Intelligence Actually Makes Us Dumber - Atlantic Mobile
When the intelligence business works, it helps create organizational cultures where empirical evidence and concern for the long-range strategic impact of a decision trump internal politics and short-term expediency. Peak Intel: How So-Called Strategic Intelligence Actually Makes Us Dumber - Atlantic Mobile
Ever since Stuxnet thundered on the global scene in the second half of 2010 the world has been awash with fresh doses of FUD. Slowly but surely calmer and more pragmatic heads are prevailing: Stuxnet: It’s a real threat, but not something we should shovel money at - By Tom Ricks | The Best Defense The correct response to Stuxnet is to acknowledge the risks of cyber war, but be discerning in our reaction. We must separate the sensational from the legitimate, and only invest in valid and practical strategies.
Richard Clayton has a great post summarising the recent paper for ENISA that he co-authored on the Internet resilience. Food for thought: Internet interconnectivity is a complex ecosystem with many interdependent layers. Its operation is governed by the collective self-interest of the Internet’s networks, but there is no central Network Operation Centre (NOC), staffed with technicians to leap into action when trouble occurs. The open and decentralised organisation that is the very essence of the ecosystem is essential to the success and resilience of the Internet.