..that you don't know what you've got 'til it's gone.
There seems to be an increasing opinion that the IT at home is better than the IT at work. Is this the glass half full or half empty? I remember when my mate Jeff was looking for a moped (as was I - look to your laurels Easy Rider). He reasoned that a two-stroke machine would be a better risk than the four-stroke equivalent on the basis that four stroke meant more parts and so therefore it meant more to go wrong. (I wonder if this was Jeff's medical training? ...systolic, diastolic and so on.) Have these IT experts considered the gestalt of the office, school, or factory IT and compared it to the variety at home? I'm not advocating that complexity absolves responsibility for a poor quality of service. It's just that the continuity is more challenging. My mate Dave the IT manager would say to his users, 'you can have it fast, cheap, and resilient... choose two'. Beyond that you increase the risk.
But consider the benefit of when it works as you expect. Understanding business impact is crucial. '5 nines' may look good on paper but if those few minutes a year of unscheduled downtime happen in the middle of the heart operation, when the airport is operating at 99% capacity, or on pay day, lives may not only be inconvenienced, they may come to an abrupt end.
So when a 'bug' causes Skype messages to be sent to unconnected (a word used advisedly) third parties to the conversation (http://www.bbc.co.uk/news/technology-18863423), when an electrical storm on the American east disrupted a cloud data centre and the back-up generator failed (http://www.bbc.co.uk/news/technology-18672173), or when the traffic management software to coordinate the evacuation of a city on America's west coast reset after 6 hours in the middle of the exercise we need to consider the culture that influences our reaction. Then there's O2, Blackberry, RBS, and so on. Leaving aside what we may have paid for these services, let's consider our expectations and the way we talk about it. We humph! We complain. We run to Web 2.0 sites and vent our frustrations. (OK...so you can get onto Twitter, Facebook, etc. Perhaps you're not so disconnected after all. Then what were you doing? (Perhaps it's time to turn off your tablet and do something less boring instead.) Let's consider software, which as the Reverend Mike reminds us is: 'Invisible, intangible, intolerant, indispensable... and totally amoral. Which makes it bloody dangerous!' We are very focused on the event of failure. We all engage in cognitive dissonance that there's risk, believe in the resilience, or a bit of smugness that there'll be someone to sue if they dare to let you down. It's time to realise our own contingency planning because these systems which were waiting to fail...were always broken. Until the 'bug', the storm, or the reset, we'd just lost track of how lucky we'd been. How many shots had we had at it? If the software only fails under certain, untested conditions, it has never worked. It's not gone..you never had it!
So what's to do? Firstly, look to the future ? I'll talk about Toynbee convectors another time. Moves are afoot to encourage more and more systems to have those magical elements of reliability, dependability, and security (www.ssdri.org.uk). For now we have to risk manage what we've got. Another initiative to impart the skills into the computer scientists of tomorrow is established security elements on MSc courses in the School of Computer Science at the University of Manchester. One of the tools in the box that we use there is the risk reckoner to design systems with a balanced view. It helps us to get over that obsession humanity has with low risk and high impact events. Not to be ignored, but to be put into perspective. Do a bit of risk reckoning and you think about what's important, what isn't, and what's the difference?
Consider the threats and what or who is likely to realise them - the 'what' or the 'who' ...the actors. Whatever the threat is, have they the capability such as know how or resources to do it? If they could, would they do it? What would their intent be? Scary decisions. But think not about your vulnerability but rather your opportunity. Where can you make a difference by reducing or closing the opening where they can do it? And always think of the impact. Are we bothered if they do it?
If you like numbers, score this and multiply the numbers to see how big these factors get when assessed together. (See the Risk Reckoner below.)
Anything multiplied by 0 is 0. Simplistic? Yes. Attenuating? Yes. A good start? Yes! If not now, when? Time and resources don't indicate security. Have you used up all the mitigating actions and countermeasures to the risks or have you still got one left to introduce or redeploy to better effect? Skype private messages... is this how to send really private stuff? Do you feel lucky?
Go ahead. Make my day.