When I was still in the software business, it always amazed me how inadequate most of us are at dealing with risk (and I include myself in that group, to some degree). I guess I shouldn't have been surprised, because as a species we usually suck at it. Think about what percentage of adults are afraid of flying, versus the proportion of us who are afraid of driving. The former phobia's much more common, and yet the risks associated with the former are virtually nil compared to the risks associated with the latter. Or consider how many parents worry that they're children will be abducted, despite this being almost unheard of statistically-speaking, while doing little or nothing to prepare those same offspring for the complexities of managing their finances, navigating sexual relationships or fulfilling personal responsibility, all of which they're virtually guaranteed to encounter in life.
Over the life of a typical, non-trivial software project, there will be dozens, if not hundreds or even thousands of risks undertaken. In very broad strokes, those can include estimates of work effort being ridiculously low, dependencies not being met, new technologies being unreliable, and specifications changing haphazardly, just to name a few. Since most IT projects fail - where failure means late, over-budget or under-scope, or any combination thereof - I guess we shouldn't wonder that risks usually aren't well-defined nor guarded against.
All of which leads us to the point made in this excellent article, which is that the level of certainty associated with any risk should obviously vary according to the circumstances. As the author notes, 99+% certainty makes sense when talking about Higgs boson calculations, but is a ridiculously (and dangerously) high bar for something like climate change. Put another way: if you told a mother that her child had a 10% chance of coming down with cancer if he used a certain brand of fingerpaints, there's no way that product would ever be brought into the home. And yet climate change deniers use the fact that scientists are only, maybe, 90% certain that we're heading for a catastrophe as reason enough to discount the science and go on about their business. It's an insane inversion of logic, but no more startling than the "fear of driving vs fear of flying" result that we've all witnessed.
It's a curious failure of our species that seems to be poised to do us a whole lot of harm in the not-too-distant future.
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment