Wicked (3)
Forget about wicked problems--what about complex ones?
Last summer I wrote a guest article on Charlie Stross's blog about wicked problems. Some of the characteristics of wicked problems are:There is no definitive formulation of a wicked problem (defining wicked problems is itself a wicked problem).
- There is no definitive formulation of a wicked problem (defining wicked problems is itself a wicked problem).
- Wicked problems have no stopping rule.
- Solutions to wicked problems are not true-or-false, but better or worse.
- There is no immediate and no ultimate test of a solution to a wicked problem.
- Every solution to a wicked problem is a "one-shot operation"; because there is no opportunity to learn by trial and error, every attempt counts significantly.
- Wicked problems do not have an enumerable (or an exhaustively describable) set of potential solutions, nor is there a well-described set of permissible operations that may be incorporated into the plan.
- Every wicked problem is essentially unique.
- Every wicked problem can be considered to be a symptom of another problem.
- The existence of a discrepancy representing a wicked problem can be explained in numerous ways. The choice of explanation determines the nature of the problem's resolution.
- The social planner who tackles a wicked problem has no right to be wrong (planners are liable for the consequences of the actions they generate).
Now Chris Smith has introduced me to a great article on How Complex Systems Fail by Richard I. Cook, MD. It's a very similar summary, but wickedly (if I can use that word) clever and, for anybody who's actually dealt with complex systems, so utterly true. Some of Cook's observations on the failure of complex systems include:
1. Complex systems are intrinsically hazardous systems.
3. Catastrophe requires multiple failures - single point failures are not enough.
4. Complex systems contain changing mixtures of failures latent within them.
and one of my personally favourites:
5. Complex systems run in degraded mode.
For any of us who watched the Fukushima fiasco last summer, some of these will have an uncanny familiarity:
7. Post-accident attribution of accidents to a 'root cause' is fundamentally wrong.
8. Hindsight biases post-accident assessments of human performance.
15. Views of 'cause' limit the effectiveness of defenses against future events.
16. Safety is a characteristic of systems and not of their components.
...and finally,
18. Failure free operations require experience with failure.
It's a sobering list and every single item on it bears a great deal of thinking. The article as a whole is brief, but each of the items is explained in enough detail to make the ideas understandable and to provoke some thought. Everything in here is applicable in many different contexts, from Fukushima and Chernobyl to the Eurozone meltdown, to current electoral issues and the unintended consequences of urban planning decisions anywhere in the world. Check out the article.
...And stop thinking in terms of root causes, damnit!
oversight
I tend to think that the really big problem with this kind of shit, is that either we drastically underestimate the external factors messing with the system or we don't, and just throw our hands in the air and say "it's a systemic problem, so we're fucked". It isn't that the problems are insoluble, it's that no-one has the ability or the authority to consider the whole system.