‘Law of Unintended Consequences’ = Admission of Failure in Systems-Thinking
How many times have you heard someone mention the ‘law of unintended consequences’? Probably many times I suspect. In fact it seems to be so common that it’s become an accepted part of the management vocabulary. A quick search on Google produces something over 520,000 entries.
[Wikipedia states][1]:
The so called “law of unintended consequences” (or “law of unforeseen consequences”) is not a true Scientific law such as Ohm’s Law, but a humorous expression in common use according to which any purposeful action will produce some unintended, unanticipated, and usually unwanted consequences. Stated in other words, each cause has more than one effect, and these effects will invariably include at least one unforeseen side-effect. The unintended side-effect can be more significant than the intended effect.
The subject of this sounds suspiciously system-like as it has ‘purposeful action’ ( a human-centric view of the attributes of a system, admittedly) and the results could well be regarded as [emergent properties] [2]. A bit further into the Wikipedia article suggests that causes might be the complexity.
The sociologist [Robert K Merton][3] is credited with making the concept popular and listed five possible causes of unanticipated consequences:-
- Ignorance (It is impossible to anticipate everything, thereby leading to incomplete analysis)
- Error (Incorrect analysis of the problem or following habits that worked in the past but may not apply to the current situation)
- Immediate interest, which may override long-term interests
- Basic values may require or prohibit certain actions even if the long-term result might be unfavorable (these long-term consequences may eventually cause changes in basic values)
- Self-defeating prophecy (Fear of some consequence drives people to find solutions before the problem occurs, thus the non-occurrence of the problem is unanticipated)
This looks horribly like a failure to apply systems-thinking - what is the system of interest and therefore what (or who) are inside and part of it and what is outside? What are the dependencies and interactions between the system and the residual world?
Ignorance. Whilst it probably is impossible to anticipate literally everything I suspect this is a bit of a get-out and often the causes of failures are readily apparent and could have been anticipated. This certainly seems to be the case with some of the more spectacular disasters that have been the subject of public enquiries such as the Haddon Cave report into the [Nimrod disaster][4]. It is also an error in analysis if the parts of the systems involved haven’t been identified - this is the difference between acts of omission and those of commission.
Error. Insufficient identification of the boundaries and parts or clarity in boundaries and ownership or authority is a recipe for disaster. Simple things like contractual boundaries that don’t align with the real system boundaries can make it difficult or impossible to ensure that control or optimisation occurs at the right time and in the right places. Of course if you haven’t identified the boundaries and fail to appreciate the interaction and dynamics then it is highly likely ‘unintended consequences’ will arise. It can be as insidious as buying something off-the-shelf and building it into the bigger (and unidentified) system. It might well be that the system has been identified but inadequate design documentation exists to properly identify the dependencies between the parts. It wouldn’t be the first time that something was added to a bus or data highway as other mid-life improvements had been to then find that the bigger system becomes unstable, unusable or unpredictable.
Immediate Interest. I imagine we’ve all seen this at work. The pressure to deliver on time and to cost and to maintain the programme can lead to corners being cut. Even the change in the focus of an organisation as a whole can be a catalyst for failure [as Haddon Cave observed][5] in the change in the Airworthiness function within the procurement part of the MoD organisation:
Financial pressures and cuts drove a cascade of multifarious organisational changes which led to a dilution of the airworthiness regime and culture within the MOD and distraction from vital safety and airworthiness issues as the top priority. There was a shift in culture and priorities in the MOD towards business and financial targets, at the expense of functional values such as safety and airworthiness.
I’m also pretty sure that in some of the NASA failures were partly driven by the pressures to maintain the launch windows. One of the never-ending mysteries is why the companies recognise that systems-thinking/engineering needs to be applied to the product fail to apply the same thinking and principles to the design and structure and processes of their own organisation (a soft system).
Immediate interest is also recognisable in the deliberate de-scoping of projects to remove those nasty dependencies that complicate delivery. What then happens is that the project delivers something that all too frequently doesn’t integrate properly leaving the next party up the food chain (System, Design or Integration Authority or Government Department or Agency) to pick up the pieces. Unfortunately this is also a characteristic of politics - both within a company and nationally as both thrive on self interest. And what about the financial benefits touted by the centralisation of, say a health authority? Yes it might seem that way if you define the boundary in terms of the facilities themselves but when you look at the bigger picture and cost to the patients (not just in monetary terms) of having to travel longer distances it isn’t so clear. Unfortunately it’s all to easy to identify big lumps of things and it’s a lot harder to assess the cost to the smaller more diffuse parts.
All in all this notion of ‘unintended consequences’ just seems to be some sort of a cop-out or label suggesting that it couldn’t he helped and giving it tacit acceptance. No, when you start to look at the causes most seem to be avoidable (or manageable) with a bit of thought, experience and reality.
Examples
There are many possible examples ranging from the deliberate introduction of non-native species for pest control through to equipment failure. Classic failures such as that of the new communications system for the [London Ambulance Service][6] and those listed by [The Register][7] all exhibit behaviour that wasn’t thought through. There is even an online [Museum of Unintended Consequences][8] maintained by the California State University.
External References
- [1]: https://en.wikipedia.org/wiki/Unintended_consequence - Wikipedia - Unintended Consequence
- [2]: https://en.wikipedia.org/wiki/Emergence - Wikipedia - Emergence
- [3]: https://en.wikipedia.org/wiki/Robert_K._Merton - Wikipedia - Robert K Merton [accessed Dec 2009]
- Merton, Robert K. On Social Structure and Science. The University of Chicago Press, 1996.
- [4]: https://www.nimrod-review.org.uk/ - The Nimrod Review website - Haddon Cave enquiry
- [5]: http://www.nimrod-review.org.uk/linkedfiles/nimrod_review/haddon_cave_statement.pdf - Haddon Cave Enquiry report
- [6]: https://ifs.host.cs.st-andrews.ac.uk/Resources/CaseStudies/LondonAmbulance/LAS-failure-report.pdf - Report of the Enquiry Into the London Ambulance Service
- [7]: https://www.theregister.co.uk/2005/01/21/unintended_consequences/ - The Register - Exploring the Law of Unintended Consequences
- [8]: https://cs.calstatela.edu/wiki/index.php/Courses/CS_461/Museum_of_unintended_consequences) - Museum of Unintended Consequences