Tame, Messy and Wicked Risk Leadership
eBook - ePub

Tame, Messy and Wicked Risk Leadership

  1. 114 pages
  2. English
  3. ePUB (mobile friendly)
  4. Available on iOS & Android
eBook - ePub

Tame, Messy and Wicked Risk Leadership

About this book

The general perception amongst most project and risk managers that we can somehow control the future is, says David Hancock, one of the most ill-conceived in risk management. The biggest problem is how to measure risks in terms of their potential likelihood, their possible consequences, their correlation and the public's perception of them. The situation is further complicated by identifying different categories of problem types; Tame problems (straight-forward simple linear causal relationships and can be solved by analytical methods), and 'messes' which have high levels of system complexity and have interrelated or interdependent problems needing to be considered holistically. However, when an overriding social theory or social ethic is not shared the project or risk manager also faces 'wickedness'. Wicked problems are characterised by high levels of behavioural complexity, but what confuses real decision-making is that behavioural and dynamic complexities co-exist and interact in what is known as wicked messes. Tame, Messy and Wicked Risk Leadership will help professionals understand the limitations of the present project and risk management techniques. It introduces the concepts of societal benefit and behavioural risk, and illustrates why project risk has followed a particular path, developing from the basis of engineering, science and mathematics. David Hancock argues for, and offers, complimentary models from the worlds of sociology, philosophy and politics to be added to the risk toolbox, and provides a framework to understand which particular type of problem (tame, messy, wicked or messy and wicked) may confront you and which tools will provide the greatest potential for successful outcomes. Finally he introduces the concept of 'risk leadership' to aid the professional in delivering projects in a world of uncertainty and ambiguity. Anyone who has experienced the pain and blame of projects faced with overruns of time or money, dissatisfied stakeholders or basic failure, will welcome this imaginative reframing of some aspects of risk management. This is a book that has implications for the risk management processes, culture, and outcomes, of large and complex projects of all kinds.

Frequently asked questions

Yes, you can cancel anytime from the Subscription tab in your account settings on the Perlego website. Your subscription will stay active until the end of your current billing period. Learn how to cancel your subscription.
No, books cannot be downloaded as external files, such as PDFs, for use outside of Perlego. However, you can download books within the Perlego app for offline reading on mobile or tablet. Learn more here.
Perlego offers two plans: Essential and Complete
  • Essential is ideal for learners and professionals who enjoy exploring a wide range of subjects. Access the Essential Library with 800,000+ trusted titles and best-sellers across business, personal growth, and the humanities. Includes unlimited reading time and Standard Read Aloud voice.
  • Complete: Perfect for advanced learners and researchers needing full, unrestricted access. Unlock 1.4M+ books across hundreds of subjects, including academic and specialized titles. The Complete Plan also includes advanced features like Premium Read Aloud and Research Assistant.
Both plans are available with monthly, semester, or annual billing cycles.
We are an online textbook subscription service, where you can get access to an entire online library for less than the price of a single book per month. With over 1 million books across 1000+ topics, we’ve got you covered! Learn more here.
Look out for the read-aloud symbol on your next book to see if you can listen to it. The read-aloud tool reads text aloud for you, highlighting the text as it is being read. You can pause it, speed it up and slow it down. Learn more here.
Yes! You can use the Perlego app on both iOS or Android devices to read anytime, anywhere — even offline. Perfect for commutes or when you’re on the go.
Please note we cannot support devices running on iOS 13 and Android 7 or earlier. Learn more about using the app.
Yes, you can access Tame, Messy and Wicked Risk Leadership by David Hancock in PDF and/or ePUB format, as well as other popular books in Business & Business General. We have over one million books available in our catalogue for you to explore.

Information

Publisher
Routledge
Year
2017
eBook ISBN
9781351896238

PART 1


THE BASIS FOR CURRENT PROJECT RISK METHODOLOGIES


The Introduction and Chapter 1 outline the reasons behind the inadequacy of present risk and project management practices and their inability to deal with everyday challenges faced by managers.

INTRODUCTION


The real trouble with this world of ours, is not that it is an unreasonable world, nor even that it is a reasonable one. The commonest kind of trouble is that it is nearly reasonable, but not quite. Life is not illogicality; yet it is a trap for logicians. It looks a little more mathematical and regular than it is; its exactitude is obvious, but its inexactitude is hidden; its wilderness lies in wait.
GK Chesterton1 (1908)
This book has been written because of my continuing frustration that risk management as described by most current project management literature does not correlate very closely with my own experience of the ā€˜real’ world – a feeling apparently shared by many project professionals in the private, public and third sectors. These beliefs appear not even to be unique to the UK or indeed Europe; having facilitated many seminars in the Middle East, North and South America and Asia, I repeatedly encountered similar problems, namely that the behavioural and societal aspects of risk are under represented in the project risk management processes. This appears to be compounded by the engineer’s and project manager’s apparent yearning for a world of unity and simplicity in which to practise their skills, where in this ā€˜reality’ there is a compulsion to reduce all the complexities in the natural world to the simple application of learnt processes and explicit knowledge.
Thus, the first problem with project risk management, as presently practised, is that it tends to reduce the likelihood and impact of experiencing harmful events to a simple mathematical equation. Typically, risk is configured technically – the preserve of specific experts who use mapping and diagnostic tools to isolate and cater for risks. Project management and financial institutions provide a variety of methods to evaluate expectant levels of risk.
Risk Definitions
The Association for Project Management defines project risk management as ā€˜a structured process that allows overall project risk to be understood and managed proactively, optimising project success by minimising threats and maximising opportunities’.2 The Office for Government Commerce, in its PRINCE2 and Management of Risk Processes, states ā€˜Project management must control and contain risks if a project is to stand a chance of being successful’,3 and evaluates risk in terms of probability, impact and proximity (the distance in time from its perceived occurrence). British Standard 31100 Code of Practice for Risk Management uses the terms ā€˜likelihood’ and ā€˜consequence’ to evaluate risk in line with the Australian and New Zealand standards.
Whilst this capability to define risk in a few simple parameters appears useful, risk management, constructed in accordance with the rules of probability can give the illusion of control and understanding when in fact there is only further confusion. Probability theory enables organizations to devise risk registers that quantify risks in terms of figures, adding credence and authority to consequent policy and strategy. The impression is of control. In all of these areas good process is the dominant control mechanism.
In the past, and I suspect in some cases still today, risk, particularly in the construction industry, has been dominated by a ā€˜check list mindset’. The move to more complex projects means that this mindset no longer meets the requirements of the industry. Sir John Egan, in Rethinking Construction,4 (the Egan Report) called for things to be done differently in the future to enable projects to be realized on time and under budget, in the light of new concepts, e.g. integrated teams and partnering. Rather than see problems as the result of a lack of information, which can be resolved through the successive acquisition of data, I argue that many risk issues arise as wicked problems and messes – uncertainties rooted not so much in a lack of information, but as a result of behavioural and/or systems complexity and interaction. The simplicity of probability (likelihood) multiplied by consequence (impact) does not in any way model the reality of risk.
The second immediate problem, which is described by Charles Perrow in his book Normal Accidents5 is the tendency to assume that events occur independently of one another. In an increasingly complex world this is proving to be a major flaw, even if and when the individual components relating to a perceived failure can be wholly identified. Combinations of events, which have hitherto been considered independent, can lead to major systems failures and events, having interacted in ways which the risk and project managers have either thought impossibly remote or have not even considered. Two examples are the Challenger Space Shuttle launch decision and the Heathrow Express collapse.
The Challenger Launch Decision
The disastrous launch of challenger in 1986 is primarily considered to have been due to a technical failure of the primary and secondary ā€˜O’ rings on the solid rocket boosters. However the truth is far more complicated. True, the rubber ā€˜O’ rings’ resilience was impaired by the cold temperature on the morning of the launch, which led to the propellant gases reaching and igniting the right solid booster tank with catastrophic consequences. However, NASA came under scrutiny for organizational culpability. After the disaster, two commissions began an investigation of NASA: the Presidential Commission, headed by William P. Rogers, and the House Commission. The Presidential Commission concluded that NASA’s flawed decision-making process contributed to the technical malfunction that caused the disaster. While the House Committee agreed with several of the Presidential Committee’s findings, they blamed not the decision-making process, but the decision-makers themselves.
Various factors contributed to the fateful decision, including acute pressure on the project and the misinterpretation of risk levels. Three main sources placed significant pressure on NASA to launch the Challenger Mission as speedily as possible: government, client and the media. Government (Congress) placed pressure on NASA to launch for economic reasons. At the time of the Challenger mission, NASA operated far over budget and far below the number of flights promised. Originally, NASA aimed to help pay for itself by sending up 24 payloads per year. When NASA began to fall behind schedule, the programme began to feel the pressure to produce more results, as fast as possible. Because NASA had taken over the responsibility of launching military satellites from the Air Force, NASA felt pressure to keep one of their main clients – the military – happy. If they failed to meet expectations, they would lose this responsibility to the Air Force, who surely would have been pleased to regain control of satellite launches. The final significant and public source of pressure came from the media. With each launch delay, NASA’s credibility suffered as the press made humorous comments about NASA’s seeming inability to get things done. From a risk assessment perspective, Herkert6 claimed that ā€˜outright dishonesty as opposed to misunderstanding or incautious practice’ was the cause of the disaster. He describes two stages of risk assessment. The first is based on probability and likelihood and considered the domain of the technical experts, and the second is based on the acceptability of the solution and he deems it to be political. NASA managers offered their own risk assessments, claiming they were based on ā€˜engineering judgement’. The managers’ risk estimates fell far below those of engineers. Herkert uses the example of a manager telling an engineer to take off his ā€˜engineering hat’7 and put on his ā€˜management hat’, to show how the decision making process can be biased by simple statements from peer groups. Herkert claims that managers appropriated the expert role of risk measurement and misrepresented it as engineering judgement. For Herkert, the disaster shows a misuse of technology for politics. He called for stronger legal protection for whistleblowers. For a more detailed study, the reader is directed to Diane Vaughan’s comprehensive book on the subject, The Challenger Launch Decision (1996).8

THE HEATHROW EXPRESS COLLAPSE (HEALTH AND SAFETY EXECUTIVE (HSE) REPORT9)

During the nightshift on 20–21 October 1994, a civil engineering disaster occurred when tunnels in the course of construction beneath Heathrow Airport’s Central Terminal Area (CTA) collapsed. They continued to collapse over the following days. The public and those engaged in the construction work were exposed to grave risk of injury. Workers were evacuated from the tunnels minutes before the first collapse; some had been carrying out repairs to critical parts of a tunnel lining while others had been advancing a parallel tunnel. By remarkable good fortune no one was injured. Major short-term disruption to the airport followed. The Heathrow Express Rail Link project, of which the tunnels were a part, suffered a severe setback as a result. The New Austrian Tunnelling Method (NATM) and compensation grouting had been used to construct the tunnels. The direct cause of the tunnel collapses was considered to be a chain of events involving:
•substandard construction in the initial length of the CTA concourse tunnel over a period of some three months;
•grout jacking that damaged the same length of tunnel, plus inadequately executed repairs to it some two months before the collapse;
•construction of a parallel tunnel in failing ground;
•a major structural failure in the tunnels, progressive failure in the adjacent ground and further badly executed repairs during October 1994.
There were also what were considered to have been weaknesses in the contingency and emergency procedures. Hazards from the in situ (i.e. cast-in-place) construction of thin, shell linings and the complementary use of compensation grouting were not identified by all the parties. Risk was not avoided or reduced through the contractual arrangements, the design of the permanent works and the NATM design. Risks were not controlled, during construction, through the ā€˜defensive’ systems (that is, preventative management systems) used by the parties. The particular risks associated with remedial work were not recognized and risk of collapse did not appear on the risk register and, subsequently, no mitigating actions were identified. The report concluded that collapses could have been prevented, but a cultural mindset focused attention on the apparent economies and the need for production rather than the particular risks. From the early stages of the project through to final collapse, there were failures to demonstrate the necessary level of care, and serious errors were made. Warnings of the impending collapse were present from an early stage in construction, but these were not recognized. The investigation found that the incident exhibited all the hallmarks of an ā€˜organizational accident’; that is, a multiplicity of causes led to the position where systems variously used by the client, designers and contractors failed and a major accident adversely affecting the safety of a large number of people occurred. There were undoubtedly human errors, but these were merely a consequence of foreseeable organizational failures. The causes of the incident were rooted in failures in ā€˜defensive’ systems that did not adequately deal with hazard identification, risk avoidance and reduction and the control of remaining residual risks.
Why are these two incidents important for our understandi...

Table of contents

  1. Cover
  2. Half Title
  3. Dedication
  4. Title Page
  5. Copyright Page
  6. Table of Contents
  7. List of Figures
  8. List of Tables
  9. Preface
  10. About the Author
  11. Part 1 The Basis for Current Project Risk Methodologies
  12. Part 2 The Tame, Messy and Wicked Model
  13. Part 3 Strategies for Wicked and Messy Environments
  14. Index