PART ONE
Rethinking
01
Introduction
Learning from crises
As two-year-olds, our children discovered the question: âWhy?â They discovered how much it could irritate their parents, particularly as bedtime approached. But they also learnt how persistently asking that question revealed valuable insights into how cause and effect drive the worldâs workings. Now in their thirties and beyond, they retain the ability to ask the worldâs most penetrating question. That is partly because we, too, retained it and know that it provides the most valuable route to understanding anything, including the anatomy of failures.
As we began to ask âwhyâ about crises, the first answers were specific to the particular disaster. A refinery blew up because an operator cut a corner. A fairground ride crashed because procedures were not followed. A hacker disrupted a computer system because its security failed. A firm hosted a rogue trader because supervision was ineffective. A plane crashed because of a maintenance error.
But these are not satisfying answers; we wanted deeper insights. We were not alone. Academic researchers and safety analysts were also studying human error in accidents. And at the other end of the telescope, we met PR directors and company secretaries some of whom muttered darkly that things were not really as good as they seemed from the outside.
What we wanted to discover was:
- Why do some crises tip into reputational catastrophes whereas others do not?
- Are common factors at work?
- To what extent are reputations predictably vulnerable?
Anthonyâs story
Through offering legal support to businesses in crisis, Anthony saw 30 yearsâ worth of calamitous events from the inside. He saw, at close quarters, the emergence of the compensation culture, the âblame gameâ and âinfotainmentâ.
As a liability and insurance lawyer, Anthony was trained to look for something called the âproximateâ cause of the accident â an event, error or omission but for which the accident would not have happened. That means looking for the event closest to the accident of which you can say: âAha! But for that, the accident would not have happened.â
This process is as convenient for leaders as it is misleading because it tends to blame lowly Fred or Freda who made a mistake â such as pressing a button at the wrong moment. Partly through his experience working with aviation â an industry that took a different approach â Anthony began to feel that a proximate cause was not enough. As Anthony Hilton, the leading City columnist put it, âinquiries focus on the processes within an organisation until they find some hapless individual or group who departed from the manual. Identifying that person becomes a proxy for solving the problemâ.1
Blame them: sack them: job done. This is where most inquiries into crises stop. They fail to delve to deeper discoveries, such as: âWhy did they make that mistake?â âTo what extent did the system drive them to it?â Root cause risks remain unidentified, unmanaged and ready to strike again.
Always close to the insurance market, Anthony was also involved, decades ago, in groups that were chewing over dietary fat, sugar and genetically modified organisms, reputations and other risks that were difficult or impossible to insure. In those days few lawyers understood much about reputations let alone reputational risk.
But Anthony had not begun his career as a lawyer. His degree in engineering had given him a particular kind of curiosity. Dealing with the liability and insurance consequences of major accidents, he was left with a big puzzle: why was it that even after crisis lawyers had minimized legal liabilities and maximized insurance pay-outs, businesses still struggled to recover fully from a crisis?
Clues began to emerge from the aviation industry. Aviation does things differently. The difference explains why airline flights have become so safe â though it was not ever thus and the path to safety was long and deliberate.
An early pointer emerged from the British Midland Airlineâs 1989 crash at Kegworth. Forty-seven people died and 74 were badly injured. Its Chief Executive Michael Bishop focused on the plight of those affected and learning the lessons of the disaster. His response is widely credited with putting British Midland on the map as a âgoodâ airline.
For Anthony, one practical insight was that the humanity and generosity shown to the victims of crises, plus demonstrably learning from mistakes, were not only morally right: they were economically right too. With judicious juggling it was possible simultaneously to meet the expectations of victims, liability insurers and future customers and minimize the damage to the reputation of the airline that had caused the victimsâ misfortune.
Michael Bishopâs handling of the Kegworth crash was a first insight into reputational risk, and one Anthony began to put into practice, helping clients in crises to do what they saw as the âright thingâ in both moral and business terms. A challenge was to avoid upsetting their liability insurers who, encouraged by their lawyers, often resisted any expression of sympathy for victims lest it be construed as an admission of legal liability.
But astute insurers, well briefed, increasingly saw the potential benefits. Victims and their families treated with humanity typically seek recognition of any wrong done and fair compensation, and want the lessons to be learned. Relatives of victims dealt with callously are more likely to want revenge as they try to punish the perpetrator by delivering a lesson that is both expensive and painfully memorable. Insurers discovered that treating victims fairly was not just morally right, it cost less too. And it pleased many that the only losers were lawyers, who lost the opportunity to earn fees as fractious fencing dragged on for years.
Meanwhile, 1990 saw the publication of âHuman error in the cockpitâ,2 a paper from Swiss Re, one of the worldâs thought-leaders among reinsurance companies. It set out from the observation of Stanley Roscoe, an aviation psychologist, that a finding of âpilot errorâ after a crash was âin no sense an explanation of why the accident occurred,â but merely âthe substitution of one mystery for anotherâ.3
Various air crash investigations had concluded that communication failure among the flight crew had been an important cause of accidents. A junior crew member knew something important but was unable to communicate that fact so that the captain understood it. Reasons included the complexity of unfolding events and unspoken hierarchical rules that seemingly made it impossible for a co-pilot assertively to tell the captain that he (in those days it almost always was a âheâ) might be making a serious mistake, even though they, their passengers and crew were facing imminent, avoidable death.
Derekâs story
Derekâs decision to study engineering was, if not a rebellion, at least a decision to be different. With a father who worked as a public relations (PR) manager at the Daily Mirror newspaper group, reputation was a regular feature of Atkins family life.
Derek went on to complete a doctorate and began his career at the UK Patent Office, a role with engineering, not reputation, at its heart. But, following the 1974 Flixborough disaster, an explosion at a UK chemical plan...