A Blog by Jonathan Low

 

Nov 20, 2012

'Back Swans?' 'Perfect Storms?' Lame Excuses for Poor Risk Management

No one could have foreseen an event of this magnitude...

That has become society's response to events whose impact leaves its members wondering how people with access to so much information can sometimes be so dumb.

Storms. Accidents. Terrorist attacks. Elections. Revolutions. They all get healthy doses of the 'dont blame me' gambit.

But the reality is, as the following article asserts, that there are very few events in modern life that are unpredictable. All too frequently the event is anticipated , but the news runs counter to popular belief or simply requires too much effort to address. The old MAD magazine tag line, 'What, me worry?' has become a global mantra, followed in popularity only by 'Not my job.'

As we have seen with technology implementation, the issue is often not information or the methods used to identify and disseminate it, but society's willingness to accept interpretations that run counter to its predilection for convenience. Humans are herd animals, despite our protestations of individuality and sometimes libertarian inclinations. We dont like to be too far outside the mainstream and, as a general rule, we like the path of least resistance.

The problem is one of emotion, rather than logic or knowledge. Belief systems are more powerful than information systems. The internet has further enabled our proclivity to seek only that data supporting our previously and/or deeply held opinions.

We fail to take appropriate steps to manage identifiable risks because it is expensive and time consuming. Our increasing insulation from the world around us - and from each other - creates a sense of entitlement to comfort or, at the very least, to routine. It is ironic that we seek solace in the belief that there are forces we can not control when we live in a world bounded and governed by identification and management of precisely those forces.JL

Kelly Servick reports in Stanford News:
The terms "black swan" and "perfect storm" have become part of the public vocabulary for describing disasters ranging from the 2008 meltdown in the financial sector to the terrorist attacks of Sept. 11, 2001.

But according to Elisabeth Paté-Cornell, a Stanford professor of management science and engineering, people in government and industry are using these terms too liberally in the aftermath of a disaster as an excuse for poor planning. Her research, published in the November issue of the journal Risk Analysis, suggests that other fields could borrow risk analysis strategies from engineering to make better management decisions, even in the case of once-in-a-blue-moon events where statistics are scant, unreliable or non-existent.

Paté-Cornell argues that a true "black swan" – an event that is impossible to imagine because we've known nothing like it in the past – is extremely rare. The AIDS virus is one of very few examples. Usually, there are important clues and warning signs of emerging hazards (e.g., a new flu virus) that can be monitored to guide quick risk management responses.

The attacks of 9/11 were not black swans, she said. The FBI knew that questionable people were taking flying lessons on large aircraft. And a group of terrorists seemed to have had a similar plan in 1994, when they took over in Algiers, Algeria, an Air France aircraft bound for Paris.

Similarly, she argues that the risk of a "perfect storm," where multiple forces join to create a disaster greater than the sum of its parts, can be assessed in a systematic way before the event because even though their conjunctions are rare, the events that compose them – and all the myriad events that are dependent on them – have been observed in the past.

"Risk analysis is not about predicting anything before it happens, it's just giving the probability of various scenarios," she said. She argues that systematically exploring those scenarios can help companies and regulators make smarter decisions before an event in the face of uncertainty.

Think like an engineer
An engineering risk analyst thinks in terms of systems, their functional components and their dependencies, Paté-Cornell said. For instance, in many plants that require cooling, generators, turbines, water pumps, safety valves and more all contribute to making the system work. Therefore, the analyst must first understand the ways in which the system works as a whole in order to identify how it could fail. The same method applies to medical, financial or ecological systems.

In the case of a nuclear plant, the seismic activity or the potential for tsunamis in the area must be part of the equation, particularly if local earthquakes have historically led to tidal waves and destructive flooding. Paté-Cornell noted that the designers of the Fukushima Daiichi nuclear power plant ignored important historical precedents, including earthquakes in 869 and 1611 that generated waves similar to those witnessed in March 2011.

Paté-Cornell says that a systematic approach is also relevant to human aspects of risk analysis.

"Some argue that in engineering you have hard data about hard systems and hard architectures, but as soon as you involve human beings, you cannot apply the same methods due to the uncertainties of human error. I do not believe this is true," she said.

In fact, Paté-Cornell and her colleagues have long been incorporating "soft" elements into their systems analysis to calculate the probability of human error. They look at all the people with access to the system and factor in any available information about past behaviors, training and skills.

Paté-Cornell has found that human errors, far from being unpredictable, are often rooted in the way an organization is managed. "We look at how the management has trained, informed and given incentives to people to do what they do and assign risk based on those assessments," she said.

A proven approach
Paté-Cornell has successfully applied this approach to the field of finance, where she has estimated the probability that an insurance company would fail given its age and its size. She said companies funded her research because they needed forward-looking models that their financial analysts generally did not provide. Traditional financial analysis, she said, is based on evaluating existing statistical data about past events. In her view, analysts can better anticipate market failures – like the financial crisis that began in 2008 – by recognizing precursors and warning signs, and factoring them into a systemic probabilistic analysis.

Medical specialists must also make decisions in the face of limited statistical data, and Paté-Cornell says the same approach is useful for calculating patient risk.

She used systems analysis to assess data about anesthesia accidents – where human mistakes can create an accident chain that, if not recognized quickly, puts the patient's life in danger. Based on her results, she suggested retraining and recertification procedures for anesthesiologists to make their system safer.

"Lots of people don't like probability because they don't understand it," she said, "and they think if they don't have hard statistics, they cannot do a risk analysis." In fact, we generally do a system-based risk analysis because we do not have reliable statistics about the performance of the whole system

1 comments:

Neil Robertson said...

Two examples that further illustrate the point:

1. The 7/7 terrorist attack in London. Initially reported to be perpetrated by people with no previous contact with the security service. Later proven to have crossed the path of the security services and been ignored because of lack of resources. This is an error that could have been avoided.

2. The financial meltdown caused (in part) by sub-prime mortgages. Financial trickery had created products (CDOs) whose provenance was almost impossible to verify. However any one of the financial companies that were investing billions of dollars in these products could have spent a few thousand surveying the typical holders of these mortgages and reached their own conclusions on what was going to happen.

Post a Comment