Latest Products

Counterfactual Risk Analysis

Friday 15 December 2017




 A way to learn from the past with a 360 degree perspective


by Gareth Byatt and Gordon Woo


In October, Lloyds published new report on counterfactual risk analysis.
This article explains the concepts of counterfactual risk analysis, and how to make the best use of it to strengthen your business resilience and management of risk.

 
The examples quoted in this article are from the Lloyds report (the report focuses, in particular, on disaster risk management and how to improve our readiness for unexpected major events through risk-modelling).
 
 
The concept of counterfactual risk analysis
 
Counterfactual risk analysis is a lens through which to view risk and uncertainty, and appropriate organisational resilience mechanisms, by looking at and learning from what has happened, and what could have happened, in the past. It applies equally to people working in commerce and industry as it does to those in government and non-government organisations.
 
 
Adopting a ‘counterfactual mindset’: learning from events by asking ‘What if?’
 
Whenever an event occurs, we ask ourselves—usually with the benefit of hindsight—how the event might have been averted, or what additional risk management and resilience measures might have reduced the impact that resulted.
 
When you review risks that turn into events, do you look at alternative ways the event could have transpired, had it followed a different path or direction? This questioning mindset is the essence of counterfactual risk analysis. Analysing an actual event for how else it might have occurred (as well as the actual way it occurred) can provide valuable insights into the strength of your activities (your risk management plans, controls, etc.), as well as your ability to manage risk and resilience.
 
Analysing what could have happened if events had deteriorated is called a ‘downward counterfactual’. By contrast, an ‘upward counterfactual’ considers what could have happened if events had a more positive outcome. Psychologists of counterfactual thinking (Roese, 1997) observe that upward counterfactual thoughts are more common than downward thoughts. There is an inherent outcome bias in reviewing events (and near-misses). As Kahneman (2011) points out, decisions tend to be judged according to the outcome. Train yourself to look at the ‘downward’ possibilities, as well as the ‘upward’ ones.
 
Examples of upward counterfactual analysis thinking include:

  • If only I had looked at this matter closer or earlier, I would have been able to do something about it.
  • If only we had reviewed this part of our business earlier, we could have avoided this event or prevented it from occurring.
  • If only we knew of this possible outcome, we would have taken steps to protect ourselves from it.
 
Examples of downward counterfactual analysis, which, typically, we fail to review as often as upward analysis, include:
 
  • How could this event have been even worse?
  • What if this event had taken place at the same time as another risk became an event?
  • What if this event had taken place 60 seconds later, when more activities were underway?
 
 
Discuss alternative outcomes of near misses
 
A counterfactual risk analysis can help us to look at and learn from near-misses, as well as actual events.
 
Many organisations have robust processes in place (audits, investigations etc.) to review and learn from near-misses. By looking at both large and small-scale near-misses through the lens of counterfactual risk analysis, we can consider alternative scenarios against which we can determine whether our control environment and our management of operations would prevent alternative near miss scenarios from becoming actual events—be they events with safety or environmental implications, or reputational, financial or legal implications, or all of the above.

 
 
Make enough time to openly discuss alternative views of events that have occurred
 
Let’s look at applying a counterfactual risk analysis with a few actual examples. Many other examples of events and major near misses are described in the full Lloyds report.
 
Consider flooding risk, an issue current in many parts of the world. In February 2013, during a major blizzard, a four-foot storm surge hit the city of Boston. Fortunately, it occurred during low and not high tide. With the high tide already a foot higher than average due to the new moon, had the storm surge combined with this high tide, it would have resulted in a real “100-year flood” (Conti, 2015). In this example, the chance of such a coincidence was approximately one in six—and luckily, on this occasion, it didn’t happen. If it had, however, what might have been the outcome for the city, its businesses, and communities?

Now let’s consider a marine disaster that was thankfully averted. The Singapore Strait is one of the world’s busiest shipping routes. On 3 August 2016, a 320,000-ton Iranian oil tanker, the Dream II, owned by Iran’s leading oil tanker operator NITC, collided with a 14,000-ton container ship, Alexandra (The Marine Executive, 2016). The tanker’s bow hit the Alexandra’s port quarter, resulting in significant damage to its hull. Ten empty containers on board the Alexandra fell overboard, five of which landed on the deck of the Dream II. Before the collision, the Port Operations Control Centre of the Singapore Maritime Port Authority provided traffic information and alerted the shipmasters of Dream II and Alexandra of the risk of collision. Both vessels remained stable and safely anchored in Singapore. Fortunately, the incident caused no injuries nor any major oil pollution, but it is one of the first cases of sea collision between mega-vessels.

What if there had been a full collision, perhaps causing fatalities and / or a major oil leak from Dream II, or from one—or both—of the ships sinking? Assessment of the possibilities revealed by examining upward and downward counterfactual analysis can help us test our resilience measures.

 

Practice counterfactual risk analysis as a habit
 
Counterfactual risk analysis can help us to look in a rounded way at events and near-misses that have occurred. It helps us learn from them whilst considering alternative scenarios, in order to understand how effective our resilience measures are to handle different possible outcomes. By expanding our thinking to include what could have happened, and by exploring and learning from the past in a deeper way, we can improve our ability to manage risk and resilience in order to achieve our objectives.
 
Think about whether you would benefit from reviewing events and near-misses with a counterfactual risk analysis mindset to ensure you learn holistically from them. Refer to
the full Lloyds report for examples of how to implement this approach and mindset.
 
 
About the authors

 
Gordon Woo has developed his interest in counterfactual analysis through extensive practical insurance experience at RMS in a diverse range of natural and man-made hazards. Trained in mathematical physics at Cambridge, MIT and Harvard, he is a visiting professor at UCL, London, and an adjunct professor at NTU, Singapore.



 
Gareth Byatt is an Independent Risk Consultant and owner of Risk Insight Consulting. He is based in Sydney, and has 20 years experience in international risk and project management.