Lessons Shared, Lessons Learned?

Lessons Shared, Lessons Learned?

kelvintop-set News

TOP-SET is much more than a health and safety investigation tool; it is a problem-solving system that, if used effectively, can help us learn from a problem or incident to prevent recurrence. It is crucial to share the lessons learned from investigations so that others can act upon them and benefit from experience. Mistakes happen, bad judgement and bad management are realities, but not learning from others is a great oversight that leads to history repeating.

In the case of the current pandemic, this is all the more acute. Although we are still very much in the middle of it, and conclusions cannot be reached until later, the numbers of positive cases and deaths from coronavirus in different countries do reflect whether their leaders took lessons from other examples worldwide and ran with them.

Among others, South Korea’s preparedness and subsequent handling of the situation should have been a key example to the West. They had learned a hard lesson from their mismanagement of MERS (Middle East Respiratory Syndrome) in 2015. The result was a strong planning effort resolving to prevent similar fallout using the test-trace-contain approach. They, after all, had recent hard memories to act on.

For whatever reasons, such as cognitive bias, blind optimism or supreme exceptionalism, leaders in the West, especially the U.K. and U.S., chose not to learn from previous, shared experiences and have therefore acted too late, evident from the huge number of cases and deaths from coronavirus in those countries. Their perspective of risk and their complacency led to the worst kind of destruction that such a virus could bring. Perhaps their memories of such crises were short-term, or short-sighted: ‘that won’t happen here’. The information and warnings were there, but were ignored.

However, it is often the human condition rather than intention that leads us to carry on regardless and ignore valuable learnings from previous incidents. Known as the ‘ostrich paradox’, we tend to ignore low probability but high-impact events, especially if their remedial actions are seen as difficult to implement. Often quick but reactive fixes are made instead that have minimal effect. We also suffer from short-term memory. But knowing this should make it all the more important to formulate solid planning strategies that counteract cognitive biasses so that we can effectively deal with the very real threats faced by humanity. It is not just the chance of something happening that is a factor to be considered in this planning, but the potential consequences of when, not if, it does.

Unfortunately, it is often seen as a worthless task to plan for such events because, if actions are taken swiftly based on international and historical learnings, then nothing much happens as a result; there is no catastrophic event as previously hypothesised and people cannot understand why such drastic measures have been taken. Experts and leaders may be blamed for “overreacting” and no crisis occurring in the end, as pointed out in The Guardian by Devi Sridhar, Professor of Global Public Health at The University of Edinburgh. This is the dilemma of the ‘prevention paradox’. Isn’t this better, however, than learning too late that people were right?

It is the same with investigation within and between companies and countries. The lessons they reveal must not only be shared but heeded and embedded in preventative planning, otherwise there is no point. People need to work together; communication and collaboration are key to this, to preventing any and the worst disasters that are always likely, but that none of us want to happen. It is also key to keeping employees, or the public, onboard with any decisions made as a result. If we view the world as a ‘system’, this can help us look at ways to prevent further system failures, like the handling of the COVID-19 outbreak.