The Institute of Enterprise Risk Practitioners (IERP®) is the world’s first and leading certification institute for Enterprise Risk Management (ERM).

Image Alt

IERP® International Institute of Enterprise Risk Practitioners

  /  Thought Leadership   /  Black Swan Events: Widespread Implication of The Highly Improbable

Black Swan Events: Widespread Implication of The Highly Improbable

What is a Black Swan event, and what does one do when confronted by such a happening? That was the premise of IERP®’s March 2022 Tea Talk, presented by Ramesh Pillai. “Black Swans – if you understand them – are actually common sense,” he said. “It’s easy to explain, but it’s difficult to explain to a broad group from different backgrounds, who have different levels of experience.” He remarked that many people, consultants included, do not understand the nature of Black Swans. The start point, he said, was identifying and defining true Black Swan events. “The true nature of Black Swans is that they are unpredictable,” he explained.

The presentation dealt primarily with what Black Swans are and, more importantly, what they are not; and how to deal with them. Ramesh also touched on the writings of Nasim Taleb in the course of his presentation, studying some of the statements Taleb has made about Black Swans. A lot of people describe an event like this as a Large-Scale, Large-Impact Rare Event, LSLIRE. Black Swans are usually far out on the modelling probability tail, making them difficult to predict using current assessment tools. “If you imagine a normal distribution or Bell curve, anything to the right of the norm is positive; anything to the left of the norm, is negative,” Ramesh said.

But Black Swans are very far to the left of the Bell curve; they are extremely rare events. “If you really understand the meaning of a Black Swan event,” he continued. “It means that it is so unusual, weird and unpredictable, that if you had to tell me what it was, theoretically speaking, you would not be able to.” Citing the example of a directive from the central bank requesting all CROs to do scenario analysis, to model for Black Swan events, instructing them to consider events that can be foreseen, and events that cannot be foreseen – producing some confusion among these professionals who were puzzled by the directive to model events that could not be foreseen!

What the Central Bank was asking these risk professionals to do, clarified Ramesh, was identify Black Swan events. “A lot of people misunderstand that just because they miss an event, it is unpredictable,” said Ramesh. “But just because many people did not see it, that does not make it a Black Swan event. Many people may not have seen Covid-19 coming but that does not make Covid-19 a Black Swan event?” He remarked that some people may be aware of approaching incidents while many others may have ignored the signs. “There is a societal, organisational and practice challenge here,” he continued. “People don’t see it; they don’t get it.”

This happens because we generally lack synthetic and syncretic thinking. Our way of thinking needs to change, to enable us to deal with Black Swan events. Very often, people do identify Black Swan events but they are punished, ignored and ostracised because society generally does not like to listen to unpopular opinion. This causes people to refrain from voicing out what they know. A true Black Swan event, theoretically, cannot be predicted. Additionally, people also do not know which “expert” or “consultant” to believe because there are just too many, all claiming to be correct. “Models and organisations systematically fail to forecast Black Swan events,” Ramesh said.

What is required, is to leverage mainstream analyst and model failure dynamics to solve Black Swan reduction or LSLIRE forecasting. Quoting extensively from the work of Nassim Taleb, the mathematician and statistician who specialises in problems of uncertainty, Ramesh pointed out that Taleb asserts that the world is shaped by rare but consequential shocks and jumps; the norm is irrelevant; the world is dominated by the extreme, the unknown and the very improbable. Black Swan logic makes what we don’t know far more relevant than what we know. What we don’t know shapes the world; what we know is a waste of time.

But despite Taleb’s argument, very few things can be said to ‘come out of nowhere’. Indeed, there are many ‘unknown unknowns’ – which according to Fintan O’Toole, an Irish commentator, biographer and critic, are things which are not at all inevitable, that are easily knowable or known, but which people choose to un-know. “Many large-scale, rare, large-impact events have practically knowable emergences,” Ramesh said. “Are you saying that we cannot forecast a tsunami, a potential downturn in the economy, or an earthquake?” Events like these, with the technology now available, can be forecast.

In view of this perspective, he said that Taleb’s definition of what constitutes a Black Swan event does not make much sense, although certain things he has said are correct. “He said, for example, that we keep focusing on the minutiae, which means we keep looking at the small items,” Ramesh pointed out. “This is true; Risk managers are notorious for doing this. They look at all the smaller items and miss the bigger picture. As Risk managers we always need to step back and look at the bigger picture.” Additionally, people like portfolio managers who look at risk tend to exclude the possibility of the Black Swan, thinking it quite unlikely.

He added that Risk professionals are not very good when it comes to understanding meta rules, i.e., rules that govern the application of others; we are not good at dealing with situations where there are multiple, complex, co-mingling rules. “We think very simplistically, we think binary. As soon as there is more than one thing, we find it difficult to think,” Ramesh said, adding that we also lack imagination, and thus tend to repress it in others. What should be discussed then, is how cognitive or philosophical groundings and perspectives impact LSLIRE forecasting and risk uncertainty assessment; what can be done better, operationally; and what better solutions look like.

A holistic view needs to be adopted in order to solve problems. “When you are a Risk manager, you need to adopt multiple perspectives,” he continued. “You need to adopt a holistic view.” He remarked that most Risk managers’ ability levels are confined to the risk registers but Black Swans are rarely covered in risk registers. But the thinking process that takes Black Swans into consideration matters strategically to the organisation. Contexts and circumstances change; history does not repeat itself. “Circumstances change. There will be similarities, shape-shifts and morphing within the context,” he said. There is a need to think within syndromes, using situational analysis, and changing scenarios.

He stressed that it is not about the numbers but the thinking processes, biases and perspectives, i.e., the psychology of risk. “If you limit your perspectives, the quality of your risk analysis is severely limited,” he cautioned. “This means that the quality of your end-product or answer or analysis is limited as well.” Besides asking qualitative questions, a broad systems-level view and multiple integrated perspectives help illuminate emerging rare outcomes. Open-flow thinking processes allow external stimuli and infinite inputs to come in – organisations need to be open to all these within their environment.

Open-model thinking processes are not random, unknown or extreme. Rather, they are grounded within change processes and especially focused on watching changing points. There is a need to stay open to the environment, with proper situational awareness. This is called syncretic thinking. “A good Risk manager, especially one who is good at trying to analyse and predict Black Swan events, has a broad background and generalist thinking – someone who understands limitations and sensitivities,” Ramesh said. “The person should be able to process the information that is coming through.” He advocated hiring people with dissimilar backgrounds, to develop broader organisational perspectives.

If organisations want to better understand the relationship between Black Swan and LSLIR events, he suggested normalising and rationalising Taleb’s Black Swan explanations, to see what he considers Black Swan events, and compare them to LSLIRE. One example is Taleb’s Black Swan narrative concerning the transformation of Lebanon from a “stable paradise” to hell on earth, seemingly without warning. However, analysis of the situation over the decades showed that changes were happening which then involved other countries and Lebanon’s internal factions, that eventually led to a civil war. This instability happened over many decades, resulting in the current unrest.

All this was known, indicating that the situation was LSLIRE, not a Black Swan event. “Very few crises happen without ‘noise’ – the problem is that very few people listen or pay attention to the noise,” stated Ramesh, citing events like the September 11 2001 attacks, and the 2008 Financial Crisis which had indicators that were largely ignored. But what does better practice (of recognising Black Swans) entail? Many things are forecastable, and technology today can be used quite effectively for this. Stressing that people do foresee certain things, he said that luck can be supported by experience; this should be replicated.

There is a need to distinguish between a Black Swan and a LSLIR event, so that they can be controlled and reduced. Firstly, learn holistic profiling of change processes. “Learn to observe things that are happening around you,” he advised. “Learn to note what they are, and when they change, learn to identify these changes holistically.” Secondly, learn to deal with the question of arithmetic or quantitative and qualitative interface change issues just prior to a LSLIRE. Thirdly, develop better judgement of the timing of the system’s shifting events. “If you are familiar with these changes and their speed, you will hopefully be able to predict the potential trigger point of a LSLIR event,” he said.

This means moving from synthetic thinking to syncretic thinking. Synthesis involves the integration of information, how we get it and how information is put together. Syncretic thinking involves seeing how things change. “You need both,” he said. “Both involve aspects of understanding specific emergence and entanglements. The problem is that they are not generic, and analysts generally are not good at this.” What needs to be considered is syndromes because situations or scenarios shape-shift into new scenarios without clear direction. Qualitative thinking needs to be merged with quantitative thinking because the LSLIRE emergence cannot be predicted.

He stressed that this is a constant iterative processes, similar to any risk management process. But most models and scenarios still cannot deal with it because their perspective is retrospective; they look at past incidents to forecast a future event. Research has shown that the closer one gets to a LSLIRE or a mind-blowing Black Swan event, the more likely that the data models being relied on, are unreliable. “We don’t understand that what is changing externally affects us internally, and what is changing internally affects us externally,” Ramesh said. “It’s a very complex relationship, and a lot of people do not have the ability to do this. They also fail to see specifically-themed plays.”

Being a good forecaster in one context is no guarantee for being correct in other contexts. He emphasised the need to develop the ability to think outside the box, to look for changes in what we believe the norm to be, and understand those changes, and then try to look at the norms that we expect. “If the norms that we expect cannot explain what is happening, that is an indicator that you are close to a LSLIRE,” he cautioned. “The more it is out of whack, the closer you are.” The trigger of the Black Swan event needs to be identified; if this can be achieved, the Black Swan event can be converted into a LSLIRE.

This will give us better control of our destiny. We cannot control what is going to happen but we want to reduce Black Swan to LSLIR events, even though the event itself ultimately may not emerge. Concluding the presentation, he said there was a need to differentiate between LSLIRE and emerging risk. “Experience is critical,” he said. “It allows you to pool your knowledge. Imagination is even more critical but many people lack this. It is helped by experience and having a clear mind. You want people who are unconstrained by all of the learning they have had over the years. Sometimes learning puts you in a straitjacket; sometimes too much learning is not good. You need new blood. Foresight is good; you need the moral courage and confidence to speak up and speak your mind. If not, you may have identified a potential Black Swan but you will not be able to move it to a LSLIR event because you lack the courage and confidence to do so.”

betoffice

User registration

Reset Password