The Black Swan - Nassim Nicholas Taleb

The Black Swan - Nassim Nicholas Taleb

The Impact of the Highly Improbable

The Black Swan - Nassim Nicholas Taleb

Buy book - The Black Swan by Nassim Nicholas Taleb

What is the plot of the novel The Black Swan?

The film The Black Swan (2010) provides insights on how we perceive randomness and the limits we encounter when attempting to anticipate the future. Our over-reliance on methods that appeal to our intuition at the expense of accuracy, our fundamental inability to comprehend and define randomness, and even our biology itself all contribute to poor decision-making and, on occasion, to "Black Swans" – events that were previously thought to be impossible but which have the potential to completely transform our understanding of the world.

Who is it that reads the novel The Black Swan?

  • Anyone whose work entails examining graphs and trends is qualified.
  • Anyone who is interested in learning more about ways to reduce their risk exposure.
  • Anyone with a vested interest in epistemology

Who is Nassim Nicholas Taleb, and what is his background?

His numerous essays have appeared in a variety of magazines and journals and he is considered to be one of the most prolific contemporary economists and thinkers. He has written a number of critically acclaimed books, such as Fooled by Randomness, as well as numerous essays that have appeared in a variety of magazines and journals. Taleb is a Distinguished Professor of Risk Engineering at New York University's Polytechnic Institute, where he also teaches at Columbia University.

What exactly is in it for me? Discover why sticking to your views may lead to an unpleasant surprise down the road.

 Nassim 'The Black Swan,' a book by Nicholas Taleb, delves into the nature of what we believe to be random occurrences, as well as the logical fallacies that lead us to lose sight of the larger picture. He refers to these apparently random occurrences as "Black Swans," since they often have far-reaching implications for the individual and, in some cases, for whole civilizations. Taleb helps us to have a better awareness of our own limitations when it comes to forecasting. The ability to detect when our judgment is affected by the urge to fit facts into tidy, easy-to-understand narratives may be useful in identifying when we are being deceived. If you read this section carefully, you'll learn how to avoid mistaking noise for knowledge, as well as how to make better use of your ignorance. You'll discover why thinking like a turkey may be detrimental to your health in this article. You'll also learn why the most serious danger to a casino may not be related to gaming at all.

You will also learn why "knowing what you don't know" may prevent you from losing your whole life savings at the end of the book.

"Black Swans" are occurrences that are considered to be beyond the range of possibility, yet which nonetheless occur.

 When it comes to converting all of the inputs from our surroundings into meaningful information, human beings are especially adept. A skill that has enabled us to develop the scientific method, philosophize about the essence of being, and devise sophisticated mathematical models is one that we possess. However, just because we have the ability to reflect on and organize the environment around us does not imply that we are particularly skilled at doing so. For starters, we have a tendency to be narrow-minded when it comes to our views on the way the world works. Once we have a general understanding of how the world works, we have a tendency to hold on to it.

However, due to the fact that human knowledge is continuously expanding and changing, taking a dogmatic approach is illogical. For example, physicians and scientists were extremely confident in their understanding of medicine only two hundred years ago, but now their assurance seems ludicrous: picture going to your doctor complaining of a simple cold and being given a prescription for snakes and leeches instead! Being dogmatic about our ideas causes us to become oblivious to notions that do not fit into the paradigms that we have already accepted as true in our minds. Imagine trying to comprehend medicine without being aware that germs exist. How would you go about learning about medicine without being aware of germs? A reasonable explanation of the disease may be developed, but it will be faulty due to a lack of critical facts.

This kind of dogmatic thinking may lead to some unexpected outcomes. We are often taken by surprise by occurrences, not because they happen at random, but because our perspective is too limited. Such shocks are referred to as "Black Swans," and they have the potential to cause us to radically rethink our worldview: In the absence of any evidence to the contrary, people believed that swans were exclusively white. In order to reflect this, all of their representations and imaginative portrayals of the swan were white, implying that white was an important component of "swanness." This meant that the discovery of the world's very first black swan changed the way people thought about the species forever (and it continues to do so). According to what you will discover, Black Swans may be as inconsequential as finding out that not all swans are white, or as life-altering as losing everything as the result of a stock market collapse.

Black Swan occurrences may have life-altering effects on individuals who fail to recognize or prepare for them.

The ramifications of the Black Swan are not the same for everyone. Some will be severely affected, while others may not be affected at all. The strength of their impact is mainly controlled by your access to relevant knowledge: the more information you have, the less likely it is that you will be struck by a Black Swan; and the more ignorant you are, the more vulnerable you are to being hit by a Black Swan. A good illustration of this is the following scenario: Consider the possibility of placing a wager on your favorite horse, Rocket. Because of Rocket's build, her track record, the competence of her jockey, and the lack of competition, you feel that Rocket is the safest option and place all of your money on the horse to win the race. You can only imagine your astonishment when the starting pistol is fired and Rocket refuses to leave the gates, choosing to just lie down on the racetrack instead of running.

This would be considered a Black Swan occurrence. You were certain that Rocket would win based on the facts you had collected, but you were wrong, and you lost everything the moment the race started. However, this will not be a disaster for everyone involved. For example, Rocket's owner earned a fortune by placing bets against his own horse, which was named Rocket. His knowledge was superior to yours, since he was aware that Rocket was about to embark on a hunger strike to protest cruelty to animals. The mere fact that he had that little bit of knowledge spared him from having to deal with a Black Swan occurrence.

The magnitude of the effect of Black Swans may also vary significantly. When a Black Swan occurs, it may have far-reaching consequences for whole civilizations, rather than just individuals. When this occurs, a Black Swan has the potential to fundamentally alter the way the world operates, with implications for many sectors of civilization, including philosophy, religion, and physics. Consider the implications of the discovery by Copernicus that the Earth was not the center of the cosmos. His findings called into question both the authority of the reigning Catholic Church and the historical authority of the Bible itself. At the end of the day, this specific Black Swan contributed to the establishment of a new beginning for all of European civilization as a whole.

Even the most fundamental of logical fallacies may deceive us into believing what we want to believe.

 Despite the fact that humans seem to be the most intellectual creatures on the planet, we still have a long way to go until we have completely outgrown all of our negative habits and behaviors. Making up stories based on what we know about the past is an example of this type of behavior. While we have a natural tendency to think that the past is a good predictor of the future, this is often incorrect. The result is that we are more susceptible to making errors because there are just too many unknown variables that may work against our narratives. Consider the following scenario: You are a turkey living on a farm. Over the years, the farmer has supplied you with food, allowed you to wander freely, and a place to call home. With the past as a guide, there is no reason to believe that tomorrow will be any different from the day before.

You are beheaded before being stuffed with spices and baked in an oven before being eaten by people who have taken care of you and provided you with a home and food. As this example illustrates, the notion that we can make predictions about the future based on knowledge of the past is a fallacy with potentially disastrous implications, as we will see below. An analogous fallacy is confirmation bias, according to which we often seek information solely to support the views we have already formed, even to the point of ignoring evidence that contradicts those beliefs. It is rare that we will accept information that contradicts our preexisting beliefs, and even less likely that we will continue to look into the information further. If we do conduct an investigation, we will very certainly seek out sources that contradict this claim.

Consider the following scenario: If you firmly think that "climate change" is a conspiracy and then chance to see a video titled "The Undeniable Evidence for a Changing Climate," you are likely to be angry. If you went on the internet after that and looked for information on climate change, it is more likely that you would use the search phrase "climate change hoax" rather than "evidence for and against climate change." It turns out that, although both of these fallacies are anti-scientific, we can't do much to prevent poor thinking, since it's just part of our human nature to do so.

Because of the way our brains classify information, it is very difficult to make accurate predictions.

 Throughout our evolutionary history, the human brain has evolved certain methods of categorizing information. Even if they are advantageous for living in the wild, when humans need to learn and adapt rapidly to our hazardous surroundings, they are detrimental in today's complicated settings. In the case of the so-called narrative fallacy, we construct linear tales to explain our current condition, which is an example of improper categorization of information. This is due to the enormous quantity of information that we are exposed to on a daily basis. In order to make sense of it all, our brains selectively store just the information that they deem to be essential. For example, although you are likely to recall what you had for breakfast this morning, it is unlikely that you will recall the color of everyone's shoes on the train this morning.

In order to provide meaning to these seemingly unrelated pieces of information, we must weave them together into a cohesive narrative structure. When you think about and reflect on your own life, for example, you are likely to select only specific events that have meaning for you and arrange those events into a narrative that explains how and why you became who you are. For example, you may like music because your mother used to sing songs by The Beatles to you every night before bedtime. Creating such narratives, on the other hand, is a terrible method of gaining any real knowledge of the world. This is due to the fact that the process operates only by looking backward in time, and does not take into consideration the almost limitless number of potential explanations for any given occurrence. The reality is that even seemingly minor occurrences may have unpredictably large and far-reaching effects.

Consider the possibility that a butterfly fluttering her wings in India triggers a storm in New York City a month later, as an example. If we keep track of each step of cause and effect in this process as it unfolds, we will be able to establish a clear, causal connection between occurrences. However, since we only observe the result — in this instance, the storm – all we can do is estimate which of the concurrently occurring events had the most impact on the outcome.

We are unable to differentiate between information that is scalable and information that is not scalable.

 Many different techniques and models for classifying information and making sense of the environment have been created by humans throughout time. Unfortunately, humans are not particularly good at discriminating between various kinds of information — most importantly, between information that is "scalable" and information that is "non-scalable." However, there is a significant distinction between the two kinds. Non-scaleable information, such as body weight and height, has a defined upper and lower statistical limit that can not be exceeded. There are physical limits to how much a person can weigh, so, although it is conceivable for someone to weigh 1000 pounds, it is impossible for anyone's weight to exceed 10,000 pounds due to the constraints of physical capacity. Because the characteristics of non-scalable information are clearly restricted, it is possible for us to make meaningful predictions about averages based on the information we have.

Non-physical or essentially abstract phenomena, such as the distribution of wealth or record sales, on the other hand, have the ability to be scaled. Consider the following scenario: If you sell your record in digital form via iTunes, there is no limit to the number of sales you can expect to earn, since distribution is not restricted by the number of physical copies you could possibly produce. Because the transactions are conducted online, there is no scarcity of real money to prohibit you from selling a trillion records. If you want to have an accurate view of the world, understanding the distinction between scalable and non-scalable information is critical to understanding the world. Furthermore, attempting to apply principles that are successful with non-scalable information to scalable data would simply result in errors and inefficiency.

Consider the following scenario: you wish to determine the wealth of the people of England. It is easiest to calculate their per capita wealth by summing up their entire revenue and dividing that amount by the total number of people in the country. Wealth, on the other hand, is really scalable: it is conceivable for a small fraction of people to possess an extremely high percentage of the wealth in the world. Using just per capita income statistics, you may create a depiction of income distribution that is likely to be inaccurately representative of the actual reality experienced by the people of England.

We have a disproportionate amount of confidence in what we think we know.

 As humans, we all want to protect ourselves from damage, and one of the ways that we accomplish this is by evaluating and controlling the potential of risk. In order to avoid this, we purchase items such as accident insurance and try not to "place all our eggs in one basket." The majority of us make every effort to assess risks as precisely as possible in order to avoid missing out on opportunities while also avoiding doing anything we may later come to regret. It is necessary to assess any potential hazards and then calculate the likelihood that these risks will materialize in order to accomplish this goal.

Consider the following scenario: you are looking to purchase insurance. You want to get the kind of insurance coverage that will protect you against the worst-case situation while also not being a financial drain on your resources. In this situation, you would have to weigh the danger of illness or accident against the implications of those events occurring, and then make an educated choice based on your findings. Unfortunately, we are much too certain that we are aware of all of the potential dangers that we must take precautions to protect ourselves from. Known as the ludic fallacy, it holds that we prefer to treat risk in the same way that we would treat it in a game, where there are a set of rules and probabilities that can be determined before we start.

However, addressing danger as if it were a game is a hazardous enterprise in and of itself. To provide an example, casinos are motivated by a desire to earn as much money as possible, which is why they use sophisticated security measures and prohibit players who win excessively and often. Their method, on the other hand, is founded on a logical fallacy. The most serious dangers to casinos may not be fortunate gamblers or burglars, but rather kidnappers who abduct the owner's child or an employee who fails to report the casino's profits to the Internal Revenue Service. The most dangerous dangers in the casino may be totally unexpected. As this example demonstrates, no matter how hard we try, we will never be able to correctly predict every danger. Following that, we'll discover that being conscious of our ignorance is much preferable than staying ignorant of it.

Making a list of what you don't know can assist you in making more informed decisions about risks.

 We've all heard the expression "knowledge is power," and it's true. However, there are occasions when we are restricted by our knowledge, and it is during these moments when understanding what you don't know is much more beneficial. Instead of narrowing your view of all potential outcomes of a particular event, by concentrating solely on what you know, you create fertile ground for the occurrence of Black Swan occurrences. Consider the following scenario: You wish to invest in a business, but your knowledge of stock data is restricted to the period of 1920-1928 — one year before the worst stock market collapse in US history. In such a scenario, you'd see a few little dips and peaks, but the overall trend would be upward, as would be expected. As a result, you decide to invest your life savings in stocks, believing that the trend will continue. The next day, though, the stock market collapses, and you lose everything you have worked so hard for.

If you had done a little more research into the industry, you would have seen the many booms and busts that have occurred throughout history. By concentrating only on what we already know, we expose ourselves to significant and unquantifiable dangers. On the other hand, if you can, at the very least, figure out what it is that you don't know, you will be able to significantly decrease your chance of being exposed. Good poker players are fully aware of this concept, which is critical to their ability to succeed in the game. While kids are aware of the game's rules and the likelihood that their opponents will have stronger cards than they do, they are also cognizant of the fact that there is certain important information they do not know - such as their opponent's strategy and how much their opponent can afford to lose.

Their understanding of these unknowns allows them to develop a strategy that is not simply focused on their own cards, allowing them to make a much more educated evaluation of the risk they are taking on.

Having a clear knowledge of our own limits as human beings may assist us in making more informed decisions.

 Most likely, the greatest protection against slipping into the cognitive traps described above is a thorough knowledge of the tools we use to make predictions, as well as the limits of those instruments. While being aware of our own limits may not prevent us from making mistakes in the future, it will, at the very least, assist us in making better decisions in the present. Consider the following example: When you are aware that you, like everyone else, are susceptible to cognitive bias, it is much easier to identify when you are simply looking for evidence to back up what you already believe to be true. Similarly, if you are aware that humans like organizing things into tidy, causal narratives and that this type of approach reduces the complexity of the universe, you will be more inclined to seek more knowledge in order to get a deeper understanding of the "full picture."

Even a little bit of critical self-analysis may assist you in gaining a competitive edge over others in your area of expertise. It is unquestionably better to be conscious of one's faults. In other words, if you are aware that there will always be unforeseen dangers associated with chasing any opportunity, regardless of how good that possibility seems, you will be less likely to engage significantly in that opportunity. While we will never be able to defeat chance or our limited ability to comprehend the enormous complexity of our universe, we may, at the very least, minimize the harm caused by our ignorance.

Summary at the end

Although we are continually making predictions about the future, we are really awful at it, as the central theme of this book demonstrates. We place much too much faith in our own knowledge and far too little faith in our own stupidity. Even our biology, which includes an over-reliance on methods that appear to make sense and a fundamental inability to comprehend and define randomness, all contribute to poor decision-making and the occurrence of "Black Swans," events that appear impossible at the time but end up redefining our understanding of the world. Advice that can be put into action: Keep an eye out for the word "because." In order to make sense of this complicated universe, it is definitely in our nature to seek out linear, causal connections between occurrences. However, the truth is that humans are completely hopeless at both making predictions about the future and identifying the reasons for current events. We should instead examine a variety of alternatives without being fixated on any one of them, rather than fueling our urge to view events in terms of cause and effect. Understand what you don't know. It is simply not enough to consider all of the "knowns" for anyone who wants to make meaningful predictions about the future – which is something that everyone wants to do, whether they are buying insurance or making investments or attending college or changing jobs, conducting research, or simply being a human. You are left with just a limited knowledge of the dangers associated with your forecast as a result of this. Instead, you should be mindful of what you don't know, so that you don't needlessly restrict the amount of knowledge that you have at your disposal.

Buy the book - The Black Swan by Nassim Nicholas Taleb.

Written by BrookPad Team based on The Black Swan by Nassim Nicholas Taleb

Back to blog

Leave a comment

Please note, comments need to be approved before they are published.