The Black Swan by Nassim Nicholas Taleb: Summary & Notes

Rated: 10/10

Available at: Amazon

ISBN: 081297381X

Related: AntifragileFooled by Randomness, Skin in the Game

Summary

One of the most impactful books I’ve ever read.  Overall, the books of the Incerto have changed my thinking on a number of topics, but this book has probably had the largest impact.

Essentially, the book is all about "how to convert knowledge into action and figure out what knowledge is worth”.  What that really means is an in-depth examination of where in our world we apply false, naive models (typically Gaussian, or bell-curve type models), and the impact they can have.

It goes further, to talk about how we can reduce the number of these occurrences by alternative prediction methods (using fractal distributions) and mitigating our exposure to true Black Swans, which are "outliers of extreme impact", or “unknown unknowns”.

Favorite Quotes

  • "My biggest problem with the educational system lies precisely in that it forces students to squeeze explanations out of subject matters and shames them for withholding judgment, for uttering the "I don’t know.””
  • "Simply, things that move, and therefore require knowledge, do not usually have experts, while things that don’t move seem to have some experts."
  • "What matters is not how often you are right, but how large your cumulative errors are."
  • "We humans are the victims of an asymmetry in the perception of random events. We attribute our successes to our skills, and our failures to external events outside our control, namely to randomness."
  • "The unexpected has a one-sided effect with projects. Consider the track records of builders, paper writers, and contractors. The unexpected almost always pushes in a single direction: higher costs and a longer time to completion."
  • "Corporate and government projections have an additional easy-to-spot flaw: they do not attach a possible error rate to their scenarios."
  • "Anyone who causes harm by forecasting should be treated as either a fool or a liar."
  • "Forecasting the motion of a billiard ball on a pool table requires knowledge of the dynamics of the entire universe, down to every single atom! We can easily predict the movements of large objects like planets (though not too far into the future), but the smaller entities can be difficult to figure out—and there are so many more of them."
  • "The Black Swan asymmetry allows you to be confident about what is wrong, not about what you believe is right."
  • "Capitalism is, among other things, the revitalization of the world thanks to the opportunity to be lucky."
  • "I want to be broadly right rather than precisely wrong."
  • "Elegance in the theories is often indicative of Platonicity and weakness—it invites you to seek elegance for elegance’s sake. A theory is like medicine (or government): often useless, sometimes necessary, always self-serving, and on occasion lethal. So it needs to be used with care, moderation, and close adult supervision."
  • "Missing a train is only painful if you run after it!"
  • "It is more difficult to be a loser in a game you set up yourself."
  • "I am sometimes taken aback by how people can have a miserable day or get angry because they feel cheated by a bad meal, cold coffee, a social rebuff, or a rude reception. We are quick to forget that just being alive is an extraordinary piece of good luck, a remote event, a chance occurrence of monstrous proportions."

Notes

  • Black Swan: outlier with extreme impact that gets explained (despite not being predicted) after the fact.
  • What is surprising is our absence of awareness of forecasting errors, not the impact of them.
  • The reason markets work is they allow people to be lucky, not by giving rewards for skill.
  • Strategy: tinker to collect as many positive Black Swan opportunities as you can.
  • Humans learn precise things, not the general; facts instead of rules.
  • We have a bad memory; we tend to distort things in retrospective.
  • Categorizing always produces reduction in true complexity (Platonification).
  • Law of Mediocristan: when your sample is large, no single instance will significantly change the aggregate or total.
  • Ex: height, weight.
  • Extremistan: inequalities are such that one single observation can disproportionately impact the aggregate, or total.
  • Ex: wealth.
  • Note: Extremistan does not always imply Black Swans.
  • Black Swans occur relative to your expectations (a sucker’s problem).
  • Generally, negative Black Swans happen quickly, positive ones take a while to show.
  • The turkey problem: a turkey lives for 1000 days, and believes nothing will happen to him, then is killed on Thanksgiving. Until Thanksgiving, there was no evidence of the possibility of Black Swans.
  • Round trip fallacy: absence of evidence is not evidence of absence.
  • Humans tend to be poor at transferring knowledge from theory to practice, and from one domain to another (domain specificity).
  • Negative instances are much more useful - if we see counter-evidence, we know something is not true.
  • Restated: you know what is wrong with much more confidence than you know what is right.
  • Confirmation bias: our tendency to only look for corroborating evidence.
  • Narrative fallacy: our tendency to summarize/simplify/tell a story to explain something, when in reality it is more complex.
  • We tend to remember or value those things that have a clear narrative to those that don’t seem to play a causal role in that narrative.
  • Counter this by keeping a diary.
  • We tend to use narratives and emotion more than logical, slow, thinking.
  • To counter this, favour experimentation over storytelling, experience over history and clinical knowledge over theories.
  • Happiness depends far more on the number of instances of positive feelings, rather than their intensity. The same works in reverse (with bad news).
  • Bleed strategy: lose steadily, for a long time, except for a rare event where you make a lot. No single event can blow you up, while one can make your profit for a lifetime.
  • Anthropic bias: do not compute odds from the vantage point of a winning gambler, but from all those who started in the cohort
  • This applies to successful figures, ourselves, etc - we only see the success
  • Similarly: positive actions are much easier examined than negative ones; it is very hard to see the consequences of lack of action, yet they may be much better.
  • Be careful with your use of “because” - use it only for experimental results, not backwards-looking history.
  • Ludic fallacy: that in real life we know the odds.
  • The last thing you need to do when dealing with uncertainty is focus - “focus” makes you a sucker; it translates to prediction errors.
  • Always question the error rate of an expert’s procedure (the confidence, not the procedure).
  • Things that move (are dynamic) don’t seem to have experts.
  • It doesn’t matter how often you’re right, only the magnitude of your cumulative errors.
  • Typical excuses for prediction errors:
  • You say you’re playing a different game.
  • You invoke the outlier.
  • The “almost right” defense.
  • The unexpected has a one-sided effect with projects. Consider the track records of builders, paper writers, and contractors. The unexpected almost always pushes in a single direction: higher costs and a longer time to completion.
  • With new projects (war, etc.), errors explode upwards.
  • Errors also commonly come from outside the model, or the expertise of the person who built the model.
  • Corporate and government projections have an additional easy-to-spot flaw: they do not attach a possible error rate to their scenarios.
  • You would take a different set of clothes on your trip to some remote destination if I told you that the temperature was expected to be seventy degrees Fahrenheit, with an expected error rate of forty degrees than if I told you that my margin of error was only five degrees. The policies we need to make decisions on should depend far more on the range of possible outcomes than on the expected final number.
  • The second fallacy lies in failing to take into account forecast degradation as the projected period lengthens. We do not realize the full extent of the difference between near and far futures
  • The third fallacy, and perhaps the gravest, concerns a misunderstanding of the random character of the variables being forecast. Owing to the Black Swan, these variables can accommodate far more optimistic—or far more pessimistic—scenarios than are currently expected
  • Ultimately, the worst case is far more consequential than the forecast itself.
  • Prediction requires knowing about technologies that will be discovered in the future. But that very knowledge would almost automatically allow us to start developing those technologies right away. Ergo, we do not know what we will know.
  • The Black Swan asymmetry allows you to be confident about what is wrong, not about what you believe is right.
  • We overestimate the effects of both pleasant and unpleasant events on our lives.
  • “Randomness” = opacity (or lack of knowledge, unknowledge).
  • Rank beliefs not according to their plausibility but by the harm they may cause.
  • Seize opportunities, and learn to distinguish between positives contingencies and negative ones; learn to open yourself to positive-Black Swans, and protect yourself from negatives ones (aka the Barbell strategy).
  • The Matthew Effect: an initial advantage provides a cumulative advantage in the long term.
  • Note: failure is also cumulative.
  • Remember this: the Gaussian–bell curve variations face a headwind that makes probabilities drop at a faster and faster rate as you move away from the mean, while “scalables", or Mandelbrotian variations, do not have such a restriction.
  • For any large total, the breakdown will be more and more asymmetric.
  • If there are strong forces of equilibrium bringing things back rather rapidly after conditions diverge from equilibrium, then again you can use the Gaussian approach. Otherwise, fuhgedaboudit.
  • Note the following principle: the rarer the event, the higher the error in our estimation of its probability—even when using the Gaussian.
  • Things that don’t apply outside Gaussian distributions: standard deviation, correlation and regression.
  • Fractal distributions (aka power laws): small variation in exponent will cause large deviation, therefore it is very sensitive to error. This exponent is also hard to compute, and applies only beginning at some “crossover” point, and tends to be overestimated (ie. Black Swan is underestimated).
  • However: we can operate knowing these things in a much better manner than assuming Gaussian distribution.
  • This thinking can make many Black Swans into gray swans (ie. modelable extreme events), but Black Swans remain: they are unknown unknowns.
  • I will repeat the following until I am hoarse: it is contagion that determines the fate of a theory in social science, not its validity.
  • Skeptical empiricism advocates the opposite method. I care about the premises more than the theories, and I want to minimize reliance on theories, stay light on my feet, and reduce my surprises. I want to be broadly right rather than precisely wrong.
  • I am most often irritated by those who attack the bishop but somehow fall for the securities analyst—those who exercise their skepticism against religion but not against economists, social scientists, and phony statisticians. Using the confirmation bias, these people will tell you that religion was horrible for mankind by counting deaths from the Inquisition and various religious wars. But they will not show you how many people were killed by nationalism, social science, and political theory under Stalinism or during the Vietnam War.
  • ...my antidote to Black Swans is precisely to be noncommoditized in my thinking. But beyond avoiding being a sucker, this attitude lends itself to a protocol of how to act—not how to think, but how to convert knowledge into action and figure out what knowledge is worth.

Want to get my latest book notes? Subscribe to my newsletter to get one email a week with new book notes, blog posts, and favorite articles.

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.