The Art of Thinking Clearly by Rolf Dobelli: Summary & Notes

the-art-of-thinking-clearly-rolf-dobelli-book-cover.jpg

Rated: 9/10

Available at: Amazon

ISBN: 0062219693

Related: The Black Swan, Thinking, Fast & Slow

Summary

A fantastic book summarizing a variety of biases that affect our thinking and decision-making.

Dobelli leans heavily on people like Kahneman, Taleb, and others to build this extensive list (99 items!) of things to watch out for.

Well worth the read, and will likely require revisiting when making decisions.

Notes

These notes are a little different than my typical ones. I’ve summarized all the biases below, which can be considered the “book notes”. Then I’ve also put together a list of questions one can use when making decisions to try and counter these biases.

Biases

  • Survivorship bias: we tend to only hear about the successes or “survivors” - we don’t hear the stories of the failures, and thus overestimate the chances of success.

  • Swimmer’s body illusion: confusing the factor for selection with the result (ex: swimming gives you a great frame; actually, great swimmers are born with a good frame for swimming).

  • Clustering illusion: we tend to see patterns where there aren’t any.

  • Social proof: we feel we are behaving correctly when we act the same as other people.

  • Sunk cost fallacy: when we consider the costs incurred to date as a factor in our decision-making. Only your assessment of the future costs and benefits should count.

  • Reciprocity: we feel we owe something in return whenever we accept a favour or free item.

  • Confirmation bias: we interpret evidence to support our existing beliefs.

  • To counter, set out to find disconfirming evidence for your hypothesis.

  • Authority bias: we tend to defer to authority, and consider the opinions of supposedly authoritative people too strongly.

  • Contrast effect: we judge things in relation to other things. We also don’t notice small, gradual changes.

  • Availability bias: we create a picture of the world, or construct arguments, based on examples and evidence that most easily come to mind.

  • Counter by spending time with people who think differently than you do.

  • It’ll-get-worse-before-it-gets-better fallacy: a variation of confirmation bias. If the problem persists, the prediction is confirmed. If it improves, the expert can attribute it to his prowess.

  • Story bias: we try and shape everything into stories. 

  • Hindsight bias: in retrospect, everything seems clear and inevitable.

  • Overconfidence effect: we systematically overestimate our knowledge and our ability to predict.

  • Chauffeur knowledge: the knowledge required to make it appear as though someone understands something, when in fact they do not.

  • Illusion of control: we believe we influence far more than we actually do.

  • Incentive super-response tendency: people respond to incentives by doing what is in their best interests.

  • Regression to the mean: average values will fluctuate around a mean. Decreased or increased performance may simply be these random fluctuations, not due to an identifiable cause.

  • Outcome bias: we tend to evaluate decisions based on the result, instead of the process.

  • Paradox of choice: an abundance of choice leads to inner paralysis, poorer decisions, and unhappiness with our decisions.

  • Liking bias: the more we like someone, the more we want to buy from or help that person.

  • Endowment effect: we consider things to be more valuable the moment we own them.

  • Coincidence: we tend to see unlikely events as causal, when in reality they are likely random.

  • Groupthink: in groups, we tend to avoid contradiction, and we tend to agree with the majority conclusion.

  • Neglect of probability: we lack an intuitive grasp of probability, and instead tend to respond to the expected magnitude of an event, instead of its likelihood.

  • Scarcity error: when we are deprived of an option, we suddenly deem it more attractive.

  • Base-rate neglect: we disregard the basic distribution levels for a given outcome. 

  • Often exacerbated by giving more detail (narrative fallacy contributes).

    1. Also made worse by survivorship bias.

  • Gambler’s fallacy: we tend to mix up events that are independent and dependent (ie. this ball has landed on black 10 times, it must be red soon).

  • “What goes around comes around” is just false.

  • Anchors: when we guess something, we start from something we are sure of, and go from there.

  • Induction: the inclination to draw universal certainties from individual (typically past) observations.

  • The turkey problem - he lives a great life until Thanksgiving.

  • Loss aversion: the fear of losing something motivates people more than the prospect of gaining something of equal value.

  • Social loafing: when people work together (and individual performance is not directly visible), their individual performance decreases.

  • Exponential growth: we do not have a good intuitive feel for exponential growth (vs. Linear growth). 

  • Winner’s curse: the winner of an auction often turns out to be the loser.

  • Fundamental attribution error: the tendency to overestimate the influence of an individual, and underestimate external, situational factors.

  • False causality: when we mix up correlation with causation.

  • Halo effect: when a single aspect dazzles us, and we fail to see the larger picture or evaluate other factors objectively.

  • Alternative paths: we fail to consider all the outcomes which could have happened, and therefore underestimate risk.

  • Forecast illusion: we tend to believe forecasts, despite the poor predictability and low downside for being wrong.

  • Conjunction fallacy: when a subset seems larger than the entire set.

  • A result of our attraction to plausible stories.

  • Framing: we react differently to identical situations, depending on how they are presented.

  • Action bias: we feel compelled to do something, particularly in new or shaky circumstances, even if we have made things worse by acting too quickly or too often.

  • Omission bias: we tend to prefer inaction whenever both action and inaction lead to cruel consequences.

  • Self-serving bias: we attribute success to ourselves and failure to external circumstances.

  • Hedonic treadmill: we adjust to new circumstances, and are unable to correctly predict our own emotions in response to new circumstances.

  • Self-selection bias: we change the outcome of something by poorly selecting our sample.

  • Association bias: we make false connections between things that are not linked.

  • Example: we condemn the bearers of bad news, due to the negative nature of the message.

  • Beginner’s luck: we create a false link with early, past results.

  • Cognitive dissonance: when inconsistencies in our thoughts, beliefs, or attitudes cause us to reinterpret events to keep things consistent.

  • Hyperbolic discounting: the introduction of “now”, causing us to make inconsistent decisions.

  • “Because” justification: introduction of a reason (any reason) increases our compliance.

  • Decision fatigue: willpower erodes throughout the day, particularly when we haven’t eaten or slept.

  • Contagion bias: we are incapable of ignoring the connection we feel to certain items, even if from long ago or of indirect relation.

  • Problems with averages: averages often mask the underlying distribution. 

  • Never cross a river that is “on average” four feet deep.

    1. The Bill Gates phenomenon.

  • Motivation crowding: small monetary incentives may crowd out other types of incentives.

  • Twaddle tendency: reams of words used to disguise intellectual laziness, stupidity, misunderstanding or underdeveloped ideas. Often used in conjunction with authority bias.

  • Will Rogers phenomenon: the effect of changing the average in two groups (positively) by moving something from one category to another.

  • Example: if you move the lowest net worth individual from a higher group to a lower group, the average net worth of both groups increases.

  • Information bias: the delusion that more information guarantees better decisions.

  • Effort justification: if you put a lot of effort into a task, you tend to overvalue the result.

  • Law of small numbers: when we assume characteristics of the overall population can be assumed from a small sample, when in fact small samples are much more subject to random variation.

  • Expectations: expectations form our reaction to various events, and contribute to our happiness.  Set expectations high for yourself and the people you love, and lower them for things you cannot control.

  • Simple logic: we tend to default to intuition because it is less taxing.

  • Forer effect (aka Barnum effect): we tend to identify with positive traits in general descriptions, believing pseudosciences as a result.

  • Volunteer’s folly: volunteering our time is less efficient (because we do these jobs less effectively) than contributing our earnings for the equivalent amount of time. Exception: celebrities.

  • Affect heuristic: when we make complex decisions by consulting our emotions, instead of considering the risks and benefits independently.

  • Introspection illusion: the belief that reflection leads to truth or accuracy. 

  • Inability to close doors: we tend to prefer leaving options open, thinking they are free, when in reality they have a cost in distracting us.

  • Neomania: when we prioritize things that are new and novel over their actual benefits. 

  • Sleeper effect: if propaganda/advertising strikes a chord with someone, the influence will only increase over time.

  • Alternative blindness: we systematically forget to compare an existing offer with the next-best alternative.

  • Social comparison bias: we tend to withhold assistance for people who might outdo us, even if you look like the fool in the long run.

  • Primacy and recency effects: the first trait, or more recent information, hold larger sway over us.

  • Not-invented-here syndrome: when we think anything we create ourselves is unbeatable.

  • The Black Swan: an unthinkable event that massively affects your life, career, company, country.

  • Domain dependence: insights from one field do not pass well to another.

  • False-consensus effect: we overestimate the unanimity of others, believing they think and feel exactly like we do.

  • Falsification of history: our memories are riddled with inaccuracy.

  • In-group out-group bias: groups form based on minor criteria. You perceive people outside your group to be more similar than they actually are (stereotypes start here). Group members lead to disproportionate perceived support within the group. 

  • Ambiguity aversion: we favour known probabilities over unknown ones.

  • Default effect: we prefer the status quo.

  • Fear of regret: when we fail to act to avoid potentially feeling regret. 

  • Salience effect: outstanding features has an undue influence on how we think and act. We neglect hidden, slow-to-develop factors.

  • House-money effect: we treat money that we win, discover, or inherit much more frivolously than hard-earned cash.

  • Procrastination: the tendency to delay unpleasant but important acts.

  • Envy: when we compare ourselves on the basis of ownership, status, health, youth, talent, popularity or beauty. The subject of envy is a thing, where as the subject of jealousy is the behaviour of a third person. 

  • Personification: we empathize with other people when the human aspect is visible.

  • Illusion of attention: we are confident that we notice everything in front of us, despite only seeing what we are focused on.

  • Strategic misrepresentation: the more at stake, the more exaggerated your assertions become.

  • Overthinking: if you think too much, you will lose the wisdom of your emotional response. 

  • Planning fallacy: we overestimate benefits, and underestimate the risks, costs and duration of a project.

  • Déformation professionnelle: experts will tend to solve problems using their expertise, not necessarily the best method.

  • “To the man with a hammer, every problem is a nail."

  • Zeigarnik effect: we forget uncompleted tasks unless we have a clear idea of how to deal with them.

  • Illusion of skill: luck plays a larger role than skill in many domains, like entrepreneurship and leadership. Skill is necessary but not sufficient.

  • Feature-positive effect: we place a greater emphasis on what is present than what is absent.

  • Cherry picking: selecting and showcasing the most attractive features and hiding the rest.

  • Fallacy of the single cause: the belief that a single factor caused an event or phenomenon.

  • Intention-to-treat error: when failed projects or statistics show up in the wrong category.

  • News illusion: we believe news is important, when in reality it is not, and is specifically designed to attract us, despite this.

Other general advice:

  • We cannot know what makes us successful or happy. Negative knowledge (what not to do) is much more valuable than positive knowledge (what to do).

  • In other words, eliminate errors and better thinking will follow.

  • In situations where consequences are large, try to be as rational as possible.

  • In situations where the consequences are small, let intuition take over (save your effort).

  • Also let intuition take over when in your circle of competence.

Decision-Making Checklist

  • Is this an example of survivorship bias?

  • Am I confusing the factor for selection with the result?

  • Am I seeing a pattern where there isn’t one?

  • Am I changing my behaviour or opinion because others are doing/acting/thinking this way? Because of social proof?

  • Am I looking at only the future costs and benefits? Disregard any costs to date.

  • Do I feel obligated to return a favour here? Have they done something for me that might make me subject to reciprocity?

  • Can I find disconfirming evidence for my current hypothesis? What are the limitations of this evidence? How might someone with the opposing viewpoint interpret this evidence?

  • Is some sort of authority figure exerting an influence on me?

  • What am I judging this is relation to? How would this look in a different context, compared to something else? What sort of small, gradual changes might I be missing?

  • Am I overvaluing evidence because of my own experience or the ease with which I can recall it? Who can I get an opinion from who has a different expertise and experience than me?

  • What evidence would I have to see to make a judgement about whether this situation is improving?  What are clear and verifiable milestones?

  • Am I trying to shape this into a story? What is my confidence level that I actually understand this?

  • What predictions am I making about this? How confident am I? What historical decisions do I have recorded that might indicate my prediction level?

  • What is the pessimistic scenario here? How far off is my own prediction from this scenario?

  • Does this person (or do I) truly understand this situation? Or is it outside my circle of competence?

  • What specific things can I actually control in this situation?

  • What incentives are at play here? How do they likely affect the behaviour of those involved?

  • Could this situation be explained by random variation, or regression to the mean?

  • Was the process behind this good or bad, regardless of the result? Do I have enough evidence to evaluate the effectiveness of the process?  What information did I have at the time?

  • How can I reduce the number of choices here? What are the key factors I want to evaluate?

  • Do I like this person? Is that affecting my decision-making process?

  • Am I valuing this too highly because it is already mine? What does the market think?

  • How unlikely is this event? Could it be caused by random chance?

  • What is the devil’s advocate view of this situation? Have we expressed our opinions independently?

  • What is the rational response based on the probability and consequences of this event? What is the expected value or risk?

  • Have I assessed this option based solely on costs and benefits? How would I evaluate it if it were available in abundance?

  • What is the base rate in this situation? Is there an analogous situation I can rely on?

  • What factors are independent and which are dependent in this situation?

  • What anchors might I be using here when I shouldn’t be?

  • What are the objective upsides and downsides here? Am I overweighting the downside, or the fear of loss?

  • Are we behaving differently here because we are a group? How are we evaluating individual performance?

  • Is there an exponential factor at play here? Or is it linear?

  • Am I competing with someone here? Is that changing my behaviour? What is my “line in the sand” if I’m bidding for something? Can I avoid an auction situation?

  • What are the broader factors influencing the situation here? What degree of influence do they really have?

  • Is there actually a link between these two factors? How do we know that one causes the other? How do we know they are linked at all?

  • What are the limits of this piece of information? Is it causing me to look at other things favourably or unfavourably?

  • If I try and evaluate from an outside view, what are all the possible outcomes for this situation? What are the associated risks with each path?

  • What incentives is this person subject to? Is there a downside if the prediction is wrong? How good is his success rate?

  • Am I dealing with a subset here? Am I trying to fit a plausible story to the situation?

  • What if I present this situation in the opposite way? How does that change my perception?

  • Am I just trying to act here? What if I just wait? Will I be able to better assess my options?

  • Am I avoiding a particular path because the consequences are bad, but less bad than inaction?

  • What bluntly honest friends, or enemies, could I ask for an honest assessment of strengths and weaknesses?

  • Will this lead to long-term or short-term happiness? Would this lead to something guaranteed to be negative?

  • How does this sample affect the conclusions I’m trying to make? What would be the ideal sample?

  • Am I transferring qualities between things that are unrelated? Am I shooting the messenger?

  • Is the sample size enough to make a conclusion about luck vs. skill here? Are there a large number of players here? (Likely to cause random winners). Can I disprove my conclusion?

  • Am I trying to reinterpret things to maintain a previous attitude or belief?

  • Am I making an impulsive decision right now? Am I playing the long game or short game?

  • Is the reasoning behind this sound, or am I just going along with a “because” reason?

  • Am I making this decision fresh? Am I well-rested and well-fed?

  • Do I have a connection to this in some way?

  • Does the average mean anything in this situation? What is the actual underlying distribution?

  • Are financial incentives crowding my judgement? Are they crowding other incentives for the people involved here?

  • What is being said here? Is it actually useful?

  • How are these factors grouped? How has it changed? Have the groups been rearranged to manipulate the averages?

  • What information is actually useful here? Am I falsely increasing my confidence levels because of additional, but useless information?

  • Am I overvaluing parts of this because I put effort into them?  What is the value of the result, discounting the process and effort put in?

  • Is this sample size sufficient to draw conclusions? Or am I in fact extrapolating too far from a small sample?

  • What expectations am I holding about this situation? Are they appropriate? What is the worst-case scenario?

  • Am I evaluating this situation rationally? Or using intuition?

  • Could this information apply to anyone? Are there any negatives, or are they all positive traits?

  • Is this the best use of my time?

  • Are my feelings about this subject, topic, or my current feelings contributing to my evaluation?

  • Am I being critical with myself? How would I regard these internal observations if they were coming from someone else?

  • Am I just trying to keep options open? What should I focus on not pursuing?

  • Am I overvaluing this option because of the novelty?

  • What is the source of this argument or opinion?

  • What is the next best alternative to this option?

  • Am I avoiding an option out of fear or jealousy of someone or something outdoing me?

  • Am I overvaluing this information because it was the first I’d heard? Or because I heard it more recently?

  • Am I overvaluing my own ideas? Who can give me an objective opinion?

  • Have I put us in a position to guard against negative Black Swans? And take advantage of positive Black Swans?

  • Am I within my circle of competence? Or am I trying to transfer knowledge from one domain to another?

  • How do other people feel? What are their opinions? Have I truly gathered information about them?

  • Do I know for sure this happened, or am I relying on memory?

  • What groups are currently affecting my thinking? Have I sought opinions from outside my group?

  • Am I falsely relying on probabilities just to avoid ambiguity?

  • Would I make this same decision from a different position, if the status quo was different?

  • Am I avoiding a decision out of fear of regret?

  • Am I attributing undue weight to this factor because of its prominence? Which discreet factors am I failing to value?

  • Is my behaviour different because I won this money or got something for free?

  • Am I avoiding this because it’s unpleasant? Can I set a deadline to force myself to get this done? Can I make a public commitment?

  • Am I envious of something here?

  • What are the facts and statistical distribution behind this story? Is the human aspect causing bias?

  • Am I focusing on something here? What am I missing? What other scenarios are possible?

  • What is the past performance behind this claim? Are there other situations similar to this where I can find data? What safeguards do I have in place?

  • Is this a complex situation, or could I rely some on my emotions?

  • What similar projects can I look at for objective data on my situation? What does the pre-mortem look like here?

  • Have I gathered a number of sufficiently different perspectives to see how experts with different tools would solve this?

  • Have I gone into enough detail in the plan on how to deal with this situation?

  • Is there an illusion of skill here? Is this likely due to chance, or is there a demonstrated record of success?

  • What features or factors am I missing here? Why do these factors exist instead of nothing?

  • What has been cherry-picked here? Where are the negative results?

  • Am I falsely attributing this to a single cause?

  • What test subjects or information has been removed from the sample?

  • Is this valuable information or just news?