Thinking in Bets Book Summary

Thinking in Bets Book Summary

Making Smarter Decisions When You Don't Have All the Facts

Book by Annie Duke

Summary

In "Thinking in Bets," Annie Duke draws on her experience as a professional poker player to share strategies for making sound decisions under uncertainty, such as thinking probabilistically, learning from outcomes, surrounding yourself with truthseeking groups, and using mental time travel to pressure-test beliefs and plans.

Sign in to rate

Average Rating: 1

Life Is Poker, Not Chess

In Chapter 1, Annie Duke argues that life is more like poker than chess. Chess contains no hidden information and little luck, so the better player almost always wins. But in poker and in life, there are unknown variables and luck involved. Even the best decision doesn't always lead to a good outcome, and a bad decision can sometimes work out due to luck. Duke uses examples like Pete Carroll's famous goal-line call in the Super Bowl and how the same play call would have been deemed brilliant rather than idiotic if it had worked.

Section: 1, Chapter: 1

"I'm Not Sure": Using Uncertainty To Our Advantage

One of the key insights is that we should get more comfortable saying "I'm not sure" and acknowledging uncertainty. Poker players know that they can never be fully certain if their decision is right, due to incomplete information. They focus on making the best decision possible given what they know. We should do the same in life - make the best choices we can while accepting that we don't know everything. Don't be afraid to express uncertainty, as it makes you more credible. Redefine "wrong" to mean the decision-making process was flawed, not that the outcome was bad due to factors beyond your control.

Section: 1, Chapter: 1

"Wanna Bet?" On Your Beliefs

Annie Duke introduces the powerful question "Wanna bet?" as a way to test the strength of your beliefs. When someone challenges you with a bet, it forces you to consider:

  • Why do I believe this?
  • How much information do I have to support it?
  • Under what circumstances might my belief not be true?

"Wanna bet?" triggers you to vet your beliefs and express your level of confidence in them accurately, rather than just assuming they are 100% true. This is what poker players and good decision makers do constantly. Duke argues we should all adopt this betting framework for our beliefs and predictions.

Section: 1, Chapter: 2

We Form Abstract Beliefs In A Flawed Way

Duke cites research showing that we form abstract beliefs in a backwards way - we hear something, assume it's true, and only sometimes get around to vetting it later, if at all.

Experiments found that even when questionable information was clearly labeled as false, people still tended to process it as true, especially when under time pressure. Our brains evolved to assume things we hear are true because doubting everything would be cognitively inefficient. But this means many of our beliefs about the world are not properly vetted. Even when presented with contradictory evidence, we often still cling to existing beliefs.

Section: 1, Chapter: 2

The Stubbornness Of Beliefs

To illustrate how stubbornly we hold onto beliefs even in the face of contrary evidence, Duke cites a famous study called "They Saw a Game." Researchers showed a film of a heated football game between Dartmouth and Princeton to students from those two schools.

Despite watching the same clip, the two groups had wildly different interpretations of what happened based on their tribal loyalties. The Princeton students saw Dartmouth players commit twice as many infractions as their own team, while Dartmouth students thought the teams committed fouls equally. This showed how powerfully our beliefs shape how we interpret objective events to confirm our existing views.

Section: 1, Chapter: 2

"Beliefs Are Like A Large Pile Of Matches, Not Cards"

"Beliefs, in most cases, aren't like cards that can be flipped easily when the facts change. Beliefs are like a large pile of matches that can ignite at the slightest provocation. Unlike cards, matches are hard to extinguish once they get going. We can keep throwing more and more facts on the fire and yet, in the face of the evidence, the beliefs remain ablaze."

Section: 1, Chapter: 2

Being Smart Can Make Bias Worse

Surprisingly, being more intelligent and knowledgeable can actually make bias worse in some cases. The smarter you are, the better you are at finding reasons to support your existing beliefs and explaining away or discounting contradictory evidence. Very intelligent people with more information at their disposal can more easily rationalize away facts that don't fit their opinions. This means even very smart, educated people are still highly prone to biased and motivated reasoning in defense of their beliefs. Raw intelligence alone doesn't lead to objectivity.

Section: 1, Chapter: 2

Fielding Outcomes - Separating Luck And Skill

Chapter 3 focuses on how to learn productively from outcomes. Duke argues we must get better at "fielding" outcomes - determining whether they were due to the quality of our decisions (skill) or factors beyond our control (luck).

Poker players know that even good decisions can lead to bad outcomes and vice versa due to luck. The challenge is that it's hard to tease apart the contributions of luck vs. skill. But if we attribute bad outcomes solely to luck, we miss opportunities to improve our choices. If we chalk up good outcomes solely to skill, we may wrongly reinforce bad habits.

Section: 1, Chapter: 3

"Outcomes Don't Tell Us What's Our Fault And What Isn't"

"Outcomes don't tell us what's our fault and what isn't, what we should take credit for and what we shouldn't. Outcomes are rarely the result of our decision quality alone or chance alone, and outcome quality is not a perfect indicator of the influence of luck or skill. When it comes to fielding outcomes, we tend to focus on the quality of the outcome as the deciding factor between luck and skill."

Section: 1, Chapter: 3

The Danger Of Resulting - Judging Decisions Solely By Results

A classic example of the danger of judging decisions solely by their results is the rise in obesity that accompanied the low-fat diet craze. Public health officials encouraged people to shun fatty foods and embrace carbs and sugars instead in the 1980s-90s. But obesity skyrocketed as a result.

However, in the moment, people eating "low-fat" but high-sugar snacks like SnackWells cookies likely attributed any weight gain to bad luck or other factors. It took a long time for the realization that judging food quality by fat content alone was flawed. This shows the peril of "resulting" - assuming the quality of a decision can be judged solely by its outcome.

Section: 1, Chapter: 3

Self-Serving Bias - Taking Credit And Blaming Luck

One of the biggest barriers to learning from outcomes is self-serving bias - the tendency to attribute good outcomes to our own skill and bad ones to factors beyond our control. This is a universal human tendency - 91% of drivers in one study blamed others for their accidents, for instance.

Even when we make horrible decisions like driving drunk and crash, we often still find a way to blame external factors like road conditions. Self-serving bias prevents us from acknowledging our true mistakes and learning from them. It feels better in the moment to chalk up failures to luck, but it prevents growth.

Section: 1, Chapter: 3

Why Self-Serving Bias Persists - Ego Protection

Self-serving bias persists because:

  1. Outcomes are usually a mix of luck and skill, so there is room for interpretation.
  2. Our brains hate feeling bad or wrong. Blaming a bad result on luck helps us maintain a positive self-image.
  3. We compare our outcomes to our peers. Blaming their good results on luck and ours on skill makes us feel better by comparison.

Accepting that not all good outcomes are 100% skill and not all bad ones are 100% luck would require constantly feeling somewhat wrong and bad. So our brains choose the easy path of protecting our egos instead of the more productive one of learning.

Section: 1, Chapter: 3

Recruiting Others To Debias Us

Ideally, we would just be able to recognize and overcome biases like self-serving bias through sheer force of will. But these patterns of thinking are so ingrained that individual willpower is rarely enough to change them. A better solution is to recruit others to help us see our blind spots.

Surround yourself with people who are on a "truthseeking" mission and aren't afraid to challenge you if your fielding of outcomes seems biased. Ideally, gather a group with diverse perspectives who are all committed to being open-minded, giving credit where due, and exploring alternative interpretations of events. Use them to vet your decision-making process, not just focus on outcomes.

Section: 1, Chapter: 4

The Ideal Decision Group - A Truthseeking Pod

The ideal decision group for debiasing and improving choices has the following traits:

  1. A commitment to rewarding and encouraging truthseeking, objectivity and openness
  2. Accountability - members must know they'll have to explain their choices to the group
  3. Diversity of perspectives to combat groupthink and confirmation bias

The group can't just be an echo chamber. There must be a culture of rewarding dissent, considering alternatives, and constantly asking how members might be wrong or biased. If you can find even 2-3 other people who share this ethos, you'll be far ahead of most decision makers.

Section: 1, Chapter: 4

Better Decision Making Is A Learned Skill

One of the key themes of Chapter 4 is that making better, less biased decisions is a learnable skill, not an innate ability. You can create habits and routines that will gradually improve your "batting average" on choices, even if it feels uncomfortable and unnatural at first.

Part of developing this skill is learning strategies for anticipating common decision traps, so you can spot them ahead of time and circumvent them. Groups can play a huge role by helping you catch flaws in your process in a timely way. Don't expect perfection, just aim to get a little more rational and objective over time. Those gains will compound.

Section: 1, Chapter: 4

Doctors Need Decision Assistance Too

Even highly-educated experts like doctors aren't immune to decision-making biases. When a patient presents with a cough, the doctor has to decide if it's due to a virus, allergies, acid reflux, cancer or other causes. Leaping to conclusions based on initial impressions leads to a lot of misdiagnoses.

That's why many medical facilities use checklists, group consultations and decision aids to debias doctors. The Kaiser health system reduced the number of patients on strong opioids by 60% by having another doctor review any long-term painkiller prescription. The outside perspective combatted the prescribing doctor's faulty pattern-matching.

Section: 1, Chapter: 4

The Power Of Dissent - Encouraging Disagreement To Get To Truth

Chapter 5 explores how great decision-making groups don't just tolerate dissent, they actively encourage it. Duke cites the example of Alfred P. Sloan, the legendary CEO of General Motors. When all his executives agreed on a decision, Sloan said "I propose we postpone further discussion of this matter until our next meeting to give ourselves time to develop disagreement and perhaps gain some understanding of what the decision is all about." He knew that the pursuit of truth required constantly stress-testing ideas and considering alternatives. Dissent isn't disloyal, it's necessary for getting to the best answer.

Section: 1, Chapter: 5

Mertonian Norms For Truthseeking Groups

To create a truthseeking culture, Duke recommends following the Mertonian norms developed by sociologist Robert Merton:

  1. Communism (data belongs to the group, not individuals)
  2. Universalism (evaluate ideas based on merit, not source)
  3. Disinterestedness (be willing to accept outcomes that go against your preferred position)
  4. Organized Skepticism (discussion is good, but agree to be bound by logic/evidence)

These principles help overcome biases like confirmation bias, motivated reasoning, and groupthink that can derail group decision making. They create an environment where the best ideas can surface and win out.

Section: 1, Chapter: 5

Adversarial Collaboration - Teaming Up With Rivals To Find Truth

A great example of standing up a truthseeking group comes from Daniel Kahneman's adversarial collaboration with Gary Klein. Kahneman, a cognitive psychologist, studied flaws in human reasoning. Klein studied how experts made great snap judgments. Their views represented a major schism in the field.

But rather than just attacking each other's work, they decided to collaborate to get to the truth. They examined case studies together and ultimately arrived at a joint perspective - that expert intuition is powerful but only in areas with stable cues and lots of practice. Neither "won" the debate, but they both gained key insights by collaborating.

Section: 1, Chapter: 5

The Importance Of Expressing Uncertainty

One habit that aids truthseeking discussions, both in groups and one-on-one, is expressing uncertainty. Rather than stating opinions as facts, couch them in probabilistic terms. Say things like "I think there's a 60% chance that..." or "I'm pretty sure that X is the case, but I'm open to other views." Expressing uncertainty:

  1. Acknowledges that reality is complex and our knowledge is limited
  2. Makes people more willing to share dissenting opinions
  3. Sets the stage for you to change your mind gracefully if better evidence emerges

Expressing certainty, on the other hand, cuts off discussion and makes you look foolish if you're wrong. It's a lazy way to "win" arguments.

Section: 1, Chapter: 6

Why We Should Express Confidence In Degrees

"When we express our beliefs (to others or just to ourselves as part of our internal decision-making dialogue), they don't generally come with qualifications. What if, in addition to expressing what we believe, we also rated our level of confidence about the accuracy of our belief on a scale of zero to ten? Zero would mean we are certain a belief is not true. Ten would mean we are certain that our belief is true. A zero-to-ten scale translates directly to percentages."

Section: 1, Chapter: 6

Using Premortems To Stress-Test Plans

One way to harness the power of dissent is to conduct a "premortem" on important decisions. A premortem involves imagining a future where your plan failed, then working backwards to figure out why.

Have the team brainstorm as many paths to failure as possible - imagine competitors' responses, think through operational snafus, consider external risks. Then update your plan to mitigate the identified issues. This "creative dissent" makes the final plan much more robust. Premortems give permission to express doubts in a productive way.

Section: 1, Chapter: 6

Backcasting - Imagining Success, Then Tracing The Path

The flipside of a premortem is "backcasting" - envisioning a successful outcome, then reverse-engineering how you got there. If your company wants to double its market share, imagine it's five years from now and that's been accomplished. What key decisions and milestones led to that rosy future?

Telling the story of success makes it feel more tangible. It also helps identify must-have elements that might otherwise be overlooked. Backcasting is a great technique for setting and pressure-testing goals. Use it for anything from launching products to planning vacations.

Section: 1, Chapter: 6

Mental Time Travel Aids Decision Making

A core theme of Chapters 5-6 is that mentally simulating the future and past leads to better choices. Vividly imagining various futures (through backcasting and premortems) helps us select the most promising one to aim for. It also allows us to anticipate and preempt obstacles. Reflecting on past similar situations provides context on whether a proposed course of action is wise.

The more we build mental muscles to escape the here-and-now and adopt a long-term perspective, the better our judgment will be. Groups and habits that promote this kind of mental time travel are invaluable.

Section: 1, Chapter: 6

Related Content

Thinking Fast and Slow Book Summary

Thinking Fast and Slow Book Summary

Daniel Kahneman

Kahneman explores the two systems that drive the way we think: the intuitive, automatic System 1 and the deliberate, analytical System 2. It explores how these systems shape our judgments, decisions, and biases, revealing the surprising power of intuition and the pitfalls of overconfidence.

Psychology

Personal Development

Business

The Signal and the Noise Book Summary

The Signal and the Noise Book Summary

Nate Silver

In The Signal and the Noise, Nate Silver explores the art and science of prediction, explaining what separates good forecasters from bad ones and how we can all improve our understanding of an uncertain world.

Prediction

Decision Making

Economics

    Summrize Footer

    Subscribe to Our Newsletter

    Become smarter every day with key takeaways delivered straight to your inbox. Perfect for busy people who want to learn from the smartest minds in just minutes!