Thinking in Bets, by Annie Duke
My nuanced summary of Duke's unique professional-poker-player perspective on effective decision making.
Thinking In Bets is a compelling perspective on what it takes to learn how to make good decisions. Annie Duke brings an interesting perspective given her background as a cognitive psychology PhD candidate-turned-professional poker player (and her poker experiences make for some fun stories).
Don’t confuse the quality of your decision with the quality of the result. The latter is often influenced by factors completely outside of your control.
Seek out help specifically to engage in the truthseeking you need to learn from prior decisions and make better ones in the future.
Humans are credulous by nature; by framing a decision as a “bet” you can unlock some of your nascent skepticism, and make a better decision as a result.
If you enjoy the summary below, I recommend that you buy the full book (and get access to all of her stories) either via Amazon or Bookshop.
Introduction: Why This Isn't a Poker Book
[Story of how Duke dropped out of her cognitive psychology PhD program, got involved in professional poker, and learned from world-class players.] “Over time, those world-class poker players taught me to understand what a bet really is: a decision about an uncertain future.” Treating decisions as bets helps to a) distinguish the quality of our decisions from luck (the “two things that determine how our lives turn out”); b) find learning opportunities in uncertain environments; c) avoid common decision traps; and d) keep emotions in check. [Note: Duke uses a lot of compelling stories from her experiences as a poker player in this book to underscore the stakes involved; it’s worth it.]
Chapter 1: Life is Poker, Not Chess
Our brains crave (and create) certainty, and we dislike the idea of luck playing a significant role in our lives. This is why we often mistakenly equate the quality of a decision with the quality of the result. [A.k.a. “resulting” in poker, general advice is to resist the temptation to change strategy when a few hands don’t turn out well in the short run.] We want life to resemble chess, where all pieces are known, and the better player can reliably defeat their slightly worse opponent. Life resembles poker instead: it is full of uncertainty and hidden information, and you might make great decisions and still lose the hand because you don’t know what new cards will be dealt. Antidote: acknowledge uncertainty by admitting when you’re not sure. Some tips:
"Hindsight bias is the tendency, after an outcome is known, to see the outcome as having been inevitable." Red flag: saying “I should have known that would happen.”
What do you consider your best or worst decisions? Consider distinguishing "a bad decision where [you] got lucky with the result [versus] a well-reasoned decision that didn’t pan out."
Most decisions happen reflexively, so use your deliberative mind to train your automatic, reflexive mind to execute well.
Working backwards to craft tight relationships between our outcomes and our decisions is susceptible to numerous cognitive traps (confirmation bias, assuming causation, etc.) Beware!
Paraphrasing John von Neumann (mathematician, created game theory with Theory of Games and Economic Behavior): Life is not a well-defined form of computation (like chess) where any position would have a correct response. Real life - and a game - “consists of bluffing, of little tactics of deception, of asking yourself what is the other man going to think I mean to do.”
“What makes a decision great is not that it has a great outcome. A great decision is the result of a good process, and that process must include an attempt to accurately represent our own state of knowledge. That state of knowledge, in turn, is some variation of ‘I’m not sure.’”
Knowing the odds - say an 80% chance of success - does not mean you're wrong if the 20% scenario plays out.
Embracing uncertainty frees us from the anguish of being “wrong,” but we also lose the good feeling of being “right.” The world is random, and hidden information makes it even harder to predict how anything will turn out. Being wrong hits us more intensely than being right.
Chapter 2: Wanna Bet?
Our beliefs drive our decisions, and we rarely engage in honest truth-seeking to ensure our beliefs are accurate reflections of reality. (Humans are credulous creatures). Decisions involving uncertainty are better framed as a bet to immediately encourage an honest assessment of alternatives, consequences, trade offs, and probabilities of different outcomes. [Story of the eccentric gambler John Hennigan accepting a bet to move to Des Moines for a month, where he only lasted 2 days before negotiating a settlement. Despite being a crazy bet, "the underlying analysis was actually very logical: a difference of opinion about alternatives, consequences, and probabilities."] Antidote: stop thinking of beliefs as 100% right or 100% wrong. Instead, when examining a belief, ask whether you'd bet on the belief being true or not (possibly against yourself). This unlocks curiosity about what matters and what is actually true. Some tips:
Most decisions you make are bets against the future versions of yourself. When you decide to order chicken instead of steak, your future self will not be having steak
Our beliefs include any knowledge about the world and reality which guide our understanding of risk, probability of certain outcomes, options on the table, etc. (Beliefs are not just metaphysical, moral, or religious in nature.)
Most belief formation follows a pattern: a) we hear something; b) we implicitly believe it to be true; and occasionally c) if we're motivated, we might think about whether it's true or false. Humans are disposed toward efficiency of information transfer, NOT accuracy.
Truth-seeking - “the desire to know the truth regardless of whether the truth aligns with the beliefs we currently hold” - goes against the grain of human nature. “Believing is so easy, [...] it may be more like involuntary comprehension than like rational assessment.”
Marketing and public relations - e.g. the low-fat diet craze in the 90s, which actually increased obesity and diabetes - often play into our credulous nature.
Beliefs are stubborn: we use motivated reasoning to explain them away (smarter people are better at deluding themselves), and confirmation bias shapes how we interpret information
Motivated reasoning comes from our desire to tell a positive self-narrative. Feeling right vs. wrong plays into that narrative.
Incorporate uncertainty into how you talk about things. Learn to acknowledge how confident you are with any given fact. (“I'm 40% sure that ....” Doing this a) makes us more credible; and b) invites others to share what they know so we can “act like scientists together” (working to advance knowledge rather than confirm our beliefs).
Chapter 3: Bet to Learn: Fielding the Unfolding Future
We can only improve our decision making by learning from prior decisions (our own and of others). The critical factor to effective learning lies in how we distinguish between outcomes driven primarily by luck versus skill, since we can only improve those driven by skill. This is its own challenge, since we tend to believe good outcomes come from our skill whereas bad outcomes come from bad luck. [Story of Nick the Greek, who believed the element of surprise was important in poker, and so would play terrible hands and fold good hands. Needless to say, he failed to learn from his actual experience.] Some tips:
Learning happens best when you get "a lot of feedback tied closely in time to decisions and actions," but only if you actually figure out how to learn from that feedback.
Treat how you sort any given outcome into the skill vs. luck bucket as its own consequential bet. All outcomes pass through this initial filter, and our self-serving impulse is to sort based on whatever tells a positive self-narrative. "We take credit for the good stuff and blame the bad stuff on luck so it won’t be our fault. The result is that we don’t learn from experience well."
Pitfalls of reality: a) beliefs & bets don't improve naturally through experience; b) more information doesn't naturally lead to better decisions; and c) outcomes don't typically have unambiguous causes (especially with long-term outcomes, e.g. a low-fat, high-sugar diet leading to weight gain).
Uncertainty makes this harder: we can easily modify patterns of behavior with fixed reward schedules, but we persist at patterns which have historically had variable or intermittent reinforcement schedules.
Hone your learning loop by getting the lucky (or unlucky) outcomes out of the way and staying focused on those actually driven by skill. “The bets we make on when and how to close the feedback loop are part of the execution, all those in-the-moment decisions about whether something is a learning opportunity.”
Remember: you don't need to learn from every decision or outcome. But starting to more reliably and thoughtfully learn from some of them will add-up over time.
There are a number of useful strategies to learn more effectively from experience:L
Learn from the experience of others. Beware the tendency to attribute good outcomes to luck and bad outcomes to skill (the opposite of what we do with ourselves). Learn to view others’ experiences with compassion, not schadenfreude (deriving pleasure from someone else's misfortune). We are biased toward the latter because "most of the variance in our happiness is how we're doing comparatively."
Shift from feeling good about the outcome to feeling good about learning from the outcome (whether good or bad). [Story of successful poker player Phil Ivey, who spends post-game social events deconstructing his decisions with fellow pros to field his outcomes more effectively and learn to make better decisions.]
Embrace the tendency for comparison, but reframe it as wanting to be better at giving credit, admitting mistakes, or exploring possible reasons for an outcome with an open mind.
Consciously treat how you field outcomes as a bet. This triggers open-minded exploration of alternative hypotheses, encourages you to take the perspective of others, and helps you get closer to the truth. ("The truth generally lies in the middle of the way we field outcomes for ourselves and the way we field them for others.")
Chapter 4: The Buddy System
We improve best with the help of others. Consider creating a "good decision group" with a focus on thinking in bets, seeking out alternative opinions, and leaning on others to cover blind spots. This requires mutual buy-in to be productive and sustainable; don’t expect productive results when you forcibly dunk someone into harsh reality. [Duke’s "good decision group" of professional poker players, with its own rules of engagement, served an integral role in her improving as a player.] The basic rules of engagement: 1) focus on accuracy (reward truthseeking); 2) mutual accountability (for decisions made); 3) openness to diverse ideas (to escape the echo chamber of our own perspective). Some tips:
A productive truthseeking group has at least three people: "two to disagree and one to referee."
Relentless truthseeking goes against social norms; don’t expect everyone in your life to engage. “It takes effort to acknowledge and explore our mistakes without feeling bad about ourselves, to forgo credit for a great result, and to realize, with an open mind, that not all our beliefs are true.” Build in breaks to replenish willpower.
Cultivate exploratory thought (open-minded and objective consideration of alternative hypotheses). Avoid confirmatory thought (rationalizing a particular point of view).
Good decision groups “reward accuracy and intellectual honesty with social approval" we crave.
Sample response to a typical rant: “I don’t want to hear it. I’m not trying to hurt your feelings, but if you have a question about a [decision], you can ask me about strategy all day long. I just don’t think there’s much purpose in [complaining] about something you had no control over, like bad luck.”
A good group will keep you grounded when you’re doing well and productively focused when you’re not doing well; the focus is always on seeking the truth and improving your thinking.
Find ways to highlight other sides of arguments (e.g. anonymous dissent channels, CIA "red teams" to spot flaws in conventional wisdom and analysis).
Work against confirmatory drift (our tendency towards confirmation bias and listening to others who resemble ourselves). [The historical approach was for US Supreme Court justices to hire clerks from different ideological backgrounds.]
Sample strategies for encouraging diversity and dissenting opinions: a) having a stated antidiscrimination policy (against opposing viewpoints); b) developing ways to encourage people with contrary viewpoints to join the group and engage in the process; and c) surveying to gauge the actual heterogeneity or homogeneity of opinion in the group.
Chapter 5: Dissent to Win
Decision groups are most productive when they follow the norms of science as stated by Robert Merton, following the acronym CUDOS. If you find the group is not being as productive as you’d like, it’s likely failing on one of these four dimensions:
Communism: All data belong to the group; nothing pertinent to the topic is held behind a veil of secrecy. Honor requests for more information and “pull the skeletons of your own reasoning out of the closet,” even if it risks putting you in an unflattering light. (Certain things, like a poker hand or a football play, might have a lot of detail relevant to the decision.)
Universalism: Apply uniform standards to ideas, claims and evidence, regardless of where they come from. When evaluating ideas from people with different ideological perspectives, seek out aspects you agree with. Ask: "How would we feel about this if we heard it from a different source?" Separate information from the expertise and credibility (or likability) of the messenger.
Disinterestedness: Stay vigilant against potential conflicts that may influence the group’s evaluation. (Conflicts of interest are dangerous and contagious.) Focus only on the inputs for the decision and avoid sharing the outcome (so details are not interpreted to fit the outcome). Shield others from your beliefs or opinions when discussing the facts.
Organized Skepticism: Encourage productive engagement and dissent (not cynicism) within group discussion. Embrace uncertainty to encourage discussion around what we don't know. Find ways to "operationalize" skepticism (e.g. CIA "red teams" from earlier), and embrace the role of devil's advocate.
Strategies to engage in truthseeking with people outside of your carefully normed group: a) express uncertainty; b) lead with assent (have new information supplement, rather than negate, what came before, e.g. "I agree with you that [XYZ], ... and [ABC]."); c) ask whether they are open to engage in truthseeking in this moment (get a temporary agreement); and d) focus on the future (e.g. what they might do so things go better going forward).
Chapter 6: Adventures in Mental Time Travel
We make better bets when we consult with (or at least explicitly consider) our past- and future-selves. “One of our time-travel goals is to create moments like that, where we can interrupt an in-the-moment decision and take some time to consider the decision from the perspective of our past and future.” Some strategies:
Learn to empathize with your future self to bypass our bias for temporal discounting (where our present-self makes decisions at the expense of our future self).
Aim to experience regret before you make the decision (anticipated regret). This a) leads to better decisions; b) prepares us for the outcome; and c) increases self-compassion afterwards.
Place the current moment in a long-term frame to mitigate any volatile emotional swings (e.g. changing a flat tire in the freezing rain). Treat happiness as a long-term stock holding.
Avoid path-dependent distortion when fielding outcomes (e.g. how we interpret the same outcome, like leaving the poker table $100 ahead, as good or bad by how we got there)
Learn to identify when you are “on tilt” and not fit to make subsequent decisions. Tilt is a term poker players use to describe when you get emotionally caught-up in a string of bad outcomes that your decision making is compromised, which lead to more bad outcomes, and so on. Note: “If you blow some recent event out of proportion and react in a drastic way, you’re on tilt.”
Find ways to precommit to certain behaviors, either to a) raise barriers against irrationality (choosing not to meet in the mall food court, where you buy unhealthy food); or b) lower barriers that get in the way of rational action (bringing your own healthy snack when you meet up with friends at the mall). These are known as Ulysses contracts, after the story of the hero Odysseus / Ulysses tying his arms to the mast to be unable to heed the Siren's call.
Encourage accountability with yourself by creating decision interrupts when you see signs you are veering off the path of truthseeking. [Duke calls this the “Decision Swear Jar.”]
Conduct reconnaissance to identify and prepare for [the most likely] future possibilities or scenarios. This helps us be better prepared and less surprised by what happens.
Use backcasting (take a positive future and work backwards) to clarify what needs to happen to make that positive future a reality. This forces a long-term view, helps you avoid getting too hung up on minor immediate challenges, and encourages specificity on tactics.
Use premortems (imagine a negative future and work backwards) to clarify the true risks to achieving your goals. This is a form of organized skepticism which complements the optimism of backcasting. Imagining obstacles in our way helps us work around them effectively.
Keep track of unrealized futures to mitigate hindsight bias. Aim to remember the uncertainty we had prior to events finally occurring. Any given outcome was only one of a branching set of possible futures at the time (even as those were trimmed off).
“We make better decisions, and we feel better about those decisions, once we get our past-, present-, and future-selves to hang out together. This not only allows us to adjust how optimistic we are, it allows us to adjust our goals accordingly and to actively put plans in place to reduce the likelihood of bad outcomes and increase the likelihood of good ones. We are less likely to be surprised by a bad outcome and can better prepare contingency plans.”
[A collection of specific books or articles Duke directly mentions in the book.]
The Believing Brain, by science writer, historian, and skeptic Michael Shermer. This book “explains why we have historically (and prehistorically) looked for connections even if they were doubtful or false.”
Thinking, Fast and Slow, by Nobel laureate and psychology professor Daniel Kahneman. This book “popularized the labels of ‘System 1’ and ‘System 2.’ He characterized System 1 as ‘fast thinking,’ [... which] encompasses reflex, instinct, intuition, impulse, and automatic processing. System 2, “slow thinking,” is how we choose, concentrate, and expend mental energy. Kahneman explains how System 1 and System 2 are capable of dividing and conquering our decision-making but work mischief when they conflict.”
Kluge: The Haphazard Evolution of the Human Mind, by psychologist Gary Marcus. This book is similar to Thinking, Fast and Slow but with the descriptive labels “reflexive mind” and “deliberative mind.”
Theory of Games and Economic Behavior, by John von Neumann and Oskar Morgenstern. This book is on Boston Public Library’s list of the “100 Most Influential Books of the Century,” and William Poundstone, author of a widely read book on game theory, Prisoner’s Dilemma, called it “one of the most influential and least-read books of the twentieth century.”
Stumbling on Happiness, by psychologist Daniel Gilbert. This book discusses belief formation and the various factors that influence our experience of happiness. (See also Lyubomirsky’s work on the subject, and Jonathan Haidt’s The Happiness Hypothesis.)
The Power of Habit, by Charles Duhigg. This book “offers the golden rule of habit change—that the best way to deal with a habit is to respect the habit loop: ‘To change a habit, you must keep the old cue, and deliver the old reward, but insert a new routine.’”
The Righteous Mind: Why Good People Are Divided by Politics and Religion, by leading expert in exploring group thought in politics, Jonathan Haidt. This book builds on Tetlock’s work, emphasizing the need for diversity of ideas to get to the truth. (See also their Behavioral and Brain Sciences article, “Political diversity will improve social psychological science” for recommendations.)
The Half-Life of Facts, by Samuel Arbesman. This book “is a great read about how practically every fact we’ve ever known has been subject to revision or reversal. We are in a perpetual state of learning, and that can make any prior fact obsolete."
Rethinking Positive Thinking: Inside the New Science of Motivation, by Gabriele Oettingen. This book describes “over twenty years of research, consistently finding that people who imagine obstacles in the way of reaching their goals are more likely to achieve success, a process she has called ‘mental contrasting.’”
Thanks for reading Evan’s Notes! Subscribe for free to receive new posts and support my work.
Love this summary! What a fantastic book. And Evan's book summaries are top notch.