Author
Annie Duke
Year
2018
Review
Thinking in Bets provides a valuable introduction to decision theory and the associated psychological principles. As Product Managers, we often don't see the immediate impact of our decisions. It's crucial to integrate long-term strategic goals into our real-time decision-making processes to enhance the likelihood of positive outcomes. Although this book isn't about product development, it helped cement the importance of Product Discovery for me (validating our assumptions allows us to peek into the future, and increase the chances that we're successful).
You Might Also Like:
Key Takeaways
The 20% that gave me 80% of the value.
- We must resist our natural tendency to equate the quality of a decision with the quality of its outcome.
- If we want to improve, we have to learn from the results of our decisions. The quality of our lives is the sum of our decision quality plus luck.
What makes a decision great is not that it has a great outcome. A great decision is the result of a good process, and that process must include an attempt to accurately represent our own state of knowledge.
- Don’t fall into the trap of black and white decision making, of misrepresenting the world as extremes of right and wrong, with no shades of grey. Think probabilistically. Making better decisions is about calibrating among all the shades of grey.
- Recognising that all decisions are bets between alternative futures, each with benefits and risks can help make us better decision makers.
- Every decision commits resources and eliminates other options. There is always an opportunity cost in choosing one path over others. Even not deciding is itself a decision.
- In most decisions, we aren’t betting against another person but against a future version of ourselves that chose a different path.
- From an evolutionary perspective type I errors (false positives) were less costly. It was better to be safe than sorry.
- Practice truthseeking; the desire to know the truth regardless of whether the truth aligns with the beliefs we currently hold.
- Motivated reasoning: our beliefs affect how we process all new things → we’re more likely see information as strengthening our beliefs → which then drive how we process further information
- Fake news isn’t meant to change minds, instead it entrenches beliefs its intended audience already has.
- Being smart can make bias worse, as you’re better at constructing a narrative that supports your beliefs, rationalising and framing the data to fit your argument or point of view.
- New information should trigger you to update your beliefs. Questions that can help…
- How do I know this? Where did I get this information? Who did I get it from? What is the quality of my sources? How much do I trust them? How up to date is my information? How much information do I have that is relevant to the belief? What other things like this have I been confident about that turned out not to be true? What are the other plausible alternatives? What do I know about the person challenging my belief? What is their view of how credible my opinion is? What do they know that I don’t know? What is their level of expertise? What am I missing?
- “Wanna bet?” brings risk out into the open, we become more honest with ourselves about how sure we are of our beliefs, and be more open to updating and calibrating our beliefs.
- Learning to express our confidence on a scale of probability (e.g. 76%) can help us become less judgemental of ourselves and invite others to help us calibrate.
Scientists publish results of experiments, they share with the rest of their community their methods of gathering and analysing the data, the data itself, and their confidence in that data. That makes it possible for others to assess the quality of the information being presented, systematised through peer review before publication.
- Outcomes are feedback. We need to get better at identifying if those outcomes are caused mainly by luck or if they were a predictable result of the decisions we made.
- Fielding outcomes: an initial sorting decision on whether the outcome belongs in the “luck” bucket or the “skill” bucket. Ambiguity makes it difficult. Outcomes don’t tell us what’s our fault and what isn’t.
- Black-and-white thinking is a driver of motivated reasoning and self-serving bias.
- What accounts for most of the variance in happiness is how we’re doing comparatively.
- Engage in truthseeking and strive toward accuracy and objectivity.
- Become a better decision maker by giving credit, admitting mistakes and finding mistakes in good outcomes. Develop habits around accurate self-critique.
- Behave as if you have something at risk. Think in bets. It’ll help you refine your beliefs by making explicit what is already implicit and trigger the exploration of alternatives and perspective taking.
- Striving for accuracy through probabilistic thinking is worthwhile, the cumulative effects of being a little better at decision-making have huge effects in the long run.
If it weren’t for luck, I’d win every one. Phill Hellmuth.
- Truthseeking can be uncomfortable at times, but forming a diverse truthseeking pod with a few other people who are interested can help you. The group should:
- reward truthseeking behaviours and express disapproval at the opposite.
- be open-minded to those who disagree and diversity of ideas.
- give credit where it’s due.
- encourage accountability to answer for our actions.
- have a focus on accuracy (over confirmation).
- try and find mistakes even in good-outcomes.
- evidence against beliefs should be viewed as helpful
- The group should run through a series of questions to examine the accuracy of held beliefs.
- Why might my belief not be true?
- What other evidence might be out there bearing on my belief?
- Are there similar areas I can look toward to gauge whether similar beliefs to mine are true?
- What sources of information could I have missed or minimised on the way to reaching my belief?
- What are the reasons someone else could have a different belief, what’s their support, and why might they be right instead of me?
- What other perspectives are there as to why things turned out the way they did?
- Merton’s 1942 and 1973 papers established norms for the scientific community known by the acronym CUDOS.
- Communism: data belong to the group. In your decision pod be transparent and share all relevant information. Avoid the Rashomon Effect: when we get two accounts of the same event. Sharing data is the best way to move toward accuracy because it extracts insight from your listeners of the highest fidelity.
- Universalism: apply uniform standards to claims and evidence, regardless of where they came from. Don’t shoot the message, don’t disparage or ignore an idea just because you don’t like who or where it came from. The accuracy of the statement should be evaluated independent of its source. The only way to gain knowledge and approach truth is by examining every variety of opinion.
- Disinterestedness: vigilance against potential conflicts that can influence the group’s evaluation. Keep the group blind to the outcome, to produce higher fidelity evaluation of decision quality. The best way to do this is to deconstruct decisions before an outcome is known. Reward members for debating opposing points of view.
- “Wait! How did the hand turn out?” I gave them the red pill: “It doesn’t matter.”
- Organised Skepticism: discussion among the group to encourage engagement and dissent. Skepticism is about approaching the world by asking why things might not be true rather than why they are true. It encourages us to examine what we do and don’t know and what our level of confidence is in our beliefs and predictions. Moving us closer to what is objectively true. True skepticism isn’t confrontational.
- Methods for communicating with people outside our truthseeking group (focus on future goals and future actions):
- Lead with your own uncertainty to encourage others to voice theirs.
- Lead with assent. Listen for the things you agree with, state those and be specific, and then follow with “and” instead of “but.” Our listeners will be more open to any dissent that might follow.
- Ask for a temporary agreement to engage in truthseeking.
- Focus on the future. Do you think there’s anything you can do to improve in the future?The future can always be better if we can get them to focus on things in their control.
- We don’t experience the consequences of most of the decisions we make right away. We need to find a way to incorporate long-term strategic goals into in-the-moment decisions to increase our chances of good outcomes.
- Regret can do nothing to change what has already happened. So we need to interrupt an in-the-moment decision and consider the decision from the perspective of our past and future.
- The 10-10-10 Rule by Suzy Welch: What are the consequences of each of my options in ten minutes? In ten months? In ten years?”
- It works inverted too: “How would I feel today if I had made this decision ten minutes ago? Ten months ago? Ten years ago?”
- A mindset swear jar: for non-truthseeking behaviours. By recognising these behaviours you can trigger reflection and interrupt:
- Overconfidence and illusion of certainty
- Irrational outcome fielding and complaining about bad luck
- Dismissive characterizations of people and their ideas
- Motivated reasoning and echo chamber thinking
- Overuse of the word "wrong" in exploratory discussions
- Lack of self-compassion and unproductive self-criticism
- Over-editing personal stories
- Biasing others by including conclusions or beliefs when seeking advice
- Discouraging engagement with phrases like "no" or "but"
- Reconnaissance: consider in detail what the possible futures might look like. Figure out the possibilities, then take a stab at the probabilities. We do reconnaissance is because we are uncertain. You’re already making a prediction about the future, you might as well make it explicit. Prospective hindsight, imagining that an event has already occurred, increases the ability to correctly identify reasons for future outcomes by 30%.
- Backcasting: Imagine you’ve already achieved the positive outcome you seek, think about how you got there. It should help identify strategies, tactics, and actions that need to be implemented to get to the goal. You might spot low-probability events that must occur to reach the goal. You might then choose to develop strategies to increase the chances those events happening.
- Premortems: Imagine not achieving your goal. Then investigate what went wrong, what prevented you from doing so. All those reasons why we didn’t achieve our goal help us anticipate potential obstacles and improve our likelihood of succeeding. We need to have positive goals, but we are more likely to execute on those goals if we think about the negative futures.
Life, like poker, is one long game, and there are going to be a lot of losses, even after making the best possible bets. We are going to do better, and be happier, if we start by recognizing that we’ll never be sure of the future. That changes our task from trying to be right every time, an impossible job, to navigating our way through the uncertainty by calibrating our beliefs to move toward, little by little, a more accurate and objective representation of the world. With strategic foresight and perspective, that’s manageable work. If we keep learning and calibrating, we might even get good at it.
Deep Summary
Longer form notes, typically condensed, reworded and de-duplicated.
Chapter 1: Life Is Poker, Not Chess
- There’s an imperfect relationship between results and decision quality. You can have a bad result, without a bad decision.
- We must resist our natural tendency to equate the quality of a decision with the quality of its outcome. ‘Resulting’.
- Hindsight bias is when after an outcome is known, we see it as inevitable. Seeing the result as an inevitable consequence rather than a probabilistic one. It develops if you have an overly tight connection between outcomes and decisions.
- Our brains weren’t built for rationality. We have two systems:
- Reflexive: fast, automatic, and largely unconscious
- Deliberative: slow, deliberate, and judicious
- Most of what we do is reflexive. Many decision-making missteps originate from not engaging in deliberation.
- For professional poker plays, the goal is to get our reflexive minds to execute on our deliberative minds’ best intentions.
- After the fact, you need to learn from the mix of decisions and outcomes, separate the luck from skill, the signal from the noise, and guard against ‘resulting’.
- Talent doesn’t matter without execution; avoiding common decision traps, learning from results, and keeping emotions in check. We all struggle to execute our best intentions.
- In life we have to make decisions under conditions of uncertainty. Valuable information remains hidden and there is an element of luck too.
- If we want to improve, we have to learn from the results of our decisions. The quality of our lives is the sum of decision quality plus luck.
- Embrace saying “I’m not sure”. Conscious ignorance is a helpful first step in making a great decision.
What makes a decision great is not that it has a great outcome. A great decision is the result of a good process, and that process must include an attempt to accurately represent our own state of knowledge.
- Don’t fall into the trap of black and white decision making, of misrepresenting the world as extremes of right and wrong, with no shades of grey. Think probabilistically. Making better decisions is about calibrating among all the shades of grey.
- Any probabilistic decision (that’s not 0% or 100%) can’t be wrong solely because the most likely future doesn’t unfold.
- The second-best choice isn’t wrong, it’s better than third-best after all.
- Loss Aversion: Being wrong hurts us more than being right feels good.
Chapter 2: Wanna Bet?
- Recognising that all decisions are bets between alternative futures, each with benefits and risks can help make us better decision makers.
- Every decision commits resources and eliminates other options. There is always an opportunity cost in choosing one path over others. Even not deciding is itself a decision.
- In most decisions, we aren’t betting against another person but against a future version of ourselves that chose a different path.
- Anticipate and protect against irrationality preventing you from acting in your best interest.
- We can’t be sure. We can only make our best guess, given what we know and don’t know.
- Getting comfortable with uncertainty, can help us see the world more accurately.
- Aim to update your beliefs to more accurately represent the world through experience and information seeking.
- Try and identify when thinking patterns might lead you astray.
- We hear something → We believe it to be true (only later do we think about it and vet it)
- From an evolutionary perspective type I errors (false positives) were less costly. It was better to be safe than sorry.
- Complex language evolved gave us the ability to form beliefs about things we had not actually experienced for ourselves.
- Practice truthseeking; the desire to know the truth regardless of whether the truth aligns with the beliefs we currently hold.
- Motivated reasoning: our beliefs affect how we process all new things → we’re more likely see information as strengthening our beliefs → which then drive how we process further information
- Fake news isn’t meant to change minds, instead it entrenches beliefs its intended audience already has.
- Seeing things as black and white makes it harder for new information to change our view. When additional information agrees with us, we effortlessly embrace it.
- Being smart can make bias worse, as you’re better at constructing a narrative that supports your beliefs, rationalising and framing the data to fit your argument or point of view.
- People who are aware of their own biases are no better able to overcome them.
- New information should trigger you to update your belief. Questions that can help…
- How do I know this? Where did I get this information? Who did I get it from? What is the quality of my sources? How much do I trust them? How up to date is my information? How much information do I have that is relevant to the belief? What other things like this have I been confident about that turned out not to be true? What are the other plausible alternatives? What do I know about the person challenging my belief? What is their view of how credible my opinion is? What do they know that I don’t know? What is their level of expertise? What am I missing?
- “Wanna bet?” brings risk out into the open, we become more honest with ourselves about how sure we are of our beliefs, and be more open to updating and calibrating our beliefs.
- Learn to express your confidence on a scale of probability (e.g. 76%). Acknowledging uncertainty is the first step in measuring and narrowing it. Incorporating percentages or ranges of alternatives into the expression of our beliefs can help us become less judgemental of ourselves.
- Expressing your level of confidence invites people to be our collaborators and tell us what they know.
There are effective strategies to be more open-minded, more objective, more accurate in our beliefs, more rational in our decisions and actions, and more compassionate toward ourselves in the process.
Scientists publish results of experiments, they share with the rest of their community their methods of gathering and analyzing the data, the data itself, and their confidence in that data. That makes it possible for others to assess the quality of the information being presented, systematized through peer review before publication.
- Scientists have institutionalised the expression of uncertainty and invite their community to share relevant information and to test and challenge the results and explanations.
Chapter 3: Bet to Learn: Fielding the Unfolding Future
- People who learn from experience improve, advance, and become leaders in their fields. Our beliefs and our bets should improve with time as we learn from experience.
- Outcomes are feedback. We need to get better at identifying if those outcomes are caused mainly by luck or if they were a predictable result of the decisions we made.
- Fielding outcomes: an initial sorting decision on whether the outcome belongs in the “luck” bucket or the “skill” bucket. Ambiguity makes it difficult. Outcomes don’t tell us what’s our fault and what isn’t.
If it weren’t for luck, I’d win every one. Phill Hellmuth.
- Black-and-white thinking is a driver of motivated reasoning and self-serving bias.
- What accounts for most of the variance in happiness is how we’re doing comparatively.
- Engage in truthseeking and strive toward accuracy and objectivity.
- Become a better decision maker by giving credit, admitting mistakes and finding mistakes in good outcomes. Develop habits around accurate self-critique.
- Top poker players deconstruct every potential playing error with peers after the event, even when they win.
- Shift your mindset → seek truth with the deliberative mind → eventually you’ll reflexively think in the same way.
- Behave as if you have something at risk. Think in bets. It’ll help you refine your beliefs by making explicit what is already implicit.
- Thinking in bets triggers the exploration of alternatives and perspective taking.
- Move away from black and white thinking, putting your beliefs on a spectrum allows you to refine them when new information comes in.
- Striving for accuracy through probabilistic thinking is worthwhile, the cumulative effects of being a little better at decision-making have huge effects in the long run.
Chapter 4 The Buddy System
- Truthseeking can be uncomfortable at times, but forming a diverse truthseeking pod with a few other people who are interested can help you. The group should:
- reward truthseeking behaviours and express disapproval at the opposite.
- be open-minded to those who disagree and diversity of ideas.
- give credit where it’s due.
- encourage accountability to answer for our actions.
- have a focus on accuracy (over confirmation).
- try and find mistakes even in good-outcomes.
- evidence against beliefs should be viewed as helpful
- You need at least three, two to disagree and one to referee.
- A group can help especially if you know in advance that you’ll have to answer to them for your decisions.
- The group should run through a series of questions to examine the accuracy of held beliefs.
- Why might my belief not be true?
- What other evidence might be out there bearing on my belief?
- Are there similar areas I can look toward to gauge whether similar beliefs to mine are true?
- What sources of information could I have missed or minimised on the way to reaching my belief?
- What are the reasons someone else could have a different belief, what’s their support, and why might they be right instead of me?
- What other perspectives are there as to why things turned out the way they did?
- Others aren’t anchored by our biases. A diverse group can do some of the heavy lifting of de-biasing for us. Diversity of thought though is hard to maintain, fight group think and confirmatory thought.
- Expert opinions expressed as bets are more accurate than expert opinions expressed through peer review.
Chapter 5: Dissent to Win
- Merton’s 1942 and 1973 papers established norms for the scientific community known by the acronym CUDOS.
- Communism: data belong to the group. In your decision pod be transparent and share all relevant information. Avoid the Rashomon Effect: when we get two accounts of the same event. Sharing data is the best way to move toward accuracy because it extracts insight from your listeners of the highest fidelity.
- Universalism: apply uniform standards to claims and evidence, regardless of where they came from. Don’t shoot the message, don’t disparage or ignore an idea just because you don’t like who or where it came from. The accuracy of the statement should be evaluated independent of its source. The only way to gain knowledge and approach truth is by examining every variety of opinion.
- Disinterestedness: vigilance against potential conflicts that can influence the group’s evaluation. Keep the group blind to the outcome, to produce higher fidelity evaluation of decision quality. The best way to do this is to deconstruct decisions before an outcome is known. Reward members for debating opposing points of view.
- “Wait! How did the hand turn out?” I gave them the red pill: “It doesn’t matter.”
- Organised Skepticism: discussion among the group to encourage engagement and dissent. Skepticism is about approaching the world by asking why things might not be true rather than why they are true. It encourages us to examine what we do and don’t know and what our level of confidence is in our beliefs and predictions. Moving us closer to what is objectively true. True skepticism isn’t confrontational.
- Companies should reward constructive dissent by taking suggestions seriously or the expression of diverse viewpoints won’t be reinforced.
- Methods for communicating with people outside our truthseeking group (focus on future goals and future actions):
- Lead with your own uncertainty to encourage others to voice theirs.
- Lead with assent. Listen for the things you agree with, state those and be specific, and then follow with “and” instead of “but.” Our listeners will be more open to any dissent that might follow.
- Ask for a temporary agreement to engage in truthseeking.
- Focus on the future. Do you think there’s anything you can do to improve in the future?The future can always be better if we can get them to focus on things in their control.
Chapter 6 Adventures in Mental Time Travel
- We don’t experience the consequences of most of the decisions we make right away. We need to find a way to incorporate long-term strategic goals into in-the-moment decisions to increase our chances of good outcomes.
- If we don’t ponder the past or future we are more likely to be irrational and impulsive. A common example is temporal discounting; when we favour our present-self at the expense of our future-self.
- We are more likely to make choices consistent with our long-term goals when we can get out of the moment and engage our past- and future-selves.
- Regret can do nothing to change what has already happened. So we need to interrupt an in-the-moment decision and consider the decision from the perspective of our past and future.
- The 10-10-10 Rule by Suzy Welch: What are the consequences of each of my options in ten minutes? In ten months? In ten years?”
- It works inverted too: “How would I feel today if I had made this decision ten minutes ago? Ten months ago? Ten years ago?”
- Use your past and future selves as a lens to provide perspective, they can pull you out of the moment and dull our reaction. By zooming out we can realise how amplified our emotional responses are in the moment.
- When we are on tilt we aren’t decision fit.
- Remember “It’s all just one long poker game.” Take the long view, especially when something big happened in the last half hour. When you take the long view, you’re going to think in a more rational way.
Strategies
- A Ulysses contract: is getting past-us to prevent present-us from doing something stupid. Most Ulysses contracts raise a barrier against irrationality In most situations, you can’t make a precommitment that’s 100% tamper-proof. Regardless of the level of binding, a precommitment contract is much more likely to make you stop and think.
- A mindset swear jar: for non-truthseeking behaviours. By recognising these behaviours you can trigger reflection and interrupt:
- Overconfidence and illusion of certainty
- Irrational outcome fielding and complaining about bad luck
- Dismissive characterizations of people and their ideas
- Motivated reasoning and echo chamber thinking
- Overuse of the word "wrong" in exploratory discussions
- Lack of self-compassion and unproductive self-criticism
- Over-editing personal stories
- Biasing others by including conclusions or beliefs when seeking advice
- Discouraging engagement with phrases like "no" or "but"
- Reconnaissance: consider in detail what the possible futures might look like. Figure out the possibilities, then take a stab at the probabilities. We do reconnaissance is because we are uncertain. You’re already making a prediction about the future, you might as well make it explicit. Prospective hindsight, imagining that an event has already occurred, increases the ability to correctly identify reasons for future outcomes by 30%.
- Backcasting: Imagine you’ve already achieved the positive outcome you seek, think about how you got there. It should help identify strategies, tactics, and actions that need to be implemented to get to the goal. You might spot low-probability events that must occur to reach the goal. You might then choose to develop strategies to increase the chances those events happening.
- Premortems: Imagine not achieving your goal. Then investigate what went wrong, what prevented you from doing so. All those reasons why we didn’t achieve our goal help us anticipate potential obstacles and improve our likelihood of succeeding. We need to have positive goals, but we are more likely to execute on those goals if we think about the negative futures.
- Imagining both positive and negative futures (pre-mortems, backcasting) helps us build a more realistic vision of the future.
- Dendrology and hindsight bias: when we look into the past and see only the thing that happened, it seems to have been inevitable. Hindsight bias is an enemy of probabilistic thinking. Keep an accurate representation of what could have happened (and not a version edited by hindsight) can help you become a better calibrator going forward.
Life, like poker, is one long game, and there are going to be a lot of losses, even after making the best possible bets. We are going to do better, and be happier, if we start by recognizing that we’ll never be sure of the future. That changes our task from trying to be right every time, an impossible job, to navigating our way through the uncertainty by calibrating our beliefs to move toward, little by little, a more accurate and objective representation of the world. With strategic foresight and perspective, that’s manageable work. If we keep learning and calibrating, we might even get good at it.