REVIEW OF THINKING IN BETS—Annie Duke
This work builds off Tetlock’s “Superforecasting” and Kahneman's
“Thinking Fast and Slow” but is much more digestible and anecdotal. As well as attempting to give workable solutions
to these issues. (IMO).
The aim is to improve decision-making through objectivity,
accuracy and open-mindedness.
Error one: Resulting - people mix up the best decisions with
the best outcomes. Opens up cognitive traps like mistaking correlation with
causation or cherry-picking data to support our narrative. Incomplete information
gives rise to a challenge when attempting to learn from past decisions.
Best decisions are the result of a good process and that
process must include an attempt to accurately represent our state of knowledge
which is usually a variation of “Im not sure”. The accuracy of guesses is dependent
on the information you have and how experienced you are at making such guesses.
When we think in advance about the chances of alternative
outcomes and make a decision based on those chances, it doesn’t automatically
make us wrong when things don’t work out it is just one event in a set of possible
future outcomes.
The aim here is to eschew the belief that everything is 0%
or 100%, that good decisions can not work out through hidden information and
luck. These make investing uncertain. The aim of a process is not to be right
all the time but to win over time by taking the right bets.
2. Use a framework of making bets when making a decision,
this explicitly forces you to think of alternate scenarios and also makes you
aware of the probabilities of different outcomes. To be a better calibrator, to adjust your
beliefs as new information comes to light, is something that is learned. This is
the same as exposed in Superforecasting by Tetlock. Accuracy in beliefs leads
to better bets. The bias we face here is the inability to change our beliefs
and we seek out ways to confirm them (error 2 confirmation). Truth-seeking does
not come naturally to us, we are much more likely to not alter our beliefs to
fit new information but rather alter our interpretation of new data to fit our existing
beliefs. Motivated reasoning = confirmation bias. Interestingly (or apparently),
higher intelligence embeds beliefs further and even being aware of your biases,
higher intelligence, makes them harder to overcome. The brain works extra hard
clinging to biases and beliefs in intelligent people.
Strategies to overcome these deficiencies include, remembering
we are in a constant state of learning where facts can become redundant. When
it comes to predictions a range of outcomes is possible and the less luck
involved, the tighter the range. We can vary our level of certainty depending
on whether we are dealing with facts or predictions/opinions. Do not feel you
are compromising yourself if a change in opinion is required by the facts, do not
lose face over this.
3. The path from being an experienced practitioner to being
an expert is the ability to identify when the outcomes of our decisions have something
to teach us and what that lesson may be. Unfortunately, the world does not
connect the dots for us between outcomes and causes, working through that data
is important in completing the feedback loop to improve decision-making over
time. That leads to differentiating luck from skill. Luck is differentiated from
skill by the level of control you have. No control is luck where you have control
it is skill. If this differentiation is done well we can gain valuable insights
to use in the future for the skill bucket. Outcomes don’t tell us what our
fault is and what isn’t, what we should take credit for and what we shouldn’t. In
reality outcomes are rarely all luck or skill but a blend. The ability to misinterpret
luck and skill is called fielding errors. Self-deception is an issue, the error
most make here is attributing all gains to skill and all losses to luck. It gets
worse, we are also more likely to dismiss the gains others make as luck and their
losses as a lack of skill. This bias prevents us from learning from other
investors who may add valuable insight. [Actually,
this is a fault of mine, what could that knucklehead or jabberer possibly know!
Dismissing outright reduces a potentially valuable learning path]
One of the most debilitating aspects of human behaviour is
the willingness to compare ourselves with others. This can lead to adverse
decisions, and this robust and perverse habit impedes learning. To improve
accuracy and the search for truth, we need to give credit where it is due, admit
where our decisions could have been better, and acknowledge almost nothing is
black and white. [I would add that comparisons with others are fraught with apples
and oranges issues]
Changing habits is very difficult. They are built into our psyche
and many revolve around us wanting to feel better about ourselves, especially in
the short term. The author suggests new, more constructive ways to feel better
about yourself, be a better credit giver, a good mistake admitter, a good
finder of mistakes in good outcomes, a good learner and then a better decision maker.
Focus on accuracy and truth-seeking.
Truth-seeking is recognizing that the way we field an outcome
is a bet, we should consider a greater number of alternative causes more
seriously. Thinking in terms of bets (alternate possible outcomes) triggers a
more open-minded exploration of alternative hypothesises, of reasons supporting
conclusions opposite of self-serving bias (confirmation/resulting). Two other
outcomes are that we can be more compassionate about ourselves for genuine poor
luck and secondly, we acknowledge scenarios in advance therefore being aware of
possible outcomes and courses of action and being mentally prepared for these. However,
realising that skill and luck both play a part it means you explicitly realise
that it is not all you (positive and negative).
There is no doubt that implementing these changes is
difficult but small improvements over time will reap large rewards.
4. This chapter focuses on using groups to further accuracy
and truth-seeking. Firstly, some people are not open to truth-seeking and at
other times they are not open to it at that specific time.
Focus on things you can control (decisions) and let go of
things you have no control over (luck). Work to be able to tell the difference between
the two.
In groups, it means learning to be open-minded to those who
disagree with us, giving credit where it is due, and taking responsibility where
appropriate, esp when that is uncomfortable. It takes effort to acknowledge
mistakes, admit our beliefs are not true, to forego credit for a great result,
without feeling bad about ourselves. It is hard work and we need breaks to
replenish the energy expended.
Other people can spot our errors better than we can. Groups,
however, can be tricky they have to find a balance between just confirming each
other views and exposing biases and alternate scenarios. Confirmation vrs exploration.
The book quotes Tetlock exposing that best outcomes occur when there is accountability
on the presenter to an audience and that audience’s views are unknown, the
audience is interested in accuracy, reasonably well informed, and has a legitimate
reason for inquiry. Summary is to be accountable to a group whose interests are
in accuracy. The addition of diversity of thought in the group is important,
but not spill over into being uncivil. There could be some tension between getting
to the truth and getting along with somebody that has to be managed. The benefit is that others in the group are
not wrapped up in preserving our narrative, anchored by our biases. Accountability
is defined as the willingness or obligation to answer for our actions to others.
P138 gives a series of questions to examine the accuracy of our beliefs.
As an aside in my experience, the quality of an investing team
is perhaps the most underestimated part of investing success. Having worked in
outstanding teams as well as teams that struggled to finish a meeting or reach
a conclusion there is a huge difference in the quality and effectiveness of
teams. Why they work and why they don’t is an issue for another paper. The interesting
point is that intelligence and hard work are often not the problems, secondly, some
members who have struggled in a certain team environment become stars in another
environment, which is interesting. Of course, I have theories on this!
5. The chapter works through a decision-making framework. Firstly,
data should be wide-ranging, and indulge the broadest definition of what could
be relevant. Without facts accuracy suffers. Data should be shared, with no
hoarding. There is a reluctance to disclose information harmful to our thesis. Secondly,
don’t disparage or ignore an idea just because you don’t like who or where it
came from. It works both ways, we accept ideas from those we like without
vetting them and vice versa. Take the idea on its own merits. [I accept this is
a personal bias, one issue is how much data you have to filter, bias is to back
idea winners, but I get the point]. The point here is to widen the perspective.
Thirdly, be disinterested, by that the author means be careful carrying your biases
into any assessment. One that is highlighted is “resulting”, which is an idea that
worked so it must be good when you want to work out the soundness of the process,
and is it replicable? Attempt to be open-minded with ideas presented to you.
Fifthly, be sceptical ie asking why things are true rather than asking why they
are not true. Highlight uncertainty in any possible outcome, we need to lean
over backwards to work out where we could be wrong without being nasty or
dismissive and undermining truth-seeking. Sixth, works through being diplomatic
in a group or communicating outwards which aids the process instead of
dislocating it.
The last chapter is generally about self-accountability through
imaging self-reporting in the future or self-assessing past performance. When we
think about the future or the past we make more rational decisions because it
takes away the pressure of the now which we usually over-emphasise. Regret has
limited usefulness because it is post the event, moving it before the event in
our thinking helps, it also reduces the chances of one bad decision becoming
two bad decisions, due to a remorse fuelled reaction. We can do this by thinking
about how a future “us” is likely to feel about a decision or how we would feel
today if a past “us” had made the decision. Planning has been discussed above
with the benefits noted. Of course, planning ahead familiarises ourselves with
the likelihood of a negative outcome and how we would feel about it. Coming to
terms with the likelihood of a bad outcome in advance will feel better than
refusing to acknowledge it, or facing it post the outcome. Controlling regret,
a severe emotion, reduces the chance of making another poor decision. Recruiting
past or future “us” to assess the situation from the perspective of another
time, discourages from magnifying the problems of today, when we may be overcome
with negative emotions, and act irrationally. The book on pp204/7 gives a list of
irrational behaviours/sayings to look out for and triggers a timeout to break
the emotional circuit. [interesting, I recall some successful investors having pictures
of those close to them around their desks, the point being what their loved ones
think about this decision or the investor thinking about the consequences of the
decision they are about to make on their loved ones]
The book then goes through the process of scenario analysis
to help decisions. An interesting piece is arguing the opposite case (of which
we should all be at least aware). The aim here is to not be surprised by the future,
it may not be what you want, but having considered the adverse outcome you are better
able to respond.
The book discusses “backcasting” where you have achieved
your goal and consider what went right to get there. Sometimes this is easier
than plotting from the present. The aim is to highlight the low-probability outcomes
required to reach your goals. Looking at the problem from different angles may
help. The other positive of this method is that it highlights inflection points
for re-evaluating the plan and reacting to developments that interfere with the
plan. Prepare for bad outcomes, actions and your emotions.
Improving decision quality is about increasing the chances
of good outcomes, not guaranteeing them. Good results compound and good
processes become habits and make possible future calibration and improvements.
That’s it, so what? you may ask. What people take away from
this will be person dependent. For me, it comes to being more open-minded and
not too dismissive of ideas and people. As an example, I can see that we can differentiate
stock analysts and industry analysts, being dismissive of an industry analyst because
they can’t properly value a stock also dismisses their considerable knowledge of
the industry which helps immensely. Secondly, realising that biases are so
ingrained that thinking we can overcome them is probably wishful thinking and
the best we can do is to continually remember that biases lurk within us and attempt
some methods discussed here to lessen their impact. Keep asking the counterpoint.
Having a proper plan to follow built on knowledge and accepting uncertainty, ie
that outcomes are dependent on many variables, that should result in improving result
consistency is the message for me.
Comments
Post a Comment