This fascinating book is
full of insights into human judgement and decision making, with an emphasis on
the Nobel-winning collaboration between Daniel Kahneman and Amos Tversky, in
which they challenged the idea that human thinking is fundamentally rational,
and showed how “systematic errors” of judgement can be explained, not by
emotional interference, but by “the design of the machinery of cognition”. The book includes two of Kahneman &
Tversky’s key papers, including 1974’s “Judgment under uncertainty: heuristics
and biases”, in which they presented three major heuristics that we use to
answer questions about probabilities or estimated values (representativeness,
availability, and adjustment from an anchor) and demonstrated a range of biases
that each frequently gives rise to. The
“fast and slow” thinking of the title refers to the “two systems” or modes of
thought that much of the book discusses: the automatic, quick, involuntary and
intuitive “System 1”, and the effortful, slow, complex and logical “System 2”.
One thing I particularly
enjoyed about this book was the inclusion of lots of striking case studies and
experimental results, such as:
- An office kitchen honesty box where the level of contributions increased sharply when an image of eyes “primed” the idea of being watched;
- Zajonc’s classic experiment demonstrating our liking for the familiar, where made up words were more likely to be judged to mean something good if the participant had previously seen them;
- Kahneman’s own discomfort as an examiner when he realised how strongly his grades for subsequent questions were influenced by the first essay he’d read from that student (a “halo effect”);
- Experiments where the ratings of competence given based on brief viewings of politician’s faces (generally higher for faces having “a strong chin with a slight confident-appearing smile”) correctly predicted (with around 70% success) the winner in an election;
- “Anchoring effects” where even an obviously random number picked on a wheel of fortune influences estimated answers to numeric questions, and even estate agents are influenced by the asking price when valuing a house, despite claiming not to be;
- An analysis showing the absence of any persistent variation in the results obtained by professional investment advisors, suggesting that successful stock picking is a matter of chance rather than skill – hard to believe because “we are far too willing to reject the belief that much of what we see in life is random”;
- A few shocking illustrations of the “planning fallacy” where forecasts are “unrealistically close to best-case scenarios” because others’ experiences and potential setbacks are not considered – such as Kahneman’s own team of textbook writers (who estimated they had two years’ work left, then learned that similar projects had a 40% failure rate and took around eight years if they did complete, but persevered anyway, finishing eight years later when the book was no longer in demand), and the Scottish Parliament building, originally budgeted at £40 million but finally completed for £431 million;
- An illustration of the costly effects of loss-aversion, where self-imposed goals act as reference points for our decisions, such as the New York cab drivers who go home early on (potentially high-earning) rainy days because they’ve met their daily targets, and work long hours on sunny ones when fares are scarce;
- A graph on reported life satisfaction, showing a peak at the time people got married and a sharp dip afterwards, which becomes less disturbing when read as “a graph of the likelihood that people will think of their recent or forthcoming marriage when asked about their life”.
Although I’m sure I’m
still susceptible to the many biases described in this book, I have noticed my
own thinking to be changed by the ideas described here on several
occasions. I found myself including “the
law of small numbers” (that “small samples yield extreme results more often”)
in a mental list of possible reasons for the difference in litter distribution
in different areas of my road. I once
shouted “regression to the mean” at the TV when a presenter spoke of poor
performance following a win as a “curse”. I resisted the urge to throw in extra items
when selling on eBay, because I’d learned that people evaluate sets by forming
an impression of the average or typical item, and therefore adding lower-value
items can actually devalue the entire set.
More significantly, two sections of this book have led to my strangely
calm and agreeable attitude to a potentially life-changing decision that last
year filled me with dread. Firstly, I
became convinced that much of my resistance to a big change is probably because
our innate loss aversion makes us value what we already have more than what we
might gain (“the disadvantages of a change loom larger than its advantages,
inducing a bias that favours the status quo”).
Secondly, I was reminded that, because even a drastic change like
paralysis only affects people’s mental state when they’re focused on it, and
other things regain focus over time, “nothing in life is as important as you
think it is when you’re thinking about it”.
Some of the questions
raised in this book are extremely hard to answer – such as whether people’s
reluctance to automate life-or-death decisions, based on the “stubborn
psychological reality” that “for most people, the cause of a mistake matters”,
making a death caused by computer error seem more tragic than one caused by a
human, should be overruled if an algorithm is available that makes fewer
mistakes than a human expert. I
particularly struggled with the material in the final section about the
contrast between the evaluations and priorities of the “experiencing self” and
the “remembering self” (which disregards the duration of experiences, basing
evaluations instead on moments of peak intensity and the way situations
end). Despite acknowledging that
“memories are all we get to keep from the experience of living”, Kahneman mostly
presents our tendency to base our decisions on the priorities of the
“remembering self” as a mistake, but I felt (with my own experience of
childbirth in mind) that – since we only experience each situation once but
live with the memory of it for the rest of our lives – the emphasis on memory
seemed reasonable, although choosing to disregard lived experience does of
course raise major concerns. This section seemed to suggest more questions than
it answered, such as the challenging question of whether medical interventions
should be prioritised and designed based on how feared a condition is, how much
suffering people actually experience in their daily lives, or how they evaluate
their situation when reflecting on it.
In his conclusion,
Kahneman notes with regret that in attempting to improve judgements and
decisions, “little can be achieved without a considerable amount of effort”,
and that the main route to avoid biases is to learn enough about them to recognise
situations where errors occur, and then to consciously slow down and engage
System 2. He does also present some helpful
advice and techniques though:
- “Decorrelate error” by making judgements as independent as possible, e.g. when conducting a meeting, ask each participant to summarise their opinions in writing first;
- In single-issue negotiations, avoid anchoring effects by going first if possible, and if you think the other side has made an outrageous offer, “make a scene, storm out or threaten to do so, and make it clear – to yourself as well as to the other side – that you will not continue the negotiation with that number on the table”;
- Before making significant decisions, hold a “pre-mortem” meeting where people imagine a disastrous outcome for the currently favoured plan;
- Start by considering base rates and averages when making estimates and predictions, e.g. always obtain and use statistics from similar projects as a baseline when making predictions, and consider how strong the correlation is between the predictors you’re considering and the result;
- Use simple checklists and formulas (such as the Apgar score) to assist with decision making – e.g. conduct interviews with standardized, factual questions, using “a disciplined collection of objective information and disciplined scoring of separate traits” to rate candidates;
- Use “risk policies” such as “never buy extended warranties” to group gambles together when considering potential losses and gains, and remember that “you win a few, you lose a few”;
- Remember that “intuition cannot be trusted in the absence of stable regularities in the environment”, and that experts are only reliable when they have experienced good feedback from a predictable environment.
Perhaps more
controversially, he also concludes (with Richard Thaler and Cass Sunstein, the
authors of “Nudge”) that the vulnerability of human decision making to biases,
particularly the powerful impact of framing effects, means that people “often
need help to make more accurate judgements and better decisions, and in some cases
policies and institutions can provide that help”. The “libertarian paternalism” advocated here
consists of things like making pension plans opt-out rather than opt-in, and regulating
the way that messages of importance are communicated (such as displaying the
more intuitive “gallons-per-mile” alongside mpg on new cars) in order to
provide a “nudge” towards decisions believed to serve people’s long-term
interests.
Early in the book, Kahneman
describes experiments using problems with intuitively appealing wrong answers that
could be solved correctly with only a little mental effort. He describes people who are more able to
resist the tempting fallacies (presumably by deliberately slowing down to check
their initial answer and search their memories for additional relevant facts)
as “more alert, more mentally active, less willing to be satisfied by
superficially attractive answers, more skeptical about their intuitions”. According to Keith Stanovich, these qualities
are not guaranteed by the possession of high “algorithmic” intelligence, but
rather constitute a separate ability, “rationality” (or being “engaged”, as
Kahneman describes it). It would appear
that this is the aptitude that we need to develop if we wish to avoid some of
the pitfalls described so convincingly in this book.
No comments:
Post a Comment