Saturday, May 12, 2012

Thinking, Fast and Slow (a review)


I’ve been shaken by a recent book called Thinking, Fast and Slow (2011), by Daniel Kahneman.  I read the book mainly because Steven Pinker (one of my top-shelf authors) called Kahneman “the most important psychologist alive today.” His work, Pinker wrote, “has reshaped social psychology, cognitive science, the study of reason and of happiness, and behavioral economics …”  I figured anything Pinker endorses with superlatives can’t be bad.  And it wasn’t.  It’s actually amazing.

Kahneman, it turns out, is a Nobel laureate for developing prospect theory, and he has been actively working in the field for nearly 60 years.  The Theory (with lifelong collaborator Amos Tversky) suggests that people evaluate risks on the basis of losses and gains from a reference point, and not simply on outcomes, as utility theory would suggest. 


The book is a summary of his and others' research on judgments and decisions.  He started his own work at 20 when he was asked to determine which prospective officers in the Israeli Army should be promoted. His PhD is from Berkeley and he’s now faculty emeritus at Princeton. He is collaborative, and ingenious, and focused on ways in which the human mind cuts corners and what can be done about it. At the end of each chapter he suggests how lessons from these often startling findings can be used in daily life, to think more clearly. So here is the main point.  In ways we can understand and predict, we make systematic errors in judgment throughout the day.  It’s probably because of natural selection that we’re good at making quick decisions and, although they are often good enough, they are also rife with shortcuts, exaggerations and mistakes.  Then they are capped off neatly with overconfidence.

He says the mind has two operating systems which he labels 1 and 2.  The first is the intuitive mind, the transparent bit that recognizes a pencil as a pencil, or a face as angry or sad.  It's the part that chooses between fight and flight, that judges the size of numbers, keeps track of time, recalls memories, and basically takes the first quick swing at everything.  We consider this “knowing.” System 2, what you might call “thinking,” includes estimation, calculation, and all manner of figuring out – much of which happens very fast and we are at least partially aware when it's at work.  Both systems cut corners, jump to conclusions, omit inconvenient information, exaggerate, and guess.  System 2’s biggest problem, according to Kahneman, is laziness.  It leaves System 1 in charge, and when called on it does the minimum that is required.  Generally this means piecing together a credible solution, then going back to rest.
At first this dichotomy seemed a bit forced to me, as I thought there must be a continuum instead.  But it turned out to be a fabulous way to parse out how, when, and why our thinking can go wrong. 
Things we encounter first hand seem more important and more common than they should, as do things we can easily recall.  When the chance of something (good or bad) is small we tend to either ignore or exaggerate the possibility of it happening.   We consider losses about twice as important as an equal gain.  Thoughts immediately preceding can dramatically influence a judgment.  In retrospect, assessing an event, we give far too little weight to blind luck.  Considering options separately often results in different conclusions than what you would arrive at with a broader framework.  We use stereotypes to draw inferences from the group to the individual, and the other way around.  And we’re bad at statistics.  Consider this:
A woman is seen reading the New York Times on the New York subway.  Which of the following is a better bet about her level of education?
    1. She has a PhD
    2. She does not have a college degree
If you chose the first, as I did, think again.  There are a lot more nongraduates than PhDs riding the subway.  We overlooked the base data. 
Here’s a clever one:
Hearing that the highest incidences of kidney cancer in U.S. counties are found in the rural, sparsely populated states you might guess the reason: these places are poorer, less educated, with less access to health care, where people have high-fat diets, drink more and smoke.  That would certainly make some sense.  It’s a coherent, compelling, and completely fabricated story.
If you had heard instead that these same areas have the lowest rates of kidney cancer, you might come up with this one: rural living is healthier, with clean air, fresh locally grown food,  people get more exercise and suffer less stress than they would in urban areas. 

So which of these two statements about kidney cancer is true?  Both are true; you see, there are fewer people in rural counties, and we are comparing cases per 1,000, and wher the denominator is small random variations in the numerator will cause large changes in the rate.  Example: If a disease affects 20% of the greater population, in counties with just 2 people the rate is likely to be 0%, some will have 50%, and a few 100%.  None will reflect the actual average. This is a statistical phenomenon which even statisticians often overlook.  
Some of the findings in the book as so bizarre I still have my doubts, although the research seems sound, has been replicated, results are consistent, and the effects are strong.  For example, when subjects were asked to estimate the percentage of African nations which are members of the UN it mattered whether the respondents had just spun a 10 or 65 on a (rigged) “random” wheel.  Their estimates for UN membership averaged 25% and 45%, respectively!  While people are holding a sequence of numbers in their head they have less self-control eating cake!  Or if they grip a pencil between their teeth (forcing a smile) cartoons seem funnier to them compared to when they hold it between pursed lips (forcing a frown)!  People pencil-frowning will concentrate more strongly on the task at hand.   And those required to nod were more likely to agree with an editorial they listen to than those who were told to shake their heads.  These results just seems absurd.  I admit that my incredulity is not evidence that it's not true.  I just don't see how such simplistic influences would not be exploited, and therefore removed, by evolutionary forces.  

However, I found myself falling for many of the little test-yourself demonstrations in the book.  Consider this question:

Was Ghandi 130 years old when he died?  You’ll know he wasn’t.  But how old was he, when he died?  The typical response is significantly a larger number, compared to subjects who were first asked if he had died at the age of 40.  The first group would tend to adjust from 130 downward until they hit ages they thought were plausible; and the others do the same upward from 40.  Someone who figured Ghandi was between 65 and 85 might guess nearly a 20 year difference, depending only on the anchor they had been primed with.

Anchors are one of many subtle influences to bias our thinking. Here are some others:
The availability heuristic (page 129), confidence by coherence (209), the illusion of validity (211), the  illusion of skill (212), the illusion of intuition/ strength of algorithms (223) , the hindsight bias (202), the illusion of control (259), the halo effect (199), nonregression to the mean  (175), inside vs. outside view (248), the planning fallacy (250), stereotyping (168) , representativeness (149), the substitution effect (139), probability neglect (144), statistical and causal base rates (168), the conjunction fallacy (158), theory induced blindness (277), risk aversion (270),  loss aversion (284), the endowment effect (293), bad news bias (301), possibility effect (311), certainty effect (311), the expectation principle (312), denominator neglect (329), and overweighting (333).  There is narrow vs. broad framing (336), reference points (281), the sunk cost fallacy (345), and the evaluability hypothesis (360), and the effect of categories (356).  The focusing illusion (402) is a simple one, with reach: (“nothing in life is as important as you think it is when you are thinking about it”), as is WYSIATI (153: “what you see is all there is”).  
This may seem like a chore to work through, but it’s not.  The writing is very clear and engaging, and the 38 well-crafted chapters are each, remarkably, 10 pages long.  The book has  plenty of examples, end notes with references, concluding advice, full reprints of two seminal journal articles in two appendices, and lots of little demonstrations you can try on yourself.  Another example of these.  Which of these two opportunities would you prefer?:
  1. a gamble that offers a 10% chance to win $95 and 90% chance to lose $5
  2. or would you pay $5 for a 10% chance to win $100 and a 90% chance to win nothing?
Most people find the second one more attractive.  The fact that they are identical is difficult to see because we are more adverse to losing $5 than we are to paying $5.  Here’s another: You are considering surgery (with slightly better long-term results than radiation therapy) and you were told, about the surgery
    1. The one-month survival rate is 90%
    2. There is a 10% mortality in the first month.
Doesn't the first one sound better?  Now consider this 
  • Adam switches from a gas-guzzler of 12 mpg to a slightly less voracious guzzler: 14 mpg
  • Beth switches from a 30 mpg car to one that runs at 40 mpg
Who will save more gas by switching?  The answer is Adam, by a large margin.
One of the best sections of the book is the last. It describes two very distinct senses of self: one is the experiencing self and the other, the remembering self.  When someone asks “how are you?” it might mean "how's your mood right now?" but more likely "how have things been going?” … in other words ... “how happy are you with your memories?”  It’s the remembering self that most people care about, whether they're looking at their own lives (it’s why we take pictures on vacations) or others'.  When subjects were asked to imagine taking a dream vacation, after which all memory of it would be erased, most would not send themselves (or another amnesic) to hike mountains or trek through the jungle.  "These experiences are mostly painful in real time and gain value from the expectation that both the pain and the joy of reaching the goal will be memorable.”  “Odd as it may seem,” Kahneman concludes, “I am my remembering self, and the experiencing self, who does my living, is like a stranger to me.”
Unfortunately it's a little worse still, because the remembering self doesn’t represent the experiencing self very accurately. Looking back, durations of experiences don't matter much.  Only two things do:  1. peaks (good or bad), and 2. how the event ended.  Kahneman explains how these findings can be put to practical use, and also how education, religion, illness, poverty, wealth, and other factors impact the two forms of satisfaction.
It’s a great book -- I believe it's a full life’s work -- and my criticisms are minor.  Maybe not enough distinction was made between one-time judgments and iterated ones.  In the Prisoner’s Dilemma, for example, once-off games lead to mutual defection but when done repeatedly with the same players (as in the real world) it leads instead to altruism.  Some of the results in a psychology lab may be quite different in the real world.  There are also a few surprising typos, like a mix-up on page 335, where “AD” and “BC” are switched, throwing off the explanation; and on page 337 two miscalculations in the formulas actually invalidate the results.
But if these are the low points, they are not very low.  The peaks were many and very high -- and even though it took me almost a month to read, but duration doesn't count for much -- I'm remember it very fondly and it will sit on my top shelf.  Next to Pinker.
(Ghandi died at 79, BTW) 

5 comments:

  1. Mistakes on page 335 are not just a switch between AD and BC - the combinations are just incorrect!!!

    ReplyDelete
  2. I do not agree with the claims of error with the case of AD and BC. It took me a while, but I agree with what is stated.

    ReplyDelete
  3. I read and reread the bit on 335, and I really think it is a typo. Anyone else think so?

    ReplyDelete
  4. This comment has been removed by the author.

    ReplyDelete
  5. Also a typo in page 186 when he is speaking about a persons intelligence using GPA and then later in the sentence uses GDP in the same context :)

    ReplyDelete

I've allowed comments without login.