Tuesday, December 11, 2012

Human Evolution: A Geographer's Perspective

Not long ago, I received an email asking me to speak to a church group about the evolution of humans from a geographer’s perspective.  It was a member of the Ethical Humanist Society in Skokie Illinois; he'd seen my blog. I knew enough about the organization to quickly agree, but I asked for several months, to go over my books and notes on hominids and humans and their migration out of Africa. At first I had in mind a sort of a Jared Diamond approach; I wanted to reread his books  (Guns, Germs, and Steel .. and The Third Chimpanzee -- I tried, but could never get through Collapse), Darwin’s The Descent of Man, and an Anthro text called Darwin’s Legacy.  But the more weekends I tinkered with my thoughts, the more the scope of my talk expanded, and I chose in the end to weave together themes from many of my existing blogs, culminating with a new set of ideas which has intrigued me. This blog is a summary of my talk which I gave to full house Dec. 2.  

As I put down my thoughts in narrative here, I will embed links to my various essays and will mention specific books so anyone interested in going deeper can do so easily.  As my date approached, whenever I timed myself honestly I could not nearly roll through my address in 45 minutes, so I invoked what most people call natural selection, evolution, or survival of the fittest; it is simply the process of elimination.  I jettisoned whole chunks, distilled others, and then I cut even more to allow myself some digression -- to switch it up a bit with something spontaneous if the mood struck. After all, innovation is an essential part of evolution too.

I must admit I was a bit outside my comfort zone as this was the first time I was to speak on evolution in a formal setting.  I worried, as is normal for me, that something might go terribly, crashingly wrong.  I teach cartography and GIS, and while I’ve been designing a class on biogeography I haven’t nearly rolled it out yet.  But I have plenty of raw enthusiasm, quite a few thoughtful perspectives, formal coursework, and my heavily annotated science books which are listed to the right. This was quite a welcome opportunity too.  By now I’ve exhausted my family members on the topic. To be fair, they are still receptive to my ruminations -- but only in small doses.

A little background about Ethical Humanists.  A few years go when I took an online survey to determine which religion best matched my views, I hadn't known of ethical humanists but it turned out I was one.  I’m no authority, but Humanists seem a bit like Unitarians I’ve known and many liberal Quakers I grew up with -- keenly interested in knowledge, goodness, and community -- and pointedly light on the supernatural talk. Like the Unitarian church I briefly attended in Southern Illinois, these Ethical Humanists bring in speakers.  I recently saw Dan Barker, author of Godless, address this very group; he is a former evangelical preacher and is now an articulate, lighthearted and very effective spokesman for those who have shed their religious beliefs.

And as I was milling about before the talk I picked up a little card summarizing the mission statement of the parent organization: American Humanists.  Excerpts from their manifesto include
 



"Humanism is a progressive philosophy of life that, without supernaturalism, affirms our ability and responsibility to lead ethical lives of personal fulfillment that aspire to the greater good of humanity."
 
"Knowledge of the world is derived by observation, experimentation, and rational analysis." …
 
"Humans are an integral part of nature, the result of unguided evolutionary change."  
I found that refreshing; clearly I could expect a kind and educated audience and the last bit was particularly helpful with my last-minute jitters. Here is their web site -- they have events all through the week, and for children, too.
 
My title was "Human Evolution: A Geographer’s Perspective" and I started off with my standard clarification of the scope of geography -- a discipline which is oddly inconspicuous in America but has become much easier to explain with the advent of Geographic Information Systems.  GIS uses a variety of digital maps for the same location.  One might be of winds, another of population, and a third map of hospitals.  And each one carries an inventory; imagine a spreadsheet wherein each row represents one little mark on the map: a weather station, census tract, or particular hospital.  Respective columns contain speed and direction for various times of the day; everything from the U.S. census; and all information about hospital facilities.  Stack these maps, and GIS will let you count the number of young children 5 miles downwind from a chemical plant, and determines whether nearby pediatric wards could handle an explosion.  The databases are relational, with location being the relational link.  This is how geographers think -- location is our hook, and we can hook into all things for which location matters.  
 

I also quickly ran through some of the more common misunderstandings about evolution -- common, that is, among the 45% or so of Americans who believe in evolution at all (I know, WTF!).  Evolution is not a drive toward perfection, humans didn't come from chimps, very few traits result from a single gene, biology actually does interact with environment, and so on.  You only need three things for evolution to occur, according to Richard Dawkins in the seminal The Selfish Gene. 1. something must duplicate, 2. it must do it imperfectly, and 3. it must matter in the real world.  That's it.  Then it evolves.  And I defined species, using the biological definition: groups which don't naturally have fertile offspring.  Species, I'll add, are usually the result of geographical barriers separating populations which are then groomed by selective pressures or drift apart randomly.  The exception to this is called sympatric speciation, where different groups fall into different niches within the same locale.  There is much about this last one which is still a mystery to me; I understand why they would go into different niches but as populations drift apart, but why would the hybrid (like a mule), vigorous in all other ways, be sterile?  I've read expert opinions, but  I still don’t know.


I showed the well known map of human migration out of Africa about 50,000 years ago, shortly after the Toba super-volcano pressed human populations into the thousands -- to near extinction -- for thousands of years.  That genetic bottleneck is thought to explain why there is very little variation in the human genome today.  There is said to be more variation in a single social group of chimps as there is in the entire human race. The routes humans took from there is to the left. 

Then I rolled out the story of how I traced my own DNA, using National Geographic’s Genographic Project and 23andme. 
 
As I was preparing my talk I very much wanted to comment on 3.5 billion years of life, not just the last 200,000 years in which humans have existed.  Here's how I did it: first I described ring species, using the example of the circumpolar Laurus Gull, all the way around at 80 degrees north. Laurus characteristics vary only slightly everywhere; they satisfy the biological definition of a single species by producing fertile offspring, with neighbors.  However, where the ends of this ring meet the gulls won’t have anything to do with one another. The Herring Gull and the lesser Black-Backed Gull are therefore different species if you face east, but the same species if you're looking west!  And here is where it becomes relevant to humans. There is a similar and utter lack of species breaks when you trace humans all the way to the first spark of life; it’s incremental variation, all the way down.  Never would one generation be unable to produce fertile offspring with the next generation and this is true not only going down but back up the trunk, branches, and twigs, to all living things today.  You can travel from the woman to the ladybug on her shoulder if you crawl from one leaf (her), back through time, all the way to the junction, then back up another branch and twig to the bug.  You'll never step from one species to another. 

So, to understand humans it only makes sense to start 3.5 billion years ago when all life began.  And we know life only started once, because all genetic code from protozoa to great blue whale is written in the same language with the same four letters: nucleotides A, C, G, and T.
 
When I was practicing this next bit, which is really the only new thing I had to offer, I always ran long so I created a graphic to help me skip through it.  My thesis is that plants and animals were driven by four essential needs (gathering food, finding mates, evading predators, and protecting young), and in response evolved patently spatial strategies: improved perception and control of the environment, and improved navigation through it.

I've ruminated before about first life.  From there forward, the strand which would become human usually opted for complexity, given the option.  There is nothing shabby about simplicity -- bacteria took that route and ... how does that saying go:  "He who laughs last, last best."  We tooled up, while they stripped down.  They hunker. We lumber.  You might summarize that our ancestors were both imprudent and lucky.  Good fortune and has blissed us with brains and consciousness. 

Digression: It is simply awesome to be self-reflective; it's spectacular, really. Dumbfounding.  I must add that I also feel like a spectator;; I don't presume to think I am independently charged.  On the topic of free will I’m with Sam Harris -- we only feel that we have free will because we can’t predict what we are going to decide next!  But it makes sense that we feel so self-directed.  As Robert Wright clearly explains in The Moral Animal, self delusion has some clear evolutionary advantages. 
Now we come to the meat of this story.  Several brilliant authors have elucidated on live from its origins to modern day.  Richard Fortey's Life, Matt Ridley's Genome, Nick Lane in Life Ascending, Bill Bryson's Short History of Nearly Everything, and of course Richard Dawkins' The Ancesters Tale... they all take us on a long long journey to the past.
 
I won't touch on every invention, but if you read the chart carefully I think you’ll appreciate the spatial dimension of each juncture.  It's commonly thought that we first duplicated in porous rock by the deep sea vents' stable gradient of heat and pH.  You might say we were born in bondage.  When we finally developed our own cell wall we were free to blithely drift and duplicate for more than a billion years, without a care (so to speak) until the moment a mutation caused one bit to eat another.  Nutritious! It was rewarded by survival, and procreation.  So predation 2.4bya, placed a premium for both parties on movement so when one of the cellular creatures grew a tail, actually a flittering hair, a "fagellum," it too was rewarded in offspring.  Soon we (our ancestors) all had little wigglers.  Two billion years later, when a flotation bladder accidentally became a rudimentary lung, we crawled onto land.  But there, cold nights were particularly important – there had never been such temperature fluctuations underwater. Warm bloodedness emerged, soon as it could, simply because of the survival advantage of being first up for breakfast.  Our eyes had developed around this time, so we could scan our environment for predators and (conversely) for food.  In the broad view this was nothing special.  Eyes have happened about 40 times.  Judged against the other eyes, human eyes are just mediocre.

The amniotic egg about 200 million years ago allowed us to have young in even where there were no ponds.  This is a spatial explosion!  Next came the secondary palette and we could run with food in our mouths, still breathing.  Larger meals, greater territory still.  
 
There is a short period in our history 145-65 mya, where we reversed our general trend toward complexity and territorial expansion.  There were dinosaurs about; they better at eating than we were.  So we hunkered down for nearly 100 million years, burrowing, developing timid habits, in a spatial sense constricting ... until that wonderful astroid landed east of theYucatan Peninsula, causing rains of buring sulfur and effectively removing our opressors.  It was suddenly safe not only to emerge from underground, but to take to trees, and we did, thereby improving our stereo vision and building up bony eyesockets. The process of elimination (by fatal falls) certainly helped with these new innovations. 

We were never destined to greatness, remember.  It's not fate, but luck.  
 
Eventually homo habilis (the handyman) started making tools: first, a stone knife, which was simply an improved, replacable tooth and nail. This cleverness paid off in food, as man attacked beast, and no doubt in mates -- as man attacked man. Lots of animals make tools, but this was a particularly good one.  Not as good as the spear, though, and I’ve recently read that the spear may have spurred a social revolution. Prior to the spear and bow the behavior of hanging back, whenever a group tried to take down a large animal, paid off in survival tokens; so everyone tended to be a little tentative. Then spears made cooperative hunting possible by limited the personal risk. it built community; it's the prisoner's dilemma, pure and simple.  
 
But then came the big leap -- language -- 100,000 years ago. We had been getting smarter, no doubt -- in the previous 2 million years our brains had tripled in size but with language we could suddenly tap into the brains of others. My 16 year old son recently observed that language is a lot like telepathy, and it's true.  When we developed written language we could access the thoughts of people far away (and those of some dead people too). Language was not the first wireless network, however; it was probably second to mirror neurons. It is commonly believed (among scientists) that mirror neurons allow one to actually experience the experience of others. I watch you grasping a coffee mug, my brain feel grasping -- my grasping neurons fire. Some have speculated that this is how fish school, birds flock, and why yawns are infectious.  Mirror neurons may very well have been the dawn of empathy.

Then just 600 years ago movable type printing press brought an immense knowledge base to the literate commoner, and it allowed the literate commoner to share his intelligence too. Do you see how this leaps and stretches across space?

Each bar in the diagram represents a much shorter time frame, reflecting the acceleration of change. The bar on the right, spanning just 5,000 years, might as well have focused on innovations in transportation -- wheel through boat, horse, horseshoe, steamboat, bicycle, locomotive, car, airplane, helicopter, jet, and rocket ... but I’ve chosen instead to emphasize communication.

After telegraph (1843), telephone (1876) radio (1901), television (1925), photocopier (1958), satellite (1963), worldwide web (1989), with the advent of the computer, internet, and the inexpensive mobile phone we are witnessing a revolution in information as important as what happened with language 100,000 years ago – but with 100,000 times the spatial bang. The two maps show internet connectivity (node to node, local connections ignored) and facebook friends. But think of google searches, netflicks, wikipedia, facebook, skype, MOOCs, Youtube, and all maknner of mobile apps. Isn't it interesting to speculate that while no other animal is even aware of it, homo sapien sapien brains around the world are in the process of coalescing in a very real sense, into one brain.  The new brain still has a slippery grip, but it's a global grip.  An interplanetary reach even, as we recently have sent an extension of our own eyes to Mars.
 
I ended my talk with a little fun, speculating about ideas, which are virus-like replicators and therefore also subject to evolution. Much like genes, ideas duplicate, imperfectly, and they are filtered by the external world. A good joke, bad news, and useful information is repeated, while other noise fizzles out. Like genes, concepts can also team up to duplicate more reliably.  

For example, think of this. Combine empiricism, conjecture, hypothesis testing, evidence based reasoning, statistics, peer review, transparency, and a few more concepts and you have the scientific method, which has spread.  To carry it further (in my way of thinking) science often comes up against religion, which have at their core just three little ideas: 1. there is an invisible, mysterious entity, 2. it can and will hurt you if you don’t follow its rules.  And 3. the first rule is believe in it.  This circular core could not withstand scrutiny by itself, so is protected by its own cell: faith, of the sort that means belief without evidence.  Now comes the hat-trick: this faith is presented as a virtue.  And then distractions are attached, such as ritual, song, cathedrals, art, diet, dress, scripture, etc.  hence the mad,  circular, core concepts are sufficiently insulated that the whole package replicates, most asily in the lesser educated minds, and those of children -- imprinted, like a duckling for a lifetime.  And that's all you need but if becomes lucrative there will be those to promote and defend it for other reasons as well.
 
This idea of ideas having their way with us -- whether it’s science, religion, or neighborhood gossip -- well I find it ... er .. infectious.
 
Looking back on my hour, there were two big surprises. The first was that while I was arguing that human evolution can be seen as a series of advances in spatial awareness and control, I nearly stepped off the stage.  The humor in that was not lost on the crowd, I’m happy to report.  Secondly, when someone asked me if I’d written my thoughts down, and I said I’d probably do it in my blog, there was actually applause! This, recall, is the same blog my vigilant statistics tracker regularly reminds me that almost no one reads. That encouragement probably pushed me over the edge and I actually took the time to do it.  Everyone, but particularly if you are an Ethical Humanist, and if you've read this far ... bliss you.

Tuesday, November 13, 2012

God: A Hypothesis Worth Testing (Book Review)


Two Pew Research Center surveys in 2006 and 2009 showed 83% of Amerians and 33% of American  scientists believe in a traditional God. If you include a "universal spirit or higher power" this becomes 95% and 51%.

Victor Stenger's 2007 book God: The Failed Hypothesis is probably best suited for the 33 or 51% of scientists, assuming they're real scientists.  That is, that they base their beliefs on evidence and understand the necessary protocol. Scientists who don't believe have probably already thought these issues through and don't need the book, and most of the general public may find Stenger's approach either tedious or offensive. For a more conversational treatment of the same issues they should pick up Sam Harris' The End of Faith or the slightly more confrontational The God Delusion by Richard Dawkins.

The hypothesis that God exists is purposefully written in the positive. It's an extraordinary claim which normally would require extraordinary evidence; but the the author looks for any evidence, setting the threshold or for failure very very low.  But it's not at 0 probability, because it would not be possible to disprove God that way (for example, God may exist but has never revealed himself).  The hypothesis, the author claims, can be confidently dismissed by an overwhelming lack of evidence, like that used in a court of law. This would be analogous to concluding that your elderly neighbor on Estes Ave. Chicago is not also, say, the masked gunman terrorizing Toronto. You don't know that she's not a killer, but there is absolutely no evidence to suggest it and so you can confidently consider the hypothesis falsified. The same type of standard can be directed to the hypothesis that God exists.  Let's look for any reason to believe the Canadian killer might be the old woman next door.

The first step is to nail down the definitions.  This was actually my exact lament in a previous post.  Here, the author distinguishes between lowercase god and uppercased God with the former including the hands-off sort of deist  God, the odd assortment of deities, spiritualist beliefs, animism, and all creative manner of supernatural forces -- probably the sort referred to by the 12% and 18% in the graph above. The author is not arguing for or against these deities -- he just doesn't include them in his analysis.  His hypothesis refers only to the capital-G God, the interventionist Judeo-Christian-Islamic God, the sort which would be recognizable to the vast majority of believers. It does not include abstracted, esoteric, interpretations of God which often result from erudite theological gymnastics, even within these three big religions. The God he hypothesizes is the God most Americans (and many many others) actually believe in.


And the hypothesis and method must be carefully drawn up.  He describes the parameters: 1. protocols must be impeccable, 2. the HO must be established before the data are collected, 3. the work must be done without prejudice, 4. the HO must be falsifiable, and 5. results must be independently replicable.  Sounds like a plan.

Although devout believers will not read this book because of the title, they should actually feel comfortable with the inquiry. If there's a reasonable chance that God exists, it may strengthen their faith and thereby curry His favor. If there is no evidence to support the hypothesis, well maybe they have been wasting their time.  That's good to know, too.  And it wouldn't be all bad news; I'm reminded of a t-shirt: "Smile, there is no Hell."  If there is no evidence for one, clinging to the idea that God might be true anyway would be analogous to living in fear of the elderly neighbor because she might be the frequent-flier killer of Canadians. Why would anyone choose to do live with that fear?

There are plenty of ways the hypothesis of God would be supported. If lightning were to only strike wicked people, for example, it would be evidence.  Or if revelation actually predicted future events, or if prayer did improve the health of the prayed-for ... any of these would support the hypothesis. The author relies on reliable studies to test each one and more.  The hypothesis is rejected, which doesn't come as a surprise to me, as I've seen enough these studies studies -- with an open mind -- to lose my own misconceptions.  The God hypothesis, when you test it seriously, does not survive.

The book considers many angles: the illusion of design, evidence from the cosmos, failures of revelation, question of values and morality, the argument from Evil.  Scientists will be pleased with the rigor and (judging from the survey) perhaps surprised by the results.  There's no evidence for God existing.  At all.  Mercifully, for those who might be surprised, he asks whether beauty, hope, morality, kindness, generosity, love, forgiveness, and all those good things can exist, if there is no God. The evidence shows clearly, strongly, thankfully: yes.  They do exist.  There're biological reasons for altruism (I've explored this carefully in previous posts); and athiests actually behave as well or better than believers.

Anyone who seriously wonders, and is influenced by evidence and reason, will appreciate the careful treatment of this important question.  There are extensive end-notes and references, and citations, as one would expect.  To believers, it could be jarring, and many will likely be curious whether the author chose his studies with bias.  If there are good studies or solid evidence for God which the author overlooked, by all means, bring them forward.  But it is -- how could anyone not agree -- a hypothesis worth testing and a question worth looking into, in the glorious years you still have left.

Saturday, November 3, 2012

Three Books Covering It All

Now and then someone writes a hugely ambitious book intended for regular guys like me.  This is what Bill Bryson did in 2003 with A Short History of Nearly Everything.  To be honest I own a copy but I've never actually read it. I listened to the audio recording a couple of times, mostly as I bike to work.  It’s Bryson himself speaking with brightness and enthusiasm matching his prose.  A few days ago I started listening a third time, and shucks, I’m hooked again.  I won't comment on what it means about my memory (hey, we listen to songs we love over and over, right?), but there goes the next six road hours.  Bryson apologizes for the title of his book right away – it's not really about everything that is known, but since he spends a good part on the origin of the universe in a sense it is about everything.  I'm a geographer after all, so my professional opinion is an endorsement. But as I found with a subsequent book, Home, Bill Bryson is much better at writing chapters than he is at titling them.  Some of these: "Muster Mark’s Quarks,” or “Goodby to all That” are just too cute and opaque to do justice to the illuminating and serious content they contain.  On the other hand, his sprinkles of jaunty humor do lighten what could otherwise be, for many of us, a heavy load of STEM disciplines: physics, astronomy, chemistry, geology, and biology. 

And that pretty much is the aggregate sum of my complaints.  Bryson takes the reader on – if you can believe this – a romp which starts at the “singularity” -- when matter has an infinite density and zero volume (!?) -- and goes all the way to the advent of homo sapiens.  He uses a long series of pivotal discoveries and a colorful cast of characters (scientists, all) as stepping stones along paths which diverge, dead end, criss-cross, backtrack but in the aggregate move science forward, and move the reader toward understanding.  The book is nicely referenced with end notes, a bibliography, and index.  A word of warning: A whole lot happens before life even begins.  “The Rise of Life” is Chapter 19, page 350.  By the time we get to humankind, Bryson is pretty much winding down. 

That was quite a lot to wrap my mind around so it was a few years before I searched for a book Bryson had referred to fairly often: Richard Fortey’s Life, and Unauthorized Bibliography (1998).  I just could not find it, but I did come across a used copy of his Life: A Natural History of the First Four Billion Years of Life on Earth, (1997).  I’m guessing it’s the same book?  Seemed so, so I ordered it.   It arrived poorly packaged and had been rained on and the pages were swollen.  But I discovered I could all but flatten it with a couple of weeks in a bench vice  Some of the photographs were nearly ruined, but I got to it eventually, and was not disappointed.

This is a nice companion to Bryson, in a way, and another way it’s a similar run through the same terrain.  Forty’s has an index and glossary but oddly no references or bibliography.  Another big difference, and a welcome one to me, is that Fortey starts life out in Chapter 2; that means he’s basically Earth-bound and thereby skips a whole lot Bryson had gone over.  Almost everything, you could say.  There is no talk of the singularity, the Big Bang, and no hypothetical trips toward the end of a curved universe.  But it’s life onward, and with more pages to do it in.  Fortey’s writing style is clear and engaging but less effervescent than Bryson’s.  He doesn’t continually pause to delight over quirky details of the odd personalities and academic villains amongst history's most brilliant minds.

Though they  start about 10 billion years apart, Bryson and Fortey both end their stories with the rise of homo sapien; neither ventures into the complexities of the modern world except to describe the scientists and discoveries which make the story possible. 

And that is exactly where Richard Dawkins starts – at the advent of homo sapien – in his 2004 tome The Ancestor’s Tale.  But instead of moving forward from there he cleverly goes backward -- from modern humans to the origin of life.  It's brilliant.  He covers the same time period as Fortey: 3.5 billion years.  But  where Fortey started at the seed and followed life up through time,  Dawkins starts at a leaf – modern man – and travels downward from twig to larger twig to branch to larger branch, converging at last with all life forms.  There are 39 junctions, or “rendezvous,” along the way.  Each juncture lets him comment on the diverging branches and their own twigs, but he always comes right back to the human ancestry to go one branch lower.  It reminds me of National Geographic’s Genographic Project, by which I traced my paternal lineage backward (with genetic markers).  At each each generation backward I expanded my contemporary circle of kin.     My Y chromosome, used by the Genographic Project, passes only from male to male so at each generation the female line is ignored and that male ancestor contributed half again of what is now me.  Go back 2,000 generations and it's really just an interesting academic exercise; I'm  homeopathically diluted.  However, and ironically, in tracing the evolutionary tree this doesn't happen.  All of what humans were to become was embedded, long, long ago, in something like a small mouse.  It wasn’t a mouse, but its genetic offspring became a mouse, and a beaver, and a lot more, and us.  It was a common ancestor.   

As you might imagine, the journey backward is a familiar one early on.   Those we encounter are recognizably like us.  After navigating the Cro-Magnon, Neanderthal, Ergaster, Habilis, and so on, we come to rendezvous #1 about 6 million years ago where we meet up with the ancestor of modern chimpanzees.    In rrendezvous #3 we merge with gorillas', #6 is New World Monkeys', and #9 tree shrews'.  You can see where this is going; farther back the company gets more general and less, we are sure to feel, like ourselves.  Rendezvous #17 is with amphibians, #22 with lampreys and hagfish, # 31 sponges, #36 plants, until the end, #39 … eubacteria.  There are replicating iota even farther back … protobacteria, DNA, RNA … but these are not yet “life” as we think of it.

As I was going through this I thought surely the farther I went the drier and drier it would become and the more difficult it would be to stay interested.  But if anyone has a knack for drawing out the true wonder of the natural world, it’s Richard Dawkins and I found the latter chapters excellent reading too.

Each of these books is truly an epic.  Adding Bryson (574 pages of text), Fortey (315 pages) and Dawkins (614 pages) one has more than 1,500 pages and that's enough to ruminate on for quite awhile.  If you're an extreme reader, add on the 192 pages of notes, references and bibliography. 

Yes, these are potentially intimidating books, to be sure.  But if you pick up a copy of one (maybe start with Dawkins) – don’t be surprised if you get sucked right in to emerge -- one way or another -- 3.5 billion years, or 14 billion, later.

Monday, September 17, 2012

The Science/Religion Impasse

At a conference designed to improve teaching of science in higher education recently I found myself surrounded by physicists, biologists, chemists, geologists, and mathmeticians.  There was only one other geographer I know of, very few social scientists at all, yet I found the main lessons of the week very stimulating:  The sciences will benefit from 1. greater community involvement, 2. a more interdisciplinary approach, and 3. an inversion of curriculum such that a vexing problem is introduced first, and then the various disciplines are mined for solutions.  These changes, I came to believe, would revolutionize education not only in the STEM disciplines but in others too.  More than once I argued that “interdisciplinary” should cross over to the social sciences as well.

There was, in the audience one theologian.  I was curious that he was listed as a presenter in several sessions so I made it a point to attend  the one he'd called “I Thought I Could Learn Something…” which promised to directly address the  science/religion conflict.  I not only wanted to hear the speaker’s position but also the audience's response.

It is difficult to pin down the percentage of scientists who believe in God because of definitional considerations: what about agnostics? Which concept of God?  What makes a “scientist?” would you count Deists?  Philosophical Buddhists?  Unitarians?  Ethical Humanists?  Pastifarians?  the culturally religious?  Any way it’s done, the percent believing is relatively small among scientists.  I’ve seen it estimated at 40%, 33%, 15%, and (among members of the National Academy of Sciences) as low as 6 or 7%.  In contrast, when 2011 Gallup asked all Americans “Do you believe in a God or Universal Spirit?” 91% said yes.  When asked if they were " atheist' agnostic, or don’t identify with a religion” the polls indicated that some groups – Liberal Democrats, 18-29 year olds, students, men – are 5-10 points lower.  But let’s say that about 25% of scientists are firm believers in a higher power… and 85% of everyone else.  This is in the U.S.  

The dispute between science and religion comes down to the difference between faith-based belief vs evidence-based belief; that is, the (in)advisability of believing something for which there is no empirical evidence.  Religious truths may be strongly felt, revealed in dreams or inspiration, they may be taken on authority, and they may be very consistent, internally.   Science ties belief to evidence, period.  When it comes to religious matters, for example, scientists may attempt to measure the efficacy of prayer, the existence of miracles, and various scientific claims extracted from scripture.  And there it is – faith vs reason.

Among the religious, faith is often a source of pride.  It may be a perverse feedback loop but the more someone can accept by sheer force of will, the stronger they may feel it is true.  Even from a psychological perspective there are benefits to faith, to be sure.  It may provide ready answers to life’s most difficult questions.  It binds community and fosters culture.  It also can give confidence -- false confidence, I would say, but confidence nonetheless.  It can make one feel connected to a higher power – where or not there is one – and it can make one feel indestructible, invulnerable, and eternal.  These, I remember, are great comforts.

A good friend and fervent believer recently discussed with me the meaning of faith.  She rejected the notion that it is belief without evidence.  Faith doesn't deny evidence, she said.  It's the "substance of things hoped for ...  if you can physically see it no faith is necessary."  So perhaps faith, I mused, doesn't shy from evidence but just fills the gaps in knowledge with hope.  I proposed that while she and I can agree that the existence of God can't be disproven, without evidence her faith fills in that missing information with hope, while the same persistent lack of evidence makes me more and more doubtful.  I was pleased that we agreed on this.  "Yes, and I can pray that God will remove any doubt," she added, to which I replied that in doing so she must hope that the same God exists.  There it is, and we're still friends. 

Stephen Jay Gould, a highly regarded scientist and leading evolutionist, famously claimed that science and religion are “non-overlapping magisteria” – they don’t clash because they are in different realms.   But this is not possible, if the God in question is interventionist; that is, if it’s a God that matters.  If God ever responds to prayer, ever punishes wrongdoing or rewards fidelity, if He can cause miracles or even subtly influence events … that would be a supernatural force.  And science does not allow for forces that are ultimately unexplainable.
Religious believers I have known often disparage the “New Atheists,” who they perceive as aggressive and confrontational.  Pat Condell and Christopher Hitchens come to mind as especially acidic.  But as Dan Barker, author of Godless, pointed out in a recent lecture I attended, aggresion is often in the eye of the beholder; he who takes offense often believes the other person was offensive.   I suspect atheism is simply where homosexuality was 30 years ago, and thanks to Dawkins, Harris, Dennett, Barker, Ray, Wright, Pinker and others a thoughtful, articulate conversation has been broached.  Another closet door has opened.

Back at my science conference and not surprisingly, the first request made of the speaker (and not by me) was for comment on faith-based vs. evidence-based reasoning.  The response was jarring.  “Faith,” we were told, doesn’t actually mean belief without evidence, it is a term invented in the 13th or 14th century for “the duty of fulfilling one’s trust.”  Clearly, this definition of faith would render the question itself irrelevant.  What’s more, our speaker had no word for “belief without evidence” -- he could not (or would not) even be able to talk about it.  I wish I had offered one: "hope."

But instead, it was actually a frustrating hour; from my perspective no one could get traction, and I had been hoping To engage in real discussion.   A few kind and supportive remarks were made, the pointed ones were deflected, and after 45 minutes or so he got a little huffy as if the crowd had turned rude -- which we hadn’t.
Yet I admit I was quietly offended when he drew a bell curve on the board and labeled the right tail “religious fundamentalists,” the middle was not labeled (apparently it was the normal people),  agnostics were toward the left end at the little tip were atheists which he said constituted 2% of the population.  Atheists, he seemed to imply, are fanatics and can be safely dismissed.
But here we are with a definitional mismatch again.  If atheism means someone who is certain there is no God, then yes it’s probably a small number, 2% is possible, and I consider them extreme.  This is a common definition, as it turns out.  I poked a few dictionaries and (along with “ungodliness, wickedess,” and “ immorality”) I’ve found atheism defined as "the doctrine that there is no deity.” Another calls atheism “rejection of belief in God or Gods” and a third said “a lack of belief in the existence of God or Gods.”   I, and most of the atheist writers I’ve mentioned, use the last definition.  Atheism is a belief like not collecting stamps is a hobby.  There is no creed, no binding principles ... nothing is required of atheists.  Someone will counter that “not believing in God” is required of an atheist but then “not believing 2+2=658” is required of mathematicians.  So would 2+2=403 ad infinitum.  The concept simply doesn’t have to be on the table. 

Atheism is the default view.  Our brains may be predisposed to cut corners understanding causality; we may be susceptible to supernatural explanations.  But we are not born with religious beliefs. 

What’s more, agnosticism is not moderate atheism, as the speaker seemed to suggest.  It's not like Methodist to Pentecostal.  Agnosticism means believing it is not possible to really know.  Anyone with a shred of doubt – the whole inner portion of the curve, probably  – is agnostic.  I’ve looked into it.  Atheists (of the not-collecting-stamps kind) make up about 12 percent of the U.S. population not counting the very young.  They don’t belong on his diagram at all, just like non-stamp collectors wouldn't belong on a chart showing different hobbies.

In retrospect, the biggest problem we all had at the session was neither theological nor scientific, it was semantic.  We simply used the same words for different things.   Scientists use language precisely – they have to.  But everyone else is more casual and the problem is particularly bad, I think, with religion because scripture so often calls out for reinterpretation. 
My first blog explored the meanings of the word God.  The same could be done for atheit, agnostic, and faith.  An odd thought to end it: a fundamental Christian might agree with Richard Dawkins that people are born atheist -- but they would mean of the “wicked, immoral, sinful” sort, not the “not-a-stamp-collector” kind. 
If we could just nail down a few good words, we might actually be able to learn something.

Saturday, August 11, 2012

Reframing Organizations (reviews)

This may be one of the best known books on leadership, as far as I can tell; it's in its fourth edition now and often used as a text in management and leadership classes. There are various ways to look at organizations - the main point of the book is to explore some of them and show that it's often useful to shift frames to get a new perspective.   A scenario near the end in which a public insult is made to an incoming manager shows how the various frames suggest very different ways to reply.  It also shows how each one of them can be overdone.

Of the four frames taken up in this book - structural, human resources, political, and symbolic - three are excellent. The Structural approach uses a factory or machine as a metaphor - it's concerned with- goals, order, precision, planning, consistency, reliability, specialization, and assessment. Human Resources views the organization more like a family; it's sensitive to personalities, communication, strengths and weaknesses, motivations and employee well being. The Political approach deals with coalitions, partnerships, team building, domains, and group conflict; its metaphor is a jungle.

I've read elsewhere of a couple of others: the Natural Systems Model apparently views the organization as a an organic whole -- a sort of Gaia hypothesis of organizations, which I don't put much stock in. The Sociotechnical Model focuses on the humans and their tools, but most intriguing, to me, is the Cognitive Model which recognizes that personal goals can be aligned with the organizational objectives. It focuses on specialization and the flow of information.

Back to the book at hand. the Symbolic Frame is the fourth in the Boleman/Deal presentation, but it probably doesn't deserve the position. Its metaphor is theater, carnival, or temple but symbols actually more of a tool for the other perspectives rather than they are a viable frame of their own. For example, symbolism and symbolic gestures can be very influential in a family -- say, with a birthday cake, in a jungle -- a bright color might signal "poison", or even in a factory where a large red X by the chopping pit might save a few lives. But when symbols happen in a theater -- while it may seem real, it's really make-believe.

Regardless, the authors spend a chapter proposing symbols as a viable organizational framework. Symbols "project legitimacy" they "engender support, faith, and hope." The purpose of meetings, from this view, is to provide an "expressive occasion to clear the air and promote collective bonding," A plan is just a badge of honor, a game, or an advertisement. A strategic plan is an especially  high honor, no matter that it's often nothing more than "fanciful," they say. An organization "needs to foster faith and confidence" and therefore it undertakes evaluation (e.g. of employees) to "assure spectators that [it is] responsible, serious, and well managed." To top it off, get this: "Successful leadership is having followers who believe in the power of the leader."

This chapter made me ill.

Fortunately, the authors are probably wrong about symbolism. Sure it can be powerful, emotive, motivating, it can signal of commemorate important things and a shorthand way to evoke complex concepts. Words are symbols. Facial expressions are symbols. So is the American Flag, the facebook logo, a song, birds chirping. We're swimming in symbols and this is important because they mean something.   But when they don't mean what they pretend to they can be very dangerous, so the ability to detect fakes has been rewarded by natural selection.  Yet and still, we are still victims to illusion.  We can be manipulated. But if an organization were to actually use symbols as if they were meaningful in themselves you'd have ... North Korea? It's hard to even think of a company that has tried this because it would fail. Enron? 

No, smoke and mirrors is not a legitimate organizational frame. Symbols for real things are useful, that's as far as it goes.

But honestly, this shortcoming doesn't detract from this insightful summary of three other approaches, or from much of the discussion of symbols as tools, in other chapters. It's content-rich, well organized, and well written, though sometimes with a few more vignettes than I would prefer.  For someone new to organizational thought, this will be a great primer..
...

Then a friend recommended a more recent book, also by Boleman but with a different co-author (Joan Gallos, a professor of leadership at the University of Missouri,); it’s called Reframing Academic Leadership and it takes the same four-frame schema but specifically in the context of higher education.  The first few chapters do seemed like a rehash of the first book: structural, political, and so on … but then the examples became infused with something that is very unique to academia: the rift that often develops between faculty and administration.  “Faculty can see staff as unduly constrained and bureaucratic,” they explain. “Staff often wonder why they have to track their hours and vacation days when faculty seem to come and go as they please.” 
Being an expert within a discipline or – more often – sub-discipline is not very amenable to hierarchical control.  Disciplines themselves may become "silos," with little communication horizontally with other departments. But more important, it’s not always easy for faculty members to see or appreciate the complex institutional machinery required to assemble groups of inquisitive youth in rooms at the appointed hour, year after year, in a fluid and unstable external environment.  Meanwhile, academic administrators (unlike those running a factory or grocery store), cannot hope to understand what actually happens at the other end of the hierarchy.  They do not have the disciplinary expertise.  There’s a built-in volatility which is difficult to control. 

The previous book only went so far as to compare universities to hospitals.  That’s an interesting thought – doctors and patients there may be counterparts to faculty and students here.   

But they go much farther in this book.  The reference group for an individual faculty member, for example, often does not include the administration or staff, colleagues in other divisions, or even colleagues in their own department. They may, instead, be aligned intellectually with likeminded specialists in other institutions, institutions which, from the business model, may even be institudional competitors.  The culture, goals, outlook, perspective, motivation, associations, and knowledge base of Professor Jones may be worlds apart from that of Dean Johnson or Provost Peters.  And although they may distrust one another, and may even fight, they must also pull in the same direction if the institution is to survive. Such is life in academia.

The audience for this book is probably small not only because it is specific to academia, but because it will probably be of more value to administrators than to faculty.  Faculty can often all but ignore the broad institutional view of the administration although they shouldn’t, of course.  However, faculty will find the book interesting, and will recognize their own importance in the larger schema. The first law of higher education leadership, the authors write, is “If you lose the faculty, you lose.” And they also discuss the “pervasive faculty scorn for bureaucracy, administrators, and hierarchy.”

The same three "frameworks" as in the previous book are wound again through academia: political, structural, human resources … to great effect.  It's all different in this unique environment.  As before, an effort is made on account of symbolism too, which fell short, I thought, for the same reasons I expressed above.  There is no denying the symbolic power of Arizona State University President Michael Crow’s 2002 inaugural address (reprinted in part), in which he describes “A new gold standard” of higher education.  The speech was moving, inspirational, and effective because it presented a vision; but it was a vision which involved people, politics, and structure.  It wasn’t that the symbols were so valuable in themselves, it was that they were effective form of communication.  If symbology is an entire frame of academic leadership, so is shouting.  If there was anything symbolic that needs a little explaining, I thought, it may be the stark difference in dress code that often divides faculty and administration.  If communication, interaction, and cooperation is so important between these groups -- if they are all on the same team, why the broad-brushed underscoring of differences, with apparel?  I understand the jeans and and sneakers on faculty -- they're comfortable and that's also what students wear.  What's with the suits and ties?  Probably, administrators must communicate with politicians, businessmen, legislators, donors, and others who may appreciate the formality.  But it does affect the faculty-administration dynamic and it's an issue that may be worth taking up in the next edition.

Despite that shortcoming, and a couple of chapters at the end that refer vaguely to “feeding the soul” and the “sacred nature of academic leadership,” the insights keep coming, chapter by chapter:  Transparency and secrecy, reward structures, recruiting and hiring, managing budgets and personnel, review, accountability, motivation, cross-disciplinary cooperation, communication, self-control, autonomy, accountability, conflict resolution, assessment, regulations and guidelines – these are all addressed.  Anyone involved in higher education will be thankful for this illuminating book. 

Tuesday, May 29, 2012

The Omnivore's Dilemma (a review)

Probably, if you were going to read it you have already, but if you haven't ... you might want to pick it up. It was the #1 New York Times Bestseller in 2006 and it's still quite relevant today -- maybe more so, who knows. 

Normally we may think of evolution as a drive toward complexity, but bacteria has gone the other direction, it's evolved downward into simplicity and into very niche environments. This is an excellent survival strategy - what will survive any catastrophe you can imagine? Somewhere, bacteria, probably.

But most animals have opted for complexity and flexibility instead. They are able to move about and adapt to new environments. But here's the rub: since they choose to be flexible, they have to be flexible.

There is a direct analogy in the gastronomic world, it turns out. Some creatures have taken the simple approach by consuming a limited range of things. They can afford to do this because they have evolved elaborate intestines with which to work food over thoroughly and in which to harbor bacteria which converts one sort of input into all the various nutrients their bodies need. These are the herbivores and carnivores, and genetic code alone, which we call instinct, is sufficient to get them fed. On the other hand, omnivores have taken the high road; their innards are leaner and less elaborate so they must gather the right mix of inputs themselves.   And in doing so, they must avoid the dangerous ones. This requires a lot of care, and thought, and therefore ... big brains. It's a tradeoff of a simple lifestyle and an elaborate belly, or a complicated lifestyle, and a lean interior. So the omnivore's dilemma is gathering how to gather the right foods and not take in the harmful kind.  That in itself is a dilemma, but Pollan points out there are plenty of moral quandries as well.

The book is as entertaining as The Botany of Desire (2001), in which he looked at the story of apples, potatos, tulips, and marijuana from the plants' perspective. Here he takes on corn, grass, meat, and fungus, and once again we benefit from his careful research and introspection (the latter, occasionally laid on a little thick, for my taste). He also does a great deal of field observation, visiting the food factories and farms, talking to many different kinds of people, gathering mushrooms, and even slitting some chicken necks himself, and shooting a wild boar.  He describes much of this so well I felt I had done it too.

His best field trips included a large sustainable farm in Virginia where production is high, costs are relatively low, waste is almost nil, and the animals are mostly content. It's most impressive in the cleverness with which it all works, and the owner explains that in detail.  It's a stark contrast to some of the more corporate operations - like a standard corn-fed feedlot, a poultry farm, even an organic farm that turned out to be pretty much like the others. In these chapters the moral dilemmas come into the sharpest focus.

Food -- if you haven't noticed --  has become a new moral battleground, and when Pollan disparaged the new methods, and the lower quality of food they sometimes produce at times I felt he didn't fully appreciate the countervailing moral implications of the much larger quantities turned out now.  All that food is a good thing, too.  When he marveled that corn production increased from 75 bu/acre in 1950 to 180 in 2006 (140% increase, and often to the detriment of small time farmers), he didn't mention that world population increased by 172% in the same period.   Sometimes I though it was a little one-sided because the older methods could not easily produce the food we need now.  Hybrid vigor, that gives us pumped up ears of corn, is itself infertile.  That's not a Monsanto conspiracy -- as he intimates -- it is a fact of nature.  Vigorous hybrids, like mules, are often -- oddly -- infertile.  In the end it appears he was sometimes just exploring some of the more extreme views of his interviewees, as his own conclusions seemed balanced and reasonable, in my opinion.  As a reader I felt I had been treated fairly.

First, it's corn's dizzying ascendency as a food source, with the field trip to a chemical plant that rips the kernal pulp apart, sending it out in a tangle of different spigots -- some headed for the gas tank, others to the various mixers of myriad foodstuffs, others to make non-edibles.   There's a good discussion of the political and economic forces driving the corn industry too.  In the second section, on  "grass," he works on a the sustainable Virginia farm, among other things. When it comes to meat, he compares the sustainable approach to that in a large organic poultry operation, a feedlot, and commercial slaughterhouse, and more.   And all through the book he comments on underlying philosophical issues.

And the section of fungus (mushrooms) is interesting from a botanical perspective, mostly.  It could have been in The Botany of Desire.  In the end he pulls the story together by describing his "perfect meal" made up of perfect ingredients, served to perfect guests.  That just seemed unnecessary, to me but by that time I'd had a good enough intellectual and philosophical material to chew on - enough food for thought, you might say - to forgive him a little retrospective self-indulgence.

Saturday, May 12, 2012

Thinking, Fast and Slow (a review)


I’ve been shaken by a recent book called Thinking, Fast and Slow (2011), by Daniel Kahneman.  I read the book mainly because Steven Pinker (one of my top-shelf authors) called Kahneman “the most important psychologist alive today.” His work, Pinker wrote, “has reshaped social psychology, cognitive science, the study of reason and of happiness, and behavioral economics …”  I figured anything Pinker endorses with superlatives can’t be bad.  And it wasn’t.  It’s actually amazing.

Kahneman, it turns out, is a Nobel laureate for developing prospect theory, and he has been actively working in the field for nearly 60 years.  The Theory (with lifelong collaborator Amos Tversky) suggests that people evaluate risks on the basis of losses and gains from a reference point, and not simply on outcomes, as utility theory would suggest. 


The book is a summary of his and others' research on judgments and decisions.  He started his own work at 20 when he was asked to determine which prospective officers in the Israeli Army should be promoted. His PhD is from Berkeley and he’s now faculty emeritus at Princeton. He is collaborative, and ingenious, and focused on ways in which the human mind cuts corners and what can be done about it. At the end of each chapter he suggests how lessons from these often startling findings can be used in daily life, to think more clearly. So here is the main point.  In ways we can understand and predict, we make systematic errors in judgment throughout the day.  It’s probably because of natural selection that we’re good at making quick decisions and, although they are often good enough, they are also rife with shortcuts, exaggerations and mistakes.  Then they are capped off neatly with overconfidence.

He says the mind has two operating systems which he labels 1 and 2.  The first is the intuitive mind, the transparent bit that recognizes a pencil as a pencil, or a face as angry or sad.  It's the part that chooses between fight and flight, that judges the size of numbers, keeps track of time, recalls memories, and basically takes the first quick swing at everything.  We consider this “knowing.” System 2, what you might call “thinking,” includes estimation, calculation, and all manner of figuring out – much of which happens very fast and we are at least partially aware when it's at work.  Both systems cut corners, jump to conclusions, omit inconvenient information, exaggerate, and guess.  System 2’s biggest problem, according to Kahneman, is laziness.  It leaves System 1 in charge, and when called on it does the minimum that is required.  Generally this means piecing together a credible solution, then going back to rest.
At first this dichotomy seemed a bit forced to me, as I thought there must be a continuum instead.  But it turned out to be a fabulous way to parse out how, when, and why our thinking can go wrong. 
Things we encounter first hand seem more important and more common than they should, as do things we can easily recall.  When the chance of something (good or bad) is small we tend to either ignore or exaggerate the possibility of it happening.   We consider losses about twice as important as an equal gain.  Thoughts immediately preceding can dramatically influence a judgment.  In retrospect, assessing an event, we give far too little weight to blind luck.  Considering options separately often results in different conclusions than what you would arrive at with a broader framework.  We use stereotypes to draw inferences from the group to the individual, and the other way around.  And we’re bad at statistics.  Consider this:
A woman is seen reading the New York Times on the New York subway.  Which of the following is a better bet about her level of education?
    1. She has a PhD
    2. She does not have a college degree
If you chose the first, as I did, think again.  There are a lot more nongraduates than PhDs riding the subway.  We overlooked the base data. 
Here’s a clever one:
Hearing that the highest incidences of kidney cancer in U.S. counties are found in the rural, sparsely populated states you might guess the reason: these places are poorer, less educated, with less access to health care, where people have high-fat diets, drink more and smoke.  That would certainly make some sense.  It’s a coherent, compelling, and completely fabricated story.
If you had heard instead that these same areas have the lowest rates of kidney cancer, you might come up with this one: rural living is healthier, with clean air, fresh locally grown food,  people get more exercise and suffer less stress than they would in urban areas. 

So which of these two statements about kidney cancer is true?  Both are true; you see, there are fewer people in rural counties, and we are comparing cases per 1,000, and wher the denominator is small random variations in the numerator will cause large changes in the rate.  Example: If a disease affects 20% of the greater population, in counties with just 2 people the rate is likely to be 0%, some will have 50%, and a few 100%.  None will reflect the actual average. This is a statistical phenomenon which even statisticians often overlook.  
Some of the findings in the book as so bizarre I still have my doubts, although the research seems sound, has been replicated, results are consistent, and the effects are strong.  For example, when subjects were asked to estimate the percentage of African nations which are members of the UN it mattered whether the respondents had just spun a 10 or 65 on a (rigged) “random” wheel.  Their estimates for UN membership averaged 25% and 45%, respectively!  While people are holding a sequence of numbers in their head they have less self-control eating cake!  Or if they grip a pencil between their teeth (forcing a smile) cartoons seem funnier to them compared to when they hold it between pursed lips (forcing a frown)!  People pencil-frowning will concentrate more strongly on the task at hand.   And those required to nod were more likely to agree with an editorial they listen to than those who were told to shake their heads.  These results just seems absurd.  I admit that my incredulity is not evidence that it's not true.  I just don't see how such simplistic influences would not be exploited, and therefore removed, by evolutionary forces.  

However, I found myself falling for many of the little test-yourself demonstrations in the book.  Consider this question:

Was Ghandi 130 years old when he died?  You’ll know he wasn’t.  But how old was he, when he died?  The typical response is significantly a larger number, compared to subjects who were first asked if he had died at the age of 40.  The first group would tend to adjust from 130 downward until they hit ages they thought were plausible; and the others do the same upward from 40.  Someone who figured Ghandi was between 65 and 85 might guess nearly a 20 year difference, depending only on the anchor they had been primed with.

Anchors are one of many subtle influences to bias our thinking. Here are some others:
The availability heuristic (page 129), confidence by coherence (209), the illusion of validity (211), the  illusion of skill (212), the illusion of intuition/ strength of algorithms (223) , the hindsight bias (202), the illusion of control (259), the halo effect (199), nonregression to the mean  (175), inside vs. outside view (248), the planning fallacy (250), stereotyping (168) , representativeness (149), the substitution effect (139), probability neglect (144), statistical and causal base rates (168), the conjunction fallacy (158), theory induced blindness (277), risk aversion (270),  loss aversion (284), the endowment effect (293), bad news bias (301), possibility effect (311), certainty effect (311), the expectation principle (312), denominator neglect (329), and overweighting (333).  There is narrow vs. broad framing (336), reference points (281), the sunk cost fallacy (345), and the evaluability hypothesis (360), and the effect of categories (356).  The focusing illusion (402) is a simple one, with reach: (“nothing in life is as important as you think it is when you are thinking about it”), as is WYSIATI (153: “what you see is all there is”).  
This may seem like a chore to work through, but it’s not.  The writing is very clear and engaging, and the 38 well-crafted chapters are each, remarkably, 10 pages long.  The book has  plenty of examples, end notes with references, concluding advice, full reprints of two seminal journal articles in two appendices, and lots of little demonstrations you can try on yourself.  Another example of these.  Which of these two opportunities would you prefer?:
  1. a gamble that offers a 10% chance to win $95 and 90% chance to lose $5
  2. or would you pay $5 for a 10% chance to win $100 and a 90% chance to win nothing?
Most people find the second one more attractive.  The fact that they are identical is difficult to see because we are more adverse to losing $5 than we are to paying $5.  Here’s another: You are considering surgery (with slightly better long-term results than radiation therapy) and you were told, about the surgery
    1. The one-month survival rate is 90%
    2. There is a 10% mortality in the first month.
Doesn't the first one sound better?  Now consider this 
  • Adam switches from a gas-guzzler of 12 mpg to a slightly less voracious guzzler: 14 mpg
  • Beth switches from a 30 mpg car to one that runs at 40 mpg
Who will save more gas by switching?  The answer is Adam, by a large margin.
One of the best sections of the book is the last. It describes two very distinct senses of self: one is the experiencing self and the other, the remembering self.  When someone asks “how are you?” it might mean "how's your mood right now?" but more likely "how have things been going?” … in other words ... “how happy are you with your memories?”  It’s the remembering self that most people care about, whether they're looking at their own lives (it’s why we take pictures on vacations) or others'.  When subjects were asked to imagine taking a dream vacation, after which all memory of it would be erased, most would not send themselves (or another amnesic) to hike mountains or trek through the jungle.  "These experiences are mostly painful in real time and gain value from the expectation that both the pain and the joy of reaching the goal will be memorable.”  “Odd as it may seem,” Kahneman concludes, “I am my remembering self, and the experiencing self, who does my living, is like a stranger to me.”
Unfortunately it's a little worse still, because the remembering self doesn’t represent the experiencing self very accurately. Looking back, durations of experiences don't matter much.  Only two things do:  1. peaks (good or bad), and 2. how the event ended.  Kahneman explains how these findings can be put to practical use, and also how education, religion, illness, poverty, wealth, and other factors impact the two forms of satisfaction.
It’s a great book -- I believe it's a full life’s work -- and my criticisms are minor.  Maybe not enough distinction was made between one-time judgments and iterated ones.  In the Prisoner’s Dilemma, for example, once-off games lead to mutual defection but when done repeatedly with the same players (as in the real world) it leads instead to altruism.  Some of the results in a psychology lab may be quite different in the real world.  There are also a few surprising typos, like a mix-up on page 335, where “AD” and “BC” are switched, throwing off the explanation; and on page 337 two miscalculations in the formulas actually invalidate the results.
But if these are the low points, they are not very low.  The peaks were many and very high -- and even though it took me almost a month to read, but duration doesn't count for much -- I'm remember it very fondly and it will sit on my top shelf.  Next to Pinker.
(Ghandi died at 79, BTW)