“Thinking Fast and Slow” by Daniel Kahneman: Part Three

The final installment of my numerous notes following Part One and Part Two.

  1. Fechner was obsessed with the relation of mind and matter. On one side there is a physical quantity that can vary, such as the energy of a light, the frequency of a tone, or an amount of money. On the other side there is a subjective experience of brightness, pitch or value.
  2. Bernoulli observed that most people dislike risk (the chance of receiving the lowest possible outcome), and if they are offered a choice between a gamble and an amount equal to its expected value they will pick the sure thing. In fact a risk-adverse decision maker will choose a sure thing that is less than expected value, in effect paying a premium to avoid the uncertainty.
  3. I call it theory-induced blindness: once you have accepted a theory and used it as a tool in your thinking, it is extraordinarily difficult to notice its flaws. If you come upon an observation that does not seem to fit the model, you assume that there must be a perfectly good explanation that you are somehow missing. You give the theory the benefit of the doubt, trusting the community of experts who have accepted it.
  4. Loss aversion refers to the relative strength of two motives: we are driven more strongly to avoid losses than to achieve gains. A reference point is sometimes the status quo, but it can also be a goal in the future: not achieving a goal is a loss, exceeding the goal is a gain.
  5. We quickly reached two conclusions: people attach values to gains and losses rather than to wealth, and the decision weights that they assign to outcomes are different from probabilities.
  6. The successful execution of a plan is specific and easy to imagine when one tries to forecast the outcome of a project. In contrast, the alternative of failure is diffuse, because there are innumerable ways for things to go wrong.
  7. The idea of denominator neglect helps explain why different ways of communicating risks vary so much in their effects. You read that “a vaccine that protects children from a fatal disease carries a 0.001% risk of permanent disability.” The risk appears small. Now consider another description of the same risk. “One of the 100,000 vaccinated children will be permanently disable.”
  8. The power of format creates opportunities for manipulation, which people with an axe to grind know how to exploit.
  9. For one thing, it helps us see the logical consistency of Human preferences for what it is- a hopeless mirage.
  10. The escalation of commitment to failing endeavors is a mistake from the perspective of the firm but not necessarily from the perspective of the executive who “owns” a floundering project. Canceling the project will leave a permanent stain on the executive’s record, and his personal interests are perhaps best served by gambling further with the organizations’ resources in the hope of recouping the original investment- or at least in an attempt to postpone the day of reckoning.
  11. The sunk-cost fallacy keeps people for too long in poor jobs, unhappy marriages, and unpromising research projects. I have often observed young scientists struggling to salvage a doomed project when they would be better advised to drop it and start a new one. Fortunately, research suggests that at least in some contexts the fallacy can be overcome.
  12. Regret is one of the counterfactual emotions that are triggered by the availability of alternatives to reality.
  13. People expect to have stronger emotional reactions (including regret) to an outcome that is produced by an action than to the same outcome when it is produced by inaction.
  14. In the regulatory context, the precautionary principle imposes the entire burden of proving safety on anyone who undertakes actions that might harm people or the environment.
  15. Decision makers tend to prefer the sure thing over the gamble (they are risk averse) when the outcomes are good. They tend to reject the sure thing and accept the gamble (they are risk seeking) when both outcomes are negative.
  16. As economists and decision theorists apply the term, it means “wantability”- and I have called it decision utility. Expected utility theory, for example, is entirely about the rules of rationality that should govern decision utilities; it has nothing at all to say about hedonic experiences.
  17. A decision maker who pays different amounts to achieve the same gain of experienced utility (or be spared the same loss) is making a mistake.
  18. Nothing in life is as important as you think it is when you are thinking about it.
  19. Thoughts on any aspect of life are more likely to be salient if a contrasting alternative is highly available.
  20. The central fact of our existence is that time is the ultimate finite resource, but the remembering self ignores that reality. The neglect of duration combined with the peak-end rule causes a bias that favors a short period of intense joy over a long period of moderate happiness.
  21. The only test of rationality is not whether a person’s belief and preferences are reasonable, but whether they are internally consistent.
  22. Rationality is logical coherence- reasonable or not.
  23. The definition of rationality as coherence is impossibly restrictive; it demands adherence to rules of logic that a finite mind is not able to implement. Reasonable people cannot be rational by that definition, but they should not be branded as irrational for that reason.
  24. Deviating from the normal choice is an act of commission, which requires more effortful deliberation, takes on more responsibility, and is more likely to evoke regret than doing nothing.
  25. System 1 registers the cognitive ease with which it processes information, but it does not generate a warning signal when it becomes unreliable. Intuitive answers come to mind quickly and confidently, whether they originate from skills or from heuristics. There is no simple way for System 2 to distinguish between a skilled and a heuristic response. Its only recourse is to slow down and attempt to construct an answer on its own, which it is reluctant to do because it is indolent.
  26. The voice of reason may be much fainter than the loud and clear voice of an erroneous intuition, and questioning your intuitions is unpleasant when you face the stress of a big decision.
  27. Organizations can institute and enforce the application of useful checklists, as well as more elaborate exercise, such as reference-class forecasting and the premortem.
  28. Every factory must have ways to ensure the quality of its products in the initial design, in fabrication, and in final inspections. The corresponding stages in the production of decisions are framing of the problem that is to be solved, the collection of relevant information leading to a decision, and reflection and review. An organization that seeks to improve its decision product should routinely look for efficiency improvements at each of these stages.

Thinking, Fast and Slow

This entry was posted in Books, Non-Fiction and tagged , , , , , . Bookmark the permalink.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s