Notes On Thinking Fast and Slow by Daniel Kahneman

Rating 8.5/10

Economics Nobel Prize winner Daniel Kahneman presents this book in which he summarizes all his research in the intersection of the fields of psychology and economics. The book is one of the most thought-provoking books I have read in the past year and one that I come back to daily in my own observations of my environment. The book is long but is divided up into easily digested chapters.

Part 1: Two Systems

  1. The characters of the Story

System 1: Automatic and quick, no effort needed, involuntary.

System 2: Allocates attention to effortful mental activities involving computation. This is the system we identify ourselves with.


System 1 has learned associations between ideas. It reads, and it understands social situations. It retrieves information from knowledge in memory. The attention can be moved from unwanted focus by focusing on something else.

System 2 has some power over system 1, It allows you to “pay attention” and it allows you to do multiple tasks at once but only if the tasks are easy and undemanding.

Both systems are on whenever we are awake. Biases come mainly from system 1. System 1 can run into conflict with illusions, to avoid falling into system 1 traps, one must learn to mistrust impressions and intuitions. The systems described in the book are abstract entities that describe the automatic and the effortful systems.

  1. Attention and Effort

System 2 is the support of system 1 but believes he is the hero. The main actor is system 1 but system 2 is fooled to believe he is the one leading. A characteristic of system 2 is that it makes your pupils change size. The pupils offer a metric of the current rate at which mental energy is being used. Laziness is built deep into our nature because cognitive expenditure is demanding. Switching from one task to another is even worse because it is effortful, especially under pressure.

  1. The Lazy Controller

Flow is described similarly to FLOW by Mihaly Csikszentmihalyi, as a state of effortless concentration so deep that you lose the sense of time. People who are cognitively busy are more likely to make selfish choices. This has been called ego depletion. The evidence suggests that activities with high demand on system 2 require self-control. System 2 monitors and controls thoughts and actions suggested by System 1. What happens when system 1 reaches a conclusion and system 2 is lazy in accepting it? System 2 endorses an intuitive answer that could have been rejected with minimal effort however, if the mind is very active, due to the law of minimal effort people will avoid thinking the answer using system 2 which leads to overconfidence and putting too much faith in intuition.

  1. The associative machine

Association is an event of system 1. An unexpected event will be made sense of as much as possible by system 1. Priming is exposing someone to a word or image so they will have the desired reaction. Primed ideas may ripple and prime other ideas with less intensity. Priming can be very subtle. Some cultures provide frequent reminders of respect, god or obedience. People feel disbelief when taught about priming because system 2 thinks he is in charge of making the choices.

  1. Cognitive Ease

Conscious and unconscious computations happen in your brain. Unconscious assessments of the environment are done by system 1 and they determine if the extra effort for a system 2 analysis is needed. The illusion of familiarity gives you a sense of familiarity with things you have read or seen before. If you want to sound credible don’t use complex language and make your message simple. If you choose a source choose one with a name that is easily pronounced.

  1. Norms Surprises and Causes.

System 1 may assess normality, it maintains and updates a model of the world. A single incident happening makes its recurrence less surprising. System 1 may detect an abnormality. There is also a search for causality like in Nassim Taleb’s Black Swans. Two headlines can a posteriori explain the same event your mind is ready and eager to identify agents and assign them personality traits and motives. System 2 can think statistically but it needs proper training.

  1. A Machine for Jumping to Conclusions

This is a property of system 1 because you are not always aware of the choice. When system 1 is uncertain it just bets on an answer. System 1 defaults to believing automatically, unbelieving is a task for system 2. Confirmation bias arises where instead of testing hypotheses by trying to refute them, we search for things to support them. The Halo Effect also arises when associating the feeling you have for a person’s personality with an extrapolation without real evidence, increasing the weight of first impressions.

  1. How Judgements Happen

There is no limit to the questions you can answer or the attributes that you can evaluate. These questions are addressed by system 2. System 1 gives a continuous assessment in order to survive.

  1. Answering an Easier Question

If an answer to a question is not found, quickly system 1 will find a related question that is easier and will answer it. Heuristics are simple processes that help find out an adequate yet imperfect answer to a question. This makes it easier to generate quick answers to difficult questions. Affect heuristic is a dominance of conclusions over arguments.

Part 2

  1. The Law Of Small Numbers

When information is given to you, system 2 automatically formulates a hypothesis. The thing is that most explanations of a fact can both prove it and disprove it. Statistical facts mostly depend on sample size. Causation gives a satisfying sense, The smaller the sample size the more possible that an extreme result occurs.  This is the reason for sample error in small populations, for example, a small town. “People are not actually sensitive to sample size”. Certainty is biased over doubt we exaggerate coherence and consistency. Causality exposes us to make mistakes when evaluating randomness the human mind seeks patterns a great example of this is a hot hand or any kind of streak in sports.

  1. Anchors

The anchoring effect is when people think of a value for an unknown quantity before estimating it. Something that is more expensive looks more valuable as a result of this effect. An example is driving too fast in the city when we come from the highway. Adjusting the value after an anchor has been dropped requires a system 2 mental effort. Our behaviour is influenced much more than we know or want.

  1. The Science of Availability

The availability heuristic is what people do when they want to estimate the frequency of a category. People will think of examples of the class they know and respond accordingly. This heuristic reduces the question of frequency by the ease with which examples come to mind. This substitution produces a systematic error, one must be aware of their own biases. System one sets expectations and is surprised when these expectations are violated.

  1. Availability Emotion and Risk

Availability is very important for the study of risk. Howard Kunreuther has a study that discovered that victims and near victims are very concerned after a disaster.  Governments normally have adequate infrastructure for the worst disaster in history, not the worst possible disaster. A clear and widely used example of this is the Fukushima power station disaster in Japan. It goes back to ancient Egyptians, measuring the highest historic level of the Nile River with a line. The affect heuristic creates a world for us that is much tidier than it really is. All decisions are easy, all technology is cheap etc.. it weakens restrictions.  The evaluation of risk depends on the choice of measure.

  1. Tom W’s speciality

The proportion of each category in a population is called the base rate. People don’t generally think of the base rate when categorizing an item, they think of the different characteristics of the item in order to fit it into a category. People confuse similarity and probability. The representativeness heuristic is exactly this, fitting an item into a definite category by inferring from its characteristics. It is the root cause of the use of stereotypes. There are two categories, one is very big and the other is comparatively very small, one individual is taken from a collection of both at random even if the stereotype of the person’s choice falls in the smaller category, it is more possible that the person pertains to the modal one. All the characteristics may point to the smaller category, but the truth is that there are more subjects in the bigger category and therefore in that category, there may be more differentiation. Just looking at the numbers, rational probability states that probability is in the numbers not in the similarities. Relative size is king.  “What you see is what there is”,  makes system 1  process all the information it comes it’s way as true. Bayes theorem explains this.

  1. Linda Less is more

Linda is a fictitious character introduced by Kahneman, created in an experiment to fit the stereotype of a left-wing feminist. A judgement of likelihood is asked of a subject and both “bank teller” and “feminist bank teller” are included as options to describe Lindain the questionnaire. It is obvious because of representativeness that bank teller is a wider category and therefore is more probable, however, the subject has been primed and he believes the “feminist” bank teller is correct even though it is a smaller sample because it fits better. This is a failure of system 2. Kahneman calls this phenomenon a conjunction fallacy. As in the Müller-Lyer illusion, the illusion remains attractive even when the subject discovers it is an illusion. The more niche statement in the questionnaire must not just be more detailed, it needs to be more plausible.

  1. Causes Trump Statistics

People fail to use Bayesian Inference and instead default to non-statistical causal inference. Stereotypes as in the neutral connotation work when the individual forms the image of a model person categorising by appearance. This experiment shows how individuals feel relieved of responsibility when they are part of a group. People convince themselves they would help the victim when part of the group but the truth says that they rather another person in the group deals with this. The problem with this is that changing the mind about human nature is difficult. ” The test of learning psychology is whether your understanding of situations you encounter has changed not whether you have learned.”

  1. Regression to the Mean

The psychology of effective training states that it is better to reward improved behaviour than to punishment for mistakes. However sometimes in training mistakes or improved behaviour may be produced by random fluctuations in the quality of performance. Over time extreme results return to the average. The more extreme the result, the bigger the return. Regression is not causal and sometimes misguided stories are used to explain the effects. The correlation effect between 0 and 1  is a measure of the relative weight of the factors that are shared between two variables that explain the outcome. Our mind is however biased towards causal explanations. This is why a control group is used, to see if the same results are obtained with no influence of variables.

Part 3: Overconfidence.

  1. The Illusion of Understanding.

Kahneman references Nassim Taleb’s work where Taleb introduces narrative fallacy in Black Swans which he describes as ” Flawed stories of the past that shape our views and our expectations of the future.”. We try to make sense of the world. A Story explains an event in hindsight and the more simple and coherent the better. The test of an explanation is if it had made an event predictable in advance.  The more luck that is involved in an event, the less there is to be learned from it.
When something unexpected occurs, we change our view of the world to accommodate this event in our framework. Once the actualized view has crystallized, it is very difficult to recall previous beliefs. This is called the “I-knew-it-all-along” or hindsight bias, If an event occurs, people tend to exaggerate the probability that they had assigned to it earlier. THen there is outcome bias, when the outcomes are bad we blame not seeing what is only visible after the fact. Kahneman proceeds to critique the figure of the superstar CEO who is generally praised for doing things where there is a huge component of luck.

  1. The Illusion of  Validity

System 1  likes to jump to conclusions from little evidence. You cannot forecast success with certainty. Returning to the Müller-Lyer Illusion, even after the subject is told all the lines are of equal length they still see the lines different. This is what Kahneman calls the illusion of validity. Kahneman then goes on to investigate this illusion in the stock market with the illusion of stock-picking skill is. Who buys the stock you sell? Probably another trader. What do sellers know that buyers don’t? This is parting from the premise that buyers and sellers have the same information. Both sellers and buyers think the price is wrong. 2 out of every 3 mutual funds underperform the market. Firms are rewarding luck like it is a skill.

  1. Intuitions vs. Formulas.

Experts are inferior to algorithms. Experts are too confident in their own opinions and assign too much weight to their intuitions.

  1. Expert Intuition: When can we trust it?

Exchanges and professional controversies are a waste of time. They are rarely instructive. Pavlov’s conditioning experiments in which the dogs recognize the bell and know food is coming is a kind of learned intuition that also applies to humans. Kahneman argues that firefighters or chess players are an example of this where their success has a big component on intuition. Kahneman calls it professional intuition and is very present in job placements like nurses or firefighters but it should not be taken into account in other jobs where unsupportable long-term forecasts like stock pickers or political commentators. Even the best algorithms, although better than humans, are never very accurate.

  1. The Outside View

The proper way to elicit information from a group is not starting with public discussion but by confidentially taking everyone’s judgement. There is the inside view and the outside view. Inside view tends to focus on specific circumstances and attends experience. It normally extrapolates. The inside view doesn’t tend to take base rate into account. The planning fallacy describes forecasts that are unrealistically close to best-case scenarios and could be improved by consulting the statistics of similar cases. Buildings are a clear example as they always go over budget. COntractors make most of their money in addition to the original plan.

  1. The Engine of Capitalism

All this is optimistic bias, we see the world as more benign than it really is or our conditions as better than they really are. With optimist bias, you always feel lucky. You take on more challenges and risks. An optimistic temperament encourages persistence in the face of obstacles. Confidence is valued over uncertainty. An expert who discloses the whole extent of their ignorance will be replaced by a more confident competitor who gains the trust of the client.

Part 4: Choices

  1. Bernoulli’s errors

To an economist, the agent of economic theory is rational and selfish and his tastes do not change. The foundation of the expected utility theory is the rational agent.   Bernoulli reasons that the utility of money is proportional to the level of wealth. We normally speak of potential rises. This results in utility being a logarithmic equation of wealth. The problem with Bernoulli’s thesis is that people tend to choose the sure thing when the option is to increase wealth but not when they can lose it when the option is sure thing vs gambling. Bernoulli’s idea lacks a reference point, some outcomes are good and others are bad, and not all pairs of risks and sure things are equal.

  1. Prospect Theory

Risk aversion vs risk focus when winning or losing. The sure loss is very averse and this drives you to the risk. Attitude to gain and loss is not referential to your worth. Prospect theory is then the alternative, it is represented as an S-shaped graph where ” losses loom larger than gains”. According to some studies on subjects the loss aversion ratio is between 1.5 and 2.5. Even though these studies exist, rationality is a pillar of economics.

  1. The Endowment Effect

Now Kahneman introduces indifference curves and marginal utility. The elegance of the model blinds students and scholars. There is a preference for the status quo that is inherent to lose aversion. The choice is biased towards the reference situation, the endowment effect consists in the effect that owning a good makes increases the owner’s perception of the value,  where his minimum selling price and buying price don’t match. There is pain in giving up something and pleasure in obtaining it. There is no loss aversion on either side of routine commercial stages for example when comparing the relative worth of an item to another one. Poor people always choose at a loss, acquiring one good makes it impossible to acquire another one. This is a textbook budget restriction.

  1. Bad Events

The amygdala is the threat centre of the brain, it has a priority for the bad news. Kahneman talks about how golfers hit better for par than for birdie which can be explained by the fact that hitting for birdie will probably have the ball in a more difficult position, not any other causal explanation. A rule of fairness is that the excitation of market power to impose losses on others is unacceptable.

  1. The Fourfold Pattern

When assessing a single item, weights are assigned to its characteristics, some characteristics have more influence than others in the assessment. The expectation principle deals with the increase of a probability of the same magnitude at different probabilities (eg from 0% to 5% or from 95% to 100%). 5% to 10% doubles the probability but studies show that psychologically this doesn’t feel like a doubling. 0% to 5% illustrates this effect perfectly, this is similar to the reason people buy lottery tickets. A 5% amputation is worse than 10% is when put in relative terms. There is a dichotomy of possibility vs certainty when the magnitude is the magnitude between 95% and 100%. 95% of a disaster happening gives a sliver of hope. Allais Paradox shows people are not rational when dealing with the sure thing and the inconsistency people show when they are presented with a sure thing vs a gamble.

GAINLOSS
HIGH PRISK-AVERSERISK SEEK
LOW PROBRISK SEEKRISK-AVERSE
  1. Rare events

Availability cascade: the basis for terrorism, the media spreads images of terrorism, it is widely talked about and although system 2 may “know” the probability of being a victim of terrorism is low, the emotion is disproportionate to the chance of this. Oversimplifying this, we get that people overestimate the probabilities of unlikely events and they factor unlikely events too much in their decisions. A representation of the outcome also reduces the weight of probability in the decision. The dominator effect, where the subject chooses the option with the larger number of “marbles” instead of the one with a higher probability of “marbles” of the category they want. Negative events suffer from this the most, in terms of relative frequencies like one of one million. System 1 deals better with individuals that categories. 1286 out of 10000 is judged as more dangerous than 24,4 out of every 100 (check this)

  1. Risk Policies

There is an intuition that playing a very risky but favourable gamble many times reduces the subjective risk. This is only true if the gambles are truly independent of each other and when the possibility of winning is way too small. Broad Minding in Trading You are less prone to churning your portfolio if you don’t know how every single stock is doing every day. Having a risk policy, always acting the same in the face of a decision prevents being narrow-minded and although it may bring losses at times it is financially beneficial in the long run.

  1. Keeping Score

The main motives for money seeking in most cases (apart from the obvious very poor cases) are not economic. Accounts both mental and economic are a way to keep things under control and manageable by a finite mind. The disposition effect: The account of an investment as a winner or a loser is taken into account instead of thinking of actual value and future prospects, the past doesn’t matter. Investing additional resources into a losing account is called sunk cost bias the correct decision is to cancel the actual project and invest elsewhere or try to finish the actual project.

  1. Reversals

Our world is broken down into categories with scientific norms. The assessment of dollar value is normally taken care of by substitution.

  1. Frames and Reality

Truth conditions equal: if one sentence is true and the other is true as well. for example, France lost, and Italy won. COSTS ARE NOT LOST. A discount and a surcharge are not psychologically equivalent even if the end price is the same. Even professionals are prey to framing, it needs an alert system 2.

  1. Two Selves

To economists utility means wantability. When having a negative experience, there is a peak en rule the pain is the average of the peak and the pain at the end. There is also duration neglect which has no effect on pain rating. This is Kahneman’s TED talk about the experiencing self and the remembering self. Confusing experience with memory is a cognitive illusion.

  1. Life as a Story

We think of life as a story and wish it to end well. Caring for people often takes the form of concern for their stories. This is the part of the vacation from the TED talk. Kahneman talks about vacations being memory/storytellers.

  1. Experienced Well-Being

Referring to FLOW by Mihaly Csikszentmihalyi, attention is the key to the state of flow, our emotional state is affected by focus. Combining eating with other experiences for example dilutes the experience.

  1. Thinking About Life

Affective forecasting not believing statistics affect you. Many simple questions are replaced by a global evaluation of life. People who want money as a goal in life and don’t get it are more dissatisfied than those that are equally poor but don’t have that goal. Focusing illusion Nothing in life is as important as you think it is when you are thinking about it. Kahneman references a study that shows that climate doesn’t affect well-being. It is only recent people moving to a better climate that name it as an influence. The same happens with people who become paraplegics after an accident.

Conclusions


There are absurd conflicts between the two selves. An objective observer making the decisions would in many choices take the alternative option. In the economy, rationality doesn’t mean someone that is reasonable but someone who is consistent. The assumption that agents are rational provides the intellectual foundation for the libertarian approach to policy.  When people are acting in ways that look odd we should examine the possibility that they have a reason to do so. There is a debate between the Chicago School and Behavioral Economists. Organizations are better at avoiding errors than individuals.