Thinking Fast and Slow: 36 Life as a Story

Following the schedule of our Reading Calendar

Well… This chapter is like really well… boring.

My rephrasing: Some friends of Kahneman, after watching “Total Recall” decided to make a pointless experiment about how important are memories when evaluating former experiences. He put them in his book so they get some extra credit. Plus the chapter starts with a little digression on the Traviatta so he also sound cool and intelectual. Meh!

 

Advertisements

Thinking, Fast and Slow. 34 Frames and Reality

Following the schedule of our Reading Calendar

We are presented here with another bias, the Frame Bias. The author also uses this chapter, I guess that by no other reason than this being the last chapter of part 4, to defend the existence of humans as a different species from Econs. The chapter ends up with a sort of bitter complain because economics is not accepting the truth coming from the behavioral psychology fast enough.

This is one of my main caveats about this book. When reading it, it is good advice to separate very carefully two kind of insights. On one side, there are genuinely new and helpful ideas that help us to know ourselves better and than can help us to avoid costly mistakes in life (at least in theory). On the other side, there is a lot in it that only makes sense as a rebuttal of a set of outlandish ideas of economists living in another universe. It doesn’t interest me at all if it is not true, as mainstream economists believe, that cows are spherical. Let’s take a good book on marketing and I suspect (I don’t know the field) that there will be much more information and fun about real human economic behavior than here. This is a book about a fight between scholars for respect and prestige, written in scholar language and I am happy that we are finally nearing the end.

About the frame bias, this is something we all intuitively know. We respond differently to equal or very similar questions depending on the way they are presented to us. In the chapter, some interesting examples are explained, although I don’t understand the miles per gallon story. Beth consumes less petrol than Adam, makes a move to consume even less with a reduction of 25%. Adam consumes a lot, makes a reduction of only 14% and ends up consuming more than twice what Beth consumed before the correction. In what possible sense Adam action is more significant than Beth’s? Because the absolute amount of petrol saved is bigger?

Thinking Fast and Slow: 33 Reversals

Following the schedule of our Reading Calendar

In this chapter we are presented with a different type of bias: how we humans make a different decision depending on whether we can compare two different outcomes or not. When we are presented with two different situations to evaluate, the way we decide is highly influenced by the way the information is presented. In tone of the first experiments described people will choose a safer bet to a riskier one if presented together even if the riskier bet can give you more money. However if they are presented separated and one has to state what is the minimum amount of money that would desire to put on the safer, people tend to put a lot more money in the riskier bet.

Of course, such results are devastating for the  Morgenstern/von Neuman model which suppose that choices human make are “rational” and consistent, but it should worry us too that systematic inconsistency in the way we make decisions, as it is presented in several experiments related to trials and juries.

And we can forget very basic facts. In another experiment, if people are asked to give a donation to save dolphins from pollution or to save farmworkers from skin cancer, alone, people tend to give more people for the dolphins cause. The emotional impact of an endagered dolphin is greater than a farmer getting skin cancer; after all, they are supposed to work under the sun… However, when they are presented together people give more money to farmers. After all, they are human, and dolphins are not.

The main reason for such a discrepancy is the fact that when we have to choose among several choices it is more likely that system 2 activates.  So this chapter give us an interesting heuristics: when suspecting that one is under a rushed evaluation from system 1, consider other choices and force system 2 to show up. I think I’ll start right now to use it.

 

Thinking, Fast and Slow. 32 Keeping Score

Following the schedule of our Reading Calendar

Money is important but not per se but because of its role as a proxy for feelings of accomplishment. Behavior in real humans is not dictated by rational calculations of gains and loses but for the pursue of emotional satisfaction. And emotions have their particularities and are sometimes coincident with the mythical rational man and sometimes not.

In this chapter we read about some psychological traits that are everywhere and which are very relevant in everyday life and that are quite well known. We are talking about the “sunken cost” effect that prevents you from stop loosing time and money in bad projects. We have the “disposition effect” that forces you to sell good performing investments and retain the losers. We have an asymmetry about the regret we feel when our actions had bad consequences in opposition to how we feel when our omissions had bad consequences. We don’t feel so much pain when we are bad after following the norm compared to what we feel when we are bad after braking the norm.

That’s how we are and if it is difficult to overcome this feelings oneself, to try to fight them in the people around when discussing joint actions that affect you is impossible. And that brings us to the interesting discussion that closes the chapter. These all are biases that decrease our overall practical and financial success. Sometimes they can seriously damage our health and life expectancy (we are over medicated and over treated because of this biases). However, it may be simply impossible to fight them. We hate losing money in a transaction in the same way that we like sugar. It is something physiological. You can’t help it. Maybe you just have to accept it and try to optimize your emotional well-being even knowing that you are harming your portfolio, wealth and health.

And that reminds me that I have always defended it as the most important argument in favor of having kids: it is what everybody does. If when you are old you discover that you made a mistake doing what everybody does, you are going to overcome it and be quite happy nevertheless. However, if your mistake was going against the norm, you are going to suffer strongly.

Thinking Fast and Slow 31: Risk Policies

Following the schedule of our Reading Calendar

As if in a rollercoaster, we are back to a pseudo-bias in which stupid humans are not able to think as probability experts think they should.

It is clear that Carlos and I share the view of rationality that Kahneman presents here:

it helps us see the logical consistency of Human Preferences for what it is -a hopeless mirage.

However, I don’t think that the example presented in the first pages is a proof of such a consideration.

In the example, first we have to choose between a sure gain a a certain probability to win some more. Without trying any calculation most people will settle for the sure gain. However, if we are presented with two choices that both imply probabilities then we’ll make the calculation.

I don’t think that’s a problem, really. There is another way of reading the two situations. In the first situation we are faced with a real decision, and in the real world going for something sure is after all, not a bad heuristics. However, in the second situation it is clear from the framing of the problem, that the experimenter wants us to make probabilistic calculations, so we do it.

If the main point here would be that we humans are not rational in the sense described by Morgensten and von Neumann, I’d agree completely, but here Kahneman seems to argue that because we are not computing probabilities all the time we are doomed to failure and we are irrational.

The main problem described in this chapter -Samuelson’s problem- is boring as hell and it is another elaboration on how far humans are from being perfect statisticians.

The last part of the chapter, on risk policies makes sense -have a risk policy and apply it routinely whenever a relevan problem arises- but the description is too broad to be actionable.

Besides, nothing new has been presented. This chapter is  mostly prescindible.

 

 

Thinking, Fast and Slow. 30. Rare Events

Following the schedule of our Reading Calendar

Well, here we are again dealing again with a real bias. And by that I mean a psychological characteristic of humans that provokes decisions that can be harming to themselves. We are talking here about the tendency to overestimate and overweight very unlikely events.

This is a natural tendency difficult to overcome even when we are perfectly conscious of its presence and of the absurdity of our behavior. Kahneman explains his experience with buses in Israel but I am sure that everyone of us can come up with similar stories in their private lives.

The central point of the chapter is that this probability insensibility (by overestimating or neglecting) is the result of the general biases of our system 1: cognitive ease, availability bias and confirmatory bias. Particularly, the most important point is the vividness that we attribute to the image of the rare event than we are judging. Events that can be clearly and vividly presented generate a response of our system 1 that contributes to the overweighting of that event. That is the case, for example, of the “denominator neglect”, because system 1 understands cases and individuals much better that percentages and statistics.

This knowledge creates tools for manipulation and deceptiveness. The way how a case is presented has a big influence in generating or not an overreaction to a rare event.

Its interesting the different response of subject studies to what is called here “choice from experience” and “choice from description”. Once again it goes to the question, that we have commented previously, of how much of this problems and biases are real problems in decision making or are artefacts that appear only when a verbal interpretation of the choices mediates in the decision process.