Thinking fast and slow 13. Availability, Emotion and Risk

Following the schedule of our reading calendar.

This chapter presents a variation of the availability bias, the affect bias, by which emotions -positive or negative- make  certain ideas more available to us, and we play the classical substitution game: we can’t really solve a complex problem -like: would X be a good president?- so we substitute it for a simpler problem that system 1 can easily process -like: what are my feelings about X?

Here we see how humans receive and process risk deeply influenced by how the media turn very improbable events like getting botulism or being killed in a terrorist attack as very relevant. It also presents empirical evidence on how messages about the benefits of a certain technology make most people think that those technologies are less risky than they thought before. Nassim Taleb argues a similar position, showing how advocates of GMO when they have to argue that GMO are not risky, they start instead of explaining all the benefits that GMO will bring to humankind.

We are also introduced to the concept of “availability cascade”in which a series of connected availability and affect biases -usually generated by mass media- “lead up to public panic and large-scale government action”

The main point of this chapter, in Kahneman’s own words: there is “a basic limitation in the ability of our mind to deal with small risks: we either ignore them altogether or give them far too much weight- nothing in between.”

This chapter was -for me- less surprising than what I’ve been reading so far. It is a well known fact how emotional impact makes us evaluate small risks as a lot greater than they are, how media actively feeds such fears, and how easy is to create public panic.

Thinking, Fast and Slow. 12 The Science of Availability

Following the schedule of our Reading Calendar

We are presented here with the availability heuristic. This is a particular case of the substitution trick that our system 1 does and that we have already seen several times previously through the book. When trying to assess the relevance or frequency of a certain category, what we do is simply try to recover instances from our memory of that category and consider that the easiness of retrieval corresponds the frequency of that category.

Such a procedure, as all substitutions, is prone to biases because retrieval easiness may not equal objective frequency in the world for that category for a lot of reasons. The availability heuristic has been studied in depth and presents some curious characteristics. First, it is not necessary to retrieve any instance from our minds. System 1, before any retrieval, generates and intuitive guess of how easy the retrieving process may be and constructs the substitution judgement upon that guess.

It seams that, in the process of retrieving, it is much more important the easiness of retrieving than the number of instances retrieved. This creates the paradox than subjects of an experiment ask to generate 12 instances of that category label it less frequent than those who are asked to generate only 6. The point is that the 6 additional instances are much more difficult to retrieve. This difficulty surprises our system 1 (who is not a genius in making predictions) and influences the final assessment. And the interesting twist of it all is that this effect is erased if subjects are told that the background music that they are hearing impairs their ability to concentrate. Then the number of instances retrieved happens to be decisive because retrieving difficulty is explained by other reasons. This is another example of system 2 resetting system 1 expectations.

And this is our only hope. To learn how, even imperfectly, use our system 2 to tray to control the actions of system 1, because for most daily tasks, system 2 cannot simple work.

It is this, or we just stop worrying about reality altogether Gorgias style:

Nothing exists;

Even if something exists, nothing can be known about it; and

Even if something can be known about it, knowledge about it can’t be communicated to others.

Even if it can be communicated, it cannot be understood.

 

Thinking Fast and Slow. Chapter 11: Anchors

Following the schedule of our Reading Calendar

This chapter introduces us to the anchoring effect. Close to the priming effects we saw in former chapter, the idea is that someone drops a number -even randomly- before one has to be an estimate, that estimate will be influenced by the number one heard before. In the words of Kahneman: “If you are asked whether Ghandhi was more than 114 when he died you will end up with a much higher estimate of his age of death that you would if the anchoring question referred to death at 35. If you consider how much you should pay for a house, you will be influenced by the asking price.”

There are two different ways in which an anchoring effect may happen: Anchoring as adjustment and anchoring as priming effect.

In anchoring as adjustment: one starts from the anchoring number, decides whether it is too hight or too low and you adjust the final number moving from the anchor.

In anchoring as priming effect we can’t actually do an adjustment because the number is too high to be believable (was Gandhi more or less than 144 years old when he died?) But still we will produce a higher number than not having a previous anchoring. When we are asked whether Gandhi was older or younger than 144 years we can’t help but imagine a very old and venerable person, and so our calculation is misguided.

Kahneman present a series of well developed experiments to show the stability of the effect as well as measure it, using temperature, the height of redwoods, the price of houses or how much a person is going to give to a charity.

There is one experiment, however I found specially terrifying: how experienced judges were influenced by a roll of dice when having to decide the exact prison sentence they would give to a shoplifter. I imagine publicists, house sellers, lawyers and politicians reading this chapter and smiling wickedly…

Fortunately the chapter includes also indications on how to fight against an anchor when negotiating the price of a house, for example. The trick is to bring up system 2 to the rescue by looking for arguments against the anchor. “You should assume that any number that is on the table has had an anchoring effect on you,and if the stakes are high you should mobilize yourself (your system 2) to combat the effect”.

 

 

Thinking, Fast and Slow. 10 The Law of Small Numbers

Following the schedule of our Reading Calendar

Our automatic System 1, we know, is a genius of mathematics when complicated calculations have to be performed in order, for instance, to catch a ball thrown to us at high speed. However, this chapter is devoted to explain to us and convince us that this very System 1 is an extraordinary lousy statistician. Statistics works with probabilities of things that may happen or not and keeps at play simultaneously a lot of different options that could eventually materialize. This is something our System 1 is not prepared to work with. He needs certainty, simplicity and a chain of causalities that makes sense of the events that he perceives around him.

We are introduced to this idea by using what the author calls the “law of small numbers”. This is simply the fact that small samples generate a higher frequency of extreme observations that larger ones. Easy, right? Then we begin the chapter confronted to some observations in the incidence of kidney cancer in the US. And that allows us to notice how difficult is to use in practice that knowledge (the law of small numbers) that we all understand in theory. This is Kahneman’s favorite strategy: begin the chapter reducing the self esteem of the reader to make them more attentive at the subsequent discussion.

There are to cognitive biases presented to us in the chapter. One of them says that we are always much more interested in knowing facts that in assessing the reliability of those facts. Assigning reliabilities imply to assign degrees of reality to the facts that we know. System 1 can not cope with that. For him, events happen or doesn’t happen. No blurred logic here.

The second is our deep incapability to recognize randomness. In front of randomness we see patterns and clusters. We look for causal explanations in everything although most of what happens is just random. We are “fooled by randomness”. This is all very Talebian.

To understand this two biases is of tantamount importance in order to understand the limitations of our cognitive systems. To be able to develop mechanisms two fight them in practice and to be conscious of their presence when we are falling into them looks to me as an Herculean task. I am really curious to know if this book, beyond its theoretical description of human cognitive weaknesses, will be able to provide any kind of help.