Thinking Fast and Slow 25: Bernoulli’s errors

Following the schedule of our Reading Calendar.

We are entering a new section of the book, part IV. What is explained in this section made Kahneman and his colleague and friend Tversky famous, and gave Kahneman his Nobel Prize in Economy: challenging the assumptionsthat are at the core of economic theory on how we make decisions .

Kahneman explains the main ideas behind Morgenstern and von Neumann model on how we make decisions -the game theory- and the experiments they develop in order to check whether this theory applied to real people and how this leads to this prospect theory on how real humans make decisions. Prospect theory is described in more detail in the next chapter.

GW274H274

Morgenstern (left) and von Neumann (right) the two guys behind Game Theory and “rational” economics.

The main case discussed here is the “bias” of preferring the sure thing (like $80) versus a complex gambling in which one has 80% chances to win $100 and 20% chances to win only 10, which makes an expected value of $82.

We are also presented with a formulation of the risk aversion bias, which implies that loosing something weights more when facing a decision that winning the equivalent. The pain of  loosing $100 is not compensated by the happiness of winning $100.

The first time I read this book I wondered how is this possible. And I’m wondering it again: if humans do not behave at all like the rational decision makers that  Bernoulli, Morgenstern and von Neumann imagined, if traditional economics is wrong at its very core, why are we still using it to make economic, political and sociological models on how markets are going to develop and what the people is going to do?

An aside comment. Consider John von Neumann. He was one of the main authors helping to create:

One of the first computers

The mathematical theory behind such computers (von Neumann Architecture)

Cybernetics

The atom bombs

The theory of Mutual Assisted Destruction (MAD) which defined the Cold War

The formalism of quantum mechanics

Game theory and its applications to economics

And then he  is quite unkown. Very few people have heard about him. Maybe he defines so close the XXth century that nobody feels really comfortable with such a character.

 

 

 

Thinking, Fast and Slow. 24. The Engine of Capitalism

Following the schedule of our Reading Calendar

Here we are presented with unreasonable optimism. All of us to a certain degree overestimate our good traits and our capabilities to achieve success in any project that we may start. Some people, however, are extreme optimists to a point close to detachment from reality. Optimism may come as a result of a particular emotional predisposition in some individuals but also comes as the result of several of the cognitive bias that we have been presented in previous chapters. The beautiful chapter 23 was on the planning fallacy that is nothing more than a particular kind of unreasonable optimism. Together with individual factors underlying our tendency for supporting this optimism, there is a social pressure that seems to ask from as a continuous state of self confidence and that forbid us to show any doubt about the solidity of our knowledge and beliefs.

Now, the interesting part comes with the conjecture that, unless everything that we have seen previously, where biases are essentially presented as damaging characteristics, here there seems to be an upside of it all. May be not at the individual level but at the collective one. Overconfidence promotes risk taking and societies need risk taking individuals to try new venues in economic activity, science, technology etcetera. Probably, most entrepreneurs would never set up a firm if they really understood the odds of success and failure. That’s why this chapter is entitled “The Engine of Capitalism”.

All that’s OK, but if there is something that makes this chapter worth reading, this is the “premortem” procedure. In fact, the whole book maybe worth reading just to get that piece of advice.

Thinking Fast and Slow Chapter 23: The Outside View

Following the schedule of our reading calendar

This is another Talebian chapter, touching a subject that he also develops in anti-fragile, the planning fallacy, and how difficult it is to calculate how long a complex project is going to take.

The main point here is to observe how different planning is if you take the inside view -the people that are working with the project-and an outside view that brings statistical information on how long on average a certain type of project is going to take. From the inside the temptation of thinking “this is different” is so great that one only focus in our specific, unique case and not consider the statistics of the class to which the case belongs to.

Once again, the examples -in this case of planning fallacies- are depressing, like the New Scotish Parliament estimated to cost 40 millions Pounds ending up costing 431 millions. But the cure to such fallacy, as Kahneman explains, is simple. Just get some previous statistics of how much similar projects took and cost before you make you estimations.

We all can get relevant lessons for our daily lives if we consider the outside view. A couple of yeasr ago or so I realized that I sucked at planning how much time any project was going to take, and then I realized that I was doing the inside view, I was with my mind set on the idea that “next month is going to be different” but then I realized that it was more proper to take an outside view and make a simple calculation. If, in the last ten months, I’ve been so busy with other stuff that I couldn’t finish so and so, why do I think the next months is going to be different?  And now I sucked a little less at planning.

Thank you Daniel, thank you Nassim

 

Thinking, Fast and Slow. 22. Expert Intuition: When can we Trust it?

Following the schedule of our Reading Calendar

This chapter, although extended in a biographical narration of an academic discussion, explains a single idea: expertise and expert intuitions are possible but only in environments where there is enough regularity to make predictions possible and the subjects are exposed to those regularities long enough to learn them (conscious or unconsciously). Fire fighters, nurses, chess masters follow into this category. In domains with less validity (stock picking, political analysis, etc…) expertise and predictions are not possible.

I have a much more simple heuristic: real experts do not use the word “expert” when they refer to themselves. They are fire fighters, nurses, plumbers, translators, programmers… The charlatans use the word: finance experts, international politic experts, management expertise. So, when you hear the word “expert”, you must run as fast as possible can.