Following the schedule of our Reading Calendar
In this chapter we are presented with a different type of bias: how we humans make a different decision depending on whether we can compare two different outcomes or not. When we are presented with two different situations to evaluate, the way we decide is highly influenced by the way the information is presented. In tone of the first experiments described people will choose a safer bet to a riskier one if presented together even if the riskier bet can give you more money. However if they are presented separated and one has to state what is the minimum amount of money that would desire to put on the safer, people tend to put a lot more money in the riskier bet.
Of course, such results are devastating for the Morgenstern/von Neuman model which suppose that choices human make are “rational” and consistent, but it should worry us too that systematic inconsistency in the way we make decisions, as it is presented in several experiments related to trials and juries.
And we can forget very basic facts. In another experiment, if people are asked to give a donation to save dolphins from pollution or to save farmworkers from skin cancer, alone, people tend to give more people for the dolphins cause. The emotional impact of an endagered dolphin is greater than a farmer getting skin cancer; after all, they are supposed to work under the sun… However, when they are presented together people give more money to farmers. After all, they are human, and dolphins are not.
The main reason for such a discrepancy is the fact that when we have to choose among several choices it is more likely that system 2 activates. So this chapter give us an interesting heuristics: when suspecting that one is under a rushed evaluation from system 1, consider other choices and force system 2 to show up. I think I’ll start right now to use it.