Week by week, we discuss a chapter of Nassim Taleb’s Antifragile. According to our reading calendar, it’s time for chapter 14.
In this chapter, Taleb continues ellaborating on probably his most philosophical (in the good sense of the word) and provocative thesis: the superiority of tinkering and doing versus theoretical knowledge.
The chapter is constructed basicly around what Taleb calls “The Big Lumber Fallacy”, inspired on an anecdote described in the book What I Learned Losing a Million Dollars. It is about a trader called joe Sigel who was an expert in the comodity called “green lumber” and despite being very successful selling it, he got the concept all wrong, thinking it was lumber painted in green (instead of what really is: lumber that is not yet dry).
The Big Lumber Fallacy is the belief that in order to be successful with one subject, you have to have lots of theoretical knowledge about that subject. To illustrate this, Taleb tells us a personal story when he started working in foreing exchange. To his surprise, what he found there were street people that started doing transfers in the back office of banks, and with no geopolitical, sociological or economical knowledge. Actually, the best expert in Swiss Franc trading in the world wasn’t able to place Switzerland on a map!
Taleb also discusses at the beginning of the chapter a very annoying fallacy we hear every day: education is key for the economical development of a country. Taleb debunks this thesis with clear arguments and references, and defends that education should be pursued for its own sake, towards reducing inequality and increase freedom, but not in order to pursue economical gain. It won’t work.
As a matter of fact, theoretical knowledge can be counterproductive:
The “right thing” here is typically an antifragile payoff. And my argument is that you don’t go to school to learn optionality, but the reverse: to become blind to it.
The most relevant idea in this chapter, to me, is his discussion on narrative. Doers are very bad at explaining what they do and why, but they get things done. Academics are very good at presenting narratives, but those stories are not aplicable to the real world. Taleb is not against narratives: they are common in those ancients sages he loves so much, but those sages knew that those narratives don’t have to be truth to be effective, they just need to lead you to the proper heuristics to get things done.
Like in the former chapter, Taleb is talking in general terms. Certainly, theoretical knowledge has some value, but we have put so much faith in it that we need to reverse course and start to consider that we can’t really predict what is going to happen based on what happened in the past, and we don’t need to know everything about a subjects to find optionalities.
To sumarize, let me use Taleb’s own words:
We separated knowledge into two categories, the formal and the Fat Tony-ish, heavily grounded in the antifragility of trial and error and risk taking with less downside, barbell-style -a de-intellectualized form of risk takin g(or, rather, intellectual in its own way.) In an opaque world, that is the only way to go.