Tuesday, June 5, 2018

Filters, Beliefs, Survival and Crotchets

One of the most interesting things for me in Nassim Taleb's book Skin in the Game was his discussion on the importance of filters and rules that lead to survival. Much of his discussion centered around religion, and how "beliefs" that may seem irrational to many who take them literally are actually rational because they aid in survival, which should be the true test: 
So when we look at religion, and, to some extent, ancestral superstitions, we should consider what purpose they serve, rather than focusing on the notion of “belief,” epistemic belief in its strict scientific definition. In science, belief is literal belief; it is right or wrong, never metaphorical. In real life, belief is an instrument to do things, not the end product. This is similar to vision: the purpose of your eyes is to orient you in the best possible way, and get you out of trouble when needed, or help you find prey at a distance. Your eyes are not sensors designed to capture the electromagnetic spectrum. Their job description is not to produce the most accurate scientific representation of reality; rather the most useful one for survival. 
...Survival comes first, truth, understanding, and science later. 
In other words, you do not need science to survive (we’ve survived for several hundred million years or more, depending on how you define the “we”), but you must survive to do science. As your grandmother would have said, better safe than sorry. Or as per the expression attributed to Hobbes: Primum vivere, deinde philosophari (First, live; then philosophize). This logical precedence is well understood by traders and people in the real world, as per the Warren Buffett truism “to make money you must first survive”— skin in the game again; those of us who take risks have their priorities firmer than vague textbook pseudo-rationalism.
And then Taleb gets back to Buffett a little later in the book:
Let us return to Warren Buffett. He did not make his billions by cost-benefit analysis; rather, he did so simply by establishing a high filter, then picking opportunities that pass such a threshold. “The difference between successful people and really successful people is that really successful people say no to almost everything,” he said. Likewise our wiring might be adapted to “say no” to tail risk. For there are a zillion ways to make money without taking tail risk.
Those excerpts reminded me of the comments Warren Buffett and Charlie Munger have made over the years relating to filters, which I've mentioned several times on this blog. My favorite examples from Buffett are probably a comment he made in 2015
At Berkshire we have certain filters that have been developed. If in the course of a presentation or evaluation part of a proposal an idea hits a filter, then there is no way I will invest. Charlie has similar filters. We don’t worry about a lot of things as we only have to be right about a certain number of things – things that are within our circle of competence.
Typically, and this is not well understood, his way of thinking is that there are disqualifying features to an investment. So he rifles through and as soon as you hit one of those it’s done. Doesn’t like the CEO, forget it. Too much tail risk, forget it. Low-margin business, forget it. Many people would try to see whether a balance of other factors made up for these things. He doesn’t analyze from A to Z; it’s a time-waster.
I'm also grateful to Taleb for his discussion on this topic because it has made me think more clearly about a comment Charlie Munger made at the 2014 Daily Journal Annual Meeting:
There's no rule I can't have crotchets [crotchet: a perverse or unfounded belief or notion]. I don't have to be totally rational.  Don't we all do that?  We probably should, as a matter of fact.  Certainly a crotchet that says this is too hard for me, I'm not going to try to understand it.  That's a very useful crotchet.
And while Munger was using the word 'rational' as might be defined by the economics profession, his crotchets would fit Taleb's definition of rational:
Rationality does not depend on explicit verbalistic explanatory factors; it is only what aids survival, what avoids ruin. 
Why? Clearly as we saw in the Lindy discussion: 
Not everything that happens happens for a reason, but everything that survives survives for a reason. 
Rationality is risk management, period.