radex.io

aboutarchiveetsybooksmastodontwitter

Fooled by Randomness

← Back to booksAmazonAudible

Ramblings about randomness and how our minds aren’t properly equipped to deal with it. Most people don’t understand the nature of randomness very well. But even those who do can easily be fooled by it, because our minds are flawed and emotional and haven’t evolved to live in highly random, high-information environments like ours today.

Context: Nassim Taleb is a derivatives trader. He specialized in making money on rare but extreme events (the black swan events), believing that most traders don’t understand their nature well enough to appreciate the value of those positions.

When I say “ramblings”, I mean that the book, though full of wisdom, lacks an obvious structure. This made it difficult to compile these notes.

See also: Naked Statistics (AmazonAudible)


Sampling rate. In a highly random environment, the greater the sampling rate, the less information you actually get. If you sample, say, a stock price every second, almost all of the data you see is pure noise. If you sample every half a year, noise largely averages out, and most of what you see is actual information. Time eliminates randomness. This is counter-intuitive, as technology drives us to collect more data with more precision. But our emotional brains have a hard time interpreting such data correctly. If you’re investing passively, you don’t want to check your portfolio too often — it will just cause your emotions to be tested, largely by noise.

Asymmetric distribution. The bell curve is a nice, idealized model. But in reality, a lot of events are distributed in an asymmetric manner. In such an environment, mere probability of events becomes meaningless. Consider: 70% probability of a price rising just a little bit vs 30% of falling by a lot. A rise is the most probable outcome, but the expectation is that it falls.

The structure of randomness favors in the short term those who are unfit to survive in the long term. Wall Street judges traders based on profit, not tenure. This is foolish. You can make a lot of money by following a strategy that works most of the time, but that will wipe out more money than you’ve ever made in one of the rare events that happen every 5 or 10 years.

What’s at play here, is of course, survivorship bias. You only see those irresponsible traders that have not yet blown up. You see only the profits, and you invent characteristics that are evidence of their greatness (confirmation bias). Once they have blown up, you judge them as having stopped doing what brought them to success before.

Accurately judging probabilities in the real world is impossible. We often just don’t know what the distribution of events is (for example, we think something is impossible because it never happened before). And often, the distribution keeps changing from underneath us (just because something has always been true doesn’t mean it will always be true).

On the limits of empiricism and inductive reasoning. Karl Popper. You can inspect 5000 sheep, and see that are all white, but you can never conclude with 100% certainty and all sheep are white. (Unless you know for a fact that you’ve inspected all sheep that exist.) But it takes just one black sheep to falsify the statement that all sheep are white. So, in a way, you can only falsify statements empirically, but you can never really prove them. We shouldn’t be taking science too seriously because of this. There’s too many opportunities for the limited information to fool us. Social sciences are particularly prone to this. (On the other hand, the mark of good science is its falsifiability. If a theory cannot be falsified, it’s not really scientific.)

Normative sciences are kinda bullshit. It doesn’t study the reality as it is, but as it “should” be. For example, rational economy, “studying” economy with the assumption that humans are rational and will follow the mathematical models that are in their best interest to follow. But we’re not.

Beware literary intellectuals. A very different breed from scientific intellectual. The former can say things that sound smart to a lay person, but the latter actually cares about the process of finding the truth through evidence.

Sample size matters. If someone does something extraordinary, but they’re one for thousands who didn’t, perhaps that’s not so extraordinary. But if it’s one in ten, then pay attention. (The smaller the sample size, the greater the probability that an outlier is not random.)

Example of this in trading: if you start with a sample of 1000 traders, all of whom are incompetent (their profit expectation is negative), you’d still expect to find a few traders who will make money 5 years in a row entirely due to randomness. (Sample size + survivorship bias)

Idea for a con: send 10000 letters to people, predicting a market movement. Tell half the people it will go up, the other half it will go down. Repeat next month, sending 5000 letters to those people whom you told the right movement. 300 people will get letters correctly predicting market movements 5 months in the row. Having proved your market brilliance, ask them to give you their money to invest. “This can’t be a coincidence”, one can be fooled to think, not knowing the sample size.

Conditional probabilities. Say life expectancy is 80 years. You’re 65. Does it mean you should expect to live 15 years more? No. 80 years is the expected lifespan at the moment you’re born. The conditional life expectancy for someone who’s 65 is going to be more than 80 years. To show the absurdity: if you were 90, your life expectancy wouldn’t be negative 10 years, it would still be a positive number.

Small world effect. Surprising only because we’re fooled by randomness.
Birthday paradox. Not a real paradox.
Data mining. With enough data, you’ll find spurious correlations.

Emotions. You become very successful. You’re the top 1%. But then you move into the rich neighborhood. And you’re the least-wealthy one of the families living there. Nothing changed, but suddenly, you feel like shit. Contrasting yourself to them, you feel like a failure. Accept your irrationality as a human being. You don’t get to feel good in such an environment with rationality. Just don’t move into the super-rich neighborhood in the first place. (You can make plenty of more mundane examples of this.)

Perceiving absolutes vs change. Our emotions are geared towards loss aversion, without consideration to the how well we are in absolute terms. For example, if you made $500k two years in row, you’d feel very good about it. But if you made $2M one year, and then lost $1M the other year, you’d likely feel devastated, a failure for having lost half your money. Even though the reality is the same in both scenarios!

Mild unpredictability can be a good thing when emotions are involved. Example: a currency is pegged to another currency. In a brief moment of weakness, a central bank fails to maintain the peg. This signals that devaluation is coming, everybody reacts to that belief, and the prophecy fulfills itself. But if there was no promise of a peg — instead, that a certain range will be maintained — a slight deviation would not cause panic. Another example: if you always act predictably, for example, always taking the same amount of abuse before snapping, some people will learn to take advantage of you.

Why read news? Journalists aren’t historians, most of what they’re serving is intellectual junk food. You’re seeding yourself with a lot of noise, most of which won’t matter in a week. When you’re listening to expert commentary on, say, stocks, most of the time you’re listening to someone inventing post-fact reasoning to randomness. You’re letting your emotions go wild because of the noisy information. What’s the point? The things that really matter will get to you anyway.

Endownment effect. We ascribe more value to things we already own than to the same things we don’t yet own. Path dependence of beliefs is one form of that. We become married to the beliefs and opinions we already hold. Once we’ve invested in a belief, we’re stubborn to change that belief, even if it’s in our best interest to do so.

We’re flawed. Not just imperfect, but fundamentally flawed. We’re only humans, irrational, emotional animals. We need tricks, mind hacks, not moralizing lectures. Our rational minds can do only so much to influence our emotions directly. It’s far better to understand our limitations, and rationally create an environment in which our emotions are less likely to fool us into making stupid decisions.

← Back to booksAmazonAudible