Let’s play a game.
See this coin? This coin has an equal chance of landing heads or tails. We start flipping the coin and recording the results. After 100 flips, we have 99 heads.
What is the probability that the next flip will be heads?
Doctor Joe, proudly wearing his shirt and tie, says “50 percent! Obviously. Each flip is an independent event, and each flip has a 50% probability of heads or tails.”
Doctor Joe took some statistics during his undergraduate studies.
His logic makes complete sense. But you have a feeling that the next flip is going to be heads, and that feeling is close to 100%.
We flip the coin. No surprise. Heads.
Real Life and Games
You were almost 100% less wrong than Doctor Joe. Over the long term this adds up, while Doctor Joe blows up.
The high IQ individual, the “nerd”, or the sucker would say 50% owing to the explanation of the preset ground rules. They hold these grounds rules to be completely certain. In a game, you know all the rules and there is complete certainty of the boundaries. When solving problems on papers and exams, you are given complete information and predetermined available tools. This practice is ingrained through academia.
In real life, certainty is an illusion. The real world is opaque. There are hidden variables, costs, and risks. There are probabilities within the confines of theory and games. These probabilities instill naive confidence in those who try to apply them in real life. No economist had lognormal economic models that predicted oil prices to be less than zero, like what happened on 4/20/2020 – literally none. As much as we need certainty, the real world’s hidden variables cannot be accounted for on paper.
The coin was likely loaded. You had that feeling and didn’t ignore it. It was outside the bounds of the game. It was real life.
Doctor Joe and those who think like him – these are the suckers. They force their rules, models, and “certainties” onto the real world, and claim ignorance when counter-evidence arrives. What is more wrong – their models or reality?
This is what Nassim Taleb calls the Ludic Fallacy – where someone conflates the known rules of a game with the unknown rules of the real world1.
Decision-making and the Ludic Fallacy
What does this have to do with decision-making?
When your data and actions do not align with real world results, like your P&L, there’s likely something amiss – an unknown variable, risk, or distraction. You have uncertainty, and you will always have it. Under uncertainty, assume more risk than your data and models show. This guides our decision making to protect against unknown risk. If you’re able to survive when you meet the unknowns, and you WILL meet unknowns, you’ll be the one left standing to capture opportunities.
Probabilities and ideas that work within the confines of a simple system will continue to obey the preset rules. Once applied to the complexities of the real world, the probabilities change due to unknown variables and their interactions.
Surviving, then thriving
To ensure survival, simplify the complex.
For you and your workspace, prime the environment and take actions that reduce complexity and therefore decreases exposure to the downsides of randomness. A reduction in complexity is a reduction in uncertainty. Reducing the number of variables decreases the amount of interactions and their unknown results, thereby domesticating uncertainty. This leads to an overall decrease in risk exposure.
The less you are exposed to risk, the more time you have exposed to benefits and opportunities.
When developing or creating a process, start simple. Start bottom-up using trial and error. Small failure after failure, you increase robustness of your process through each iteration. You throw out the bad and keep the good, building only from what works. This is in contrast to a top-down approach, building and implementing a complex system before starting simple. When you start with a pre-built system or solution, you accept a lack of understanding of where it fails, thus assuming hidden risk.
You may start out ahead and confident using a top-down approach, but the unseen risk may break your system. This is the real world after all.
Working bottom up, you are exposing yourself to the upsides of randomness as well. When you encounter a favorable result, you have the option of moving forward with it.
Most discoveries are not found teleologically, but by mistake. Just like human kind’s greatest medical advancements, penicillin and Viagra, most huge discoveries are made by accident. Penicillin was discovered because of a contamination in a bacteria culture. Viagra was initially intended to treat chest pain. Instead, it’s used as a cure for what most call “limp dick”, “whisky dick”, and other confidence-related issues. Both are examples of unexpected discoveries and opportunities made through building bottom-up. There was no aim to create the world’s first antibiotic or boner pill. They were solutions found with time. You only get time for discovery if your system ensures you survive.
Subtraction is greater than addition
We have to undo a lot of the complexity created by academics and their top-down theory-based approaches. We’ve been conditioned to always add, creating more complex interactions, processes, and theories. These complexities expose us to more uncertainty and hidden risk. To shelter ourselves from hidden risk, subtract. Remove complexities, expose yourself to the small risks, and build from there.
Top-down vendors sell you the “right” answer. Having the right answer is fragile. One error or counter-example and the “right” answer falls apart. On the other hand, knowing what’s wrong keeps you moving forward, protecting you from downside as you explore the up. Don’t be fooled by what you’ve learned in academia and games, and succumb to the guise of certainty. The real world always tells the truth.
Thanks for reading!
- [Affiliate Link] The Black Swan
Before you go, join my mailing list so we can stay connected: