A great many smart people terminably hamper their ability to better understand the world by refusing to accept defeat when reality proves them wrong. Armed with an intellect that's at once both too proud to recognize it's own failings and cunningly capable of producing sophisticated excuses, they're adapt at spotting this failing in others but not in themselves.
You see this all the time in the realm of public policy. Someone presents a plausible thesis on how to deal with a given problem. They drum up support for an attempt that follows their ideas, but when the attempt is defeated by reality, they can't retreat, and become stuck trying to defend the path in ever-more creative yet ludicrous ways.
Watch the cadre of "harm reduction" activists attempt to defend the increasingly decrepit state of San Francisco, for example. I don't actually have a problem with the genesis of the original thesis. That maybe you could indeed help people off the streets and off the drugs by refusing to forcefully intervene and by only offering help where it was wanted. Maybe the problem was indeed just that there wasn't enough money going into these programs.
Except, no. As Michael Shellenberger originally documented so well in San Fransicko, that thesis has just not panned out. Reality has revealed something very different, and anyone who've paid even casual attention to the state of that once proud city can attest to the consequences. It just didn't work!
This is where the fork in the road is usually met. Lots of average people without PhDs in a social science subgenre is able to believe their own eyes when reality puts on such a vivid show. While plenty of very smart people, as defined by academic credentials or political prowess, can't seem to do the same. Believing your own eyes is a skill that actually appears less reliable the higher up you go the intellectual tree of knowledge.
Isn't that curious!
But we don't even have to swim into the hot waters of politics and social policy to see the effects of this syndrome. It's all around us in technology and in business too. People falling in love with their favorite thesis or falling into hate of their most despised character. Then blocking out any ability to correct course in a timely manner when reality reveals the truth.
We're all liable to this. The smarter we are, the more creative we can get at coming up with those self-deceivingly compelling rationalizations for why, actually, in this one case, the world is not what it appears.
The best way I've found to break out of this loop is to look at the longer game. There is never one single policy, one single business experiment, or one single technical argument so worth saving that you want to risk not learning from reality. And rolling those learnings into the next thousand decisions and analysis you have to do before you get to the end is worth far more.
Going the distance means eating your intellectual losses. Accepting that reality is the referee. And the prize is that you get to keep playing, and keep getting better.
You see this all the time in the realm of public policy. Someone presents a plausible thesis on how to deal with a given problem. They drum up support for an attempt that follows their ideas, but when the attempt is defeated by reality, they can't retreat, and become stuck trying to defend the path in ever-more creative yet ludicrous ways.
Watch the cadre of "harm reduction" activists attempt to defend the increasingly decrepit state of San Francisco, for example. I don't actually have a problem with the genesis of the original thesis. That maybe you could indeed help people off the streets and off the drugs by refusing to forcefully intervene and by only offering help where it was wanted. Maybe the problem was indeed just that there wasn't enough money going into these programs.
Except, no. As Michael Shellenberger originally documented so well in San Fransicko, that thesis has just not panned out. Reality has revealed something very different, and anyone who've paid even casual attention to the state of that once proud city can attest to the consequences. It just didn't work!
This is where the fork in the road is usually met. Lots of average people without PhDs in a social science subgenre is able to believe their own eyes when reality puts on such a vivid show. While plenty of very smart people, as defined by academic credentials or political prowess, can't seem to do the same. Believing your own eyes is a skill that actually appears less reliable the higher up you go the intellectual tree of knowledge.
Isn't that curious!
But we don't even have to swim into the hot waters of politics and social policy to see the effects of this syndrome. It's all around us in technology and in business too. People falling in love with their favorite thesis or falling into hate of their most despised character. Then blocking out any ability to correct course in a timely manner when reality reveals the truth.
We're all liable to this. The smarter we are, the more creative we can get at coming up with those self-deceivingly compelling rationalizations for why, actually, in this one case, the world is not what it appears.
The best way I've found to break out of this loop is to look at the longer game. There is never one single policy, one single business experiment, or one single technical argument so worth saving that you want to risk not learning from reality. And rolling those learnings into the next thousand decisions and analysis you have to do before you get to the end is worth far more.
Going the distance means eating your intellectual losses. Accepting that reality is the referee. And the prize is that you get to keep playing, and keep getting better.