Joan Westenberg

August 5, 2025

You Don't Need More Input. You Need Fewer Assumptions.

People ask me which podcasts I listen to, which books I’m reading, which thinkers I follow. Sometimes they ask because they want recommendations, but often it’s something else: a belief that consuming the right combination of content will produce the right worldview. If they listen to enough Lex Fridman and read enough Taleb and browse enough LessWrong, maybe they'll arrive at the correct opinions. Like a chemical reaction reaching equilibrium. 

But ideas don't work that way. 

What you read matters far less than what you bring to the reading. A good input filtered through a bad frame gives you garbage. And most people’s frames are packed with silent, unexamined assumptions.

Francis Bacon called these "idols": illusions of the tribe, the cave, the marketplace, and the theater. Centuries later, Kahneman and Tversky rebranded them as cognitive biases. But the older framing might be more useful. An idol is something sacred, something you protect. Assumptions are often too comforting to discard. That’s what makes them dangerous. They're not unconscious because you're unaware of them. They're unconscious because you've trained yourself not to look.

People yell about climate models and culture wars without realizing that what they really disagree on is baseline trust in institutions. Or baseline trust in statistical models. Or the value of precaution over risk tolerance. No dataset will resolve those disagreements, because the conflict is happening upstream of the data. It's happening inside the frame.

You see the same thing in productivity culture. People try new apps, methods, and systems every month, convinced that the right calendar tool will finally fix their life. But the tools don't matter. What's upstream is usually an assumption about how much control one ought to have over time in the first place. Or a belief that output is proof of worth. Or a fear of stillness. You don't need more GTD inputs. You need to question what you're trying to manage.

The temptation is to assume that more information will clarify things. Sometimes it does. But more often, it reinforces the wrong lens. Reading a hundred articles won't help if you're interpreting all of them through the same flawed model. Like Ahab chasing the whale, you'll just end up more certain, more doomed. Sometimes the only cure for obsessive interpretation is subtraction. Less data, fewer takes, fewer defaults. A beginner mind, if you like Zen. Or in Augustine's terms, a heart unencumbered.

You don't need another input. You need to dethrone the idols. Or at least, see where they sit.

About Joan Westenberg

I write about tech + humans.