Friday morning, March 6th. A committee is deciding whether to fund a new initiative. The proposal is good—not perfect, but good. One by one, the members speak. Each adds a qualification: the cultural dynamics are complex, the incentive structures cut both ways, the evidence is mixed, analogous interventions have had uneven results elsewhere. All of this is true. They decide to commission further research. Six months later the opportunity is gone, a competitor moved first, and the committee members feel, with some justification, that they were being rigorous. They were not being rigorous. They were performing rigor as a way to avoid being wrong about anything.

Nuanced thinking is a genuine intellectual virtue. Nuance as identity is one of the most reliable routes to impotence available to otherwise capable people. The difference matters enormously, and the two are difficult to distinguish from the inside—which is precisely why the trap works.

The Status Function of Complexity

In most educated social environments, expressing nuanced views is a high-status move. Saying "it's complicated" signals that you've thought carefully, that you're not naive, that you've moved beyond simplistic narratives. Saying something clear and committing to it exposes you to being wrong. The asymmetry is sharp: nuance is low-risk and high-status; commitment is high-risk and neutral-status at best.

This incentive structure has predictable consequences. People learn that complexity-acknowledgment is rewarded regardless of whether it leads anywhere. The move becomes an end in itself. You get the social reward of appearing thoughtful without the cost of being accountable for an actual view.

The tell is whether nuance resolves. Real nuance is temporary—you gather complexity, hold it, and eventually synthesize it into a clearer model. "It's complicated" is a waystation, not a destination. When complexity acknowledgment becomes a permanent resting state, something has gone wrong. The person has confused the appearance of sophisticated thinking with its function.

What Nuance Is Actually For

Philip Tetlock's superforecaster research is instructive here. The best predictors of real-world events are not people who refuse to commit to positions because things are too complex. They are people who hold complexity while still generating concrete, testable predictions. They track their accuracy. They update. They commit.

Tetlock distinguishes between "foxes" and "hedgehogs"—foxes know many things and update continuously, hedgehogs know one big thing and stick to it. Foxes outperform. But notice what foxes do: they use complexity as input to a process that still produces outputs. The fox who never makes a prediction is not a sophisticated fox. It is a fox that has learned to perform foxiness while avoiding the accountability that makes foxiness valuable.

The purpose of perceiving complexity is to make more accurate models. If your nuance isn't improving your predictions or sharpening your decisions, it isn't doing its job. It's decoration.

Gerd Gigerenzer's research on fast-and-frugal heuristics makes the point from another direction. In complex real-world domains—medical diagnosis, financial forecasting, ecological prediction—simple decision rules often outperform elaborate models that incorporate more variables. Not because complexity doesn't exist, but because complex models overfit: they incorporate so much nuance that they lose the ability to generalize. The expert who says "look at these three things and ignore everything else" is not less nuanced than the expert who says "it depends on forty-seven factors." In many domains, the former is doing better science.

The Paralysis Mechanism

The mechanism by which nuance becomes paralyzing is well-understood. When you perceive multiple valid considerations pulling in different directions, decision-making requires weighting them—and weighting requires commitment to a hierarchy of values. If you're unwilling to commit to that hierarchy, you can't resolve the tension. You're stuck.

The sophisticated-sounding response to this is to keep gathering information, to consult more perspectives, to await further evidence. This response is sometimes correct: if the decision is irreversible and the information is genuinely forthcoming, waiting is right. But most decisions are not like this. Most decisions have a real cost to delay, the relevant information is already mostly available, and the "further research" being sought is the kind that would confirm a conclusion rather than inform it.

Barry Schwartz's research on choice overload documents the psychological version of this. Giving people more options, more information, more nuance about the tradeoffs between options—this reliably increases dissatisfaction with the choice they eventually make and, past a threshold, reduces the likelihood of making any choice at all. The people who choose better are often the people who decide which considerations matter and apply them—not the people who continue multiplying the considerations indefinitely.

The irony is that people who perform nuance feel more rigorous than people who commit. They have considered more. They are less exposed. But in most real-world domains, the person who thought less and acted more will produce better outcomes—not because they're smarter, but because a good-enough decision taken quickly outperforms a better decision taken too late.

"It Depends" as Evasion

There's a specific linguistic form this takes in professional environments: "it depends."

"It depends" is sometimes the right answer. If someone asks whether you should use microservices or a monolith for a new project, the real answer does depend on team size, deployment constraints, scaling requirements. A doctor who says "it depends on the patient's specific profile" before prescribing treatment is being appropriately careful.

But "it depends" is also the most socially safe thing to say when asked a direct question. It signals expertise (you know the factors it depends on), defers commitment, and cannot be wrong. It is the intellectual equivalent of a politician saying "we need to have a national conversation."

When someone with genuine expertise is asked a question in their domain, they should usually be able to tell you what it depends on and then tell you what to do given the most common cases. "It depends on X; for most situations like yours, do Y" is a useful answer. "It depends" as a complete response is evasion wearing expertise as a costume.

The clinician who says "that depends on several factors" and then gives you nothing actionable has not demonstrated depth. They've demonstrated either that they don't actually know the answer or that they've learned to protect themselves from being wrong. Neither is helpful.

The Commitment Aversion at the Core

The nuance trap is ultimately a commitment aversion. Committing to a position means being wrong. Being wrong has costs: social embarrassment, damaged credibility, the requirement to update publicly. Nuance allows you to never be committed enough to be clearly wrong.

This is comfortable and expensive. The cost is that you never help anyone. A doctor who won't diagnose isn't practicing good medicine—they're practicing defensive medicine. A manager who won't make decisions isn't being careful—they're pushing the discomfort onto everyone else. A friend who responds to every dilemma with "I can see both sides" isn't being thoughtful—they're being useless.

Real intellectual seriousness involves not just seeing complexity but taking responsibility for resolving it. You gather the considerations. You weigh them. You commit to a view. You hold that view accountable against evidence. You update when the evidence demands. This process requires accepting that you will sometimes be wrong, and that being wrong is the cost of being useful.

The people worth talking to about hard problems are the people who will tell you what they actually think. Not "there are many perspectives to consider," but "here's what I believe, and here's why." You can argue with that. You can update on it. "It's complicated" is a wall.

The Practical Adjustment

Test whether your nuance is leading somewhere. Ask of any complex view you hold: does this help me act better, predict better, or decide better? If it does, the nuance is working. If it just makes you feel sophisticated while producing no actionable output, you've drifted into performance.

Commit to positions on things you've thought about. State what you believe. Accept that you'll be wrong about some of it and update accordingly. The update record is the actual evidence of careful thinking—not the complexity of your initial formulation.

When you catch yourself saying "it's complicated" or "it depends," ask whether you're about to give a useful answer or stop at the disclaimer. If you genuinely don't know what it depends on in a way that leads somewhere, say that. If you do know but you're unwilling to commit, notice that.

The goal is not to be simple. The goal is to be useful. Complexity that doesn't resolve into something actionable isn't rigor—it's furniture.

Today's Sketch

March 06, 2026