Gustave Le Bon’s The Crowd: A Study of the Popular Mind (1895) has had the half-life of uranium. His thesis is simple and disturbing: rational individuals become irrational when they merge into a crowd. The crowd hypnotises, contagion spreads, and reason dissolves in the swell of collective emotion.
It’s neat. It’s elegant. It’s also far too flattering.
Audio: NotebookLM podcast on this topic.
Le Bon assumes that the default state of the human being is rational, some Enlightenment holdover where citizens, left to their own devices, behave like tidy mini-Kants. Then, and only then, do they lose their reason in the mob. The trouble is that a century of behavioural science has made it painfully clear that “rational man” is a fairy tale.
Daniel Kahneman mapped our cognitive machinery in Thinking, Fast and Slow: System 1 (fast, intuitive, emotional) is running the show while System 2 (slow, deliberate, logical) is mostly hired as the PR manager after the fact. Dan Ariely built a career documenting just how predictably irrational we are – anchoring, framing, sunk-cost fallacies, you name it. Add in Tversky, Thaler, Gigerenzer, and the usual suspects, and the picture is clear: we don’t need a crowd to become irrational. We start irrational, and then the crowd amplifies it.
In that sense, Le Bon wasn’t wrong about crowds being dangerous, but he may have missed the darker point. A crowd doesn’t corrupt a rational base – it accelerates the irrational baseline. It’s not Jekyll turning into Hyde; it’s Hyde on a megaphone.
This matters because if you take Le Bon literally, the problem is situational: avoid crowds and you’ll preserve reason. But if you take Kahneman seriously, the problem is structural: crowds only reveal what was already there. The positive feedback loop of group psychology doesn’t replace rationality; it feeds on the biases, illusions, and shortcuts already baked into the individual mind.
Le Bon handed us the stage directions for mass manipulation. Modern behavioural economics shows us that the script was already written in our heads before we ever left the house. Put the two together and you have the perfect recipe for political spectacle, advertising, and algorithmic nudges.
Which makes Le Bon’s century-old observations correct – but not nearly bleak enough.
Are you rational, or merely rehearsing your tribe’s catechism? Bayes’ theorem insists we should all update our beliefs the same way when presented with the same evidence. Yet in today’s political divide, identical events harden opposing convictions. The problem isn’t the math—it’s the priors. When your starting assumptions are inherited, acculturated, or indoctrinated, no amount of “evidence” will move you into enemy territory.
A Bayesian Sketch of the Divide
Let be a contested claim (pick your poison: “the election was fair,” “immigration helps,” whatever).
People in Camp R and Camp B begin with different priors and . That’s acculturation if you’re being polite, indoctrination if you’ve run out of patience.
They observe evidence (news, a court ruling, a video clip, a statistic).
They update:
posterior odds = prior odds ×
Except they don’t, not cleanly, because trust in sources warps the likelihoods.
Video: Jonny Thompson on Bayes’ Theorem. I love Jonny’s content, which is why I reference it so often. He and I have such different philosophical worldviews. Vive la différence (or différance).
Why this locks in polarisation
1. Wildly different priors. If Camp R starts at and Camp B at , then even moderately pro- evidence (say likelihood ratio ) yields:
R: prior odds
B: prior odds
Same evidence, one camp “settled,” the other still unconvinced. Repeat ad infinitum, preferably on primetime.
2. Identity-weighted likelihoods. People don’t evaluate ; they evaluate . Disconfirming evidence is down-weighted by a factor . This is called “being rational” on your own planet and “motivated reasoning” on everyone else’s.
3. Different hypothesis sets. Camps don’t just disagree on ; they entertain different s. If one side’s model includes “coordinated elite malfeasance” and the other’s does not, then identical data streams update into different universes.
4. Selective exposure = selection bias. Evidence isn’t i.i.d.; it’s curated by feeds, friends, and fury. You are sampling from your own posterior predictive distribution and calling it “reality.”
5. Asymmetric loss functions. Even if beliefs converged, choices won’t. If the social cost of dissent is high, the decision threshold moves. People report a “belief” that minimises ostracism rather than error.
6. No common knowledge, no convergence. Aumann told us honest Bayesians with common priors and shared posteriors must agree. Remove either—common priors or the “we both know we both saw the same thing” bit—and you get the modern news cycle.
“Acculturation” vs “Indoctrination”
Same mechanism, different moral valence. Priors are installed by families, schools, churches, unions, algorithms. Call it culture if you approve of the installers; call it indoctrination if you don’t. The probability calculus doesn’t care. Your tribal totems do.
Two quick toy moves you can use in prose
Likelihood hacking: “When evidence arrives, the tribe doesn’t deny the datum; it edits the likelihoods. ‘If my side did it, it’s an outlier; if your side did it, it’s a pattern.’ This is not hypocrisy; it’s a parameter update where the parameter is loyalty.”
Posterior divergence despite ‘facts’: “Give two citizens the same court ruling. One updates towards legitimacy because courts are reliable; the other away from legitimacy because courts are captured. The ruling is constant; the trust vector is not.”
If one wanted to reduce the split (perish the thought)
Forecast, don’t opine. Run cross-camp prediction markets or calibration tournaments. Bayes behaves when you pay people for accuracy rather than performance art.
Adversarial collaboration. Force both sides to pre-register what evidence would move them and how much. If someone’s for disconfirming evidence is effectively zero, you’ve identified faith, not inference.
Reference classes, not anecdotes. Pull arguments out of the single-case trap and into base-rate land. Yes, it’s boring. So is surgery, but people still do it.
The punchline
Polarisation isn’t the failure of reason; it’s what happens when reason is strapped to identity. Priors are social. Likelihoods are political. Posteriors are performative. You can call it acculturation if you want to feel civilised, indoctrination if you want to throw a brick, but either way you’re watching Bayes’ theorem run inside a culture war. The maths is sober; the humans are not.
This clip of Rachel Barr slid into my feed today, fashionably late by a week, and I thought it deserved a little dissection. The video wouldn’t embed directly – Instagram always has to be precious – so I downloaded it and linked it here. Don’t worry, Rachel, I’m not stealing your clicks.
Now, the United States. Or rather, the United States In Name Only – USINO. A nation perpetually rebranding itself as a “union” whilst its citizens claw at each other like alley cats in a bin fire. Yes, divisions abound – economic, racial, ideological, pick your poison – but some fissures cut to the bone. Today’s example: Charlie Kirk and the rabid congregation of defenders he’s managed to cultivate.
Audio: NotebookLM podcast on this topic.
The Competing Liturgies
To hear one camp tell it, Kirk is no hater at all. He’s a gentle, God-soaked soul, brimming with Christian love and trying – halo tilted just so – to shepherd stray sheep toward Our Lord and Saviour™. A real Sunday-school sweetheart.
But this is not, shockingly, the consensus. The other camp (my camp, if disclosure still matters in a post-truth age) see him as a snarling opportunist, a huckster of hate packaged in the familiar varnish of patriotism and piety. In short: a hate-merchant with a mailing list.
Spectacle as Weapon
I’ve watched Kirk at work. He loves to stage “debates” – quotation marks mandatory – where a token dissenter is dropped into an amphitheatre of loyalists. It’s the rhetorical equivalent of feeding Christians to lions, except the lions roar on cue and the crowd thinks the blood is wine. He laces misogyny, racism, and reheated premodern dogma into cheap soundbites, and the audience laps it up as though they were attending a revival. For the believers, it’s a festival. For everyone else, it’s a hostile takeover of public discourse.
Deaf Ears, Loud Mouths
Here’s the rub: Cohort A doesn’t perceive his words as hate because they already share the operating system. It’s not hate to them – it’s common sense. Cohort B, meanwhile, hears every syllable as the screech of a chalkboard dragged across the public square. Same words, two worlds.
And when I dare to suggest that if you can’t hear the hatred, you might just be complicit in it, the pushback is instantaneous: Stop imposing your worldview! Which is rich, since their worldview is already blaring through megaphones at tax-exempt rallies. If my worldview is one that insists on less hate, less dehumanisation, less sanctified bullying, then fine, I’ll take the charge.
The deeper accusation, though, is almost comic: that I’m hallucinating hate in a man of pure, lamb-like love. That’s the gaslighting twist of the knife – turning critique into pathology. As if the problem isn’t the bile spilling from the stage but my faulty perception of it.
Perspective is everything, yes – but some perspectives reek of wilful blindness.
Humans can’t seem to stop clawing after morality. The primates among us chuck cucumbers when their neighbours get grapes, and the rest of us grumble about fairness on social media. The impulse is practically universal, an evolutionary quirk that kept us from throttling each other long enough to raise children and build cities.
Image: A seemingly perturbed capuchin monkey.
But universality is not objectivity. Just because every ape howls about fairness doesn’t mean “Justice” floats somewhere in Platonic space, waiting to be downloaded. It only means we’re the kind of animal that survives by narrating rules and enforcing them with shunning, shame, or, when necessary, cudgels.
Audio: NotebookLM podcast on this topic.
This is where Alasdair MacIntyre trips over his own robes. After Virtue skewers Enlightenment rationalists who tried to prop morality on reason, it then dismisses Nietzsche for being “irrational.” MacIntyre’s fix? Resurrect Aristotle’s teleology. If reason can’t save morality, maybe an ancient oak tree can. But this is wish-thinking with a Greek accent. He’s still arguing by reason that reason can’t do the job, then sneaking back in through Aristotle’s back door with a “firmer ground.” Firmer only because he says so.
Nietzsche, at least, had the decency to call the bluff: no telos, no floor, no cosmic anchor. Just will, style, and the abyss. Uncomfortable? Absolutely. Honest? Yes.
Deleuze went further. He pointed out that morality, like culture, doesn’t look like a tree at all. It’s a rhizome: tangled, proliferating, hybridising, never grounded in a single root. The fragments MacIntyre despairs over aren’t evidence of collapse. They’re evidence of how moral life actually grows—messy, contingent, interconnected. The only reason it looks chaotic is that we keep demanding a trunk where only tubers exist.
So here we are, apes with a craving for rules, building cities and philosophies on scaffolds of habit, language, and mutual illusion. We are supported as surely as the Earth is supported – by nothing. And yet, we go on living.
The need for morality is real. The yearning for telos is real. The floor is not.
This isn’t a political post. It’s about language, the insufficiency of it, and the games we play when pretending words carry more weight than they do.
Luigi Mangione is the man accused of killing UnitedHealthcare CEO Brian Thompson. After his arrest, prosecutors stacked the usual charges – murder, firearms, assorted legal bric-a-brac – then added the cherry on top: domestic terrorism.
Audio: NotebookLM podcast on this topic.
Recently, a pretrial judge cut the cherry loose.
Murder, yes. Terrorism, no. Not because murder is less grotesque, but because the statutory definition won’t stretch that far without breaking.
NEW YORK, Sept 16 (Reuters) – A New York state judge dismissed on Tuesday two terrorism-related counts against Luigi Mangione over the December 2024 killing of health insurance executive Brian Thompson, though the 27-year-old remains charged with second-degree murder and eight other criminal counts in the case.
“There was no evidence presented of a desire to terrorize the public, inspire widespread fear, engage in a broader campaign of violence, or to conspire with organized terrorist groups,” Judge Gregory Carro found in a 12-page written decision (pdf). “Here, the crime – the heinous, but targeted and discrete killing of one person – is very different from the examples of terrorism set forth in the statute.” (source)
The prosecution insisted the label fit. The judge disagreed. Cue outrage, applause, and confusion. The crime is still horrific, but suddenly the word “terrorist” is off-limits.
The Elasticity of Terror
How can two educated parties look at the same set of facts and come to opposite conclusions? Because “terrorism” isn’t a Platonic form. It’s an elastic linguistic category. The prosecutor drags it out because “terrorist” is a magical word in American law: it inflates an already ugly act into a civilisation-level threat, unlocks harsher penalties, and lets politicians posture about national security.
The judge, however, reminded everyone that a bullet in Manhattan does not equal al-Qaeda. Murder, yes. Terrorism, no. Not because murder is less grotesque, but because the statutory definition won’t stretch that far without breaking.
Language Games, Legal Hierarchies
This is where it gets trickier. The judge isn’t merely “pulling rank”—though rank does matter. American jurisprudence is hierarchical: trial judges hand down rulings, appellate judges review them, and nine robed partisans in Washington can one day rewrite the whole script. On paper, these tiers are meant to iron out ambiguity. In practice, they multiply it.
Five minds say ‘constitutional,’ four say ‘unconstitutional,’ and the one-vote margin becomes binding law for 330 million people. That’s not truth; it’s hierarchy dressed in robes.
Even co-equal judges, reading the same facts, can diverge wildly. Split decisions at the Supreme Court prove the point: five minds say “constitutional,” four say “unconstitutional,” and the one-vote margin becomes binding law for 330 million people. That’s not the discovery of truth; it’s the triumph of one language game over another, enforced by hierarchy.
The Insufficiency Laid Bare
So we return to Mangioni. He has been charged with murder – the second degree flavour; that much is uncontested. But is he a “terrorist”? The prosecution said yes, the judge said no, and another judge, higher up or sitting elsewhere, might well say yes again. Each claim is defensible. Each is motivated by language, by politics, and by the institutional pressures of the bench.
And that’s the point. Language doesn’t tether itself to reality; it choreographs our endless arguments about reality. The law tries to tame it with hierarchies and definitions, but the seams always show. Mangioni is a murderer. Whether he is a terrorist depends less on his actions than on which interpretive dance is winning in the courtroom that day.
The Enlightenment promised a universal Reason; what we got was a carnival mirror that flatters philosophers and fools the rest of us. MacIntyre and Anscombe diagnosed the corpse with precision, but then tried to resurrect it with Aristotelian or theological magic tricks. I’m less charitable: you can’t will petrol into an empty tank. In my latest essay, I put ‘Reason’ on the slab, call in Kahneman, Hume, Nietzsche, and others as expert witnesses, and deliver the verdict: morality is a house rule, not a cosmic law. This piece is part of a larger project that includes my Language Insufficiency Hypothesis and Against Dumocracy. The Enlightenment isn’t dying – it’s already dead. We’re just cataloguing the remains.
The Enlightenment was many things: a bonfire of superstition, a hymn to autonomy, a fever dream of “Reason” enthroned. Its philosophers fancied themselves heirs to Aristotle and midwives to a new humanity. And to be fair, they were clever enough to trick even themselves. Too clever by half.
Alasdair MacIntyre, in After Virtue, plays the role of forensic pathologist with admirable precision. He shows us how the Enlightenment dynamited the teleological scaffolding of Aristotle, then tried to keep the vocabulary of virtue, duty, and rights standing in mid-air. The result: what he calls a “moral Babel,” a chorus of shrill assertions dressed up as rational law. Elizabeth Anscombe had already filed the death certificate back in 1958 with Modern Moral Philosophy, where she pointed out that our talk of “moral obligation” is just a Christian relic without a deity to enforce it. And Nietzsche, that perennial party-crasher, cheerfully declared the whole project bankrupt: once the gods are dead, “ought” is nothing but resentment pretending to be metaphysics.
And yet, when MacIntyre reaches the heart of the matter, he can’t quite let the body stay buried. He wants to reattach a soul by importing an Aristotelian telos, even summoning a “new St Benedict” to shepherd us through the ruins. It plays beautifully with those still tethered by a golden string to Aquinas and the premodern, but let’s be honest: this is just hypnosis with a Latin chorus. Descartes told us je pense, donc je suis; MacIntyre updates it to je pense, donc j’ai raison. The trouble is that thinking doesn’t guarantee rightness any more than an empty petrol tank guarantees motion. You can will fuel into existence all you like; the car still isn’t going anywhere.
The behavioral economists – Kahneman, Tversky, Ariely, Gigerenzer – have already demonstrated that human reason is less compass than carnival mirror. Jonathan Haidt has shown that our “moral reasoning” usually lags behind our gut feelings like a PR department scrambling after a scandal. Meanwhile, political practice reduces “just war” to a matter of who gets to publish the rule book. Progress™ is declared, rights are invoked, but the verdict is always written by the most powerful litigant in the room.
So yes, MacIntyre and Anscombe diagnose the corpse with impressive clarity. But then they can’t resist playing resurrectionist, insisting that if we only chant the right metaphysical formula, the Enlightenment’s heart will start beating again. My own wager is bleaker – or maybe just more honest. There is no golden thread back to Aristotle, no metaphysical petrol station in the desert. Morality is not a universal constant; it’s a set of rules as contingent as the offside law. Killing becomes “murder” only when the tribe – or the state – says so. “Life is sacred” is not a discovery but a spell, a linguistic sleight of hand that lets us kill in one context while weeping in another.
The Enlightenment wanted to enthrone Reason as our common oracle. Instead, it handed us a corpse and told us to pretend it was still breathing. My contribution is simply to keep the coroner’s mask on and say: The magic tricks aren’t working anymore. Stop looking for a metaphysical anchor that isn’t there. If there’s to be an “after,” it won’t come from another Saint Benedict. It will come from admitting that the Enlightenment died of believing its own hype – and that language itself was never built to carry the weight of gods.
Freud once quipped that people are “normal” only on average. To the degree that they deviate from the mean, they are neurotic, psychotic, or otherwise abnormal. Whatever else one thinks of Freud, the metaphor holds for Modernity.
Image: Picture and quote by Sigmund Freud: Every normal person, in fact, is only normal on the average. His ego approximates to that of the psychotic in some part or other and to a greater or lesser extent. —Analysis Terminable And Interminable (1937), Chapter V
We are “Modern” only on average, and only for the first standard deviation. Within one sigma, you can wave a flag and declare: rational, secular, Enlightened. But step further into the tails and the façade dissolves. The “normal” modern turns out to attend megachurches, consult horoscopes, share conspiracy memes, or cling to metaphysical relics that Enlightenment reason was supposed to have torched centuries ago.
„ Jeder Normale ist eben nur durchschnittlich normal, sein Ich nähert sich dem des Psychotikers in dem oder jenem Stück, in größerem oder geringerem Ausmaß. “
The problem isn’t that these people aren’t Modern. The problem is that nobody is Modern, not in the sense the story requires. The mean is an over-fitted abstraction. “Modernity” works like Freud’s “normal”: a statistical average that erases the deviations, then insists that the erased bits are pathology rather than reality.
But the tails are where most of human life actually happens. The “average Modern” is as mythical as the “reasonable person.” What we call Modernity is just a bell curve costume draped over the same mix of superstition, desire, and contingency that has always driven human behaviour.
We humans pride ourselves on being civilised. Unlike animals, we don’t let biology call the shots. A chimp reaches puberty and reproduces; a human reaches puberty and is told, not yet – society has rules. Biologically mature isn’t socially mature, and we pat ourselves on the back for having spotted the difference.
But watch how quickly that distinction vanishes when it threatens the in-group narrative. Bring up gender, and suddenly there’s no such thing as a social construct. Forget the puberty-vs-adulthood distinction we were just defending – now biology is destiny, immutable and absolute. Cross-gender clothing? “Against nature.” Transition? “You can’t be born into the wrong body.” Our selective vision flips depending on whose ox is being gored.
The same trick appears in how we talk about maturity. You can’t vote until 18. You’re not old enough to drink until 21. You’re not old enough to stop working until 67. These numbers aren’t natural; they’re paperwork. They’re flags planted in the soil of human life, and without the right flag, you don’t count.
The very people who insist on distinguishing biological maturity from social maturity when it comes to puberty suddenly forget the distinction when it comes to gender. They know perfectly well that “maturity” is a construct – after all, they’ve built entire legal systems around arbitrary thresholds – but they enforce the amnesia whenever it suits them. Nietzsche would say it plainly: the powerful don’t need to follow the rules, they only need to make sure you do.
So the next time someone appeals to “nature,” ask: which one? The nature that declares you old enough to marry at puberty? The nature that withholds voting, drinking, or retirement rights until a bureaucrat’s calendar says so? Or the nature that quietly mutates whenever the in-group needs to draw a new line around civilisation?
The truth is, “nature” and “maturity” are less about describing the world than about policing it. They’re flags, shibboleths, passwords. We keep calling them natural, but the only thing natural about them is how often they’re used to enforce someone else’s story.
Kant, bless him, thought he was staging the trial of Reason itself, putting the judge in the dock and asking whether the court had jurisdiction. It was a noble spectacle, high theatre of self-scrutiny. But the trick was always rigged. The presiding judge, the prosecution, the jury, the accused, all wore the same powdered wig. Unsurprisingly, Reason acquitted itself.
The Enlightenment’s central syllogism was never more than a parlour trick:
P1: The best path is Reason.
P2: I practice Reason.
C: Therefore, Reason is best.
It’s the self-licking ice-cream cone of intellectual history. And if you dare to object, the trap springs shut: what, you hate Reason? Then you must be irrational. Inquisitors once demanded heretics prove they weren’t in league with Satan; the modern equivalent is being told you’re “anti-science.” The categories defend themselves by anathematising doubt.
The problem is twofold:
First, Reason never guaranteed agreement. Two thinkers can pore over the same “facts” and emerge with opposite verdicts, each sincerely convinced that Reason has anointed their side. In a power-laden society, it is always the stronger voice that gets to declare its reasoning the reasoning. As Dan Hind acidly observed, Reason is often nothing more than a marketing label the powerful slap on their interests.
Second, and this is the darker point, Reason itself is metaphysical, a ghost in a powdered wig. To call something “rational” is already to invoke an invisible authority, as if Truth had a clerical seal. Alasdair MacIntyre was right: strip away the old rituals and you’re left with fragments, not foundations.
Other witnesses have tried to say as much. Horkheimer and Adorno reminded us that Enlightenment rationality curdles into myth the moment it tries to dominate the world. Nietzsche laughed until his throat bled at the pretence of universal reason, then promptly built his own metaphysics of will. Bruno Latour, in We Have Never Been Modern, dared to expose Science as what it actually is – a messy network of institutions, instruments, and politics masquerading as purity. The backlash was so swift and sanctimonious that he later called it his “worst” book, a public recantation that reads more like forced penance than revelation. Even those who glimpsed the scaffolding had to return to the pews.
So when we talk about “Reason” as the bedrock of Modernity, let’s admit the joke. The bedrock was always mist. The house we built upon it is held up by ritual, inertia, and vested interest, not granite clarity. Enlightenment sold us the fantasy of a universal judge, when what we got was a self-justifying oracle. Reason is not the judge in the courtroom. Reason is the courtroom itself, and the courtroom is a carnival tent – all mirrors, no floor.
We’re told we live in the Enlightenment, that Reason™ sits on the throne and superstition has been banished to the attic. Yet when I disguised a little survey as “metamodern,” almost none came out as fully Enlightened. Three managed to shed every trace of the premodern ghost, one Dutch wanderer bypassed Modernity entirely, and not a single soul emerged free of postmodern suspicion. So much for humanity’s great rational awakening. Perhaps Modernity wasn’t a phase we passed through at all, but a mirage we still genuflect before, a lifestyle brand draped over a naked emperor.
Audio: NotebookLM podcast on this topic
The Enlightenment as Marketing Campaign
The Enlightenment is sold to us as civilisation’s great coming-of-age: the dawn when the fog of superstition lifted and Reason took the throne. Kant framed it as “man’s emergence from his self-incurred immaturity” – an Enlightenment bumper sticker that academics still like to polish and reapply. But Kant wasn’t writing for peasants hauling mud or women without the vote; he was writing for his own coterie of powdered-wig mandarins, men convinced their own habits of rational debate were humanity’s new universal destiny.
Modernity, in this story, isn’t a historical stage we all inhabited. It’s an advertising campaign: Reason™ as lifestyle brand, equality as tagline, “progress” as the logo on the tote bag. Modernity, in the textbooks, is billed as a historical epoch, a kind of secular Pentecost in which the lights came on and we all finally started thinking for ourselves. In practice, it was more of a boutique fantasy, a handful of gentlemen mistaking their own rarefied intellectual posture for humanity’s destiny.
The Archetype That Nobody Lives In
At the core of the Enlightenment lies the archetype of Man™: rational, autonomous, unencumbered by superstition, guided by evidence, weighing pros and cons with the detachment of a celestial accountant. Economics repackaged him as homo economicus, forever optimising his utility function as if he were a spreadsheet in breeches.
But like all archetypes, this figure is a mirage. Our survey data, even when baited as a “metamodern survey”, never produced a “pure” Enlightenment subject.
3 scored 0% Premodern (managing, perhaps, to kick the gods and ghosts to the kerb).
1 scored 0% Modern (the Dutch outlier: 17% Premodern, 0% Modern, 83% Post, skipping the Enlightenment altogether, apparently by bike).
0 scored 0% Postmodern. Every single participant carried at least some residue of suspicion, irony, or relativism.
The averages themselves were telling: roughly 18% Premodern, 45% Modern, 37% Postmodern. That’s not an age of Reason. That’s a muddle, a cocktail of priestly deference, rationalist daydreams, and ironic doubt.
Even the Greats Needed Their Crutches
If the masses never lived as Enlightenment subjects, what about the luminaries? Did they achieve the ideal? Hardly.
Descartes, desperate to secure the cogito, called in God as guarantor, dragging medieval metaphysics back on stage.
Kant built a cathedral of reason only to leave its foundations propped up by noumena: an unseeable, unknowable beyond.
Nietzsche, supposed undertaker of gods, smuggled in his own metaphysics of will to power and eternal recurrence.
William James, surveying the wreckage, declared that “truth” is simply “what works”, a sort of intellectual aspirin for the Enlightenment headache.
And economists, in a fit of professional humiliation, pared the rational subject down to a corpse on life support. Homo economicus became a creature who — at the very least, surely — wouldn’t choose to make himself worse off. But behavioural economics proved even that meagre hope to be a fantasy. People burn their wages on scratch tickets, sign up for exploitative loans, and vote themselves into oblivion because a meme told them to.
If even the “best specimens” never fully embodied the rational archetype, expecting Joe Everyman, who statistically struggles to parse a sixth-grade text and hasn’t cracked a book since puberty, to suddenly blossom into a mini-Kant is wishful thinking of the highest order.
The Dual Inertia
The real story isn’t progress through epochs; it’s the simultaneous drag of two kinds of inertia:
Premodern inertia: we still cling to sacred myths, national totems, and moral certainties.
Modern inertia: we still pretend the rational subject exists, because democracy, capitalism, and bureaucracy require him to.
The result isn’t a new epoch. It’s a cultural chimaera: half-superstitious, half-rationalist, shot through with irony. A mess, not a phase..
Arrow’s Mathematical Guillotine
Even if the Enlightenment dream of a rational demos were real, Kenneth Arrow proved it was doomed. His Impossibility Theorem shows that no voting system can turn individual rational preferences into a coherent “general will.” In other words, even a parliament of perfect Kants would deadlock when voting on dinner. The rational utopia is mathematically impossible.
So when we are told that democracy channels Reason, we should hear it as a polite modern incantation, no sturdier than a priest blessing crops.
Equality and the Emperor’s Wardrobe
The refrain comes like a hymn: “All men are created equal.” But the history is less inspiring. “Men” once meant property-owning Europeans; later it was generously expanded to mean all adult citizens who’d managed to stay alive until eighteen. Pass that biological milestone, and voilà — you are now certified Rational, qualified to determine the fate of nations.
And when you dare to question this threadbare arrangement, the chorus rises: “If you don’t like democracy, capitalism, or private property, just leave.” As if you could step outside the world like a theatre where the play displeases you. Heidegger’s Geworfenheit makes the joke bitter: we are thrown into this world without choice, and then instructed to exit if we find the wallpaper distasteful. Leave? To where, precisely? The void? Mars?
The Pre-Modern lord said: Obey, or be exiled. The Modern democrat says: Vote, or leave. And the Post-Enlightenment sceptic mutters: Leave? To where, exactly? Gravity? History? The species? There is no “outside” to exit into. The system is not a hotel; it’s the weather.
Here the ghost of Baudrillard hovers in the wings, pointing out that we are no longer defending Reason, but the simulacrum of Reason. The Emperor’s New Clothes parable once mocked cowardice: everyone saw the nudity but stayed silent. Our situation is worse. We don’t even see that the Emperor is naked. We genuinely believe in the fineries, the Democracy™, the Rational Man™, the sacred textile of Progress. And those who point out the obvious are ridiculed: How dare you mock such fineries, you cad!
Conclusion: The Comfort of a Ghost
So here we are, defending the ghost of a phase we never truly lived. We cling to Modernity as if it were a sturdy foundation, when in truth it was always an archetype – a phantom rational subject, a Platonic ideal projected onto a species of apes with smartphones. We mistook it for bedrock, built our institutions upon it, and now expend colossal energy propping up the papier-mâché ruins. The unfit defend it out of faith in their own “voice,” the elites defend it to preserve their privilege, and the rest of us muddle along pragmatically, dosing ourselves with Jamesian aspirin and pretending it’s progress.
Metamodernism, with its marketed oscillation between sincerity and irony, is less a “new stage” than a glossy rebranding of the same old admixture: a bit of myth, a bit of reason, a dash of scepticism. And pragmatism –James’s weary “truth is what works” – is the hangover cure that keeps us muddling through.
Modernity promised emancipation from immaturity. What we got was a new set of chains: reason as dogma, democracy as ritual, capitalism as destiny. And when we protest, the system replies with its favourite Enlightenment lullaby: If you don’t like it, just leave.
But you can’t leave. You were thrown here. What we call “Enlightenment” is not a stage in history but a zombie-simulation of an ideal that never drew breath. And yet, like villagers in Andersen’s tale, we not only guard the Emperor’s empty wardrobe – we see the garments as real. The Enlightenment subject is not naked. He is spectral, and we are the ones haunting him.