Modernity: The Phase That Never Was

6–8 minutes

We’re told we live in the Enlightenment, that Reason™ sits on the throne and superstition has been banished to the attic. Yet when I disguised a little survey as “metamodern,” almost none came out as fully Enlightened. Three managed to shed every trace of the premodern ghost, one Dutch wanderer bypassed Modernity entirely, and not a single soul emerged free of postmodern suspicion. So much for humanity’s great rational awakening. Perhaps Modernity wasn’t a phase we passed through at all, but a mirage we still genuflect before, a lifestyle brand draped over a naked emperor.

Audio: NotebookLM podcast on this topic

The Enlightenment as Marketing Campaign

The Enlightenment is sold to us as civilisation’s great coming-of-age: the dawn when the fog of superstition lifted and Reason took the throne. Kant framed it as “man’s emergence from his self-incurred immaturity” – an Enlightenment bumper sticker that academics still like to polish and reapply. But Kant wasn’t writing for peasants hauling mud or women without the vote; he was writing for his own coterie of powdered-wig mandarins, men convinced their own habits of rational debate were humanity’s new universal destiny.

Modernity, in this story, isn’t a historical stage we all inhabited. It’s an advertising campaign: Reason™ as lifestyle brand, equality as tagline, “progress” as the logo on the tote bag. Modernity, in the textbooks, is billed as a historical epoch, a kind of secular Pentecost in which the lights came on and we all finally started thinking for ourselves. In practice, it was more of a boutique fantasy, a handful of gentlemen mistaking their own rarefied intellectual posture for humanity’s destiny.

The Archetype That Nobody Lives In

At the core of the Enlightenment lies the archetype of Man™: rational, autonomous, unencumbered by superstition, guided by evidence, weighing pros and cons with the detachment of a celestial accountant. Economics repackaged him as homo economicus, forever optimising his utility function as if he were a spreadsheet in breeches.

But like all archetypes, this figure is a mirage. Our survey data, even when baited as a “metamodern survey”, never produced a “pure” Enlightenment subject.

  • 3 scored 0% Premodern (managing, perhaps, to kick the gods and ghosts to the kerb).
  • 1 scored 0% Modern (the Dutch outlier: 17% Premodern, 0% Modern, 83% Post, skipping the Enlightenment altogether, apparently by bike).
  • 0 scored 0% Postmodern. Every single participant carried at least some residue of suspicion, irony, or relativism.

The averages themselves were telling: roughly 18% Premodern, 45% Modern, 37% Postmodern. That’s not an age of Reason. That’s a muddle, a cocktail of priestly deference, rationalist daydreams, and ironic doubt.

Even the Greats Needed Their Crutches

If the masses never lived as Enlightenment subjects, what about the luminaries? Did they achieve the ideal? Hardly.

  • Descartes, desperate to secure the cogito, called in God as guarantor, dragging medieval metaphysics back on stage.
  • Kant built a cathedral of reason only to leave its foundations propped up by noumena: an unseeable, unknowable beyond.
  • Nietzsche, supposed undertaker of gods, smuggled in his own metaphysics of will to power and eternal recurrence.
  • William James, surveying the wreckage, declared that “truth” is simply “what works”, a sort of intellectual aspirin for the Enlightenment headache.

And economists, in a fit of professional humiliation, pared the rational subject down to a corpse on life support. Homo economicus became a creature who — at the very least, surely — wouldn’t choose to make himself worse off. But behavioural economics proved even that meagre hope to be a fantasy. People burn their wages on scratch tickets, sign up for exploitative loans, and vote themselves into oblivion because a meme told them to.

If even the “best specimens” never fully embodied the rational archetype, expecting Joe Everyman, who statistically struggles to parse a sixth-grade text and hasn’t cracked a book since puberty, to suddenly blossom into a mini-Kant is wishful thinking of the highest order.

The Dual Inertia

The real story isn’t progress through epochs; it’s the simultaneous drag of two kinds of inertia:

  • Premodern inertia: we still cling to sacred myths, national totems, and moral certainties.
  • Modern inertia: we still pretend the rational subject exists, because democracy, capitalism, and bureaucracy require him to.

The result isn’t a new epoch. It’s a cultural chimaera: half-superstitious, half-rationalist, shot through with irony. A mess, not a phase..

Arrow’s Mathematical Guillotine

Even if the Enlightenment dream of a rational demos were real, Kenneth Arrow proved it was doomed. His Impossibility Theorem shows that no voting system can turn individual rational preferences into a coherent “general will.” In other words, even a parliament of perfect Kants would deadlock when voting on dinner. The rational utopia is mathematically impossible.

So when we are told that democracy channels Reason, we should hear it as a polite modern incantation, no sturdier than a priest blessing crops.

Equality and the Emperor’s Wardrobe

The refrain comes like a hymn: “All men are created equal.” But the history is less inspiring. “Men” once meant property-owning Europeans; later it was generously expanded to mean all adult citizens who’d managed to stay alive until eighteen. Pass that biological milestone, and voilà — you are now certified Rational, qualified to determine the fate of nations.

And when you dare to question this threadbare arrangement, the chorus rises: “If you don’t like democracy, capitalism, or private property, just leave.” As if you could step outside the world like a theatre where the play displeases you. Heidegger’s Geworfenheit makes the joke bitter: we are thrown into this world without choice, and then instructed to exit if we find the wallpaper distasteful. Leave? To where, precisely? The void? Mars?

The Pre-Modern lord said: Obey, or be exiled. The Modern democrat says: Vote, or leave. And the Post-Enlightenment sceptic mutters: Leave? To where, exactly? Gravity? History? The species? There is no “outside” to exit into. The system is not a hotel; it’s the weather.

Here the ghost of Baudrillard hovers in the wings, pointing out that we are no longer defending Reason, but the simulacrum of Reason. The Emperor’s New Clothes parable once mocked cowardice: everyone saw the nudity but stayed silent. Our situation is worse. We don’t even see that the Emperor is naked. We genuinely believe in the fineries, the Democracy™, the Rational Man™, the sacred textile of Progress. And those who point out the obvious are ridiculed: How dare you mock such fineries, you cad!

Conclusion: The Comfort of a Ghost

So here we are, defending the ghost of a phase we never truly lived. We cling to Modernity as if it were a sturdy foundation, when in truth it was always an archetype – a phantom rational subject, a Platonic ideal projected onto a species of apes with smartphones. We mistook it for bedrock, built our institutions upon it, and now expend colossal energy propping up the papier-mâché ruins. The unfit defend it out of faith in their own “voice,” the elites defend it to preserve their privilege, and the rest of us muddle along pragmatically, dosing ourselves with Jamesian aspirin and pretending it’s progress.

Metamodernism, with its marketed oscillation between sincerity and irony, is less a “new stage” than a glossy rebranding of the same old admixture: a bit of myth, a bit of reason, a dash of scepticism. And pragmatism –James’s weary “truth is what works” – is the hangover cure that keeps us muddling through.

Modernity promised emancipation from immaturity. What we got was a new set of chains: reason as dogma, democracy as ritual, capitalism as destiny. And when we protest, the system replies with its favourite Enlightenment lullaby: If you don’t like it, just leave.

But you can’t leave. You were thrown here. What we call “Enlightenment” is not a stage in history but a zombie-simulation of an ideal that never drew breath. And yet, like villagers in Andersen’s tale, we not only guard the Emperor’s empty wardrobe – we see the garments as real. The Enlightenment subject is not naked. He is spectral, and we are the ones haunting him.

The Scourge: They’re Really Fighting Is Ambiguity

A Sequel to “The Disorder of Saying No” and a Companion to “When ‘Advanced’ Means Genocide”

In my previous post, The Disorder of Saying No, I explored the way resistance to authority is pathologised, particularly when that authority is cloaked in benevolence and armed with diagnostic manuals. When one refuses — gently, thoughtfully, or with a sharp polemic — one is no longer principled. One is “difficult.” Or in my case, oppositional.

Audio: NotebookLM podcast on this topic.

So when I had the gall to call out Bill Maher for his recent linguistic stunt — declaring that a woman is simply “a person who menstruates” — I thought I was doing the rational thing: pointing out a classic bit of reductionist nonsense masquerading as clarity. Maher, after all, was not doing biology. He was playing lexicographer-in-chief, defining a term with centuries of philosophical, sociological, and political baggage as though it were a checkbox on a medical form.

I said as much: that he was abusing his platform, presenting himself as the sole arbiter of the English language, and that his little performance was less about clarity and more about controlling the terms of discourse.

My friend, a post-menopausal woman herself, responded not by engaging the argument, but by insinuating — as others have — that I was simply being contrary. Oppositional. Difficult. Again. (She was clearly moved by When “Advanced” Means Genocide, but may have missed the point.)

So let’s unpack this — not to win the debate, but to show what the debate actually is.

This Isn’t About Biology — It’s About Boundary Maintenance

Maher’s statement wasn’t intended to clarify. It was intended to exclude. It wasn’t some linguistic slip; it was a rhetorical scalpel — one used not to analyse, but to amputate.

And the applause from some cisgender women — particularly those who’ve “graduated” from menstruation — reveals the heart of the matter: it’s not about reproductive biology. It’s about controlling who gets to claim the term woman.

Let’s steelman the argument, just for the sport of it:

Menstruation is a symbolic threshold. Even if one no longer menstruates, having done so places you irrevocably within the category of woman. It’s not about exclusion; it’s about grounding identity in material experience.

Fine. But now let’s ask:

  • What about women who’ve never menstruated?
  • What about intersex people?
  • What about trans women?
  • What about cultures with radically different markers of womanhood?

You see, it only works if you pretend the world is simpler than it is.

The Language Insufficiency Hypothesis: Applied

This is precisely where the Language Insufficiency Hypothesis earns its keep.

The word woman is not a locked vault. It is a floating signifier, to borrow from Barthes — a term whose meaning is perpetually re-negotiated in use. There is no singular essence to the word. It is not rooted in biology, nor in social role, nor in performance. It is a hybrid, historically contingent construct — and the moment you try to fix its meaning, it slips sideways like a greased Wittgensteinian beetle.

“Meaning is use,” says Wittgenstein, and this is what frightens people.

If woman is defined by use and not by rule, then anyone might claim it. And suddenly, the club is no longer exclusive.

That’s the threat Maher and his defenders are really reacting to. Not trans women. Not intersex people. Not language activists or queer theorists.

The threat is ambiguity.

What They Want: A World That Can Be Named

The push for rigid definitions — for menstruation as membership — is a plea for a world that can be named and known. A world where words are secure, stable, and final. Where meaning doesn’t leak.

But language doesn’t offer that comfort.

It never did.

And when that linguistic instability gets too close to something personal, like gender identity, or the foundation of one’s own sense of self, the defensive response is to fortify the language, as though building walls around a collapsing church.

Maher’s defenders aren’t making scientific arguments. They’re waging semantic warfare. If they can hold the definition, they can win the cultural narrative. They can hold the gates to Womanhood and keep the undesirables out.

That’s the fantasy.

But language doesn’t play along.

Conclusion: Words Will Not Save You — but They Might Soothe the Dead

In the end, Maher’s definition is not merely incorrect. It is insufficient. It cannot accommodate the complexity of lived experience and cannot sustain the illusion of clarity for long.

And those who cling to it — friend or stranger, progressive, or conservative — are not defending biology. They are defending nostalgia. Specifically, a pathological nostalgia for a world that no longer exists, and arguably never did: a world where gender roles were static, language was absolute, and womanhood was neatly circumscribed by bodily functions and suburban etiquette.

Ozzy and Harriet loom large here — not as individuals but as archetypes. Icons of a mid-century dream in which everyone knew their place, and deviation was something to be corrected, not celebrated. My friend, of that generation, clings to this fantasy not out of malice but out of a desperate yearning for order. The idea that woman could mean many things, and mean them differently across contexts, is not liberating to her — it’s destabilising.

But that world is gone. And no amount of menstruation-based gatekeeping will restore it.

The Real Scourge Is Ambiguity

Maher’s tantrum wasn’t about truth. It was about fear — fear of linguistic drift, of gender flux, of a world in which meaning no longer obeys. The desire to fix the definition of “woman” is not a biological impulse. It’s a theological one.

And theology, like nostalgia, often makes terrible policy.

This is why your Language Insufficiency Hypothesis matters. Because it reminds us that language does not stabilise reality — it masks its instability. The attempt to define “woman” once and for all is not just futile — it’s an act of violence against difference, a linguistic colonisation of lived experience.

So Let Them Rest

Ozzy and Harriet are dead. Let them rest.
Let their picket fence moulder. Let their signage decay.

The world has moved on. The language is shifting beneath your feet. And no amount of retroactive gatekeeping can halt that tremor.

The club is burning. And the only thing left to save is honesty.

The Disorder of Saying No

A Polite Rebuttal to a Diagnosis I Didn’t Ask For

A dear friend — and I do mean dear, though this may be the last time they risk diagnosing me over brunch — recently suggested, with all the benevolent concern of a well-meaning inquisitor, that I might be showing signs of Oppositional Defiant Disorder.

You know the tone: “I say this with love… but have you considered that your refusal to play nicely with institutions might be clinical?”

Let’s set aside the tea and biscuits for a moment and take a scalpel to this charming little pathology. Because if ODD is a diagnosis, then I propose we start diagnosing systems — not people.

Audio: NotebookLM podcast on this topic.

When the Empire Diagnoses Its Rebels

Oppositional Defiant Disorder, for those blissfully unscarred by its jargon, refers to a “persistent pattern” of defiance, argumentativeness, rule-breaking, and — the pièce de résistance — resentment of authority. In other words, it is a medical label for being insufficiently obedient.

What a marvel: not only has resistance been de-politicised, it has been medicalised. The refusal to comply is not treated as an ethical stance or a contextual response, but as a defect of the self. The child (or adult) is not resisting something; they are resisting everything, and this — according to the canon — makes them sick.

One wonders: sick according to whom?

Derrida’s Diagnosis: The Binary Fetish

Jacques Derrida, of course, would waste no time in eviscerating the logic at play. ODD depends on a structural binary: compliant/defiant, healthy/disordered, rule-follower/troublemaker. But, as Derrida reminds us, binaries are not descriptive — they are hierarchies in disguise. One term is always elevated; the other is marked, marginal, suspect.

Here, “compliance” is rendered invisible — the assumed baseline, the white space on the page. Defiance is the ink that stains it. But this only works because “normal” has already been declared. The system names itself sane.

Derrida would deconstruct this self-justifying loop and note that disorder exists only in relation to an order that never justifies itself. Why must the subject submit? That’s not up for discussion. The child who asks that question is already halfway to a diagnosis.

Foucault’s Turn: Disciplinary Power and the Clinic as Court

Enter Foucault, who would regard ODD as yet another exquisite specimen in the taxonomy of control. For him, modern power is not exercised through visible violence but through the subtler mechanisms of surveillance, normalisation, and the production of docile bodies.

ODD is a textbook case of biopower — the system’s ability to define and regulate life itself through classification, diagnosis, and intervention. It is not enough for the child to behave; they must believe. They must internalise authority to the marrow. To question it, or worse, to resent it, is to reveal one’s pathology.

This is not discipline; this is soulcraft. And ODD is not a disorder — it is a symptom of a civilisation that cannot tolerate unmediated subjectivity. See Discipline & Punish.

Ivan Illich: The Compulsory Institutions of Care

Illich would call the whole charade what it is: a coercive dependency masquerading as therapeutic care. In Deschooling Society, he warns of systems — especially schools — that render people passive recipients of norms. ODD, in this light, is not a syndrome. It is the final gasp of autonomy before it is sedated.

What the diagnosis reveals is not a child in crisis, but an institution that cannot imagine education without obedience. Illich would applaud the so-called defiant child for doing the one thing schools rarely reward: thinking.

R.D. Laing: Sanity as a Political Position

Laing, too, would recognise the ruse. His anti-psychiatry position held that “madness” is often the only sane response to a fundamentally broken world. ODD is not insanity — it is sanity on fire. It is the refusal to adapt to structures that demand submission as a prerequisite for inclusion.

To quote Laing: “They are playing a game. They are playing at not playing a game. If I show them I see they are, I shall break the rules and they will punish me. I must play their game, of not seeing I see the game.”

ODD is what happens when a child refuses to play the game.

bell hooks: Refusal as Liberation

bell hooks, writing in Teaching to Transgress, framed the classroom as a potential site of radical transformation — if it rejects domination. The child who refuses to be disciplined is often the one who sees most clearly that the system has confused education with indoctrination.

Resistance, hooks argues, is not a flaw. It is a form of knowledge. ODD becomes, in this frame, a radical pedagogy. The defiant student is not failing — they are teaching.

Deleuze & Guattari: Desire Against the Machine

And then, should you wish to watch the diagnostic edifice melt entirely, we summon Deleuze and Guattari. For them, the psyche is not a plumbing system with blockages, but a set of desiring-machines short-circuiting the factory floor of capitalism and conformity.

ODD, to them, would be schizoanalysis in action — a body refusing to be plugged into the circuits of docility. The tantrum, the refusal, the eye-roll: these are not symptoms. They are breakdowns in the control grid.

The child isn’t disordered — the system is. The child simply noticed.

Freire: The Educated Oppressed

Lastly, Paulo Freire would ask: What kind of pedagogy demands the death of resistance? In Pedagogy of the Oppressed, he warns of an education model that treats students as empty vessels. ODD, reframed, is the moment a subject insists on being more than a receptacle.

In refusing the “banking model” of knowledge, the so-called defiant child is already halfway to freedom. Freire would call this not a disorder but a moment of awakening.

Conclusion: Diagnostic Colonialism

So yes, dear friend — I am oppositional. I challenge authority, especially when it mistakes its position for truth. I argue, question, resist. I am not unwell for doing so. I am, if anything, allergic to the idea that obedience is a virtue in itself.

Let us be clear: ODD is not a mirror held up to the subject. It is a spotlight shining from the system, desperately trying to blind anyone who dares to squint.

Now, shall we talk about your compliance disorder?


Full Disclosure: I used ChatGPT for insights beyond Derrida and Foucault, two of my mainstays.

The Tyranny of “Human Nature”

There is a kind of political necromancy afoot in modern discourse—a dreary chant murmured by pundits, CEOs, and power-drunk bureaucrats alike: “It’s just human nature.” As if this incantation explains, excuses, and absolves all manner of violent absurdities. As if, by invoking the mystic forces of evolution or primal instinct, one can justify the grotesque state of things. Income inequality? Human nature. War? Human nature. Corporate psychopathy? Oh, sweetie, it’s just how we’re wired.

What a convenient mythology.

Audio: NotebookLM podcast on this topic.

If “human nature” is inherently brutish and selfish, then resistance is not only futile, it is unnatural. The doctrine of dominance gets sanctified, the lust to rule painted as destiny rather than deviance. Meanwhile, the quiet, unglamorous yearning of most people—to live undisturbed, to coöperate rather than conquer—is dismissed as naïve, childish, and unrealistic. How curious that the preferences of the vast majority are always sacrificed at the altar of some aggressive minority’s ambitions.

Let us dispense with this dogma. The desire to dominate is not a feature of human nature writ large; it is a glitch exploited by systems that reward pathological ambition. Most of us would rather not be ruled, and certainly not managed by glorified algorithms in meat suits. The real human inclination, buried beneath centuries of conquest and control, is to live in peace, tend to our gardens, and perhaps be left the hell alone.

And yet, we are not. Because there exists a virulent cohort—call them oligarchs, executives, generals, kings—whose raison d’être is the acquisition and consolidation of power. Not content to build a life, they must build empires. Not content to share, they must extract. They regard the rest of us as livestock: occasionally troublesome, but ultimately manageable.

To pacify us, they offer the Social Contract™—a sort of ideological bribe that says, “Give us your freedom, and we promise not to let the wolves in.” But what if the wolves are already inside the gates, wearing suits and passing legislation? What if the protection racket is the threat itself?

So no, it is not “human nature” that is the problem. Cancer is natural, too, but we don’t celebrate its tenacity. We treat it, research it, and fight like hell to survive it. Likewise, we must treat pathological power-lust not as an inevitability to be managed but as a disease to be diagnosed and dismantled.

The real scandal isn’t that humans sometimes fail to coöperate. It’s that we’re constantly told we’re incapable of it by those whose power depends on keeping it that way.

Let the ruling classes peddle their myths. The rest of us might just choose to write new ones.