A Critique of Reason (Not to Be Confused with Kant’s)

2–3 minutes

Kant, bless him, thought he was staging the trial of Reason itself, putting the judge in the dock and asking whether the court had jurisdiction. It was a noble spectacle, high theatre of self-scrutiny. But the trick was always rigged. The presiding judge, the prosecution, the jury, the accused, all wore the same powdered wig. Unsurprisingly, Reason acquitted itself.

The Enlightenment’s central syllogism was never more than a parlour trick:

  • P1: The best path is Reason.
  • P2: I practice Reason.
  • C: Therefore, Reason is best.

It’s the self-licking ice-cream cone of intellectual history. And if you dare to object, the trap springs shut: what, you hate Reason? Then you must be irrational. Inquisitors once demanded heretics prove they weren’t in league with Satan; the modern equivalent is being told you’re “anti-science.” The categories defend themselves by anathematising doubt.

The problem is twofold:

First, Reason never guaranteed agreement. Two thinkers can pore over the same “facts” and emerge with opposite verdicts, each sincerely convinced that Reason has anointed their side. In a power-laden society, it is always the stronger voice that gets to declare its reasoning the reasoning. As Dan Hind acidly observed, Reason is often nothing more than a marketing label the powerful slap on their interests.

Second, and this is the darker point, Reason itself is metaphysical, a ghost in a powdered wig. To call something “rational” is already to invoke an invisible authority, as if Truth had a clerical seal. Alasdair MacIntyre was right: strip away the old rituals and you’re left with fragments, not foundations.

Other witnesses have tried to say as much. Horkheimer and Adorno reminded us that Enlightenment rationality curdles into myth the moment it tries to dominate the world. Nietzsche laughed until his throat bled at the pretence of universal reason, then promptly built his own metaphysics of will. Bruno Latour, in We Have Never Been Modern, dared to expose Science as what it actually is – a messy network of institutions, instruments, and politics masquerading as purity. The backlash was so swift and sanctimonious that he later called it his “worst” book, a public recantation that reads more like forced penance than revelation. Even those who glimpsed the scaffolding had to return to the pews.

So when we talk about “Reason” as the bedrock of Modernity, let’s admit the joke. The bedrock was always mist. The house we built upon it is held up by ritual, inertia, and vested interest, not granite clarity. Enlightenment sold us the fantasy of a universal judge, when what we got was a self-justifying oracle. Reason is not the judge in the courtroom. Reason is the courtroom itself, and the courtroom is a carnival tent – all mirrors, no floor.

Modernity: The Phase That Never Was

6–8 minutes

We’re told we live in the Enlightenment, that Reason™ sits on the throne and superstition has been banished to the attic. Yet when I disguised a little survey as “metamodern,” almost none came out as fully Enlightened. Three managed to shed every trace of the premodern ghost, one Dutch wanderer bypassed Modernity entirely, and not a single soul emerged free of postmodern suspicion. So much for humanity’s great rational awakening. Perhaps Modernity wasn’t a phase we passed through at all, but a mirage we still genuflect before, a lifestyle brand draped over a naked emperor.

Audio: NotebookLM podcast on this topic

The Enlightenment as Marketing Campaign

The Enlightenment is sold to us as civilisation’s great coming-of-age: the dawn when the fog of superstition lifted and Reason took the throne. Kant framed it as “man’s emergence from his self-incurred immaturity” – an Enlightenment bumper sticker that academics still like to polish and reapply. But Kant wasn’t writing for peasants hauling mud or women without the vote; he was writing for his own coterie of powdered-wig mandarins, men convinced their own habits of rational debate were humanity’s new universal destiny.

Modernity, in this story, isn’t a historical stage we all inhabited. It’s an advertising campaign: Reason™ as lifestyle brand, equality as tagline, “progress” as the logo on the tote bag. Modernity, in the textbooks, is billed as a historical epoch, a kind of secular Pentecost in which the lights came on and we all finally started thinking for ourselves. In practice, it was more of a boutique fantasy, a handful of gentlemen mistaking their own rarefied intellectual posture for humanity’s destiny.

The Archetype That Nobody Lives In

At the core of the Enlightenment lies the archetype of Man™: rational, autonomous, unencumbered by superstition, guided by evidence, weighing pros and cons with the detachment of a celestial accountant. Economics repackaged him as homo economicus, forever optimising his utility function as if he were a spreadsheet in breeches.

But like all archetypes, this figure is a mirage. Our survey data, even when baited as a “metamodern survey”, never produced a “pure” Enlightenment subject.

  • 3 scored 0% Premodern (managing, perhaps, to kick the gods and ghosts to the kerb).
  • 1 scored 0% Modern (the Dutch outlier: 17% Premodern, 0% Modern, 83% Post, skipping the Enlightenment altogether, apparently by bike).
  • 0 scored 0% Postmodern. Every single participant carried at least some residue of suspicion, irony, or relativism.

The averages themselves were telling: roughly 18% Premodern, 45% Modern, 37% Postmodern. That’s not an age of Reason. That’s a muddle, a cocktail of priestly deference, rationalist daydreams, and ironic doubt.

Even the Greats Needed Their Crutches

If the masses never lived as Enlightenment subjects, what about the luminaries? Did they achieve the ideal? Hardly.

  • Descartes, desperate to secure the cogito, called in God as guarantor, dragging medieval metaphysics back on stage.
  • Kant built a cathedral of reason only to leave its foundations propped up by noumena: an unseeable, unknowable beyond.
  • Nietzsche, supposed undertaker of gods, smuggled in his own metaphysics of will to power and eternal recurrence.
  • William James, surveying the wreckage, declared that “truth” is simply “what works”, a sort of intellectual aspirin for the Enlightenment headache.

And economists, in a fit of professional humiliation, pared the rational subject down to a corpse on life support. Homo economicus became a creature who — at the very least, surely — wouldn’t choose to make himself worse off. But behavioural economics proved even that meagre hope to be a fantasy. People burn their wages on scratch tickets, sign up for exploitative loans, and vote themselves into oblivion because a meme told them to.

If even the “best specimens” never fully embodied the rational archetype, expecting Joe Everyman, who statistically struggles to parse a sixth-grade text and hasn’t cracked a book since puberty, to suddenly blossom into a mini-Kant is wishful thinking of the highest order.

The Dual Inertia

The real story isn’t progress through epochs; it’s the simultaneous drag of two kinds of inertia:

  • Premodern inertia: we still cling to sacred myths, national totems, and moral certainties.
  • Modern inertia: we still pretend the rational subject exists, because democracy, capitalism, and bureaucracy require him to.

The result isn’t a new epoch. It’s a cultural chimaera: half-superstitious, half-rationalist, shot through with irony. A mess, not a phase..

Arrow’s Mathematical Guillotine

Even if the Enlightenment dream of a rational demos were real, Kenneth Arrow proved it was doomed. His Impossibility Theorem shows that no voting system can turn individual rational preferences into a coherent “general will.” In other words, even a parliament of perfect Kants would deadlock when voting on dinner. The rational utopia is mathematically impossible.

So when we are told that democracy channels Reason, we should hear it as a polite modern incantation, no sturdier than a priest blessing crops.

Equality and the Emperor’s Wardrobe

The refrain comes like a hymn: “All men are created equal.” But the history is less inspiring. “Men” once meant property-owning Europeans; later it was generously expanded to mean all adult citizens who’d managed to stay alive until eighteen. Pass that biological milestone, and voilà — you are now certified Rational, qualified to determine the fate of nations.

And when you dare to question this threadbare arrangement, the chorus rises: “If you don’t like democracy, capitalism, or private property, just leave.” As if you could step outside the world like a theatre where the play displeases you. Heidegger’s Geworfenheit makes the joke bitter: we are thrown into this world without choice, and then instructed to exit if we find the wallpaper distasteful. Leave? To where, precisely? The void? Mars?

The Pre-Modern lord said: Obey, or be exiled. The Modern democrat says: Vote, or leave. And the Post-Enlightenment sceptic mutters: Leave? To where, exactly? Gravity? History? The species? There is no “outside” to exit into. The system is not a hotel; it’s the weather.

Here the ghost of Baudrillard hovers in the wings, pointing out that we are no longer defending Reason, but the simulacrum of Reason. The Emperor’s New Clothes parable once mocked cowardice: everyone saw the nudity but stayed silent. Our situation is worse. We don’t even see that the Emperor is naked. We genuinely believe in the fineries, the Democracy™, the Rational Man™, the sacred textile of Progress. And those who point out the obvious are ridiculed: How dare you mock such fineries, you cad!

Conclusion: The Comfort of a Ghost

So here we are, defending the ghost of a phase we never truly lived. We cling to Modernity as if it were a sturdy foundation, when in truth it was always an archetype – a phantom rational subject, a Platonic ideal projected onto a species of apes with smartphones. We mistook it for bedrock, built our institutions upon it, and now expend colossal energy propping up the papier-mâché ruins. The unfit defend it out of faith in their own “voice,” the elites defend it to preserve their privilege, and the rest of us muddle along pragmatically, dosing ourselves with Jamesian aspirin and pretending it’s progress.

Metamodernism, with its marketed oscillation between sincerity and irony, is less a “new stage” than a glossy rebranding of the same old admixture: a bit of myth, a bit of reason, a dash of scepticism. And pragmatism –James’s weary “truth is what works” – is the hangover cure that keeps us muddling through.

Modernity promised emancipation from immaturity. What we got was a new set of chains: reason as dogma, democracy as ritual, capitalism as destiny. And when we protest, the system replies with its favourite Enlightenment lullaby: If you don’t like it, just leave.

But you can’t leave. You were thrown here. What we call “Enlightenment” is not a stage in history but a zombie-simulation of an ideal that never drew breath. And yet, like villagers in Andersen’s tale, we not only guard the Emperor’s empty wardrobe – we see the garments as real. The Enlightenment subject is not naked. He is spectral, and we are the ones haunting him.

Keeping Ourselves in the Dark: Depressive Realism and the Fiction of Agency

Philosopher Muse brought Colin Feltham to my attention, so I read his Keeping Ourselves in the Dark. It’s in limited supply, so I found an online copy.

So much of modern life rests on promises of improvement. Governments promise progress, religions promise redemption, therapists promise healing. Feltham’s Keeping Ourselves in the Dark (2015) takes a blunt axe to this edifice. In a series of sharp, aphoristic fragments, he suggests that most of these promises are self-deceptions. They keep us busy and comforted, but they do not correspond to the reality of our condition. For Feltham, reality is not an upward arc but a fog – a place of incoherence, accident, and suffering, which we disguise with stories of hope.

Audio: NotebookLM podcast summarising this post.

It is a book that situates itself in a lineage of pessimism. Like Schopenhauer, Feltham thinks life is saturated with dissatisfaction. Like Emil Cioran, he delights in puncturing illusions. Like Peter Wessel Zapffe, he worries that consciousness is an overdeveloped faculty, a tragic gift that leaves us exposed to too much meaninglessness.

Depressive Realism – Lucidity or Illusion?

One of Feltham’s recurring themes is the psychological idea of “depressive realism.” Researchers such as Lauren Alloy and Lyn Abramson suggested that depressed individuals may judge reality more accurately than their non-depressed peers, particularly when it comes to their own lack of control. Where the “healthy” mind is buoyed by optimism bias, the depressed mind may be sober.

Feltham uses this as a pivot: if the depressed see things more clearly, then much of what we call mental health is simply a shared delusion, a refusal to see the world’s bleakness. He is not romanticising depression, but he is deliberately destabilising the assumption that cheerfulness equals clarity.

Here I find myself diverging. Depression is not simply lucidity; it is also, inescapably, a condition of suffering. To say “the depressed see the truth” risks sanctifying what is, for those who live it, a heavy and painful distortion. Following Foucault, I would rather say that “mental illness” is itself a category of social control – but that does not mean the suffering it names is any less real.

Video: Depressive Realism by Philosopher Muse, the impetus for this blog article

Agency Under the Same Shadow

Feltham’s suspicion of optimism resonates with other critiques of human self-concepts. Octavia Butler, in her fiction and theory, often frames “agency” as a structural mirage: we think we choose, but our choices are already scripted by language and power. Jean-Paul Sartre, on the other hand, insists on the opposite extremity: that we are “condemned to be free,” responsible even for our refusal to act. Howard Zinn echoes this in his famous warning that “you can’t be neutral on a moving train.”

My own work, the Language Insufficiency Hypothesis, takes a fourth line. Like Feltham, I doubt that our central myths – agency, freedom, progress – correspond to any stable reality. But unlike him, I do not think stripping them away forces us into depressive despair. The feeling of depression is itself another state, another configuration of affect and narrative. To call it “realistic” is to smuggle in a judgment, as though truth must wound.

Agency, Optimism, and Their Kin

Feltham’s bleak realism has interesting affinities with other figures who unpick human self-mythology:

  • Octavia Butler presents “agency” itself as a kind of structural illusion. From the Oankali’s alien vantage in Dawn, humanity looks like a single destructive will, not a set of sovereign choosers.
  • Sartre, by contrast, radicalises agency: even passivity is a choice; we are condemned to be free.
  • Howard Zinn universalises responsibility in a similar register: “You can’t be neutral on a moving train.”
  • Cioran and Zapffe, like Feltham, treat human self-consciousness as a trap, a source of suffering that no optimistic narrative can finally dissolve.

Across these positions, the common thread is suspicion of the Enlightenment story in which rational agency and progress are guarantors of meaning. Some embrace the myth, some invert it, some discard it.

Dis-integration Rather Than Despair

Where pessimists like Feltham (or Cioran, or Zapffe) tend to narrate our condition as tragic, my “dis-integrationist” view is more Zen: the collapse of our stories is not a disaster but a fact. Consciousness spins myths of control and meaning; when those myths fail, we may feel disoriented, but that disorientation is simply another mode of being. There is no imperative to replace one illusion with another – whether it is progress, will, or “depressive clarity.”

From this perspective, life is not rescued by optimism, nor is it condemned by realism. It is simply flux, dissonance, and transient pattern. The task is not to shore up agency but to notice its absence without rushing to fill the void with either hope or despair.

Four Ways to Mistake Agency

I’ve long wrestled with the metaphysical aura that clings to “agency.” I don’t buy it. Philosophers – even those I’d have thought would know better – keep smuggling it back into their systems, as though “will” or “choice” were some indispensable essence rather than a narrative convenience.

Take the famous mid-century split: Sartre insisted we are “condemned to be free,” and so must spend that freedom in political action; Camus shrugged at the same premise and redirected it toward art, creation in the face of absurdity. Different prescriptions, same underlying assumption – that agency is real, universal, and cannot be escaped.

What if that’s the problem? What if “agency” is not a fact of human being but a Modernist fable, a device designed to sustain certain worldviews – freedom, responsibility, retribution – that collapse without it?

Sartre and Zinn: Agency as Compulsion

Sartre insists: “There are no innocent victims. Even inaction is a choice.” Zinn echoes: “You can’t be neutral on a moving train.” Both rhetorics collapse hesitation, fatigue, or constraint into an all-encompassing voluntarism. The train is rolling, and you are guilty for sitting still.

Feltham’s Depressive Realism

Colin Feltham’s Keeping Ourselves in the Dark extends the thesis: our optimism and “progress” are delusions. He leans into “depressive realism,” suggesting that the depressive gaze is clearer, less self-deceived. Here, too, agency is unmasked as myth – but the myth is replaced with another story, one of lucidity through despair.

A Fourth Position: Dis-integration

Where I diverge is here: why smuggle in judgment at all? Butler, Sartre, Zinn, Feltham each turn absence into a moral. They inflate or invert “agency” so it remains indispensable. My sense is more Zen: perhaps agency is not necessary. Not as fact, not as fiction, not even as a tragic lack.

Life continues without it. Stabilisers cling to the cart, Tippers tip, Egoists recline, Sycophants ride the wake, Survivors endure. These are dispositions, not decisions. The train moves whether or not anyone is at the controls. To say “you chose” is to mistake drift for will, inertia for responsibility.

From this angle, nihilism doesn’t require despair. It is simply the atmosphere we breathe. Meaning and will are constructs that serve Modernist institutions – law, nation, punishment. Remove them, and nothing essential is lost, except the illusion that we were ever driving.

Octavia E Butler’s Alien Verdict

Not Judith Buthler. In the opening of Dawn, the Oankali tell Lilith: “You committed mass suicide.” The charge erases distinctions between perpetrators, victims, resisters, and bystanders. From their vantage, humanity is one agent, one will. A neat explanation – but a flattening nonetheless.

👉 Full essay: On Agency, Suicide, and the Moving Train

Why Feltham Matters

Even if one resists his alignment of depression with truth, Feltham’s work is valuable as a counterweight to the cult of positivity. It reminds us that much of what we call “mental health” or “progress” depends on not seeing too clearly the futility, fragility, and cruelty that structure our world.

Where he sees darkness as revelation, I see it as atmosphere: the medium in which we always already move. To keep ourselves in the dark is not just to lie to ourselves, but to continue walking the tracks of a train whose destination we do not control. Feltham’s bleak realism, like Butler’s alien rebuke or Sartre’s burden of freedom, presses us to recognise that what we call “agency” may itself be part of the dream.

On Agency, Suicide, and the Moving Train

I’ve been working through the opening chapters of Octavia Butler’s Dawn. At one point, the alien Jdahya tells Lilith, “We watched you commit mass suicide.”*

The line unsettles not because of the apocalypse itself, but because of what it presumes: that “humanity” acted as one, as if billions of disparate lives could be collapsed into a single decision. A few pulled triggers, a few applauded, some resisted despite the odds, and most simply endured. From the alien vantage, nuance vanishes. A species is judged by its outcome, not by the uneven distribution of responsibility that produced it.

This is hardly foreign to us. Nationalism thrives on the same flattening. We won the war. We lost the match. A handful act; the many claim the glory or swallow the shame by association. Sartre takes it further with his “no excuses” dictum, even to do nothing is to choose. Howard Zinn’s “You can’t remain neutral on a moving train” makes the same move, cloaked in the borrowed authority of physics. Yet relativity undermines it: on the train, you are still; on the ground, you are moving. Whether neutrality is possible depends entirely on your frame of reference.

What all these formulations share is a kind of metaphysical inflation. “Agency” is treated as a universal essence, something evenly spread across the human condition. But in practice, it is anything but. Most people are not shaping history; they are being dragged along by it.

One might sketch the orientations toward the collective “apple cart” like this:

  • Tippers with a vision: the revolutionaries, ideologues, or would-be prophets who claim to know how the cart should be overturned.
  • Sycophants: clinging to the side, riding the momentum of others’ power, hoping for crumbs.
  • Egoists: indifferent to the cart’s fate, focused on personal comfort, advantage, or escape.
  • Stabilisers: most people, clinging to the cart as it wobbles, preferring continuity to upheaval.
  • Survivors: those who endure, waiting out storms, not out of “agency” but necessity.

The Stabilisers and Survivors blur into the same crowd, the former still half-convinced their vote between arsenic and cyanide matters, the latter no longer believing the story at all. They resemble Seligman’s shocked dogs, conditioned to sit through pain because movement feels futile.

And so “humanity” never truly acts as one. Agency is uneven, fragile, and often absent. Yet whether in Sartre’s philosophy, Zinn’s slogans, or Jdahya’s extraterrestrial indictment, the temptation is always to collapse plurality into a single will; you chose this, all of you. It is neat, rhetorically satisfying, and yet wrong.

Perhaps Butler’s aliens, clinical in their judgment, are simply holding up a mirror to the fictions we already tell about ourselves.


As an aside, this version of the book cover is risible. Not to devolve into identity politics, but Lilith is a dark-skinned woman, not a pale ginger. I can only assume that some target science fiction readers have a propensity to prefer white, sapphic adjacent characters.

I won’t even comment further on the faux 3D title treatment, relic of 1980s marketing.


Spoiler Alert: As this statement about mass suicide is a Chapter 2 event, I am not inclined to consider it a spoiler. False alarm.

The Reasonable Person: From Judge Judy to SCOTUS

2–4 minutes

When I was a child, the United States Supreme Court was still spoken of in hushed, reverent tones, as though nine robed sages in Washington were the Platonic guardians of justice. Impartiality was the word on everyone’s lips, and we were meant to believe that “the law” floated above the grubby realm of politics, as pure and crystalline as the Ten Commandments descending from Sinai.

Audio: NotebookLM podcast on this topic (MP3).

Even then, I didn’t buy it. The whole thing reeked of theatre. And the past few decades have proved that scepticism correct: the Court has become a pantomime. In this robed reality show, nine unelected lawyers cosplaying as oracles interpret the world for us, often by a razor-thin vote that splits exactly along partisan lines. Impartial? Please. A coin toss would be less predictable.

This is why I perked up when I heard Iain McGilchrist, in his recent interview with Curt Jaimungal, wax lyrical about rationality versus reasonableness. Schizophrenia, he tells us, is like a left hemisphere gone berserk, parsing the world in a literalist frenzy without the right hemisphere’s sense of context. The schizophrenic hears a voice in an empty room and, lacking the capacity for metaphor, deduces that it must be the neighbours whispering through the electrical socket. Rational, in its way, but absurd.

Video: Iain McGilchrist and Curt Jaimungal

McGilchrist’s corrective is “reasonableness,” which he casts as the quality of a wise judge: not a slave to mechanistic logic, but able to balance intuition, context, and experience. The problem, of course, is that “reasonable” is one of those delightful weasel words I keep writing about. It claims to be neutral – a universal standard, above the fray – but in practice, it’s just a ventriloquism act. “Reasonable” always turns out to mean what I, personally, consider obvious.

Enter Judge Judy, daytime television’s answer to jurisprudence. Watch her wag a finger and declare, “Any reasonable person would have kept the receipt!” And the studio audience – hand-picked to agree with her every twitch – erupts in applause. It’s reasonableness as spectacle, the mob dressed up as jurisprudence.

Now scale that performance up to SCOTUS. The “reasonable person” test is embedded deep in the common law tradition, but the reasonable person is not you, me, or anyone who has actually missed a bus, pawned a wedding ring, or heard a neighbour’s radio through thin walls. No, the reasonable person is an imaginary, well-groomed gentleman of property whose intuitions happen to dovetail nicely with the prejudices of the bench. The Court, like Judge Judy, insists it is Reason incarnate, when in truth it is reasonableness-by-consensus, a carefully curated consensus at that.

McGilchrist is right that rationality, stripped of context, can lead to absurdity. But in elevating “reasonableness” as if it were a transcendent virtue, he mistakes projection for philosophy. A judge is “reasonable” only when her intuitions rhyme with yours. And when they don’t? Suddenly, she’s a madwoman in robes, and her “reasonableness” is exposed as nothing more than taste disguised as universal law.

The “reasonable person” – whether invoked by the Supreme Court or by Judge Judy – is a ghost that conveniently resembles the speaker. We imagine we’re appealing to some objective standard, when in fact we’re gazing into a mirror. The tragedy of schizophrenia, as McGilchrist notes, is to take metaphor literally. The tragedy of law and politics is the opposite: to dress literal bias in metaphor, to call it “reason,” and to applaud ourselves for our wisdom while the stage set burns behind us.

Democracy: The Worst Form of Government, and Other Bedtime Stories

3–5 minutes

Karl Popper’s Paradox of Intolerance has become a kind of intellectual talisman, clutched like a rosary whenever fascists start goose-stepping into the town square. Its message is simple enough: to preserve tolerance, one must be intolerant of intolerance. Shine enough sunlight on bad ideas, and – so the pious hope – they’ll shrivel into dust like a vampire caught out at dawn.

If only.

The trouble with this Enlightenment fairy tale is that it presumes bad ideas melt under the warm lamp of Reason, as if ignorance were merely a patch of mildew waiting for the bleach of debate. But bad ideas are not bacteria; they are weeds, hydra-headed and delighting in the sun. Put them on television, and they metastasise. Confront them with logic, and they metastasise faster, now with a martyr’s halo.

Audio: NotebookLM podcast on this topic.

And here’s the part no liberal dinner-party theorist likes to face: the people most wedded to these “bad ideas” often don’t play the game of reason at all. Their critical faculties have been packed up, bubble-wrapped, and left in the loft decades ago. They don’t want dialogue. They want to chant. They don’t want evidence. They want affirmation. The Socratic method bounces off them like a ping-pong ball fired at a tank.

But let’s be generous. Suppose, just for a moment, we had Plato’s dream: a citizenry of Philosopher Kings™, all enlightened, all rational. Would democracy then work? Cue Arrow’s Impossibility Theorem, that mathematical killjoy which proves that even under perfect conditions – omniscient voters, saintly preferences, universal literacy – you still cannot aggregate those preferences into a system that is both fair and internally consistent. Democracy can’t even get out of its own way on paper.

Now throw in actual humans. Not the Platonic paragons, but Brexit-uncle at the pub, Facebook aunt with her memes, the American cousin in a red cap insisting a convicted felon is the second coming. Suddenly, democracy looks less like a forum of reasoned debate and more like a lottery machine coughing up numbers while we all pretend they mean “the will of the people.”

And this is where the Churchill quip waddles in, cigar smoke curling round its bowler hat: “Democracy is the worst form of government, except for all the others.” Ah yes, Winston, do please save us with a quip so well-worn it’s practically elevator music. But the problem is deeper than taste in quotations. If democracy is logically impossible (Arrow) and practically dysfunctional (Trump, Brexit, fill in your own national catastrophe), then congratulating ourselves that it’s “better than the alternatives” is simply an admission that we’ve run out of imagination.

Because there are alternatives. A disinterested AI, for instance, could distribute resources with mathematical fairness, free from lobbyists and grievance-mongers. Nursery schools versus nursing homes? Feed in the data, spit out the optimal allocation. No shouting matches, no demagoguery, no ballots stuffed with slogans. But here the defenders of democracy suddenly become Derrida in disguise: “Ah, but what does fair really mean?” And just like that, we are back in the funhouse of rhetorical mirrors where “fair” is a word everyone loves until it costs them something.

So perhaps democracy doesn’t require an “educated populace” at all; that was always just sugar-paper wrapping. It requires, instead, a population sufficiently docile, sufficiently narcotised by the spectacle, to accept the carnival of elections as a substitute for politics. Which is why calling the devotees of a Trump, or any other demagogue, a gaggle of lemmings is both accurate and impolitic: they know they’re not reasoning; they’re revelling. Your contempt merely confirms the script they’ve already written for you.

Video: Short callout to Karl Popper and Hilary Lawson.

The philosopher, meanwhile, is left polishing his lantern, muttering about reason to an audience who would rather scroll memes about pedophile pizza parlours. Popper warned us that tolerance cannot survive if it tolerates its own annihilation. Arrow proved that even if everyone were perfectly reasonable, the maths would still collapse. And Churchill, bless him, left us a one-liner to make it all seem inevitable.

Perhaps democracy isn’t the worst form of government except for all the others. Perhaps it’s simply the most palatable form of chaos, ballots instead of barricades, polling booths instead of pitchforks. And maybe the real scandal isn’t that people are too stupid for democracy, but that democracy was never designed to be about intelligence in the first place. It was always about managing losers while telling them they’d “had their say.”

The Enlightenment promised us reason; what it delivered was a carnival where the loudest barker gets the booth. The rest of us can either keep muttering about paradoxes in the corner or admit that the show is a farce and start imagining something else.

Cogito, Ergo… Who?

Everyone knows the line: cogito ergo sum. Descartes’ great party trick. A man alone in his study, fretting about demons, announces that because he’s doubting, he must exist. Ta-da! Curtain call. Except, of course, it’s less of a revelation than a conjuring trick: he pulls an I out of a hat that was never proved to be there in the first place. Thinking is happening, indeed – but who invited the “thinker”?

Video: David Guignion talks about Descartes’ Cogito.

And let’s not forget the dramatis personae Descartes smuggles in for atmosphere. A malicious demon, a benevolent God, both necessary props to justify his paranoia and his certainty. Philosophy as melodrama: cue organ music, lightning strike.

Audio: NotebookLM podcast on this topic.

Enter the Critics

Spinoza rolls his eyes. Doubt isn’t some heroic starting point, he says – it’s just ignorance, a lack of adequate ideas. To elevate doubt into method is like treating vertigo as a navigational tool. Error isn’t demonic trickery; it’s our own confusion.

Kant arrives next, shaking his head. Descartes thinks he’s proven a substantial “I,” but all he’s actually shown is the form of subjectivity – the empty requirement that experiences hang together. The “I think” is a necessary placeholder, not a discovery. A grammatical “you are here” arrow, not a metaphysical treasure chest.

Hegel, of course, can’t resist upping the disdain. Descartes’ I is an empty abstraction, a hollow balloon floating above reality. The self isn’t given in some solitary moment of doubt; it emerges through process – social, historical, dialectical. The cogito is the philosophical equivalent of a selfie: lots of certainty, zero depth.

The Insufficiency Twist

And yet, maybe all of them are still dancing to the same fiddler. Because here’s the real suspicion: what if the whole problem is a trick of language? English, with its bossy Indo-European grammar, refuses to let verbs stand alone. “Thinking” must have a “thinker,” “seeing” a “seer.” Grammar insists on a subject; ontology obediently provides one.

Other languages don’t always play this game. Sanskrit or Pali can shrug and say simply, “it is seen.” Japanese leaves subjects implied, floating like ghosts. Some Indigenous languages describe perception as relational events – “seeing-with-the-tree occurs” – no heroic subject required. So perhaps the real villain here isn’t Descartes or even metaphysics, but syntax itself, conscripting us into a subject-shaped theatre.

Now, I don’t want to come off like a one-trick pony, forever waving the flag of “language insufficiency” like some tired philosopher’s catchphrase. But we should be suspicious when our limited grammar keeps painting us into corners, insisting on perceivers where maybe there are only perceptions, conjuring selves because our verbs can’t tolerate dangling.

Curtain Call

So in the end, Descartes’ famous “I” might be no more than a grammatical fiction, a casting error in the great play of philosophy. The cogito isn’t the foundation of modern thought; it’s the world’s most influential typo.

The Red Flag of Truth

Nothing says “I’ve stopped thinking” quite like someone waving the banner of Truth. The word itself, when capitalised and flapped about like a holy relic, isn’t a signal of wisdom but of closure. A red flag.

The short video by Jonny Thompson that inspired this post.

Those who proclaim to “speak the Truth” or “know the Truth”rarely mean they’ve stumbled upon a tentative insight awaiting refinement. No, what they mean is: I have grasped reality in its totality, and—surprise!—it looks exactly like my prejudices. It’s the epistemic equivalent of a toddler declaring ownership of the playground by drooling on the swings.

The Fetish of Objectivity

The conceit is that Truth is singular, objective, eternal, a monolithic obelisk towering over human folly. But history’s scrapyard is full of such obelisks, toppled and broken: phlogiston, bloodletting, Manifest Destiny, “the market will regulate itself.” Each was once trumpeted as capital-T Truth. Each is now embarrassing clutter for the dustbin.

Still, the zealots never learn. Every generation delivers its own batch of peddlers, flogging their version of Truth as if it were snake oil guaranteed to cure ignorance and impotence. (Side effects may include dogmatism, authoritarianism, and an inability to read the room.)

Why It’s a Red Flag

When someone says, “It’s just the truth”, what they mean is “, I am not listening,” like the parent who argues, “because I said so.” Dialogue is dead; curiosity cremated. Truth, in their hands, is less a lantern than a cosh. It is wielded not to illuminate, but to bludgeon.

Ralph Waldo Emerson’s voice breaks in, urging us to trust ourselves and to think for ourselves. Nothing is more degrading than to borrow another’s convictions wholesale and parade them as universal law. Better to err in the wilderness of one’s own reason than to be shepherded safely into another man’s paddock of certainties.

A Better Alternative

Rather than fetishising Truth, perhaps we ought to cultivate its neglected cousins: curiosity, provisionality, and doubt. These won’t look as good on a placard, admittedly. Picture a mob waving banners emblazoned with Ambiguity! – not exactly the stuff of revolutions. But infinitely more honest, and infinitely more humane.

So when you see someone waving the flag of Truth, don’t salute. Recognise it for what it is: a warning sign. Proceed with suspicion, and for God’s sake, bring Emerson.

Don’t Tread on My Ass

Absolute liberty means absolute liberty, but what if the liberty you seek is death? The moment you carve out exceptions – speech you can’t say, choices you can’t make, exits you can’t take – you’ve left the realm of liberty and entered the gated community of permission.

Video: YouTube vid by Philosopher Muse.

And here’s the test most self-styled liberty lovers fail: you’re free to skydive without a parachute, but try ending your life peacefully and watch how quickly the freedom brigade calls in the moral SWAT team.

I’m not his usual audience; I’m already in the choir, but this eight-minute clip by Philosopher Muse is worth your time. It’s a lucid walk through the ethical terrain mapped by Sarah Perry in Every Cradle Is a Grave, and it’s one of the better distillations of antinatalist thought I’ve seen for the general public. Perry’s libertarian starting point is straightforward: if you truly own your life, you must also have the right to give it up.

He threads in the dark-glimmer insights of Emil Cioran’s poetic despair, Thomas Ligotti’s existential horror, David Benatar’s asymmetry, and Peter Wessel Zapffe’s tragic consciousness. Together they point to an uncomfortable truth: autonomy that stops short of death isn’t autonomy at all; it’s a petting zoo of freedoms where the gate is locked.

I’ve said this before, but it bears repeating. I once had a girlfriend who hated her life but was too afraid of hell to end it. She didn’t “pull through.” She overdosed by accident. Loophole closed, I suppose. That’s what happens when metaphysical prohibitions are allowed to run the operating system.

And here’s where I diverge from the purist libertarians. I don’t believe most people are competent enough to have the liberty they think they deserve. Not because they’re all dribbling idiots, but because they’ve been marinated for generations in a stew of indoctrination. For thousands of years, nobody talked about “liberty” or “freedom” as inalienable rights. Once the notion caught on, it was packaged and sold – complete with an asterisk, endless fine print, and a service desk that’s never open.

We tell ourselves we’re free, but only in the ways that don’t threaten the custodians. You can vote for whoever the party machine serves up, but you can’t opt out of the game. You can live any way you like, as long as it looks enough like everyone else’s life. You can risk death in countless state-approved ways, but the moment you calmly choose it, your autonomy gets revoked.

So yes, watch the video. Read Perry’s Every Cradle Is a Grave. Then ask yourself whether your liberty is liberty, or just a longer leash.

If liberty means anything, it means the right to live and the right to leave. The former without the latter is just life imprisonment with better marketing.

Ages of Consent: A Heap of Nonsense

A response on another social media site got me thinking about another Sorites paradox. The notion just bothers me. I’ve long held that it is less a paradox than an intellectually lazy way to manoeuvre around language insufficiency.

<rant>

The law loves a nice, clean number. Eighteen to vote. Sixteen to marry. This-or-that to consent. As if we all emerge from adolescence on the same morning like synchronised cicadas, suddenly equipped to choose leaders, pick spouses, and spot the bad lovers from the good ones.

But the Sorites paradox gives the game away: if you’re fit to vote at 18 years and 0 days, why not at 17 years, 364 days? Why not 17 years, 363 days? Eventually, you’re handing the ballot to a toddler who thinks the Prime Minister is Peppa Pig. Somewhere between there and adulthood, the legislator simply throws a dart and calls it “science.”

To bolster this fiction, we’re offered pseudo-facts: “Women mature faster than men”, or “Men’s brains don’t finish developing until thirty.” These claims, when taken seriously, only undermine the case for a single universal threshold. If “maturity” were truly the measure, we’d have to track neural plasticity curves, hormonal arcs, and a kaleidoscope of individual factors. Instead, the state settles for the cheapest approximation: a birthday.

This obsession with fixed thresholds is the bastard child of Enlightenment rationalism — the fantasy that human variation can be flattened into a single neat line on a chart. The eighteenth-century mind adored universals: universal reason, universal rights, universal man. In this worldview, there must be one age at which all are “ready,” just as there must be one unit of measure for a metre or a kilogram. It is tidy, legible, and above all, administratively convenient.

Cue the retorts:

  • “We need something.” True, but “something” doesn’t have to mean a cliff-edge number. We could design systems of phased rights, periodic evaluations, or contextual permissions — approaches that acknowledge people as more than interchangeable cut-outs from a brain-development chart.
  • “It would be too complicated.” Translation: “We prefer to be wrong in a simple way than right in a messy way.” Reality is messy. Pretending otherwise isn’t pragmatism; it’s intellectual cowardice. Law is supposed to contend with complexity, not avert its gaze from it.

And so we persist, reducing a continuous, irregular, and profoundly personal process to an administratively convenient fiction — then dressing it in a lab coat to feign objectivity. A number is just a number, and in this case, a particularly silly one.

</rant>