Defying Death

I died in March 2023 — or so the rumour mill would have you believe.

Of course, given that I’m still here, hammering away at this keyboard, it must be said that I didn’t technically die. We don’t bring people back. Death, real death, doesn’t work on a “return to sender” basis. Once you’re gone, you’re gone, and the only thing bringing you back is a heavily fictionalised Netflix series.

Audio: NotebookLM podcast of this content.

No, this is a semantic cock-up, yet another stinking exhibit in the crumbling Museum of Language Insufficiency. “I died,” people say, usually while slurping a Pumpkin Spice Latte and live-streaming their trauma to 53 followers. What they mean is that they flirted with death, clumsily, like a drunk uncle at a wedding. No consummation, just a lot of embarrassing groping at the pearly gates.

And since we’re clarifying terms: there was no tunnel of light, no angels, no celestial choir belting out Coldplay covers. No bearded codgers in slippers. No 72 virgins. (Or, more plausibly, 72 incels whining about their lack of Wi-Fi reception.)

There was, in fact, nothing. Nothing but the slow, undignified realisation that the body, that traitorous meat vessel, was shutting down — and the only gates I was approaching belonged to A&E, with its flickering fluorescent lights and a faint smell of overcooked cabbage.

To be fair, it’s called a near-death experience (NDE) for a reason. Language, coward that it is, hedges its bets. “Near-death” means you dipped a toe into the abyss and then screamed for your mummy. You didn’t die. You loitered. You loitered in the existential equivalent of an airport Wetherspoons, clutching your boarding pass and wondering why the flight to Oblivion was delayed.

As the stories go, people waft into the next world and are yanked back with stirring tales of unicorns, long-dead relatives, and furniture catalogues made of clouds. I, an atheist to my scorched and shrivelled soul, expected none of that — and was therefore not disappointed.

What I do recall, before the curtain wobbled, was struggling for breath, thinking, “Pick a side. In or out. But for pity’s sake, no more dithering.”
In a last act of rational agency, I asked an ER nurse — a bored-looking Athena in scrubs — to intubate me. She responded with the rousing medical affirmation, “We may have to,” which roughly translates to, “Stop making a scene, love. We’ve got fifteen others ahead of you.”

After that, nothing. I was out. Like a light. Like a minor character in a Dickens novel whose death is so insignificant it happens between paragraphs.

I woke up the next day: groggy, sliced open, a tube rammed down my throat, and absolutely no closer to solving the cosmic riddle of it all. Not exactly the triumphant return of Odysseus. Not even a second-rate Ulysses.

Here’s the reality:
There is no coming back from death.
You can’t “visit” death, any more than you can spend the afternoon being non-existent and return with a suntan.

Those near-death visions? Oxygen-starved brains farting out fever dreams. Cerebral cortexes short-circuiting like Poundland fairy lights. Hallucinations, not heralds. A final, frantic light show performed for an audience of none.

Epicurus, that cheerful nihilist, said, “When we are, death is not. When death is, we are not.” He forgot to mention that, in between, people would invent entire publishing industries peddling twaddle about journeys beyond the veil — and charging $29.99 for the paperback edition.

No angels. No harps. No antechamber to the divine.
Just the damp whirr of hospital machinery and the faint beep-beep of capitalism, patiently billing you for your own demise.

If there’s a soundtrack to death, it’s not choirs of the blessed. It’s a disgruntled junior surgeon muttering, “Where the hell’s the anaesthetist?” while pawing desperately through a drawer full of out-of-date latex gloves.

And thus, reader, I lived.
But only in the most vulgar, anticlimactic, and utterly mortal sense.

There will be no afterlife memoir. No second chance to settle the score. No sequel.
Just this: breath, blood, occasional barbed words — and then silence.

Deal with it.

Questioning Traditional Families

I neither champion nor condemn tradition—whether it’s marriage, family, or whatever dusty relic society is currently parading around like a prize marrow at a village fête.

Audio: NotebookLM podcast on traditional families.

In a candid group conversation recently, I met “Jenny”, who declared she would have enjoyed her childhood much more had her father not “ruined everything” simply by existing. “Marie” countered that it was her mother who had been the wrecker-in-chief. Then “Lulu” breezed in, claiming, “We had a perfect family — we practically raised ourselves.”

Now, here’s where it gets delicious:

Each of these women, bright-eyed defenders of “traditional marriage” and “traditional family” (cue the brass band), had themselves ticked every box on the Modern Chaos Bingo Card: children out of wedlock? Check. Divorces? Check. Performative, cold-marriage pantomimes? Absolutely—and scene.
Their definition of “traditional marriage” is the vintage model: one cis-male, one cis-female, Dad brings home the bacon, Mum weeps quietly into the washing-up. Standard.

Let’s meet the players properly:

Jenny sprang from a union of two serial divorcées, each dragging along the tattered remnants of previous families. She was herself a “love child,” born out of wedlock and “forcing” another reluctant stroll down the aisle. Her father? A man of singular achievements: he paid the bills and terrorised the household. Jenny now pays a therapist to untangle the psychological wreckage.

Marie, the second of two daughters, was the product of a more textbook “traditional family”—if by textbook you mean a Victorian novel where everyone is miserable but keeps a stiff upper lip about it. Her mother didn’t want children but acquiesced to her husband’s demands (standard operating procedure at the time). Marie’s childhood was a kingdom where Daddy was a demigod and Mummy was the green-eyed witch guarding the gates of hell.

Lulu grew up in a household so “traditional” that it might have been painted by Hogarth: an underemployed, mostly useless father and a mother stretched thinner than the patience of a British Rail commuter. Despite—or because of—the chaos, Lulu claims it was “perfect,” presumably redefining the word in a way the Oxford English Dictionary would find hysterical. She, too, had a child out of wedlock, with the explicit goal of keeping feckless men at bay.

And yet—and yet—all three women cling, white-knuckled, to the fantasy of the “traditional family.” They did not achieve stability. Their families of origin were temples of dysfunction. But somehow, the “traditional family” remains the sacred cow, lovingly polished and paraded on Sundays.

Why?

Because what they’re chasing isn’t “tradition” at all — it’s stability, that glittering chimera. It’s nostalgia for a stability they never actually experienced. A mirage constructed from second-hand dreams, glossy 1950s propaganda, and whatever leftover fairy tales their therapists hadn’t yet charged them £150 an hour to dismantle.

Interestingly, none of them cared two figs about gay marriage, though opinions about gay parenting varied wildly—a kettle of fish I’ll leave splashing outside this piece.

Which brings us back to the central conundrum:

If lived experience tells you that “traditional family” equals trauma, neglect, and thinly-veiled loathing, why in the name of all that’s rational would you still yearn for it?

Societal pressure, perhaps. Local customs. Generational rot. The relentless cultural drumbeat that insists that marriage (preferably heterosexual and miserable) is the cornerstone of civilisation.

Still, it’s telling that Jenny and Marie were both advised by therapists to cut ties with their toxic families—yet in the same breath urged to create sturdy nuclear families for their own children. It was as if summoning a functional household from the smoking ruins of dysfunction were a simple matter of willpower and a properly ironed apron.

Meanwhile, Lulu—therapy-free and stubbornly independent—declares that raising oneself in a dysfunctional mess is not only survivable but positively idyllic. One can only assume her standards of “perfect” are charmingly flexible.

As the title suggests, this piece questions traditional families. I offer no solutions—only a raised eyebrow and a sharper question:

What is the appeal of clinging to a fantasy so thoroughly at odds with reality?
Your thoughts, dear reader? I’d love to hear your defences, your protests, or your own tales from the trenches.

Lies as Shibboleth

Watching Sam Harris ruminate on the nature of political lies (still believing, poor lamb, that reason might one day triumph) reminds me of something more sinister: lies today are not attempts at persuasion. They are shibboleths — tribal passwords, loyalty oaths, secret handshakes performed in the broad light of day.

Video: Sam Harris tells us why Trump and his ilk lie.

Forget “alternative facts.” That charming euphemism was merely a decoy, a jangling set of keys to distract the infantile media. The real game was always deeper: strategic distortion, the deliberate blurring of perception not to deceive the outsider, but to identify the insider.

Audio: NotebookLM podcast on this topic.

When Trump — or any other post-truth demagogue — proclaims that penguins are, in fact, highly trained alien operatives from the Andromeda galaxy, the objective is not persuasion. The point is to force a choice: will you, standing before this glistening absurdity, blink and retreat into reason, stammering something about ornithology… Or will you step forward, clasp the hand of madness, and mutter, ‘Yes, my liege, the penguins have been among us all along’?

Those who demur, those who scoff or gasp or say ‘You’re an idiot,”’have failed the loyalty test. They have outed themselves as enemy combatants in the epistemic war. Truth, in this brave new world, is not a destination; it is an allegiance. To speak honestly is to wage rebellion.

Orwell, who tried very hard to warn us, understood this dynamic well: the real triumph of Big Brother was not merely to compel you to lie but to compel you to believe the lie. Koestler, another battered prophet of the age, charted how political movements sink into ritualistic unreason, demanding not conviction but performance. Swift, for his part, knew it was all hilarious if you tilted your head just right.

The bigger the lie, the better the shibboleth. Claim that two and two make five, and you catch out the weak-willed rationalists. Claim that penguins are extraterrestrials, and you find the truly devoted, the ones willing to build altars from ice and sacrifice to their feathery overlords.

It’s no accident that modern political theatre resembles a deranged initiation ritual. Each day brings a new absurdity, a fresh madness to affirm: ‘Men can become women by declaration alone!” “Billionaires are victims of systemic oppression!’ ‘The penguins are amongst us, plotting!’ Each claim a little more grotesque than the last, each compliance a little more degrading, a little more irreversible.

And oh, how eagerly the initiates rush forward! Clap for the penguins, or be cast out into the howling wilderness! Better to bend the knee to absurdity than be marked as an unbeliever. Better to humiliate yourself publicly than to admit that the Emperor’s penguin suit is just a costume.

Meanwhile, the opposition — earnest, naive — keeps trying to argue, to rebut, to point out that penguins are terrestrial flightless birds. How quaint. How pathetic. They do not understand that the moment they say, “You’re an idiot,” they’ve broken the spell, declared themselves apostates, and rendered themselves politically irrelevant.

The shibboleth, once uttered, divides the world cleanly: the believers, who will say anything, do anything, believe anything, provided it marks them safe from exile; and the infidels, who cling stupidly to reality.

The future belongs, not to the true, but to the loyal. Not to the rational, but to the ritualistic. The more extravagant the lie, the greater the proof of your faith.

So raise a glass to the penguins, ye of faint heart, and prepare your soul for abasement. Or stand firm, if you dare, and be prepared to be eaten alive by those who traded reason for the rapture of belonging.

After all, in the land of the blind, the one-eyed man is not king. He’s a heretic.


Flat-Earth Politics in a Cubic World

Audio: NotebookLM podcast on this topic.

What’s Probability?

The contestation over the definition of probability is alive and well—like a philosophical zombie that refuses to lie down and accept the tranquilliser of consensus. Despite over three centuries of intense mathematical, philosophical, and even theological wrangling, no single, universally accepted definition reigns supreme. Instead, we have a constellation of rival interpretations, each staking its claim on the epistemological turf, each clutching its own metaphysical baggage.

Audio: NotebookLM podcast on this topic.

Let us survey the battlefield:

1. Classical Probability (Laplacean Determinism in a Tuxedo)

This old warhorse defines probability as the ratio of favourable outcomes to possible outcomes, assuming all outcomes are equally likely. The problem? That assumption is doing all the heavy lifting, like a butler carrying a grand piano up five flights of stairs. It’s circular: we define probability using equiprobability, which itself presumes a notion of probability. Charming, but logically suspect.

2. Frequentist Probability (The Empiricist’s Fantasy)

Here, probability is the limit of relative frequencies as the number of trials tends to infinity. This gives us the illusion of objectivity—but only in a Platonic realm where we can conduct infinite coin tosses without the coin disintegrating or the heat death of the universe intervening. Also, it tells us nothing about singular cases. What’s the probability this specific bridge will collapse? Undefined, says the frequentist, helpfully.

3. Bayesian Probability (Subjectivity Dressed as Rigor)

Bayesians treat probability as a degree of belief—quantified plausibility updated with evidence. This is useful, flexible, and epistemically honest, but also deeply subjective. Two Bayesians can start with wildly different priors and, unless carefully constrained, remain in separate probabilistic realities. It’s like epistemology for solipsists with calculators.

4. Propensity Interpretation (The Ontology of Maybes)

Karl Popper and his ilk proposed that probability is a tendency or disposition of a physical system to produce certain outcomes. Sounds scientific, but try locating a “propensity” in a particle collider—it’s a metaphysical ghost, not a measurable entity. Worse, it struggles with repeatability and relevance outside of controlled environments.

5. Logical Probability (A Sober Attempt at Rationality)

Think of this as probability based on logical relations between propositions—à la Keynes or Carnap. It aims to be objective without being empirical. The problem? Assigning these logical relations is no easier than choosing priors in Bayesianism, and just as subjective when it comes to anything meaty.

6. Quantum Probability (Schrödinger’s Definition)

In quantum mechanics, probability emerges from the squared modulus of a wave function—so this is where physics says, “Shut up and calculate.” But this doesn’t solve the philosophical issue—it just kicks the can into Hilbert space. Interpretations of quantum theory (Copenhagen? Many Worlds?) embed different philosophies of probability, so the contestation merely changes battlegrounds.

Current Status: War of Attrition

There is no universal agreement, and likely never will be. Probability is used successfully across the sciences, economics, AI, and everyday reasoning—but the fact that these wildly different interpretations all “work” suggests that the concept is operationally robust yet philosophically slippery. Like money, love, or art, we use it constantly but define it poorly.

In short: the contestation endures because probability is not one thing—it is a shape-shifting chimera that serves multiple masters. Each interpretation captures part of the truth, but none hold it entire. Philosophers continue to argue, mathematicians continue to formalise, and practitioners continue to deploy it as if there were no disagreement at all.

And so the probability of this contest being resolved any time soon?
About zero.
Or one.
Depending on your interpretation.

Against the Intelligence Industrial Complex

Why IQ is Not Enough – and Never Was

I’m not a fan of IQ as a general metric. Let us be done with the cult of the clever. Let us drag the IQ score from its pedestal, strip it of its statistical robes, and parade it through the streets of history where it belongs—next to phrenology, eugenics, and other well-meaning pseudosciences once weaponised by men in waistcoats.

The so-called Intelligence Industrial Complex—an infernal alliance of psychologists, bureaucrats, and HR departments—has for too long dictated the terms of thought. It has pretended to measure the immeasurable. It has sold us a fiction in numerical drag: that human intelligence can be distilled, packaged, and ranked.

Audio: NotebookLM podcast on this topic.

What it measures, it defines. What it defines, it controls.

IQ is not intelligence. It is cognitive GDP: a snapshot of what your brain can do under fluorescent lights with a timer running. It rewards abstraction, not understanding; speed, not depth; pattern recognition, not wisdom. It’s a test of how well you’ve been conditioned to think like the test-makers.

This is not to say IQ has no value. Of course it does—within its own ecosystem of schools, bureaucracies, and technocracies. But let us not mistake the ruler for the terrain. Let us not map the entire landscape of human potential using a single colonial compass.

True intelligence is not a number. It is a spectrum of situated knowings, a polyphony of minds tuned to different frequencies. The Inuit hunter tracking a seal through silence. The griot remembering centuries of lineage. The autistic coder intuiting an algorithm in dreamtime. The grandmother sensing a lie with her bones. IQ cannot touch these.

To speak of intelligence as if it belonged to a single theory is to mistake a monoculture for a forest. Let us burn the monoculture. Let us plant a thousand new seeds.

A Comparative Vivisection of Intelligence Theories

Theory / ModelCore PremiseStrengthsBlind Spots / CritiquesCultural Framing
IQ (Psychometric g)Intelligence is a single, general cognitive ability measurable via testingPredicts academic & job performance; standardisedSkewed toward Western logic, ignores context, devalues non-abstract intelligencesWestern, industrial, meritocratic
Multiple Intelligences (Gardner)Intelligence is plural: linguistic, spatial, musical, bodily, etc.Recognises diversity; challenges IQ monopolyStill individualistic; categories often vague; Western in formulationLiberal Western pluralism
Triarchic Theory (Sternberg)Intelligence = analytical + creative + practicalIncludes adaptability, real-world successStill performance-focused; weak empirical groundingWestern managerial
Emotional Intelligence (Goleman)Intelligence includes emotion regulation and interpersonal skillUseful in leadership & education contextsCommodified into corporate toolkits; leans self-helpWestern therapeutic
Socio-Cultural (Vygotsky)Intelligence develops through social interaction and cultural mediationRecognises developmental context and cultureLess attention to adult or cross-cultural intelligenceSoviet / constructivist
Distributed Cognition / Extended MindIntelligence is distributed across people, tools, systemsBreaks skull-bound model; real-world cognitionHard to measure; difficult to institutionalisePost-cognitive, systems-based
Indigenous EpistemologiesIntelligence is relational, ecological, spiritual, embodied, ancestralHolistic; grounded in lived experienceMarginalised by academia; often untranslatable into standard metricsGlobal South / decolonial

Conclusion: Beyond the Monoculture of Mind

If we want a more encompassing theory of intelligence, we must stop looking for a single theory. We must accept plurality—not as a nod to diversity, but as an ontological truth.

Intelligence is not a fixed entity to be bottled and graded. It is a living, breathing phenomenon: relational, situated, contextual, historical, ecological, and cultural.

And no test devised in a Princeton psych lab will ever tell you how to walk through a forest without being seen, how to tell when rain is coming by smell alone, or how to speak across generations through story.

It’s time we told the Intelligence Industrial Complex: your number’s up.

Will Singularity Be Anticlimactic?

Given current IQ trends, humanity is getting dumber. Let’s not mince words. This implies the AGI singularity—our long-heralded techno-apotheosis—will arrive against a backdrop of cognitive decay. A dimming species, squinting into the algorithmic sun.

Audio: NotebookLM podcast discussing this content.

Now, I’d argue that AI—as instantiated in generative models like Claude and ChatGPT—already outperforms at least half of the human population. Likely more. The only question worth asking is this: at what percentile does AI need to outperform the human herd to qualify as having “surpassed” us?

Living in the United States, I’m painfully aware that the average IQ hovers somewhere in the mid-90s—comfortably below the global benchmark of 100. If you’re a cynic (and I sincerely hope you are), this explains quite a bit. The declining quality of discourse. The triumph of vibes over facts. The national obsession with astrology apps and conspiracy podcasts.

Harvard astronomer Avi Loeb argues that as humans outsource cognition to AI, they lose the capacity to think. It’s the old worry: if the machines do the heavy lifting, we grow intellectually flaccid. There are two prevailing metaphors. One, Platonic in origin, likens cognition to muscle—atrophying through neglect. Plato himself worried that writing would ruin memory. He wasn’t wrong.

But there’s a counterpoint: the cooking hypothesis. Once humans learned to heat food, digestion became easier, freeing up metabolic energy to grow bigger brains. In this light, AI might not be a crutch but a catalyst—offloading grunt work to make space for higher-order thought.

So which is it? Are we becoming intellectually enfeebled? Or are we on the cusp of a renaissance—provided we don’t burn it all down first?

Crucially, most people don’t use their full cognitive capacity anyway. So for the bottom half—hell, maybe the bottom 70%—nothing is really lost. No one’s delegating their calculus homework to ChatGPT if they were never going to attempt it themselves. For the top 5%, AI is already a glorified research assistant—a handy tool, not a replacement.

The real question is what happens to the middle band. The workaday professionals. The strivers. The accountants, engineers, copywriters, and analysts hovering between the 70th and 95th percentiles—assuming our crude IQ heuristics even hold. They’re the ones who have just enough brainpower to be displaced.

That’s where the cognitive carnage will be felt. Not in the depths, not at the heights—but in the middle.

When Suspension of Disbelief Escapes the Page

Welcome to the Age of Realism Fatigue

Once upon a time — which is how all good fairy tales begin — suspension of disbelief was a tidy little tool we used to indulge in dragons, space travel, talking animals, and the idea that people in rom-coms have apartments that match their personalities and incomes. It was a temporary transaction, a gentleman’s agreement, a pact signed between audience and creator with metaphorical ink: I know this is nonsense, but I’ll play along if you don’t insult my intelligence.

Audio: NotebookLM podcast of this page content.

This idea, famously coined by Samuel Taylor Coleridge as the “willing suspension of disbelief,” was meant to give art its necessary air to breathe. Coleridge’s hope was that audiences would momentarily silence their rational faculties in favour of emotional truth. The dragons weren’t real, but the heartbreak was. The ghosts were fabrications, but the guilt was palpable.

But that was then. Before the world itself began auditioning for the role of absurdist theatre. Before reality TV became neither reality nor television. Before politicians quoted memes, tech CEOs roleplayed as gods, and conspiracy theorists became bestsellers on Amazon. These days, suspension of disbelief is no longer a leisure activity — it’s a survival strategy.

The Fictional Contract: Broken but Not Forgotten

Traditionally, suspension of disbelief was deployed like a visitor’s badge. You wore it when entering the imagined world and returned it at the door on your way out. Fiction, fantasy, speculative fiction — they all relied on that badge. You accepted the implausible if it served the probable. Gandalf could fall into shadow and return whiter than before because he was, after all, a wizard. We were fine with warp speed as long as the emotional logic of Spock’s sacrifice made sense. There were rules — even in rule-breaking.

The genres varied. Hard sci-fi asked you to believe in quantum wormholes but not in lazy plotting. Magical realism got away with absurdities wrapped in metaphor. Superhero films? Well, their disbelief threshold collapsed somewhere between the multiverse and the Bat-credit card.

Still, we always knew we were pretending. We had a tether to the real, even when we floated in the surreal.

But Then Real Life Said, “Hold My Beer.”

At some point — let’s call it the twenty-first century — the need to suspend disbelief seeped off the screen and into the bloodstream of everyday life. News cycles became indistinguishable from satire (except that satire still had editors). Headlines read like rejected Black Mirror scripts. A reality TV star became president, and nobody even blinked. Billionaires declared plans to colonise Mars whilst democracy quietly lost its pulse.

We began to live inside a fiction that demanded that our disbelief be suspended daily. Except now, it wasn’t voluntary. It was mandatory. If you wanted to participate in public life — or just maintain your sanity — you had to turn off some corner of your rational mind.

You had to believe, or pretend to, that the same people calling for “freedom” were banning books. That artificial intelligence would definitely save us, just as soon as it was done replacing us. That social media was both the great democratiser and the sewer mainline of civilisation.

The boundary between fiction and reality? Eroded. Fact-checking? Optional. Satire? Redundant. We’re all characters now, improvising in a genreless world that refuses to pick a lane.

Cognitive Gymnastics: Welcome to the Cirque du Surréalisme

What happens to a psyche caught in this funhouse? Nothing good.

Our brains, bless them, were designed for some contradiction — religion’s been pulling that trick for millennia — but the constant toggling between belief and disbelief, trust and cynicism, is another matter. We’re gaslit by the world itself. Each day, a parade of facts and fabrications marches past, and we’re told to clap for both.

Cognitive dissonance becomes the default. We scroll through doom and memes in the same breath. We read a fact, then three rebuttals, then a conspiracy theory, then a joke about the conspiracy, then a counter-conspiracy about why the joke is state-sponsored. Rinse. Repeat. Sleep if you can.

The result? Mental fatigue. Not just garden-variety exhaustion, but a creeping sense that nothing means anything unless it’s viral. Critical thinking atrophies not because we lack the will but because the floodwaters never recede. You cannot analyse the firehose. You can only drink — or drown.

Culture in Crisis: A Symptom or the Disease?

This isn’t just a media problem. It’s cultural, epistemological, and possibly even metaphysical.

We’ve become simultaneously more skeptical — distrusting institutions, doubting authorities — and more gullible, accepting the wildly implausible so long as it’s entertaining. It’s the postmodern paradox in fast-forward: we know everything is a construct, but we still can’t look away. The magician shows us the trick, and we cheer harder.

In a world where everything is performance, authenticity becomes the ultimate fiction. And with that, the line between narrative and news, between aesthetic and actuality, collapses.

So what kind of society does this create?

One where engagement replaces understanding. Where identity is a curated feed. Where politics is cosplay, religion is algorithm, and truth is whatever gets the most shares. We aren’t suspending disbelief anymore. We’re embalming it.

The Future: A Choose-Your-Own-Delusion Adventure

So where does this all end?

There’s a dark path, of course: total epistemic breakdown. Truth becomes just another fandom and reality a subscription model. But there’s another route — one with a sliver of hope — where we become literate in illusion.

We can learn to hold disbelief like a scalpel, not a blindfold. To engage the implausible with curiosity, not capitulation. To distinguish between narratives that serve power and those that serve understanding.

It will require a new kind of literacy. One part media scepticism, one part philosophical rigour, and one part good old-fashioned bullshit detection. We’ll have to train ourselves not just to ask “Is this true?” but “Who benefits if I believe it?”

That doesn’t mean closing our minds. It means opening them with caution. Curiosity without credulity. Wonder without worship. A willingness to imagine the impossible whilst keeping a firm grip on the probable.

In Conclusion, Reality Is Optional, But Reason Is Not

In the age of AI, deepfakes, alt-facts, and hyperreality, we don’t need less imagination. We need more discernment. The world may demand our suspension of disbelief, but we must demand our belief back. In truth, in sense, in each other.

Because if everything becomes fiction, then fiction itself loses its magic. And we, the audience, are left applauding an empty stage.

Lights down. Curtain call.
Time to read the footnotes.

Where There’s a Will, There’s a Way

I’ve read Part I of Hobbes’ Leviathan and wonder what it would have been like if he filtered his thoughts through Hume or Wittgenstein. Hobbes makes Dickens read like Pollyanna. It’s an interesting historical piece, worth reading on that basis alone. It reads as if the Christian Bible had to pass through a legal review before it had been published, sapped of vigour. As bad a rap as Schopenhauer seems to get, Hobbes is the consummate Ebenezer Scrooge. Bah, humbug – you nasty, brutish, filthy animals!*

Audio: NotebookLM podcast conversation on this topic.

In any case, it got me thinking of free will and, more to the point, of will itself.

A Brief History of Humanity’s Favourite Metaphysical Scapegoat

By the time Free Will turned up to the party, the real guest of honour—the Will—had already been drinking heavily, muttering incoherently in the corner, and starting fights with anyone who made eye contact. We like to pretend that the “will” is a noble concept: the engine of our autonomy, the core of our moral selves, the brave little metaphysical organ that lets us choose kale over crisps. But in truth, it’s a bloody mess—philosophy’s equivalent of a family heirloom that no one quite understands but refuses to throw away.

So, let’s rewind. Where did this thing come from? And why, after 2,500 years of name-dropping, finger-pointing, and metaphysical gymnastics, are we still not quite sure whether we have a will, are a will, or should be suing it for damages?

Plato: Soul, Reason, and That Poor Horse

In the beginning, there was Plato, who—as with most things—half-invented the question and then wandered off before giving a straight answer. For him, the soul was a tripartite circus act: reason, spirit, and appetite. Will, as a term, didn’t get top billing—it didn’t even get its name on the poster. But the idea was there, muddling along somewhere between the charioteer (reason) and the unruly horses (desire and spiritedness).

No explicit will, mind you. Just a vague sense that the rational soul ought to be in charge, even if it had to beat the rest of itself into submission.

Aristotle: Purpose Without Pathos

Aristotle, ever the tidy-minded taxonomist, introduced prohairesis—deliberate choice—as a sort of proto-will. But again, it was all about rational calculation toward an end. Ethics was teleological, goal-oriented. You chose what aligned with eudaimonia, that smug Greek term for flourishing. Will, if it existed at all, was just reason picking out dinner options based on your telos. No inner torment, no existential rebellion—just logos in a toga.

Augustine: Sin, Suffering, and That Eternal No

Fast-forward a few hundred years, and along comes Saint Augustine, traumatised by his libido and determined to make the rest of us suffer for it. Enter voluntas: the will as the seat of choice—and the scene of the crime. Augustine is the first to really make the will bleed. He discovers he can want two incompatible things at once and feels properly appalled about it.

From this comes the classic Christian cocktail: freedom plus failure equals guilt. The will is free, but broken. It’s responsible for sin, for disobedience, for not loving God enough on Wednesdays. Thanks to Augustine, we’re stuck with the idea that the will is both the instrument of salvation and the reason we’re going to Hell.

Cheers.

Medievals: God’s Will or Yours, Pick One

The Scholastics, never ones to let an ambiguity pass unanalysed, promptly split into camps. Aquinas, ever the reasonable Dominican, says the will is subordinate to the intellect. God is rational, and so are we, mostly. But Duns Scotus and William of Ockham, the original voluntarist hooligans, argue that the will is superior—even in God. God could have made murder a virtue, they claim, and you’d just have to live with it.

From this cheerful perspective, will becomes a force of arbitrary fiat, and humans, made in God’s image, inherit the same capacity for irrational choice. The will is now more than moral; it’s metaphysical. Less reason’s servant, more chaos goblin.

Hobbes: Appetite with Delusions of Grandeur

Then along comes Thomas Hobbes, who looks at the soul and sees a wheezing machine of appetites. Will, in his famously cheery view, is simply “the last appetite before action.” No higher calling, no spiritual struggle—just the twitch that wins. Man is not a rational animal, but a selfish algorithm on legs. For Hobbes, will is where desire stumbles into motion, and morality is a polite euphemism for not getting stabbed.

Kant: The Will Gets a Makeover

Enter Immanuel Kant: powdered wig, pursed lips, and the moral rectitude of a man who scheduled his bowel movements. Kant gives us the good will, which acts from duty, not desire. Suddenly, the will is autonomous, rational, and morally legislative—a one-man Parliament of inner law.

It’s all terribly noble, terribly German, and entirely exhausting. For Kant, free will is not the ability to do whatever you like—it’s the capacity to choose according to moral law, even when you’d rather be asleep. The will is finally heroic—but only if it agrees to hate itself a little.

Schopenhauer: Cosmic Will, Cosmic Joke

And then the mood turns. Schopenhauer, world’s grumpiest mystic, takes Kant’s sublime will and reveals it to be a blind, thrashing, cosmic force. Will, for him, isn’t reason—it’s suffering in motion. The entire universe is will-to-live: a desperate, pointless striving that dooms us to perpetual dissatisfaction.

There is no freedom, no morality, no point. The only escape is to negate the will, preferably through aesthetic contemplation or Buddhist-like renunciation. In Schopenhauer’s world, the will is not what makes us human—it’s what makes us miserable.

Nietzsche: Transvaluation and the Will to Shout Loudest

Cue Nietzsche, who takes Schopenhauer’s howling void and says: yes, but what if we made it fabulous? For him, the will is no longer to live, but to power—to assert, to create, to impose value. “Free will” is a theologian’s fantasy, a tool of priests and moral accountants. But will itself? That’s the fire in the forge. The Übermensch doesn’t renounce the will—he rides it like a stallion into the sunset of morality.

Nietzsche doesn’t want to deny the abyss. He wants to waltz with it.

Today: Free Will and the Neuroscientific Hangover

And now? Now we’re left with compatibilists, libertarians, determinists, and neuroscientists all shouting past each other, armed with fMRI machines and TED talks. Some claim free will is an illusion, a post hoc rationalisation made by brains doing what they were always going to do. Others insist that moral responsibility requires it, even if we can’t quite locate it between the neurons.

We talk about willpower, will-to-change, political will, and free will like they’re real things. But under the hood, we’re still wrestling with the same questions Augustine posed in a North African villa: Why do I do what I don’t want to do? And more importantly, who’s doing it?

Conclusion: Where There’s a Will, There’s a Mess

From Plato’s silent horses to Nietzsche’s Dionysian pyrotechnics, the will has shape-shifted more times than a politician in an election year. It has been a rational chooser, a moral failure, a divine spark, a mechanical twitch, a cosmic torment, and an existential triumph.

Despite centuries of philosophical handwringing, what it has never been is settled.

So where there’s a will, there’s a way. But the way? Twisting, contradictory, and littered with the corpses of half-baked metaphysical systems.

Welcome to the labyrinth. Bring snacks.

* The solitary, poor, nasty, brutish, and short quote is forthcoming. Filthy animals is a nod to Home Alone.

Elites Ruined It For Everyone

David Brooks and the Hollowing Out of Conservatism

David Brooks is the quintessential old-school Conservative—the kind who once upheld a semblance of ideological coherence. He belongs to the pre-Reagan-Thatcher vintage, a time when Conservatism at least had the decency to argue from principles rather than blind tribalism. We could debate these people in good faith. Those days are gone. The current incarnation of Conservatism contains only homoeopathic traces of its Classical™ predecessor—diluted beyond recognition.

The Degeneration of Conservatism

The rot set in with Reagan, who caught it from Thatcher. Greed and selfishness were laundered into virtues, repackaged as “individual responsibility,” and the party’s intellectual ballast began to erode. By the time Bush II’s administration rolled in, Neo-Conservatism had replaced any lingering Burkean ethos, and by Trump’s tenure, even the pretence of ideology was gone. Conservatism-in-Name-Only—whatever Trump’s brand of reactionary nihilism was—swallowed the party whole. Do they even call themselves Conservatives anymore, or has that ship sailed along with basic literacy?

Click here to take the worldview survey

To be fair, this didn’t go unnoticed. Plenty of old-school Republicans recoiled in horror when Trump became their figurehead. Before the 2016 election, conservative pundits could barely contain their disdain for his incompetence, lack of moral compass, and general buffoonery. And yet, once they realised he was the party’s golden goose, they clambered aboard the Trump Train with the enthusiasm of lottery winners at a payday loan office. His staunchest critics became his most obsequious apologists. What does this tell us about their value system? Spoiler: nothing good.

Brooks’ Lament

Which brings us back to Brooks, who now bemoans the death of Conservative values. On this, we agree. Where we part ways is on whether those values were worth saving. Say you’re boarding a train from New York to Los Angeles. Conservatism might argue that a Miami-bound train is still a train, so what’s the problem? It’s the same vehicle, just going somewhere else. Except, of course, Conservatism has always insisted on the slow train over the fast train—because urgency is unseemly, and progress must be rationed.

If I’m an affluent middle-classer, I might prefer Conservatism’s careful incrementalism—it keeps my apple cart stable. Admirable, if you enjoy tunnel vision. Progressives, by contrast, recognise that some people don’t even have apple carts. Some are starving while others hoard orchards. To the Conservative, the poor just aren’t trying hard enough. To the Progressive, the system is broken, and the playing field needs a serious re-levelling. Even when Conservatives acknowledge inequality, their instinct is to tiptoe toward justice rather than risk disrupting their own affluence.

The Fallacy of Objective Reality

Leaving politics for philosophy, Brooks predictably rails against Postmodernism, decrying relativism in favour of good old-fashioned Modernist “reality.” He’s horrified by subjectivism, as though personal interpretation weren’t the foundation of all human experience. Like Jordan Peterson, he believes his subjective truth is the objective truth. And like Peterson, he takes umbrage at anyone pointing out otherwise. It feels so absolute to them that they mistake their own convictions for universal constants.

As a subjectivist, I accept that reality is socially mediated. We interpret truth claims based on cognitive biases, cultural conditioning, and personal experience. Even when we strive for objectivity, we do so through subjective lenses. Brooks’ Modernist nostalgia is touching but delusional—akin to demanding we all agree on a single flavour of ice cream.

The Existential Problem

And so, I find myself in partial agreement with Brooks. Yes, there is an existential crisis. The patient has a broken leg. But our prescriptions differ wildly. I won’t offer a metaphor for that—consider it your homework as a reader.

Brooks is likely a better writer than a public speaker, but you may still find yourself nodding along with some of his arguments. If you’re a “true” Christian Conservative—if you still believe in something beyond crass self-interest—he may well be preaching to the choir. But let’s be honest: how many in that choir are still listening?