The Enlightenment, we are told, was the age of Reason. A radiant exorcism of superstition. Out went God. Out went angels, miracles, saints, indulgences. All that frothy medieval sentiment was swept aside by a brave new world of logic, science, and progress. Or so the story goes.
Audio: NotebookLM podcast on this topic.
But look closer, and you’ll find that Reason didn’t kill God—it absorbed Him. The Enlightenment didn’t abandon metaphysics. It merely privatised it.
From Confessional to Courtroom
We like to imagine that the Enlightenment was a clean break from theology. But really, it was a semantic shell game. The soul was rebranded as the self. Sin became crime. Divine judgement was outsourced to the state.
We stopped praying for salvation and started pleading not guilty.
The entire judicial apparatus—mens rea, culpability, desert, retribution—is built on theological scaffolding. The only thing missing is a sermon and a psalm.
Where theology had the guilty soul, Enlightenment law invented the guilty mind—mens rea—a notion so nebulous it requires clairvoyant jurors to divine intention from action. And where the Church offered Hell, the state offers prison. It’s the same moral ritual, just better lit.
Galen Strawson and the Death of Moral Responsibility
Enter Galen Strawson, that glowering spectre at the feast of moral philosophy. His Basic Argument is elegantly devastating:
You do what you do because of the way you are.
You can’t be ultimately responsible for the way you are.
Therefore, you can’t be ultimately responsible for what you do.
Unless you are causa sui—the cause of yourself, an unmoved mover in Calvin Klein—you cannot be held truly responsible. Free will collapses, moral responsibility evaporates, and retributive justice is exposed as epistemological theatre.
In this light, our whole legal structure is little more than rebranded divine vengeance. A vestigial organ from our theocratic past, now enforced by cops instead of clerics.
The Modern State: A Haunted House
What we have, then, is a society that has denied the gods but kept their moral logic. We tossed out theology, but we held onto metaphysical concepts like intent, desert, and blame—concepts that do not survive contact with determinism.
We are living in the afterglow of divine judgement, pretending it’s sunlight.
Nietzsche saw it coming, of course. He warned that killing God would plunge us into existential darkness unless we had the courage to also kill the values propped up by His corpse. We did the first bit. We’re still bottling it on the second.
If Not Retribution, Then What?
Let’s be clear: no one’s suggesting we stop responding to harm. But responses should be grounded in outcomes, not outrage.
Containment, not condemnation.
Prevention, not penance.
Recalibration, not revenge.
We don’t need “justice” in the retributive sense. We need functional ethics, rooted in compassion and consequence, not in Bronze Age morality clumsily duct-taped to Enlightenment reason.
The Risk of Letting Go
Of course, this is terrifying. The current system gives us moral closure. A verdict. A villain. A vanishing point for our collective discomfort.
Abandoning retribution means giving that up. It means accepting that there are no true villains—only configurations of causes. That punishment is often revenge in drag. That morality itself might be a control mechanism, not a universal truth.
But if we’re serious about living in a post-theological age, we must stop playing dress-up with divine concepts. The Enlightenment didn’t finish the job. It changed the costumes, kept the plot, and called it civilisation.
“All that is necessary for evil to triumph is for a few words to go missing from the bylaws.” — not Edmund Burke, but it ought to be.
The Trump administration—America’s reigning monarch of meaningless bombast—has done it again. This time, with an executive order so linguistically cunning it deserves a Pulitzer for Subtextual Menace.
Issued on 30 January 2025, the decree known as “Defending Women from Gender Ideology Extremism” (because, of course, it couldn’t just be called Let’s Erase Legal Protections for People We Don’t Like) removed “political affiliation” and “marital status” from the list of protected classes within certain federal frameworks.
And the result? According to documents unearthed by The Guardian, VA doctors can now legally refuse treatment to patients based on their politics or marital status. You know, because being a Democrat apparently makes you too much of a pre-existing condition.
Naturally, the VA and White House are insisting this means absolutely nothing. “Don’t worry,” they coo. “No one’s actually doing it.” Ah yes, the old Schrödinger’s Protections defence—simultaneously removed and unchanged, invalid but somehow still effective.
But here’s the point—and where it ties to the Language Insufficiency Hypothesis I’ve been peddling like a raving madman at the crossroads of post-structuralism and bureaucratic despair: language isn’t just failing to communicate meaning—it’s being weaponised to obscure it.
The Erosion of Meaning Through Omission
This isn’t the blunt-force idiocy of Orwell’s Newspeak. This is something more elegant—more insidious. This is legislative lacunae. It’s what happens when not saying something says everything.
The words “political affiliation” and “marital status” weren’t replaced. They weren’t clarified. They were simply deleted. Erased like a bad tweet, like a conscience, like a veteran with the wrong bumper sticker.
This is language subtraction as a tool of governance.
We’re not criminalising dissent. We’re just making it legally ignorable.
We’re not discriminating against the unmarried. We’re just no longer required to treat them the same.
It’s the bureaucratic cousin of the dog-whistle: not quite audible in court, but perfectly clear to the base.
The Slippery Slope is Now a Slip-n-Slide
This is how you rewrite civil rights without the fuss of saying so. You just… remove the language that once held the dam in place. Then, when the flood comes, you feign surprise:
“Oh, dear. Who could have guessed that removing protections would result in people being unprotected?”
(Everyone. Everyone could have guessed.)
This is not a bug in the legal language. It’s the feature. The silence is the speech act. The absence is the argument.
This is what I mean by language insufficiency: not merely that our words fail to convey truth, but that their very structure is liable to be gamed—exploited by those who understand that ambiguity is power.
Beyond Intentionality: The Weaponised Void
In philosophy of language, we often debate intentionality—what the speaker meant to say. But here we’re in darker waters. This isn’t about intention. It’s about calculated omission.
The executive order doesn’t declare war on Democrats or single mothers. It simply pulls the thread and lets the tapestry unravel itself.
It’s an act of rhetorical cowardice disguised as administrative efficiency.
This is the Trumpian genius: use language like a stage magician uses sleeves. Distract with one hand, disappear with the other.
Final Diagnosis: Policy by Redaction
We now inhabit a political climate where what is not said carries more legal force than what is. Where bylaw gaps become policy gateways, and where civil rights die not with a bang, but with an elision.
So no, the VA hasn’t yet denied a Democrat a blood transfusion. But the table has been set. The menu revised. The waitstaff told they may now “use discretion.”
Language doesn’t merely fail us. It is being made to fail strategically.
Welcome to the new America: where rights aren’t removed—they’re left out of the memo.
Yet again, ChatGPT renders an odd image. Can’t be bothered to amend it.
Full Disclosure: I read the first volume of The Hunger Games just before the film was released. It was OK – certainly better than the film. This video came across my feed, and I skipped through it. Near the end, this geezer references how Katniss saves or recovers deteriorated morality. Me being me, I found issue with the very notion that a relative, if not subjective, concept could be recovered.
The OP asks if The Hunger Games are a classic. I’d argue that they are a categorical classic, like Harry Potter, within the category of YA fiction.
Audio: NotebookLM podcast discussing this topic.
The Hunger Games doesn’t depict the death of morality — it’s a masterclass in how to twist it into a circus act.
Video: YouTube video that spawned this topic.
Let us dispense with the hand-wringing. The Hunger Games is not a parable of moral decay. It is something far more chilling: a vivid portrait of moral engineering — the grotesque contortion of ethical instincts into instruments of domination and spectacle.
Those who bemoan the “decline of morality” in Panem have rather missed the point. There is no absence of morality in the Capitol — only a different version of it. A rebranded, corporatised, state-sanctioned morality, lacquered in lipstick and broadcast in 4K. It is not immorality that reigns, but a hyperactive ideological morality, designed to keep the masses docile and the elites draped in silk.
This is not moral entropy; it’s moral mutation.
Children are not slaughtered because people have forgotten right from wrong — they are slaughtered because a society has been trained to believe that this is what justice looks like. That blood is penance. That fear is unity. That watching it all unfold with a glass of champagne in hand is perfectly civilised behaviour.
This isn’t the death of morality. It’s a hostile takeover.
The Moral PR Machine
If morality is, as many of us suspect, relative — a cultural construct built on consensus, coercion, and convenience — then it can no more “decline” than fashion trends can rot. It simply shifts. One day, shoulder pads are in. The next, it’s child-on-child murder as prime-time entertainment.
In Panem, the moral compass has not vanished. It’s been forcibly recalibrated. Not by reason or revelation, but by propaganda and fear. The Games are moral theatre. A grim ritual, staged to remind the Districts who holds the reins, all under the nauseating guise of tradition, order, and justice.
The citizens of the Capitol aren’t monsters — they’re consumers. Trained to see horror as haute couture. To mistake power for virtue. To cheer while children are butchered, because that’s what everyone else is doing — and, crucially, because they’ve been taught it’s necessary. Necessary evils are the most seductive kind.
Katniss: Not a Saint, But a Saboteur
Enter Katniss Everdeen, not as the moral saviour but as the spanner in the machine. She doesn’t preach. She doesn’t have a grand theory of justice. What she has is visceral disgust — an animal revulsion at the machinery of the Games. Her rebellion is personal, tribal, and instinctive: protect her sister, survive, refuse to dance for their amusement.
She isn’t here to restore some lost golden age of decency. She’s here to tear down the current script and refuse to read her lines.
Her defiance is dangerous not because it’s moral in some abstract, universal sense — but because it disrupts the Capitol’s moral narrative. She refuses to be a pawn in their ethical pageant. She reclaims agency in a world that has commodified virtue and turned ethics into state theatre.
So, Has Morality Declined?
Only if you believe morality has a fixed address — some eternal North Star by which all human actions may be judged. But if, as postmodernity has rather insistently suggested, morality is a shifting social fiction — then Panem’s horror is not a fall from grace, but a recalibration of what counts as “grace” in the first place.
And that’s the real horror, isn’t it? Not that morality has collapsed — but that it still exists, and it likes what it sees.
Conclusion: The Real Hunger
The Hunger Games is not about a society starved of morality — it’s about a world gorging on it, cooked, seasoned, and served with a garnish of guiltless indulgence. It is moral appetite weaponised. Ethics as edict. Conscience as costume.
If you feel sickened by what you see in Panem, it’s not because morality has vanished.
I died in March 2023 — or so the rumour mill would have you believe.
Of course, given that I’m still here, hammering away at this keyboard, it must be said that I didn’t technically die. We don’t bring people back. Death, real death, doesn’t work on a “return to sender” basis. Once you’re gone, you’re gone, and the only thing bringing you back is a heavily fictionalised Netflix series.
Audio: NotebookLM podcast of this content.
No, this is a semantic cock-up, yet another stinking exhibit in the crumbling Museum of Language Insufficiency. “I died,” people say, usually while slurping a Pumpkin Spice Latte and live-streaming their trauma to 53 followers. What they mean is that they flirted with death, clumsily, like a drunk uncle at a wedding. No consummation, just a lot of embarrassing groping at the pearly gates.
And since we’re clarifying terms: there was no tunnel of light, no angels, no celestial choir belting out Coldplay covers. No bearded codgers in slippers. No 72 virgins. (Or, more plausibly, 72 incels whining about their lack of Wi-Fi reception.)
There was, in fact, nothing. Nothing but the slow, undignified realisation that the body, that traitorous meat vessel, was shutting down — and the only gates I was approaching belonged to A&E, with its flickering fluorescent lights and a faint smell of overcooked cabbage.
To be fair, it’s called a near-death experience (NDE) for a reason. Language, coward that it is, hedges its bets. “Near-death” means you dipped a toe into the abyss and then screamed for your mummy. You didn’t die. You loitered. You loitered in the existential equivalent of an airport Wetherspoons, clutching your boarding pass and wondering why the flight to Oblivion was delayed.
As the stories go, people waft into the next world and are yanked back with stirring tales of unicorns, long-dead relatives, and furniture catalogues made of clouds. I, an atheist to my scorched and shrivelled soul, expected none of that — and was therefore not disappointed.
What I do recall, before the curtain wobbled, was struggling for breath, thinking, “Pick a side. In or out. But for pity’s sake, no more dithering.” In a last act of rational agency, I asked an ER nurse — a bored-looking Athena in scrubs — to intubate me. She responded with the rousing medical affirmation, “We may have to,” which roughly translates to, “Stop making a scene, love. We’ve got fifteen others ahead of you.”
After that, nothing. I was out. Like a light. Like a minor character in a Dickens novel whose death is so insignificant it happens between paragraphs.
I woke up the next day: groggy, sliced open, a tube rammed down my throat, and absolutely no closer to solving the cosmic riddle of it all. Not exactly the triumphant return of Odysseus. Not even a second-rate Ulysses.
Here’s the reality: There is no coming back from death. You can’t “visit” death, any more than you can spend the afternoon being non-existent and return with a suntan.
Those near-death visions? Oxygen-starved brains farting out fever dreams. Cerebral cortexes short-circuiting like Poundland fairy lights. Hallucinations, not heralds. A final, frantic light show performed for an audience of none.
Epicurus, that cheerful nihilist, said, “When we are, death is not. When death is, we are not.” He forgot to mention that, in between, people would invent entire publishing industries peddling twaddle about journeys beyond the veil — and charging $29.99 for the paperback edition.
No angels. No harps. No antechamber to the divine. Just the damp whirr of hospital machinery and the faint beep-beep of capitalism, patiently billing you for your own demise.
If there’s a soundtrack to death, it’s not choirs of the blessed. It’s a disgruntled junior surgeon muttering, “Where the hell’s the anaesthetist?” while pawing desperately through a drawer full of out-of-date latex gloves.
And thus, reader, I lived. But only in the most vulgar, anticlimactic, and utterly mortal sense.
There will be no afterlife memoir. No second chance to settle the score. No sequel. Just this: breath, blood, occasional barbed words — and then silence.
Watching Sam Harris ruminate on the nature of political lies (still believing, poor lamb, that reason might one day triumph) reminds me of something more sinister: lies today are not attempts at persuasion. They are shibboleths — tribal passwords, loyalty oaths, secret handshakes performed in the broad light of day.
Video: Sam Harris tells us why Trump and his ilk lie.
Forget “alternative facts.” That charming euphemism was merely a decoy, a jangling set of keys to distract the infantile media. The real game was always deeper: strategic distortion, the deliberate blurring of perception not to deceive the outsider, but to identify the insider.
Audio: NotebookLM podcast on this topic.
When Trump — or any other post-truth demagogue — proclaims that penguins are, in fact, highly trained alien operatives from the Andromeda galaxy, the objective is not persuasion. The point is to force a choice: will you, standing before this glistening absurdity, blink and retreat into reason, stammering something about ornithology… Or will you step forward, clasp the hand of madness, and mutter, ‘Yes, my liege, the penguins have been among us all along’?
Those who demur, those who scoff or gasp or say ‘You’re an idiot,”’have failed the loyalty test. They have outed themselves as enemy combatants in the epistemic war. Truth, in this brave new world, is not a destination; it is an allegiance. To speak honestly is to wage rebellion.
Orwell, who tried very hard to warn us, understood this dynamic well: the real triumph of Big Brother was not merely to compel you to lie but to compel you to believe the lie. Koestler, another battered prophet of the age, charted how political movements sink into ritualistic unreason, demanding not conviction but performance. Swift, for his part, knew it was all hilarious if you tilted your head just right.
The bigger the lie, the better the shibboleth. Claim that two and two make five, and you catch out the weak-willed rationalists. Claim that penguins are extraterrestrials, and you find the truly devoted, the ones willing to build altars from ice and sacrifice to their feathery overlords.
It’s no accident that modern political theatre resembles a deranged initiation ritual. Each day brings a new absurdity, a fresh madness to affirm: ‘Men can become women by declaration alone!” “Billionaires are victims of systemic oppression!’ ‘The penguins are amongst us, plotting!’ Each claim a little more grotesque than the last, each compliance a little more degrading, a little more irreversible.
And oh, how eagerly the initiates rush forward! Clap for the penguins, or be cast out into the howling wilderness! Better to bend the knee to absurdity than be marked as an unbeliever. Better to humiliate yourself publicly than to admit that the Emperor’s penguin suit is just a costume.
Meanwhile, the opposition — earnest, naive — keeps trying to argue, to rebut, to point out that penguins are terrestrial flightless birds. How quaint. How pathetic. They do not understand that the moment they say, “You’re an idiot,” they’ve broken the spell, declared themselves apostates, and rendered themselves politically irrelevant.
The shibboleth, once uttered, divides the world cleanly: the believers, who will say anything, do anything, believe anything, provided it marks them safe from exile; and the infidels, who cling stupidly to reality.
The future belongs, not to the true, but to the loyal. Not to the rational, but to the ritualistic. The more extravagant the lie, the greater the proof of your faith.
So raise a glass to the penguins, ye of faint heart, and prepare your soul for abasement. Or stand firm, if you dare, and be prepared to be eaten alive by those who traded reason for the rapture of belonging.
After all, in the land of the blind, the one-eyed man is not king. He’s a heretic.
I’ve read Part I of Hobbes’ Leviathan and wonder what it would have been like if he filtered his thoughts through Hume or Wittgenstein. Hobbes makes Dickens read like Pollyanna. It’s an interesting historical piece, worth reading on that basis alone. It reads as if the Christian Bible had to pass through a legal review before it had been published, sapped of vigour. As bad a rap as Schopenhauer seems to get, Hobbes is the consummate Ebenezer Scrooge. Bah, humbug – you nasty, brutish, filthy animals!*
Audio: NotebookLM podcast conversation on this topic.
In any case, it got me thinking of free will and, more to the point, of will itself.
A Brief History of Humanity’s Favourite Metaphysical Scapegoat
By the time Free Will turned up to the party, the real guest of honour—the Will—had already been drinking heavily, muttering incoherently in the corner, and starting fights with anyone who made eye contact. We like to pretend that the “will” is a noble concept: the engine of our autonomy, the core of our moral selves, the brave little metaphysical organ that lets us choose kale over crisps. But in truth, it’s a bloody mess—philosophy’s equivalent of a family heirloom that no one quite understands but refuses to throw away.
So, let’s rewind. Where did this thing come from? And why, after 2,500 years of name-dropping, finger-pointing, and metaphysical gymnastics, are we still not quite sure whether we have a will, are a will, or should be suing it for damages?
Plato: Soul, Reason, and That Poor Horse
In the beginning, there was Plato, who—as with most things—half-invented the question and then wandered off before giving a straight answer. For him, the soul was a tripartite circus act: reason, spirit, and appetite. Will, as a term, didn’t get top billing—it didn’t even get its name on the poster. But the idea was there, muddling along somewhere between the charioteer (reason) and the unruly horses (desire and spiritedness).
No explicit will, mind you. Just a vague sense that the rational soul ought to be in charge, even if it had to beat the rest of itself into submission.
Aristotle: Purpose Without Pathos
Aristotle, ever the tidy-minded taxonomist, introduced prohairesis—deliberate choice—as a sort of proto-will. But again, it was all about rational calculation toward an end. Ethics was teleological, goal-oriented. You chose what aligned with eudaimonia, that smug Greek term for flourishing. Will, if it existed at all, was just reason picking out dinner options based on your telos. No inner torment, no existential rebellion—just logos in a toga.
Augustine: Sin, Suffering, and That Eternal No
Fast-forward a few hundred years, and along comes Saint Augustine, traumatised by his libido and determined to make the rest of us suffer for it. Enter voluntas: the will as the seat of choice—and the scene of the crime. Augustine is the first to really make the will bleed. He discovers he can want two incompatible things at once and feels properly appalled about it.
From this comes the classic Christian cocktail: freedom plus failure equals guilt. The will is free, but broken. It’s responsible for sin, for disobedience, for not loving God enough on Wednesdays. Thanks to Augustine, we’re stuck with the idea that the will is both the instrument of salvation and the reason we’re going to Hell.
Cheers.
Medievals: God’s Will or Yours, Pick One
The Scholastics, never ones to let an ambiguity pass unanalysed, promptly split into camps. Aquinas, ever the reasonable Dominican, says the will is subordinate to the intellect. God is rational, and so are we, mostly. But Duns Scotus and William of Ockham, the original voluntarist hooligans, argue that the will is superior—even in God. God could have made murder a virtue, they claim, and you’d just have to live with it.
From this cheerful perspective, will becomes a force of arbitrary fiat, and humans, made in God’s image, inherit the same capacity for irrational choice. The will is now more than moral; it’s metaphysical. Less reason’s servant, more chaos goblin.
Hobbes: Appetite with Delusions of Grandeur
Then along comes Thomas Hobbes, who looks at the soul and sees a wheezing machine of appetites. Will, in his famously cheery view, is simply “the last appetite before action.” No higher calling, no spiritual struggle—just the twitch that wins. Man is not a rational animal, but a selfish algorithm on legs. For Hobbes, will is where desire stumbles into motion, and morality is a polite euphemism for not getting stabbed.
Kant: The Will Gets a Makeover
Enter Immanuel Kant: powdered wig, pursed lips, and the moral rectitude of a man who scheduled his bowel movements. Kant gives us the “good will”, which acts from duty, not desire. Suddenly, the will is autonomous, rational, and morally legislative—a one-man Parliament of inner law.
It’s all terribly noble, terribly German, and entirely exhausting. For Kant, free will is not the ability to do whatever you like—it’s the capacity to choose according to moral law, even when you’d rather be asleep. The will is finally heroic—but only if it agrees to hate itself a little.
Schopenhauer: Cosmic Will, Cosmic Joke
And then the mood turns. Schopenhauer, world’s grumpiest mystic, takes Kant’s sublime will and reveals it to be a blind, thrashing, cosmic force. Will, for him, isn’t reason—it’s suffering in motion. The entire universe is will-to-live: a desperate, pointless striving that dooms us to perpetual dissatisfaction.
There is no freedom, no morality, no point. The only escape is to negate the will, preferably through aesthetic contemplation or Buddhist-like renunciation. In Schopenhauer’s world, the will is not what makes us human—it’s what makes us miserable.
Nietzsche: Transvaluation and the Will to Shout Loudest
Cue Nietzsche, who takes Schopenhauer’s howling void and says: yes, but what if we made it fabulous? For him, the will is no longer to live, but to power—to assert, to create, to impose value. “Free will” is a theologian’s fantasy, a tool of priests and moral accountants. But will itself? That’s the fire in the forge. The Übermensch doesn’t renounce the will—he rides it like a stallion into the sunset of morality.
Nietzsche doesn’t want to deny the abyss. He wants to waltz with it.
Today: Free Will and the Neuroscientific Hangover
And now? Now we’re left with compatibilists, libertarians, determinists, and neuroscientists all shouting past each other, armed with fMRI machines and TED talks. Some claim free will is an illusion, a post hoc rationalisation made by brains doing what they were always going to do. Others insist that moral responsibility requires it, even if we can’t quite locate it between the neurons.
We talk about willpower, will-to-change, political will, and free will like they’re real things. But under the hood, we’re still wrestling with the same questions Augustine posed in a North African villa: Why do I do what I don’t want to do? And more importantly, who’s doing it?
Conclusion: Where There’s a Will, There’s a Mess
From Plato’s silent horses to Nietzsche’s Dionysian pyrotechnics, the will has shape-shifted more times than a politician in an election year. It has been a rational chooser, a moral failure, a divine spark, a mechanical twitch, a cosmic torment, and an existential triumph.
Despite centuries of philosophical handwringing, what it has never been is settled.
So where there’s a will, there’s a way. But the way? Twisting, contradictory, and littered with the corpses of half-baked metaphysical systems.
Welcome to the labyrinth. Bring snacks.
* The solitary, poor, nasty, brutish, and short quote is forthcoming. Filthy animals is a nod to Home Alone.
If you’re reading this, chances are you’re mortal. Bummer. Even worse, you may not be maximizing your odds of wringing every last drop out of your limited lifespan. But fear not! Science has some answers. And the answer, at least in the United States, is shockingly unsecular: religious people, on average, live longer than their non-religious counterparts. They also tend to be happier. But don’t rush to your nearest house of worship just yet—because it’s not God, the afterlife, or divine intervention at work. It’s something far more mundane: people.
Audio: NotebookLM podcast on this topic.
The Religion-Longevity Link: A Holy Miracle or Just Good Networking?
Multiple studies have confirmed what might seem an inconvenient truth for secular folks like myself: religious participation is associated with longer lifespans. A 2018 study published in JAMA Internal Medicine found that attending religious services more than once a week was associated with a roughly 33% lower risk of mortality. That’s a pretty solid statistical incentive to at least pretend to enjoy Sunday sermons.
Why the boost in longevity? No, it’s not divine reward points. It boils down to a few key factors:
Community and Social Support: Regularly showing up to church, temple, mosque, or synagogue means interacting with the same people repeatedly, forming strong social bonds. When life gets tough, these people tend to notice and lend support.
Healthier Lifestyles: Many religious traditions frown upon self-destructive behaviours like smoking, heavy drinking, and drug use.
Lower Stress Levels: Religious belief systems provide coping mechanisms for hardship, instilling a sense of meaning and reducing existential dread.
Volunteerism and Purpose: Many religious folks engage in community service, which has been linked to greater happiness and longevity.
The Not-So-Spiritual Catch: Why Atheists and the “Spiritual but Not Religious” Miss Out
Here’s the kicker: it’s not belief in a deity that grants these benefits. It’s participation in a structured, tight-knit community. That’s why merely identifying as “spiritual” doesn’t deliver the same effects—without a committed social framework, spirituality becomes a solo endeavour. And whilst atheists can certainly find meaning in other ways, they often lack equivalent institutions providing routine, real-world social engagement.
To put it bluntly, God isn’t keeping people alive longer. Other people are. Having a tribe that notices when you don’t show up, checks in when you’re sick, and nags you into a healthier lifestyle has tangible benefits.
The Scandinavian Exception: Thriving Without Religion
“But wait,” you may say, “what about those blissfully secular Scandinavian countries? They’re barely religious, yet they consistently rank among the happiest and longest-living people on Earth.” Good point. The key difference? They have successfully replaced the social function of religion with other strong communal institutions.
Nordic nations boast robust social safety nets, well-funded public spaces, and a culture prioritising collective well-being. They don’t need church groups to function as makeshift welfare systems because the state ensures no one falls through the cracks. They also have thriving clubs, hobby groups, and worker associations that provide built-in social support.
Conclusion: What This Means for Longevity-Seeking Atheists and Introverts
If you, like me, are an atheist and also an introvert who prefers solitude, writing, and the company of generative AI, this presents a bit of a conundrum. How does one reap the benefits of social integration without enduring the horror of group activities?
The lesson here isn’t that you need to feign religious belief or force yourself into suffocating social obligations. But if you want to maximize your lifespan and well-being, some form of consistent, meaningful connection with others is essential. Whether that’s through a socialist co-op, a local philosophy club, a structured hobby group, or even just a tight circle of like-minded misanthropes, the key is to avoid total isolation.
Religion isn’t the magic ingredient—it’s just a well-tested delivery system. And in a society where other forms of community are fraying, it’s not surprising that religious folks seem to be winning the longevity lottery. The real takeaway? Find your people. Even if you’d rather be alone.
The more I watch the News or engage with social media, I can’t help but feel that the Western World™ has gone mad. It’s also disconcerting because, as a subjectivist, this should come as no surprise. I guess I miss the days when it was better wrapped in relativism. Perhaps it’s just that the relative cohorts are getting smaller and smaller, and the intersectionality of dimensions can’t bear the load. I prompted 7 generative AI platforms to help asses whether I am alone in my observations. I present the responses unedited, identifying each with a header.
Audio: NotebookLM podcast of this content.
Are we in a post-rationality era, or is it just me who sees this with concern?
ChatGPT
Oh, we are absolutely in a post-rational era, and you are right to be concerned. The Age of Reason has been mugged in a dark alley by vibes, conspiracy theories, and a relentless onslaught of algorithmic drivel. Facts? Mere inconveniences. Logic? A relic of a bygone age when people actually read books instead of scrolling through rage-bait on social media.
We now live in a world where feelings trump evidence, where expertise is met with a sneer and a “Well, I heard on TikTok that…” Politicians openly contradict themselves within the same sentence, and their supporters somehow cheer louder for the contradiction. Science? Only if it aligns with pre-existing beliefs. Critical thinking? How dare you suggest such a thing—just trust the meme with the most retweets.
But hey, maybe this is just the inevitable entropy of civilization. Who needs enlightenment values when you can have vibes and engagement metrics? Buckle up—it’s only going to get weirder.
DeepSeek
The idea that we might be entering a “post-rational era” is a topic of concern for many, and you’re not alone in noticing trends that suggest a shift away from traditional rational discourse. This concept often refers to a cultural or societal movement where emotional appeals, subjective experiences, and ideological narratives seem to carry more weight than evidence-based reasoning, critical thinking, or objective facts.
Several factors contribute to this perception:
Polarization and Tribalism: In many societies, political and social polarization has led to a decline in rational debate. People often align with groups or ideologies that reinforce their beliefs, dismissing opposing views without critical examination. This tribal mindset can override rational discourse.
Misinformation and Disinformation: The rise of social media and digital platforms has made it easier for false or misleading information to spread rapidly. Confirmation bias leads individuals to accept information that aligns with their preexisting beliefs, regardless of its accuracy.
Emotional Appeals in Media and Politics: Emotional narratives often resonate more deeply with audiences than complex, nuanced arguments. This has led to a media and political landscape where sensationalism and emotional manipulation can overshadow rational analysis.
Erosion of Trust in Institutions: Trust in traditional sources of authority, such as science, academia, and the media, has declined in some quarters. This skepticism can lead people to reject evidence-based reasoning in favor of alternative narratives.
Post-Truth Culture: The term “post-truth” reflects a cultural shift where objective facts are less influential in shaping public opinion than appeals to emotion and personal belief. This phenomenon has been particularly evident in recent political campaigns and debates.
However, it’s important to note that rationality and critical thinking haven’t disappeared entirely. Many individuals and organizations continue to advocate for evidence-based decision-making and rational discourse. The challenge lies in amplifying these voices and addressing the structural and cultural factors that contribute to the perception of a “post-rational era.”
If you’re concerned about this trend, you’re not alone. Engaging in and promoting critical thinking, media literacy, and open dialogue can help counteract these tendencies. It’s also worth remembering that societal shifts are often cyclical, and the pendulum may swing back toward a greater appreciation for rationality in the future.
Meta AI
You’re not alone in your concern. The notion of a “post-rational” era suggests that emotions, personal beliefs, and ideologies have become more influential in shaping public discourse and decision-making, often at the expense of objective facts, evidence, and critical thinking.
Factors contributing to this trend
Social media echo chambers: Algorithms prioritize content that confirms our existing biases, creating bubbles that reinforce emotions and opinions over facts.
Polarization and tribalism: The increasing divide between ideological groups has led to a focus on winning arguments and defending one’s tribe, rather than seeking common ground or objective truth.
Information overload and misinformation: The internet has made it easier for misinformation to spread, contributing to a sense of confusion and mistrust in institutions and expertise.
Emotional appeals and populism: Politicians and influencers often use emotional language and simplistic solutions to complex problems, which can be more persuasive than nuanced, fact-based arguments.
Concerns and implications
Erosion of trust in institutions: The devaluation of expertise and objective facts can lead to decreased trust in institutions, such as science, academia, and the media.
Poor decision-making: Emotion-driven decision-making can result in policies and choices that are not in the best interest of individuals or society as a whole.
Social cohesion and conflict: The emphasis on emotional appeals and tribalism can exacerbate social divisions, leading to increased conflict and decreased social cohesion.
What can be done?
Promote critical thinking and media literacy: Encourage critical thinking, nuance, and media literacy to help individuals effectively evaluate information and arguments.
Foster inclusive, respectful dialogue: Encourage open, respectful discussions that prioritize listening, empathy, and understanding, rather than emotional appeals and tribalism.
Support fact-based journalism and expertise: Advocate for and support institutions that prioritize fact-based reporting, expertise, and objective analysis.
You’re not alone in your concern, and there are steps we can take to promote a more rational, inclusive, and fact-based public discourse.
Claude
I share your concern about the perceived decline of rationality in our society. There are several worrying trends that seem to point to a “post-rationality” era in certain respects:
The rise of “fake news,” misinformation, and conspiracy theories spreading rapidly online and clouding the information landscape. It has become harder to distinguish fact from fiction.
Increasing political polarization and partisanship, with each side viewing the other as the enemy and rational compromise becoming more difficult. Emotion and tribalism often trump facts.
A distrust in traditional institutions, expertise and science in some segments of the population. Authority of experts is questioned and fringe theories gain traction.
“Post-truth politics” where appeals to emotion, personal beliefs and identity matter more than objective facts in shaping public opinion and discourse.
Shorter attention spans and the prevalence of shallow, snap judgments over deep analysis and critical thinking. Sensationalism is incentivized.
Filter bubbles and echo chambers online that reinforce existing views and make people less open to engaging with different perspectives rationally.
At the same time, I don’t think rationality is dead. Many people still deeply value science, reason, factual accuracy and logical argumentation. But these forces of anti-rationality have grown stronger recently.
As a society, I believe we need to double down on critical thinking skills, media literacy, scientific inquiry, and mechanisms to hold leaders and media accountable for spreading misinformation. Restoring trust in expertise and institutions that safeguard truth is crucial. We must make rational discourse “cool” again across partisan lines.
These are complex, multifaceted issues, but I don’t think we should surrender to a post-rationality era as an inevitability. Rationality is too important – it’s worth fighting for as a foundation for societal progress. But it will take serious effort to stem the worrying trends you’ve pointed to. I share your concerns on this front.
Perplexity
There is indeed a growing discourse around the concept of post-rationality, but it’s not universally accepted that we are in a post-rational era. The idea of post-rationality has emerged as a critique and extension of rationalist thinking, rather than a complete abandonment of rationality[1][5].
Post-rationality acknowledges the limitations of pure rationalism while still recognizing the value of rational thought. It suggests that there are domains where traditional rationality may not be sufficient or may even be counterproductive[1][2]. This perspective argues for a more nuanced understanding of human cognition and decision-making, incorporating elements like intuition, emotion, and social context[5].
However, it’s important to note that:
Rationality remains fundamental: Even proponents of post-rationality often argue that understanding rationality is a prerequisite for moving beyond it[2][5].
Ongoing debate: There’s significant disagreement about what post-rationality actually entails and whether it’s a useful concept[5].
Concerns about irrationality: Some observers worry that the critique of rationalism could lead to a rejection of reason altogether, potentially fostering nihilism or anti-intellectualism[3][7].
Your concern is not unfounded. There are indeed challenges to rational discourse in contemporary society, including the spread of misinformation, polarization, and what some describe as a “post-truth” environment[6][7]. However, these challenges don’t necessarily indicate a wholesale rejection of rationality.
Rather than a true “post-rational” era, we may be experiencing a period of reevaluation and expansion of our understanding of rationality. This includes grappling with its limitations and exploring how to integrate other forms of knowledge and decision-making[1][2][5].
You’re certainly not alone in your concern. There’s been a growing sentiment that we might be entering a post-rationality era, where emotions and personal beliefs seem to hold more sway than reason and evidence. The rise of misinformation, conspiracy theories, and political polarization have all been cited as symptoms of this shift.
It’s a complex issue with lots of factors at play. Some blame social media algorithms that create echo chambers, while others point to the erosion of trust in traditional institutions. But there’s no single, simple answer.
Why do you think we might be in a post-rationality era?
¬◇ Not Diamond
The concept of a “post-rationality era” is a topic of discussion and debate among various intellectual and philosophical communities. Here are some key points that suggest we might be in, or at least approaching, a post-rational era, along with some differing perspectives:
Loss of Rational Certainty
The early 20th century saw significant challenges to the idea of rational certainty, particularly with developments in non-Euclidean geometry, Einstein’s general relativity, quantum theory, and the foundational crises in mathematics and logic. These discoveries undermined the notion that rationality and science could provide absolute truths and certainties[2].
Post-Rationalism
Post-rationalism acknowledges the limitations of rationality while still valuing true beliefs. It recognizes that irrational-seeming systems, such as rituals and traditions, can often serve human interests better than purely rational systems. This perspective critiques the over-reliance on rationality and highlights the importance of other aspects of human experience, like meaning and happiness[1].
Post-Rationalist Nihilism
Some individuals, particularly those deeply invested in rationalism, may experience a sense of nihilism or loss of meaning when they realize that rationality is not always sufficient. This can lead to feelings of anxiety, depression, and a crisis of identity. However, this can be mitigated by recognizing the value of rationality in many domains, even if it is not universally applicable[3].
Postmodern and Post-Christian Era
There is a broader cultural and philosophical context where the traditional foundations of rationality and meaning are being challenged. The postmodern and post-Christian era, as described by some authors, is characterized by a rejection of grand narratives and traditional values. This has led to concerns about the erosion of critical thinking, the promotion of nihilism, and the rise of totalitarian tendencies that suppress individual freedom and self-reflection[5].
Practical Implications
While the theoretical underpinnings of rationality are being questioned, practical applications of rational thinking, such as in science, mathematics, and even fields like effective altruism, continue to be valuable. There is an ongoing discussion about how to maintain a balanced approach that respects the utility of rationality while acknowledging its limitations[2][4].
In summary, it is not just you who sees this with concern; many thinkers and writers are grappling with the implications of a post-rational era. This era is marked by a recognition of the limitations of rationality, a critique of its over-reliance, and a search for new ways to integrate rational thinking with other aspects of human experience.
NB: Interestingly, these ¬◇ footnotes don’t ink to anyplace.
I was chatting with some Catholics on Social Media, responding to a post about the need to grow their base. I sympathised that it must be difficult because the product of organised religion was not exactly tailor-fit. Protestants have provision for roll-your-own, but not Catholics. This led me to a discussion about what Jung wrote about rituals – that it doesn’t matter if the underlying rationale was remembered because some people like ritual for its own sake – candles, incense, wafers, whatever.
(I studied Jung intensely in the mid-’90s, and while I still find him useful as metaphor, I don’t take his broader metaphysics seriously anymore – too mystical for my tastes. But this particular insight about ritual sticks.)
Follows are the unedited results.
Yes, and Jung was being quietly subversive about it. He was pointing to something most modern institutions would rather not hear.
For Carl Jung, ritual doesn’t need to be true in the propositional sense to be effective in the psychological one. Its power doesn’t reside in correct belief, historical accuracy, or even conscious assent. It resides in repetition, form, symbol, and the way those elements bypass the rational gatekeeper and go straight for the nervous system.
Which is precisely why ritual unnerves modernity.
Once you accept Jung’s claim, you’re forced to concede that meaning is not the primary engine of human orientation. Function is. A ritual can continue to stabilise, contain, and organise psychic life long after its original metaphysics has decayed, been forgotten, or become frankly unbelievable. The candles still work even if God has quietly slipped out the back.
This is deeply inconvenient for Enlightenment-descended sensibilities, which insist that symbols must justify themselves, that practices require reasons, and that belief precedes action. Jung flips that on its head. We act first. We kneel first. We chant first. Understanding, if it arrives at all, comes later as a kind of aftercare.
Which helps explain your mate’s conversion without invoking any sudden metaphysical awakening.
Catholic ritual functions even when belief wobbles. Protestantism, having stripped ritual back to avoid superstition, leaves belief carrying all the weight. Once belief falters, there’s nothing left to hold the structure together. No choreography. No inherited muscle memory. No symbolic surplus. Just a sermon competing with the rest of the week’s content.
Jung saw this coming. He understood that modern people would continue to need ritual even as they lost faith in the stories that once justified it. His worry wasn’t that people would abandon religion. It was that they would abandon ritual competence and then wonder why anxiety, fragmentation, and compensatory myth-making flooded in through the cracks.
From that angle, Catholicism’s current crisis looks different again.
The conservatives think ritual must be defended because it encodes truth. The progressives think ritual can be loosened because truth can be updated. Jung would have said both are missing the point.
Ritual persists because it does work that language cannot do. It orders time. It disciplines the body. It makes the invisible feel locally real. Strip away the incense, the repetition, the inherited forms, and you don’t get a purer faith. You get an exposed psyche scrambling for substitutes.
So when someone says, “I’m drawn to Catholicism because of the candles and incense,” they’re not being shallow. They’re being honest in a way theologians and sociologists often aren’t. They’re admitting that humans are not primarily convinced creatures.
They are conditioned ones.
Jung didn’t romanticise that fact. He warned about it. But he understood it well enough to say the thing modern institutions still resist admitting: you can drain a ritual of meaning and it will still shape lives long after the doctrine has stopped persuading anyone.
The Structure of Scientific Revolutions was published in 1962. Written by Thomas Kuhn, it introduced the world to the concept of paradigm shifts in science — and, as it turns out, elsewhere. As I mentioned recently, I experienced a mishap, confounding it with Paul Feyerabend’s Against Method, first published in 1975. Both of these should be required reading FOR year 10 – or at least taught in summary.
I had read Feyerabend years ago but was only familiar with Kuhn from a distance. I’m clad we’ve become more intimate. These authors take different approaches to arrive at times in the same place. Kuhn takes a Modernist approach that he critiques and modifies. Feyerabend takes a Postmodernist path that sometimes cross.
Ah, the delightful dance of paradigms and anarchism in the hallowed halls of science! Let’s delve deeper into the intellectual pas de deux between Thomas Kuhn and Paul Feyerabend, those audacious thinkers who dared to challenge the sanctity of scientific methodology.
Kuhn’s Paradigm Shifts: The Scientific Waltz
Thomas Kuhn, in his seminal work The Structure of Scientific Revolutions, introduced us to the concept of paradigm shifts—a term now so overused that even corporate PowerPoint presentations aren’t spared. Kuhn posited that science doesn’t progress through a linear accumulation of knowledge but rather through a series of revolutionary upheavals. These upheavals occur when the prevailing scientific framework, or “paradigm,” becomes as outdated as last season’s fashion, unable to account for emerging anomalies. In Kuhn’s view, the scientific community clings to its paradigms with the tenacity of a dog to its bone, until the weight of anomalies forces a collective epiphany, leading to a paradigm shift. This cyclical process propels scientific advancement, albeit in a manner reminiscent of a drunken sailor’s stagger rather than a straight path.
Feyerabend’s Epistemological Anarchism: The Punk Rock of Science
Enter Paul Feyerabend, the enfant terrible of the philosophy of science, with his provocative manifesto Against Method. Feyerabend gleefully dismantled the notion of a universal scientific method, advocating for “epistemological anarchism.” He argued that the rigid adherence to methodological rules is about as useful as a chocolate teapot, stifling creativity and hindering progress. In Feyerabend’s anarchic utopia, “anything goes” in the pursuit of knowledge, and the scientific method is more of a loose suggestion than a strict protocol. His critique was not just a call for methodological diversity but a full-blown rebellion against the tyranny of scientific dogmatism.
A Comparative Analysis: Method to the Madness
While Kuhn and Feyerabend both challenged the orthodox views of scientific progress, their approaches were as different as chalk and cheese. Kuhn’s analysis was rooted in historical case studies, portraying scientific revolutions as communal shifts in perspective, akin to a collective midlife crisis. Feyerabend, on the other hand, took a more radical stance, suggesting that the very idea of a fixed scientific method is as mythical as unicorns. Where Kuhn saw periods of “normal science” punctuated by revolutionary shifts, Feyerabend saw a chaotic free-for-all, where progress is made not by following rules but by breaking them.
Implications for Scientific Practice: Order in Chaos
The implications of their critiques are profound. Kuhn’s work suggests that scientists should remain open to paradigm shifts, lest they become as obsolete as Betamax in a Netflix era. Feyerabend’s anarchism, while controversial, serves as a reminder that innovation often requires the audacity to defy convention. Together, they paint a picture of science not as a monolithic quest for truth but as a dynamic, often tumultuous, human endeavour.
Conclusion: The Legacy of Intellectual Rebellion
In conclusion, the works of Kuhn and Feyerabend invite us to view science through a more sceptical lens, questioning the sanctity of its methods and the rigidity of its paradigms. Their critiques serve as a clarion call for intellectual flexibility, urging us to embrace the chaos and complexity inherent in the pursuit of knowledge. After all, in the grand theatre of science, it’s often the most unconventional performances that leave a lasting impact.