Can One Obstruct Justice in a Place It Doesn’t Exist?

ICE is out in force again, dragging brown bodies out of homes in Los Angeles like it’s some righteous carnival of due process. Another day, another federal theatre production titled Law and Order: Ethnic Cleansing Unit, where men with guns and names like Chad or Hank mistake cruelty for patriotism and paperwork for moral clarity.

Audio: NotebookLM podcast on this topic.

Naturally, critics of these raids are now being threatened with that great juridical cudgel: “obstructing justice.” Yes, you heard that right. If you interfere – say, by filming, shouting, refusing to roll over like a good little colonial subject – you are obstructing justice. As though justice were something you could actually put your hands on in the United States without a hazmat suit and a decade of appeals.

Let’s be clear. There is no justice here to obstruct. What you are obstructing is bureaucratic violence wrapped in legal latex. You are obstructing a system that functions like a vending machine for state-sanctioned trauma: insert immigrant, extract ruin.

Justice: The Imaginary Friend of Empire

Ah, “justice.” That hallowed ideal trotted out whenever the state wants to put a boot through your front door. The U.S. has long since traded its Justice for Security Theatre and capitalist choreography. The blindfold is still there, sure – but these days, it’s a branded sleep mask from Lockheed Martin, and the scales are rigged to weigh white tears heavier than brown bodies.

Let’s run through the usual suspects:

  • ICE – America’s own domestic Gestapo, but with better PR and significantly worse fashion.
  • CBP – Border fetishists whose job seems less about national defence and more about satisfying their Freud-bereft fantasies of control.
  • SCOTUS – That great moral weather vane, spinning wildly between “originalist necromancy” and outright lunacy, depending on how recently Thomas and Alito read Leviticus.
  • Congress – An assembly of millionaires cosplaying as public servants, holding hearings on “the threat of immigration” while outsourcing their lawn care.

And of course, the President – whichever septuagenarian husk happens to be in office – offers the usual bromides about order, safety, and enforcement, all while the real crimes (you know, the kind involving tax fraud, corporate pollution, or drone strikes) go entirely unmolested.

Can You Obstruct a Simulation?

If you stand in front of a deportation van, are you obstructing justice, or merely interrupting the bureaucratic excretion of empire? It’s the philosophical equivalent of trying to punch a hologram. The system pretends to uphold fairness while routinely violating its own principles, then charges you with “obstruction” when you call out the sleight of hand.

This is not justice. This is kabuki. A ritual. A performance piece sponsored by Raytheon.

A Modest Proposal

Let’s just be honest and rename the charge. Not “Obstruction of Justice”—too ironic, too pompous. Call it what it is: Obstruction of Procedure, Obstruction of Power, or if we’re being especially accurate: Obstruction of the Industrial Deportation Complex™. Hell, add a corporate sponsor while you’re at it:

You are being charged with Obstruction of Justice, Presented by Amazon Web Services.

Because when justice itself is a ghost, when the rule of law has become the rule of lawfare, the real obscenity is pretending any of this is noble.

Final Thought

So no, dear reader, you’re not obstructing justice. You’re obstructing a machine that mistakes itself for a moral order. And if you’re going to obstruct something, make it that.

Parfit’s Long-Termism and Property Rights

Cause and effect: This clip by Jonny Thompson influenced this post.

I’ve written extensively (and, some might say, relentlessly) on the immorality of private property, particularly the theological nonsense that undergirds its supposed legitimacy. Locke’s first-come, first-served logic might have sounded dashing in the 17th century, but it now reads like a boarding queue at Ryanair: desperate, arbitrary, and hostile to basic decency.

Audio: NotebookLM podcast on this content.

The core problem? Locke’s formulation assumes land was once freely available, as if Earth were a kind of colonial vending machine: insert labour, receive title. But that vending machine was already jammed by the time most of humanity got a look-in. Worse, it bakes in two kinds of chauvinism: temporal (screw the future) and speciesist (screw anything non-human).

Parfit’s long-termism lays bare the absurdity: why should a bit of land or atmospheric stability belong to those who happened to get here first, especially when their stewardship amounts to strip-mining the pantry and then boarding up the exit?

And no, “mixing your labour” with the land does not miraculously confer ownership—any more than a damp bint lobbing a sword at you from a pond makes you sovereign. That’s not philosophy; that’s Arthurian cosplay.

The Rhetoric of Realism: When Language Pretends to Know

Let us begin with the heresy: Truth is a rhetorical artefact. Not a revelation. Not a metaphysical essence glimmering behind the veil. Just language — persuasive, repeatable, institutionally ratified language. In other words: branding.

Audio: NotebookLM podcast on this topic.

This is not merely a postmodern tantrum thrown at the altar of Enlightenment rationalism. It is a sober, if impolite, reminder that nearly everything we call “knowledge” is stitched together with narrative glue and semantic spit. Psychology. Neuroscience. Ethics. Economics. Each presents itself as a science — or worse, a moral imperative — but their foundations are built atop a linguistic faultline. They are, at best, elegant approximations; at worst, dogma in drag.

Let’s take psychology. Here is a field that diagnoses your soul via consensus. A committee of credentialed clerics sits down and declares a cluster of behaviours to be a disorder, assigns it a code, and hands you a script. It is then canonised in the DSM, the Diagnostic Scripture Manual. Doubt its legitimacy and you are either naïve or ill — which is to say, you’ve just confirmed the diagnosis. It’s a theological trap dressed in the language of care.

Or neuroscience — the church of the glowing blob. An fMRI shows a region “lighting up” and we are meant to believe we’ve located the seat of love, the anchor of morality, or the birthplace of free will. Never mind that we’re interpreting blood-oxygen fluctuations in composite images smoothed by statistical witchcraft. It looks scientific, therefore it must be real. The map is not the territory, but in neuroscience, it’s often a mood board.

And then there is language itself, the medium through which all these illusions are transmitted. It is the stage, the scenery, and the unreliable narrator. My Language Insufficiency Hypothesis proposes that language is not simply a flawed tool — it is fundamentally unfit for the task it pretends to perform. It was forged in the furnace of survival, not truth. We are asking a fork to play the violin.

This insufficiency is not an error to be corrected by better definitions or clever metaphors. It is the architecture of the system. To speak is to abstract. To abstract is to exclude. To exclude is to falsify. Every time we speak of a thing, we lose the thing itself. Language functions best not as a window to the real but as a veil — translucent, patterned, and perpetually in the way.

So what, then, are our Truths™? They are narratives that have won. Stories that survived the epistemic hunger games. They are rendered authoritative not by accuracy, but by resonance — psychological, cultural, institutional. A “truth” is what is widely accepted, not because it is right, but because it is rhetorically unassailable — for now.

This is the dirty secret of epistemology: coherence masquerades as correspondence. If enough concepts link arms convincingly, we grant them status. Not because they touch reality, but because they echo each other convincingly in our linguistic theatre.

Libet’s experiment, Foucault’s genealogies, McGilchrist’s hemispheric metaphors — each peels back the curtain in its own way. Libet shows that agency might be a post-hoc illusion. Foucault reveals that disciplines don’t describe the subject; they produce it. McGilchrist laments that the Emissary now rules the Master, and the world is flatter for it.

But all of them — and all of us — are trapped in the same game: the tyranny of the signifier. We speak not to uncover truth, but to make truth-sounding noises. And the tragedy is, we often convince ourselves.

So no, we cannot escape the prison of language. But we can acknowledge its bars. And maybe, just maybe, we can rattle them loudly enough that others hear the clank.

Until then, we continue — philosophers, scientists, diagnosticians, rhetoricians — playing epistemology like a parlour game with rigged dice, congratulating each other on how well the rules make sense.

And why wouldn’t they? We wrote them.

The Scourge: They’re Really Fighting Is Ambiguity

A Sequel to “The Disorder of Saying No” and a Companion to “When ‘Advanced’ Means Genocide”

In my previous post, The Disorder of Saying No, I explored the way resistance to authority is pathologised, particularly when that authority is cloaked in benevolence and armed with diagnostic manuals. When one refuses — gently, thoughtfully, or with a sharp polemic — one is no longer principled. One is “difficult.” Or in my case, oppositional.

Audio: NotebookLM podcast on this topic.

So when I had the gall to call out Bill Maher for his recent linguistic stunt — declaring that a woman is simply “a person who menstruates” — I thought I was doing the rational thing: pointing out a classic bit of reductionist nonsense masquerading as clarity. Maher, after all, was not doing biology. He was playing lexicographer-in-chief, defining a term with centuries of philosophical, sociological, and political baggage as though it were a checkbox on a medical form.

I said as much: that he was abusing his platform, presenting himself as the sole arbiter of the English language, and that his little performance was less about clarity and more about controlling the terms of discourse.

My friend, a post-menopausal woman herself, responded not by engaging the argument, but by insinuating — as others have — that I was simply being contrary. Oppositional. Difficult. Again. (She was clearly moved by When “Advanced” Means Genocide, but may have missed the point.)

So let’s unpack this — not to win the debate, but to show what the debate actually is.

This Isn’t About Biology — It’s About Boundary Maintenance

Maher’s statement wasn’t intended to clarify. It was intended to exclude. It wasn’t some linguistic slip; it was a rhetorical scalpel — one used not to analyse, but to amputate.

And the applause from some cisgender women — particularly those who’ve “graduated” from menstruation — reveals the heart of the matter: it’s not about reproductive biology. It’s about controlling who gets to claim the term woman.

Let’s steelman the argument, just for the sport of it:

Menstruation is a symbolic threshold. Even if one no longer menstruates, having done so places you irrevocably within the category of woman. It’s not about exclusion; it’s about grounding identity in material experience.

Fine. But now let’s ask:

  • What about women who’ve never menstruated?
  • What about intersex people?
  • What about trans women?
  • What about cultures with radically different markers of womanhood?

You see, it only works if you pretend the world is simpler than it is.

The Language Insufficiency Hypothesis: Applied

This is precisely where the Language Insufficiency Hypothesis earns its keep.

The word woman is not a locked vault. It is a floating signifier, to borrow from Barthes — a term whose meaning is perpetually re-negotiated in use. There is no singular essence to the word. It is not rooted in biology, nor in social role, nor in performance. It is a hybrid, historically contingent construct — and the moment you try to fix its meaning, it slips sideways like a greased Wittgensteinian beetle.

“Meaning is use,” says Wittgenstein, and this is what frightens people.

If woman is defined by use and not by rule, then anyone might claim it. And suddenly, the club is no longer exclusive.

That’s the threat Maher and his defenders are really reacting to. Not trans women. Not intersex people. Not language activists or queer theorists.

The threat is ambiguity.

What They Want: A World That Can Be Named

The push for rigid definitions — for menstruation as membership — is a plea for a world that can be named and known. A world where words are secure, stable, and final. Where meaning doesn’t leak.

But language doesn’t offer that comfort.

It never did.

And when that linguistic instability gets too close to something personal, like gender identity, or the foundation of one’s own sense of self, the defensive response is to fortify the language, as though building walls around a collapsing church.

Maher’s defenders aren’t making scientific arguments. They’re waging semantic warfare. If they can hold the definition, they can win the cultural narrative. They can hold the gates to Womanhood and keep the undesirables out.

That’s the fantasy.

But language doesn’t play along.

Conclusion: Words Will Not Save You — but They Might Soothe the Dead

In the end, Maher’s definition is not merely incorrect. It is insufficient. It cannot accommodate the complexity of lived experience and cannot sustain the illusion of clarity for long.

And those who cling to it — friend or stranger, progressive, or conservative — are not defending biology. They are defending nostalgia. Specifically, a pathological nostalgia for a world that no longer exists, and arguably never did: a world where gender roles were static, language was absolute, and womanhood was neatly circumscribed by bodily functions and suburban etiquette.

Ozzy and Harriet loom large here — not as individuals but as archetypes. Icons of a mid-century dream in which everyone knew their place, and deviation was something to be corrected, not celebrated. My friend, of that generation, clings to this fantasy not out of malice but out of a desperate yearning for order. The idea that woman could mean many things, and mean them differently across contexts, is not liberating to her — it’s destabilising.

But that world is gone. And no amount of menstruation-based gatekeeping will restore it.

The Real Scourge Is Ambiguity

Maher’s tantrum wasn’t about truth. It was about fear — fear of linguistic drift, of gender flux, of a world in which meaning no longer obeys. The desire to fix the definition of “woman” is not a biological impulse. It’s a theological one.

And theology, like nostalgia, often makes terrible policy.

This is why your Language Insufficiency Hypothesis matters. Because it reminds us that language does not stabilise reality — it masks its instability. The attempt to define “woman” once and for all is not just futile — it’s an act of violence against difference, a linguistic colonisation of lived experience.

So Let Them Rest

Ozzy and Harriet are dead. Let them rest.
Let their picket fence moulder. Let their signage decay.

The world has moved on. The language is shifting beneath your feet. And no amount of retroactive gatekeeping can halt that tremor.

The club is burning. And the only thing left to save is honesty.

The Disorder of Saying No

A Polite Rebuttal to a Diagnosis I Didn’t Ask For

A dear friend — and I do mean dear, though this may be the last time they risk diagnosing me over brunch — recently suggested, with all the benevolent concern of a well-meaning inquisitor, that I might be showing signs of Oppositional Defiant Disorder.

You know the tone: “I say this with love… but have you considered that your refusal to play nicely with institutions might be clinical?”

Let’s set aside the tea and biscuits for a moment and take a scalpel to this charming little pathology. Because if ODD is a diagnosis, then I propose we start diagnosing systems — not people.

Audio: NotebookLM podcast on this topic.

When the Empire Diagnoses Its Rebels

Oppositional Defiant Disorder, for those blissfully unscarred by its jargon, refers to a “persistent pattern” of defiance, argumentativeness, rule-breaking, and — the pièce de résistance — resentment of authority. In other words, it is a medical label for being insufficiently obedient.

What a marvel: not only has resistance been de-politicised, it has been medicalised. The refusal to comply is not treated as an ethical stance or a contextual response, but as a defect of the self. The child (or adult) is not resisting something; they are resisting everything, and this — according to the canon — makes them sick.

One wonders: sick according to whom?

Derrida’s Diagnosis: The Binary Fetish

Jacques Derrida, of course, would waste no time in eviscerating the logic at play. ODD depends on a structural binary: compliant/defiant, healthy/disordered, rule-follower/troublemaker. But, as Derrida reminds us, binaries are not descriptive — they are hierarchies in disguise. One term is always elevated; the other is marked, marginal, suspect.

Here, “compliance” is rendered invisible — the assumed baseline, the white space on the page. Defiance is the ink that stains it. But this only works because “normal” has already been declared. The system names itself sane.

Derrida would deconstruct this self-justifying loop and note that disorder exists only in relation to an order that never justifies itself. Why must the subject submit? That’s not up for discussion. The child who asks that question is already halfway to a diagnosis.

Foucault’s Turn: Disciplinary Power and the Clinic as Court

Enter Foucault, who would regard ODD as yet another exquisite specimen in the taxonomy of control. For him, modern power is not exercised through visible violence but through the subtler mechanisms of surveillance, normalisation, and the production of docile bodies.

ODD is a textbook case of biopower — the system’s ability to define and regulate life itself through classification, diagnosis, and intervention. It is not enough for the child to behave; they must believe. They must internalise authority to the marrow. To question it, or worse, to resent it, is to reveal one’s pathology.

This is not discipline; this is soulcraft. And ODD is not a disorder — it is a symptom of a civilisation that cannot tolerate unmediated subjectivity. See Discipline & Punish.

Ivan Illich: The Compulsory Institutions of Care

Illich would call the whole charade what it is: a coercive dependency masquerading as therapeutic care. In Deschooling Society, he warns of systems — especially schools — that render people passive recipients of norms. ODD, in this light, is not a syndrome. It is the final gasp of autonomy before it is sedated.

What the diagnosis reveals is not a child in crisis, but an institution that cannot imagine education without obedience. Illich would applaud the so-called defiant child for doing the one thing schools rarely reward: thinking.

R.D. Laing: Sanity as a Political Position

Laing, too, would recognise the ruse. His anti-psychiatry position held that “madness” is often the only sane response to a fundamentally broken world. ODD is not insanity — it is sanity on fire. It is the refusal to adapt to structures that demand submission as a prerequisite for inclusion.

To quote Laing: “They are playing a game. They are playing at not playing a game. If I show them I see they are, I shall break the rules and they will punish me. I must play their game, of not seeing I see the game.”

ODD is what happens when a child refuses to play the game.

bell hooks: Refusal as Liberation

bell hooks, writing in Teaching to Transgress, framed the classroom as a potential site of radical transformation — if it rejects domination. The child who refuses to be disciplined is often the one who sees most clearly that the system has confused education with indoctrination.

Resistance, hooks argues, is not a flaw. It is a form of knowledge. ODD becomes, in this frame, a radical pedagogy. The defiant student is not failing — they are teaching.

Deleuze & Guattari: Desire Against the Machine

And then, should you wish to watch the diagnostic edifice melt entirely, we summon Deleuze and Guattari. For them, the psyche is not a plumbing system with blockages, but a set of desiring-machines short-circuiting the factory floor of capitalism and conformity.

ODD, to them, would be schizoanalysis in action — a body refusing to be plugged into the circuits of docility. The tantrum, the refusal, the eye-roll: these are not symptoms. They are breakdowns in the control grid.

The child isn’t disordered — the system is. The child simply noticed.

Freire: The Educated Oppressed

Lastly, Paulo Freire would ask: What kind of pedagogy demands the death of resistance? In Pedagogy of the Oppressed, he warns of an education model that treats students as empty vessels. ODD, reframed, is the moment a subject insists on being more than a receptacle.

In refusing the “banking model” of knowledge, the so-called defiant child is already halfway to freedom. Freire would call this not a disorder but a moment of awakening.

Conclusion: Diagnostic Colonialism

So yes, dear friend — I am oppositional. I challenge authority, especially when it mistakes its position for truth. I argue, question, resist. I am not unwell for doing so. I am, if anything, allergic to the idea that obedience is a virtue in itself.

Let us be clear: ODD is not a mirror held up to the subject. It is a spotlight shining from the system, desperately trying to blind anyone who dares to squint.

Now, shall we talk about your compliance disorder?


Full Disclosure: I used ChatGPT for insights beyond Derrida and Foucault, two of my mainstays.

On Ishiguro, Cioran, and Whatever I Think I’m Doing

Sora-generated image of Emil Cioran and Kazuo Ishiguro reading a generic book together

Having just finished Never Let Me Go by Kazuo Ishiguro, I’ve now cracked open my first taste of Cioran—History and Utopia. You might reasonably ask why. Why these two? And what, if anything, do they have in common? Better yet—what do the three of us have in common?

Audio: NotebookLM podcast on this topic.

Recently, I finished writing a novella titled Propensity (currently gathering metaphorical dust on the release runway). Out of curiosity—or narcissism—I fed it to AI and asked whose style it resembled. Among the usual suspects were two names I hadn’t yet read: Ishiguro and Cioran. I’d read the others and understood the links. These two, though, were unknown quantities. So I gave them a go.

Ishiguro is perhaps best known for The Remains of the Day, which, like Never Let Me Go, got the Hollywood treatment. I chose the latter, arbitrarily. I even asked ChatGPT to compare both books with their cinematic counterparts. The AI was less than charitable, describing Hollywood’s adaptations as bastardised and bowdlerised—flattened into tidy narratives for American palates too dim to digest ambiguity. On this, we agree.

What struck me about Never Let Me Go was its richly textured mundanity. That’s apparently where AI saw the resemblance to Propensity. I’m not here to write a book report—partly because I detest spoilers, and partly because summaries miss the point. It took about seven chapters before anything ‘happened’, and then it kept happening. What had at first seemed like a neurotic, wandering narrative from the maddeningly passive Kathy H. suddenly hooked me. The reveals began to unfold. It’s a book that resists retelling. It demands firsthand experience. A vibe. A tone. A slow, aching dread.

Which brings me neatly to Cioran.

History and Utopia is a collection of essays penned in French (not his mother tongue, but you’d never guess it) while Cioran was holed up in postwar Paris. I opted for the English translation—unapologetically—and was instantly drawn in. His prose? Electric. His wit? Acidic. If Ishiguro was a comparison of style, then Cioran was one of spirit. Snark, pessimism, fatalistic shrugs toward civilisation—finally, someone speaking my language.

Unlike the cardboard cut-outs of Cold War polemics we get from most Western writers of the era, Cioran’s take is layered, uncomfortably self-aware, and written by someone who actually fled political chaos. There’s no naïve idealism here, no facile hero-villain binaries. Just a deeply weary intellect peering into the abyss and refusing to blink. It’s not just what he says, but the tone—the curled-lip sneer at utopian pretensions and historical self-delusions. If I earned even a drop of that comparison, I’ll take it.

Both Ishiguro and Cioran delivered what I didn’t know I needed: the reminder that some writers aren’t there to tell you a story. They’re there to infect you with an atmosphere. An idea. A quiet existential panic you can’t shake.

I’ve gotten what I came for from these two, though I suspect I’ll be returning, especially to Cioran. Philosophically, he’s my kind of bastard. I doubt this’ll be my last post on his work.

The Trust Myth: Harari’s Binary and the Collapse of Political Credibility

Yuval Noah Harari, always ready with a digestible morsel for the TED-addled masses, recently declared that “democracy runs on trust, dictatorship on terror.” It’s a line with the crispness of a fortune cookie and about as much analytical depth. Designed for applause, not interrogation, it’s the sort of soundbite that flatters liberal sensibilities while sanding off the inconvenient edges of history.

Audio: NotebookLM podcast on this topic.

Let’s be honest: this dichotomy is not merely simplistic – it’s a rhetorical sedative. It reassures those who still believe political systems are like kitchen appliances: plug-and-play models with clear instructions and honest warranties. But for anyone who’s paid attention to the actual mechanics of power, this framing is delusional.

1. Trust Was Never Earned

In the United States, trust in democratic institutions was never some noble compact forged through mutual respect and enlightened governance. It was cultivated through exclusion, propaganda, and economic bribery. The post-WWII boom offered the illusion of institutional legitimacy – but only if you were white, male, middle-class, and preferably asleep.

Black Americans, Indigenous peoples, immigrants, women – none were granted the luxury of naïve trust. They were told to trust while being actively disenfranchised. To participate while being systemically excluded. So no, Harari, the machine didn’t run on trust. It ran on marketing. It ran on strategic ignorance.

2. Dictatorship Doesn’t Require Terror

Equally cartoonish is the notion that dictatorships subsist purely on terror. Many of them run quite comfortably on bureaucracy, passive conformity, and the grim seduction of order. Authoritarians know how to massage the same trust reflexes as democracies – only more bluntly. People don’t just obey out of fear. They obey out of habit. Out of resignation. Out of a grim kind of faith that someone – anyone – is in charge.

Dictatorships don’t extinguish trust. They re-route it. Away from institutions and toward strongmen. Toward myths of national greatness. Toward performative stability. It’s not that terror is absent—it’s just not the whole machine. The real engine is misplaced trust.

3. Collapse Is Bipartisan

The present moment isn’t about the erosion of a once-trustworthy system. It’s the slow-motion implosion of a confidence game on all sides. The old liberal institutions are collapsing under the weight of their hypocrisies. But the loudest critics – tech messiahs, culture warriors, authoritarian nostalgists – are no better. Their solutions are just new brands of snake oil in sleeker bottles.

Everyone is pointing fingers, and no one is credible. The public, caught between cynicism and desperation, gravitates either toward restoration fantasy (“make democracy work again”) or authoritarian theatre (“at least someone’s doing something”). Both are dead ends.

4. The Only Way Forward: Structural Reimagination

The only viable path isn’t restoration or regression. It’s reinvention. Systems that demand unconditional trust – like religions and stock markets – are bound to fail, because they rely on sustained illusions. Instead, we need systems built on earned, revocable, and continually tested trust – systems that can survive scrutiny, decentralise power, and adapt to complexity.

In other words: stop trying to repair a house built on sand. Build something else. Something messier, more modular, less mythological.

Let the TED crowd have their slogans. We’ve got work to do.

Unwilling: The Neuroscience Against Free Will

Why the cherished myth of human autonomy dissolves under the weight of our own biology

We cling to free will like a comfort blanket—the reassuring belief that our actions spring from deliberation, character, and autonomous choice. This narrative has powered everything from our justice systems to our sense of personal achievement. It feels good, even necessary, to believe we author our own stories.

But what if this cornerstone of human self-conception is merely a useful fiction? What if, with each advance in neuroscience, our cherished notion of autonomy becomes increasingly untenable?

Audio: NotebookLM podcast on this topic.

I. The Myth of Autonomy: A Beautiful Delusion

Free will requires that we—some essential, decision-making “self”—stand somehow separate from the causal chains of biology and physics. But where exactly would this magical pocket of causation exist? And what evidence do we have for it?

Your preferences, values, and impulses emerge from a complex interplay of factors you never chose:

The genetic lottery determined your baseline neurochemistry and cognitive architecture before your first breath. You didn’t select your dopamine sensitivity, your amygdala reactivity, or your executive function capacity.

The hormonal symphony that controls your emotional responses operates largely beneath conscious awareness. These chemical messengers—testosterone, oxytocin, and cortisol—don’t ask permission before altering your perceptions and priorities.

Environmental exposures—from lead in your childhood drinking water to the specific traumas of your upbringing—have sculpted neural pathways you didn’t design and can’t easily rewire.

Developmental contingencies have shaped your moral reasoning, impulse control, and capacity for empathy through processes invisible to conscious inspection.

Your prized ability to weigh options, inhibit impulses, and make “rational” choices depends entirely on specific brain structures—particularly the dorsolateral prefrontal cortex (DLPFC)—operating within a neurochemical environment you inherited rather than created.

You occupy this biological machinery; you do not transcend it. Yet, society holds you responsible for its outputs as if you stood separate from these deterministic processes.

II. The DLPFC: Puppet Master of Moral Choice

The dorsolateral prefrontal cortex serves as command central for what we proudly call executive function—our capacity to plan, inhibit, decide, and morally judge. We experience its operations as deliberation, as the weighing of options, as the essence of choice itself.

And yet this supposed seat of autonomy can be manipulated with disturbing ease.

When researchers apply transcranial magnetic stimulation to inhibit DLPFC function, test subjects make dramatically different moral judgments about identical scenarios. Under different stimulation protocols, the same person arrives at contradictory conclusions about right and wrong without any awareness of the external influence.

Similarly, transcranial direct current stimulation over the DLPFC alters moral reasoning, especially regarding personal moral dilemmas. The subject experiences these externally induced judgments as entirely their own, with no sense that their moral compass has been hijacked.

If our most cherished moral deliberations can be redirected through simple electromagnetic manipulation, what does this reveal about the nature of “choice”? If will can be so easily influenced, how free could it possibly be?

III. Hormonal Puppetmasters: The Will in Your Bloodstream

Your decision-making machinery doesn’t stop at neural architecture. Your hormonal profile actively shapes what you perceive as your autonomous choices.

Consider oxytocin, popularly known as the “love hormone.” Research demonstrates that elevated oxytocin levels enhance feelings of guilt and shame while reducing willingness to harm others. This isn’t a subtle effect—it’s a direct biological override of what you might otherwise “choose.”

Testosterone tells an equally compelling story. Administration of this hormone increases utilitarian moral judgments, particularly when such decisions involve aggression or social dominance. The subject doesn’t experience this as a foreign influence but as their own authentic reasoning.

These aren’t anomalies or edge cases. They represent the normal operation of the biological systems governing what we experience as choice. You aren’t choosing so much as regulating, responding, and rebalancing a biochemical economy you inherited rather than designed.

IV. The Accident of Will: Uncomfortable Conclusions

If the will can be manipulated through such straightforward biological interventions, was it ever truly “yours” to begin with?

Philosopher Galen Strawson’s causa sui argument becomes unavoidable here: To be morally responsible, one must be the cause of oneself, but no one creates their own neural and hormonal architecture. By extension, no one can be ultimately responsible for actions emerging from that architecture.

What we dignify as “will” may be nothing more than a fortunate (or unfortunate) biochemical accident—the particular configuration of neurons and neurochemicals you happened to inherit and develop.

This lens forces unsettling questions:

  • How many behaviours we praise or condemn are merely phenotypic expressions masquerading as choices? How many acts of cruelty or compassion reflect neurochemistry rather than character?
  • How many punishments and rewards are we assigning not to autonomous agents, but to biological processes operating beyond conscious control?
  • And perhaps most disturbingly: If we could perfect the moral self through direct biological intervention—rewiring neural pathways or adjusting neurotransmitter levels to ensure “better” choices—should we?
  • Or would such manipulation, however well-intentioned, represent the final acknowledgement that what we’ve called free will was never free at all?

A Compatibilist Rebuttal? Not So Fast.

Some philosophers argue for compatibilism, the view that determinism and free will can coexist if we redefine free will as “uncoerced action aligned with one’s desires.” But this semantic shuffle doesn’t rescue moral responsibility.

If your desires themselves are products of biology and environment—if even your capacity to evaluate those desires depends on inherited neural architecture—then “acting according to your desires” just pushes the problem back a step. You’re still not the ultimate author of those desires or your response to them.

What’s Left?

Perhaps we need not a defence of free will but a new framework for understanding human behaviour—one that acknowledges our biological embeddedness while preserving meaningful concepts of agency and responsibility without magical thinking.

The evidence doesn’t suggest we are without agency; it suggests our agency operates within biological constraints we’re only beginning to understand. The question isn’t whether biology influences choice—it’s whether anything else does.

For now, the neuroscientific evidence points in one direction: The will exists, but its freedom is the illusion.

Hungering for Morality: When Right and Wrong Are Just a Matter of PR

Full Disclosure: I read the first volume of The Hunger Games just before the film was released. It was OK – certainly better than the film. This video came across my feed, and I skipped through it. Near the end, this geezer references how Katniss saves or recovers deteriorated morality. Me being me, I found issue with the very notion that a relative, if not subjective, concept could be recovered.

The OP asks if The Hunger Games are a classic. I’d argue that they are a categorical classic, like Harry Potter, within the category of YA fiction.

Audio: NotebookLM podcast discussing this topic.

The Hunger Games doesn’t depict the death of morality — it’s a masterclass in how to twist it into a circus act.

Video: YouTube video that spawned this topic.

Let us dispense with the hand-wringing. The Hunger Games is not a parable of moral decay. It is something far more chilling: a vivid portrait of moral engineering — the grotesque contortion of ethical instincts into instruments of domination and spectacle.

Those who bemoan the “decline of morality” in Panem have rather missed the point. There is no absence of morality in the Capitol — only a different version of it. A rebranded, corporatised, state-sanctioned morality, lacquered in lipstick and broadcast in 4K. It is not immorality that reigns, but a hyperactive ideological morality, designed to keep the masses docile and the elites draped in silk.

This is not moral entropy; it’s moral mutation.

Children are not slaughtered because people have forgotten right from wrong — they are slaughtered because a society has been trained to believe that this is what justice looks like. That blood is penance. That fear is unity. That watching it all unfold with a glass of champagne in hand is perfectly civilised behaviour.

This isn’t the death of morality. It’s a hostile takeover.

The Moral PR Machine

If morality is, as many of us suspect, relative — a cultural construct built on consensus, coercion, and convenience — then it can no more “decline” than fashion trends can rot. It simply shifts. One day, shoulder pads are in. The next, it’s child-on-child murder as prime-time entertainment.

In Panem, the moral compass has not vanished. It’s been forcibly recalibrated. Not by reason or revelation, but by propaganda and fear. The Games are moral theatre. A grim ritual, staged to remind the Districts who holds the reins, all under the nauseating guise of tradition, order, and justice.

The citizens of the Capitol aren’t monsters — they’re consumers. Trained to see horror as haute couture. To mistake power for virtue. To cheer while children are butchered, because that’s what everyone else is doing — and, crucially, because they’ve been taught it’s necessary. Necessary evils are the most seductive kind.

Katniss: Not a Saint, But a Saboteur

Enter Katniss Everdeen, not as the moral saviour but as the spanner in the machine. She doesn’t preach. She doesn’t have a grand theory of justice. What she has is visceral disgust — an animal revulsion at the machinery of the Games. Her rebellion is personal, tribal, and instinctive: protect her sister, survive, refuse to dance for their amusement.

She isn’t here to restore some lost golden age of decency. She’s here to tear down the current script and refuse to read her lines.

Her defiance is dangerous not because it’s moral in some abstract, universal sense — but because it disrupts the Capitol’s moral narrative. She refuses to be a pawn in their ethical pageant. She reclaims agency in a world that has commodified virtue and turned ethics into state theatre.

So, Has Morality Declined?

Only if you believe morality has a fixed address — some eternal North Star by which all human actions may be judged. But if, as postmodernity has rather insistently suggested, morality is a shifting social fiction — then Panem’s horror is not a fall from grace, but a recalibration of what counts as “grace” in the first place.

And that’s the real horror, isn’t it? Not that morality has collapsed — but that it still exists, and it likes what it sees.

Conclusion: The Real Hunger

The Hunger Games is not about a society starved of morality — it’s about a world gorging on it, cooked, seasoned, and served with a garnish of guiltless indulgence. It is moral appetite weaponised. Ethics as edict. Conscience as costume.

If you feel sickened by what you see in Panem, it’s not because morality has vanished.

It’s because it hasn’t.

The Tyranny of “Human Nature”

There is a kind of political necromancy afoot in modern discourse—a dreary chant murmured by pundits, CEOs, and power-drunk bureaucrats alike: “It’s just human nature.” As if this incantation explains, excuses, and absolves all manner of violent absurdities. As if, by invoking the mystic forces of evolution or primal instinct, one can justify the grotesque state of things. Income inequality? Human nature. War? Human nature. Corporate psychopathy? Oh, sweetie, it’s just how we’re wired.

What a convenient mythology.

Audio: NotebookLM podcast on this topic.

If “human nature” is inherently brutish and selfish, then resistance is not only futile, it is unnatural. The doctrine of dominance gets sanctified, the lust to rule painted as destiny rather than deviance. Meanwhile, the quiet, unglamorous yearning of most people—to live undisturbed, to coöperate rather than conquer—is dismissed as naïve, childish, and unrealistic. How curious that the preferences of the vast majority are always sacrificed at the altar of some aggressive minority’s ambitions.

Let us dispense with this dogma. The desire to dominate is not a feature of human nature writ large; it is a glitch exploited by systems that reward pathological ambition. Most of us would rather not be ruled, and certainly not managed by glorified algorithms in meat suits. The real human inclination, buried beneath centuries of conquest and control, is to live in peace, tend to our gardens, and perhaps be left the hell alone.

And yet, we are not. Because there exists a virulent cohort—call them oligarchs, executives, generals, kings—whose raison d’être is the acquisition and consolidation of power. Not content to build a life, they must build empires. Not content to share, they must extract. They regard the rest of us as livestock: occasionally troublesome, but ultimately manageable.

To pacify us, they offer the Social Contract™—a sort of ideological bribe that says, “Give us your freedom, and we promise not to let the wolves in.” But what if the wolves are already inside the gates, wearing suits and passing legislation? What if the protection racket is the threat itself?

So no, it is not “human nature” that is the problem. Cancer is natural, too, but we don’t celebrate its tenacity. We treat it, research it, and fight like hell to survive it. Likewise, we must treat pathological power-lust not as an inevitability to be managed but as a disease to be diagnosed and dismantled.

The real scandal isn’t that humans sometimes fail to coöperate. It’s that we’re constantly told we’re incapable of it by those whose power depends on keeping it that way.

Let the ruling classes peddle their myths. The rest of us might just choose to write new ones.