The Rhetoric of Realism: When Language Pretends to Know

Let us begin with the heresy: Truth is a rhetorical artefact. Not a revelation. Not a metaphysical essence glimmering behind the veil. Just language — persuasive, repeatable, institutionally ratified language. In other words: branding.

Audio: NotebookLM podcast on this topic.

This is not merely a postmodern tantrum thrown at the altar of Enlightenment rationalism. It is a sober, if impolite, reminder that nearly everything we call “knowledge” is stitched together with narrative glue and semantic spit. Psychology. Neuroscience. Ethics. Economics. Each presents itself as a science — or worse, a moral imperative — but their foundations are built atop a linguistic faultline. They are, at best, elegant approximations; at worst, dogma in drag.

Let’s take psychology. Here is a field that diagnoses your soul via consensus. A committee of credentialed clerics sits down and declares a cluster of behaviours to be a disorder, assigns it a code, and hands you a script. It is then canonised in the DSM, the Diagnostic Scripture Manual. Doubt its legitimacy and you are either naïve or ill — which is to say, you’ve just confirmed the diagnosis. It’s a theological trap dressed in the language of care.

Or neuroscience — the church of the glowing blob. An fMRI shows a region “lighting up” and we are meant to believe we’ve located the seat of love, the anchor of morality, or the birthplace of free will. Never mind that we’re interpreting blood-oxygen fluctuations in composite images smoothed by statistical witchcraft. It looks scientific, therefore it must be real. The map is not the territory, but in neuroscience, it’s often a mood board.

And then there is language itself, the medium through which all these illusions are transmitted. It is the stage, the scenery, and the unreliable narrator. My Language Insufficiency Hypothesis proposes that language is not simply a flawed tool — it is fundamentally unfit for the task it pretends to perform. It was forged in the furnace of survival, not truth. We are asking a fork to play the violin.

This insufficiency is not an error to be corrected by better definitions or clever metaphors. It is the architecture of the system. To speak is to abstract. To abstract is to exclude. To exclude is to falsify. Every time we speak of a thing, we lose the thing itself. Language functions best not as a window to the real but as a veil — translucent, patterned, and perpetually in the way.

So what, then, are our Truths™? They are narratives that have won. Stories that survived the epistemic hunger games. They are rendered authoritative not by accuracy, but by resonance — psychological, cultural, institutional. A “truth” is what is widely accepted, not because it is right, but because it is rhetorically unassailable — for now.

This is the dirty secret of epistemology: coherence masquerades as correspondence. If enough concepts link arms convincingly, we grant them status. Not because they touch reality, but because they echo each other convincingly in our linguistic theatre.

Libet’s experiment, Foucault’s genealogies, McGilchrist’s hemispheric metaphors — each peels back the curtain in its own way. Libet shows that agency might be a post-hoc illusion. Foucault reveals that disciplines don’t describe the subject; they produce it. McGilchrist laments that the Emissary now rules the Master, and the world is flatter for it.

But all of them — and all of us — are trapped in the same game: the tyranny of the signifier. We speak not to uncover truth, but to make truth-sounding noises. And the tragedy is, we often convince ourselves.

So no, we cannot escape the prison of language. But we can acknowledge its bars. And maybe, just maybe, we can rattle them loudly enough that others hear the clank.

Until then, we continue — philosophers, scientists, diagnosticians, rhetoricians — playing epistemology like a parlour game with rigged dice, congratulating each other on how well the rules make sense.

And why wouldn’t they? We wrote them.

The Scourge: They’re Really Fighting Is Ambiguity

A Sequel to “The Disorder of Saying No” and a Companion to “When ‘Advanced’ Means Genocide”

In my previous post, The Disorder of Saying No, I explored the way resistance to authority is pathologised, particularly when that authority is cloaked in benevolence and armed with diagnostic manuals. When one refuses — gently, thoughtfully, or with a sharp polemic — one is no longer principled. One is “difficult.” Or in my case, oppositional.

Audio: NotebookLM podcast on this topic.

So when I had the gall to call out Bill Maher for his recent linguistic stunt — declaring that a woman is simply “a person who menstruates” — I thought I was doing the rational thing: pointing out a classic bit of reductionist nonsense masquerading as clarity. Maher, after all, was not doing biology. He was playing lexicographer-in-chief, defining a term with centuries of philosophical, sociological, and political baggage as though it were a checkbox on a medical form.

I said as much: that he was abusing his platform, presenting himself as the sole arbiter of the English language, and that his little performance was less about clarity and more about controlling the terms of discourse.

My friend, a post-menopausal woman herself, responded not by engaging the argument, but by insinuating — as others have — that I was simply being contrary. Oppositional. Difficult. Again. (She was clearly moved by When “Advanced” Means Genocide, but may have missed the point.)

So let’s unpack this — not to win the debate, but to show what the debate actually is.

This Isn’t About Biology — It’s About Boundary Maintenance

Maher’s statement wasn’t intended to clarify. It was intended to exclude. It wasn’t some linguistic slip; it was a rhetorical scalpel — one used not to analyse, but to amputate.

And the applause from some cisgender women — particularly those who’ve “graduated” from menstruation — reveals the heart of the matter: it’s not about reproductive biology. It’s about controlling who gets to claim the term woman.

Let’s steelman the argument, just for the sport of it:

Menstruation is a symbolic threshold. Even if one no longer menstruates, having done so places you irrevocably within the category of woman. It’s not about exclusion; it’s about grounding identity in material experience.

Fine. But now let’s ask:

  • What about women who’ve never menstruated?
  • What about intersex people?
  • What about trans women?
  • What about cultures with radically different markers of womanhood?

You see, it only works if you pretend the world is simpler than it is.

The Language Insufficiency Hypothesis: Applied

This is precisely where the Language Insufficiency Hypothesis earns its keep.

The word woman is not a locked vault. It is a floating signifier, to borrow from Barthes — a term whose meaning is perpetually re-negotiated in use. There is no singular essence to the word. It is not rooted in biology, nor in social role, nor in performance. It is a hybrid, historically contingent construct — and the moment you try to fix its meaning, it slips sideways like a greased Wittgensteinian beetle.

“Meaning is use,” says Wittgenstein, and this is what frightens people.

If woman is defined by use and not by rule, then anyone might claim it. And suddenly, the club is no longer exclusive.

That’s the threat Maher and his defenders are really reacting to. Not trans women. Not intersex people. Not language activists or queer theorists.

The threat is ambiguity.

What They Want: A World That Can Be Named

The push for rigid definitions — for menstruation as membership — is a plea for a world that can be named and known. A world where words are secure, stable, and final. Where meaning doesn’t leak.

But language doesn’t offer that comfort.

It never did.

And when that linguistic instability gets too close to something personal, like gender identity, or the foundation of one’s own sense of self, the defensive response is to fortify the language, as though building walls around a collapsing church.

Maher’s defenders aren’t making scientific arguments. They’re waging semantic warfare. If they can hold the definition, they can win the cultural narrative. They can hold the gates to Womanhood and keep the undesirables out.

That’s the fantasy.

But language doesn’t play along.

Conclusion: Words Will Not Save You — but They Might Soothe the Dead

In the end, Maher’s definition is not merely incorrect. It is insufficient. It cannot accommodate the complexity of lived experience and cannot sustain the illusion of clarity for long.

And those who cling to it — friend or stranger, progressive, or conservative — are not defending biology. They are defending nostalgia. Specifically, a pathological nostalgia for a world that no longer exists, and arguably never did: a world where gender roles were static, language was absolute, and womanhood was neatly circumscribed by bodily functions and suburban etiquette.

Ozzy and Harriet loom large here — not as individuals but as archetypes. Icons of a mid-century dream in which everyone knew their place, and deviation was something to be corrected, not celebrated. My friend, of that generation, clings to this fantasy not out of malice but out of a desperate yearning for order. The idea that woman could mean many things, and mean them differently across contexts, is not liberating to her — it’s destabilising.

But that world is gone. And no amount of menstruation-based gatekeeping will restore it.

The Real Scourge Is Ambiguity

Maher’s tantrum wasn’t about truth. It was about fear — fear of linguistic drift, of gender flux, of a world in which meaning no longer obeys. The desire to fix the definition of “woman” is not a biological impulse. It’s a theological one.

And theology, like nostalgia, often makes terrible policy.

This is why your Language Insufficiency Hypothesis matters. Because it reminds us that language does not stabilise reality — it masks its instability. The attempt to define “woman” once and for all is not just futile — it’s an act of violence against difference, a linguistic colonisation of lived experience.

So Let Them Rest

Ozzy and Harriet are dead. Let them rest.
Let their picket fence moulder. Let their signage decay.

The world has moved on. The language is shifting beneath your feet. And no amount of retroactive gatekeeping can halt that tremor.

The club is burning. And the only thing left to save is honesty.

When “Advanced” Means Genocide: A Case Study in Linguistic Implosion

This post draws on themes from my upcoming book, A Language Insufficiency Hypothesis. The transcript below is taken from a publicly available exchange, which you can view here. Consider it Exhibit A in language’s ongoing failure to bear the weight of meaning.

Transcript:

KK: Konstantin Kisin
DFW: Deborah Frances-White

KK: I’m saying we were technologically more advanced.
DFW: So you’re saying we’re superior to Australian Aboriginals?
KK: That’s quite the opposite of what I’m saying. I’m not saying we were superior, I’m saying we were technologically more advanced.
DFW: So, how is that the opposite?
KK: Superior implies a moral quality. I’m not making any moral implication. You seem to be, but what I’m saying is…
DFW: I think most people would hear it that way.
KK: No.
DFW: Again, you’re a very intelligent man. How would most people hear that?
KK: Most people would hear what I’m saying for what I’m saying, which is…
DFW: I don’t think they would.
KK: You seem to get quite heated about this, which is completely unnecessary.
DFW: Um…
KK: You think it’s necessary?
DFW: I’m a bit stunned by what you’re implying.
KK: No, you’re acting in a kind of passive aggressive way which indicates that you’re not happy…
DFW: I genuinely… I’m being 100% authentic. My visceral reaction to a white man sitting and saying to me, “And why were we able to commit genocide on them?” and then just pausing—
KK: Yes.
DFW: …is very visceral to me.
KK: Well, let’s go back. First of all, it’s interesting that you brought up my skin colour because I thought that was the exact opposite of the point you’re trying to make in the book.

Audio: NotebookLM podcast on this topic.

The Language Insufficiency Hypothesis begins with this premise: language is not merely flawed, it is structurally inadequate for mediating complex, layered realities – especially those laced with power, morality, and history. This transcript is not a debate. It is a linguistic trench war in which every utterance is laced with shrapnel, and each side thinks they’re defending reason.

Let’s pull a few of the shell casings from the mud.

KK attempts to offer a dry, neutral descriptor. DFW hears supremacist teleology. Why? Because “advanced” is culturally radioactive. It doesn’t merely denote a technical state—it connotes a ladder, with someone inevitably on the bottom rung.

When language carries historical residue, neutrality is a delusion. Words don’t just mean. They echo.

KK is making a semantic distinction. DFW hears a moral claim. Both are right. And both are talking past one another, because language is attempting to cleave affect from description, and it simply can’t.

KK’s insistence—“I’m not saying we’re superior”—is a textbook example of denotative desperation. He believes clarification will rescue intent. But as any linguist (or postcolonial theorist) will tell you: intent does not sterilise implication.

Language cannot be laundered by explanation. Once spoken, words belong to context, not intention.

KK thinks he’s holding a scalpel. DFW hears a cudgel. And here we are.

This is where the wheels come off. KK argues from semantic specificity. DFW argues from sociolinguistic reception. It’s Saussure versus the TikTok algorithm. Neither will win.

Communication disintegrates not because anyone is lying, but because they are playing incompatible games with the same tokens.

DFW’s invocation of “a white man” is not a derailment—it’s the inevitable endpoint of a system where words no longer float free but are yoked to their utterer. This is the moment the failure of language becomes a failure of interlocution. Argument collapses into indexical entrapment.

At this point, you’re no longer debating ideas. You’re defending your right to use certain words at all.

Which brings us to the final breakdown.

KK: I am making a logical distinction.
DFW: I am having a visceral reaction.

The failure isn’t moral. It isn’t historical. It’s grammatical. One is operating in a truth-function logic game. The other is reacting within a trauma-informed, socially indexed register. These are grammars that do not overlap.

If this brief and brutal dialogue proves anything, it’s this: you cannot extract meaning cleanly from words when the words themselves are sponges for history, hierarchy, and harm. The moment we ask language to do too much—to carry precision, affect, ethics, and identity—it folds in on itself.

And that, dear reader, is precisely the argument of A Language Insufficiency Hypothesis: that meaning does not reside in words, and never has. It lives in the gaps, the silences, the misfires. That’s where the truth—whatever’s left of it—might be hiding.

Follow the wreckage. That’s where the signal lives.

The Disorder of Saying No

A Polite Rebuttal to a Diagnosis I Didn’t Ask For

A dear friend — and I do mean dear, though this may be the last time they risk diagnosing me over brunch — recently suggested, with all the benevolent concern of a well-meaning inquisitor, that I might be showing signs of Oppositional Defiant Disorder.

You know the tone: “I say this with love… but have you considered that your refusal to play nicely with institutions might be clinical?”

Let’s set aside the tea and biscuits for a moment and take a scalpel to this charming little pathology. Because if ODD is a diagnosis, then I propose we start diagnosing systems — not people.

Audio: NotebookLM podcast on this topic.

When the Empire Diagnoses Its Rebels

Oppositional Defiant Disorder, for those blissfully unscarred by its jargon, refers to a “persistent pattern” of defiance, argumentativeness, rule-breaking, and — the pièce de résistance — resentment of authority. In other words, it is a medical label for being insufficiently obedient.

What a marvel: not only has resistance been de-politicised, it has been medicalised. The refusal to comply is not treated as an ethical stance or a contextual response, but as a defect of the self. The child (or adult) is not resisting something; they are resisting everything, and this — according to the canon — makes them sick.

One wonders: sick according to whom?

Derrida’s Diagnosis: The Binary Fetish

Jacques Derrida, of course, would waste no time in eviscerating the logic at play. ODD depends on a structural binary: compliant/defiant, healthy/disordered, rule-follower/troublemaker. But, as Derrida reminds us, binaries are not descriptive — they are hierarchies in disguise. One term is always elevated; the other is marked, marginal, suspect.

Here, “compliance” is rendered invisible — the assumed baseline, the white space on the page. Defiance is the ink that stains it. But this only works because “normal” has already been declared. The system names itself sane.

Derrida would deconstruct this self-justifying loop and note that disorder exists only in relation to an order that never justifies itself. Why must the subject submit? That’s not up for discussion. The child who asks that question is already halfway to a diagnosis.

Foucault’s Turn: Disciplinary Power and the Clinic as Court

Enter Foucault, who would regard ODD as yet another exquisite specimen in the taxonomy of control. For him, modern power is not exercised through visible violence but through the subtler mechanisms of surveillance, normalisation, and the production of docile bodies.

ODD is a textbook case of biopower — the system’s ability to define and regulate life itself through classification, diagnosis, and intervention. It is not enough for the child to behave; they must believe. They must internalise authority to the marrow. To question it, or worse, to resent it, is to reveal one’s pathology.

This is not discipline; this is soulcraft. And ODD is not a disorder — it is a symptom of a civilisation that cannot tolerate unmediated subjectivity. See Discipline & Punish.

Ivan Illich: The Compulsory Institutions of Care

Illich would call the whole charade what it is: a coercive dependency masquerading as therapeutic care. In Deschooling Society, he warns of systems — especially schools — that render people passive recipients of norms. ODD, in this light, is not a syndrome. It is the final gasp of autonomy before it is sedated.

What the diagnosis reveals is not a child in crisis, but an institution that cannot imagine education without obedience. Illich would applaud the so-called defiant child for doing the one thing schools rarely reward: thinking.

R.D. Laing: Sanity as a Political Position

Laing, too, would recognise the ruse. His anti-psychiatry position held that “madness” is often the only sane response to a fundamentally broken world. ODD is not insanity — it is sanity on fire. It is the refusal to adapt to structures that demand submission as a prerequisite for inclusion.

To quote Laing: “They are playing a game. They are playing at not playing a game. If I show them I see they are, I shall break the rules and they will punish me. I must play their game, of not seeing I see the game.”

ODD is what happens when a child refuses to play the game.

bell hooks: Refusal as Liberation

bell hooks, writing in Teaching to Transgress, framed the classroom as a potential site of radical transformation — if it rejects domination. The child who refuses to be disciplined is often the one who sees most clearly that the system has confused education with indoctrination.

Resistance, hooks argues, is not a flaw. It is a form of knowledge. ODD becomes, in this frame, a radical pedagogy. The defiant student is not failing — they are teaching.

Deleuze & Guattari: Desire Against the Machine

And then, should you wish to watch the diagnostic edifice melt entirely, we summon Deleuze and Guattari. For them, the psyche is not a plumbing system with blockages, but a set of desiring-machines short-circuiting the factory floor of capitalism and conformity.

ODD, to them, would be schizoanalysis in action — a body refusing to be plugged into the circuits of docility. The tantrum, the refusal, the eye-roll: these are not symptoms. They are breakdowns in the control grid.

The child isn’t disordered — the system is. The child simply noticed.

Freire: The Educated Oppressed

Lastly, Paulo Freire would ask: What kind of pedagogy demands the death of resistance? In Pedagogy of the Oppressed, he warns of an education model that treats students as empty vessels. ODD, reframed, is the moment a subject insists on being more than a receptacle.

In refusing the “banking model” of knowledge, the so-called defiant child is already halfway to freedom. Freire would call this not a disorder but a moment of awakening.

Conclusion: Diagnostic Colonialism

So yes, dear friend — I am oppositional. I challenge authority, especially when it mistakes its position for truth. I argue, question, resist. I am not unwell for doing so. I am, if anything, allergic to the idea that obedience is a virtue in itself.

Let us be clear: ODD is not a mirror held up to the subject. It is a spotlight shining from the system, desperately trying to blind anyone who dares to squint.

Now, shall we talk about your compliance disorder?


Full Disclosure: I used ChatGPT for insights beyond Derrida and Foucault, two of my mainstays.

Semantic Drift: When Language Outruns the Science

Science has a language problem. Not a lack of it – if anything, a surfeit. But words, unlike test tubes, do not stay sterile. They evolve, mutate, and metastasise. They get borrowed, bent, misused, and misremembered. And when the public discourse gets hold of them, particularly on platforms like TikTok, it’s the language that gets top billing. The science? Second lead, if it’s lucky.

Semantic drift is at the centre of this: the gradual shift in meaning of a word or phrase over time. It’s how “literally” came to mean “figuratively,” how “organic” went from “carbon-based” to “morally superior,” and how “theory” in science means robust explanatory framework but in the public square means vague guess with no homework.

In short, semantic drift lets rhetoric masquerade as reason. Once a word acquires enough connotation, you can deploy it like a spell. No need to define your terms when the vibe will do.

Audio: NotebookLM podcast on this topic.

When “Vitamin” No Longer Means Vitamin

Take the word vitamin. It sounds objective. Authoritative. Something codified in the genetic commandments of all living things. (reference)

But it isn’t.

A vitamin is simply a substance that an organism needs but cannot synthesise internally, and must obtain through its diet. That’s it. It’s a functional definition, not a chemical one.

So:

  • Vitamin C is a vitamin for humans, but not for dogs, cats, or goats. They make their own. We lost the gene. Tough luck.
  • Vitamin D, meanwhile, isn’t a vitamin at all. It’s a hormone, synthesised when sunlight hits your skin. Its vitamin status is a historical relic – named before we knew better, and now marketed too profitably to correct.

But in the land of TikTok and supplement shelves, these nuances evaporate. “Vitamin” has drifted from scientific designation to halo term – a linguistic fig leaf draped over everything from snake oil to ultraviolet-induced steroidogenesis.

The Rhetorical Sleight of Hand

This linguistic slippage is precisely what allows the rhetorical shenanigans to thrive.

In one video, a bloke claims a burger left out for 151 days neither moulds nor decays, and therefore, “nature won’t touch it.” From there, he leaps (with Olympic disregard for coherence) into talk of sugar spikes, mood swings, and “metabolic chaos.” You can almost hear the conspiratorial music rising.

The science here is, let’s be generous, circumstantial. But the language? Oh, the language is airtight.

Words like “processed,” “chemical,” and “natural” are deployed like moral verdicts, not descriptive categories. The implication isn’t argued – it’s assumed, because the semantics have been doing quiet groundwork for years. “Natural” = good. “Chemical” = bad. “Vitamin” = necessary. “Addiction” = no agency.

By the time the viewer blinks, they’re nodding along to a story told by words in costume, not facts in context.

The Linguistic Metabolism of Misunderstanding

This is why semantic drift isn’t just an academic curiosity – it’s a vector. A vector by which misinformation spreads, not through outright falsehood, but through weaponised ambiguity.

A term like “sugar crash” sounds scientific. It even maps onto a real physiological process: postprandial hypoglycaemia. But when yoked to vague claims about mood, willpower, and “chemical hijacking,” it becomes a meme with lab coat cosplay. And the science, if mentioned at all, is there merely to decorate the argument, not drive it.

That’s the crux of my forthcoming book, The Language Insufficiency Hypothesis: that our inherited languages, designed for trade, prayer, and gossip, are woefully ill-equipped for modern scientific clarity. They lag behind our knowledge, and worse, they often distort it.

Words arrive first. Definitions come limping after.

In Closing: You Are What You Consume (Linguistically)

The real problem isn’t that TikTokers get the science wrong. The problem is that they get the words right – right enough to slip past your critical filters. Rhetoric wears the lab coat. Logic gets left in the locker room.

If vitamin C is a vitamin only for some species, and vitamin D isn’t a vitamin at all, then what else are we mislabelling in the great nutritional theatre? What other linguistic zombies are still wandering the scientific lexicon?

Language may be the best tool we have, but don’t mistake it for a mirror. It’s a carnival funhouse – distorting, framing, and reflecting what we expect to see. And until we fix that, science will keep playing second fiddle to the words pretending to explain it.

The Indexing Abyss: A Cautionary Tale in Eight Chapters

There, I said it.

I’m almost finished with A Language Insufficiency Hypothesis, the book I’ve been labouring over for what feels like the gestation period of a particularly reluctant elephant. To be clear: the manuscript is done. Written. Edited. Blessed. But there remains one final circle of publishing hell—the index.

Now, if you’re wondering how motivated I am to return to indexing, consider this: I’m writing this blog post instead. If that doesn’t scream avoidance with an airhorn, nothing will.

Audio: NotebookLM podcast on this topic.

I began indexing over a month ago. I made it through two chapters of eight, then promptly wandered off to write a couple of novellas. As you do. One started as a short story—famous last words—and evolved into a novella. The muse struck again. Another “short story” appeared, and like an unattended sourdough starter, it fermented into a 15,000-word novelette. Apparently, I write short stories the way Americans pour wine: unintentionally generous.

With several unpublished manuscripts loitering on my hard drive like unemployed theatre majors, I figured it was time to release one into the wild. So I did. I published the novelette to Kindle, and just today, the paperback proof landed in my postbox like a smug little trophy.

And then, because I’m an unrepentant completionist (or a masochist—jury’s out), I thought: why not release the novella too? I’ve been told novellas and novelettes are unpopular due to “perceived value.” Apparently, people would rather buy a pound of gristle than 200 grams of sirloin. And yet, in the same breath, they claim no one has time for long books anymore. Perhaps these are different tribes of illiterates. I suppose we’ll find out.

Let’s talk logistics. Writing a book is only the beginning—and frankly, it’s the easy part. Fingers to keyboard, ideas to page. Done. I use Word, like most tragically conventional authors. Planning? Minimal. These were short stories, remember? That was the plan.

Next comes layout. Enter Adobe InDesign—because once you’ve seen what Word does to complex layouts, you never go back. Export to PDF, pray to the typographic gods, and move on.

Then there’s the cover. I lean on Illustrator and Photoshop. Photoshop is familiar, like a worn-in shoe; Illustrator is the smug cousin who turns up late but saves the day with scalable vectors. This time, I used Illustrator for the cover—lesson learnt from past pixelation traumas. Hardback to paperback conversion? A breeze when your artwork isn’t made of crayon scribbles and hope.

Covers, in case you’ve never assembled one, are ridiculous. Front. Back. Spine. Optional dust jacket if you’re feeling fancy (I wasn’t). You need titles, subtitles, your name in a legible font, and let’s not forget the barcode, which you will place correctly on the first attempt exactly never.

Unlike my first novel, where I enlisted someone with a proper design eye to handle the cover text, this time I went full minimalist. Think Scandinavian furniture catalogue meets existential despair. Classy.

Once the cover and interior are done, it’s time to wrestle with the publishing platforms. Everything is automated these days—provided you follow their arcane formatting commandments, avoid forbidden fonts, and offer up your soul. Submitting each book takes about an hour, not including the time lost choosing a price that balances “undervalued labour” and “won’t scare away cheapskates.”

Want a Kindle version? That’s another workflow entirely, full of tortured formatting, broken line breaks, and wondering why your chapter headings are now in Wingdings. Audiobooks? That’s a whole other circus, with its own animals and ringmasters. Honestly, it’s no wonder authors hire publishers. Or develop drinking problems.

But I’m stubborn. Which brings us full circle.

I’ve now got two books heading for daylight, a few more waiting in the wings, and one bloody non-fiction beast that won’t see release until I finish the damn index. No pseudonym this time. No hiding. Just me, owning my sins and hoping the final product lands somewhere between “insightful” and “mercifully short.”

So yes, life may well be a journey. But indexing is the bit where the satnav breaks, the road floods, and the boot falls off the car. Give me the destination any day. The journey can fuck right off.

Sustenance: A Book About Aliens, Language, and Everything You’re Getting Wrong

Violet aliens on a farm

So, I wrote a book and published it under Ridley Park, the pseudonym I use for fiction.

It has aliens. But don’t get excited—they’re not here to save us, probe us, or blow up the White House. They’re not even here for us.

Which is, frankly, the point.

Audio: NotebookLM podcast on this topic.

The book’s called Sustenance, and while it’s technically speculative fiction, it’s more about us than them. Or rather, it’s about how we can’t stop making everything about us—even when it shouldn’t be. Especially when it shouldn’t be.

Let’s talk themes. And yes, we’re using that word like academics do: as a smokescreen for saying uncomfortable things abstractly.

Language: The Original Scam

Language is the ultimate colonial tool. We call it communication, but it’s mostly projection. You speak. You hope. You assume. You superimpose meaning on other people like a cling film of your own ego.

Sustenance leans into this—not by showing a breakdown of communication, but by showing what happens when communication was never mutual in the first place. When the very idea of “meaning” has no purchase. It’s not about mishearing—it’s about misbeing.

Culture: A Meme You Were Born Into

Culture is the software you didn’t choose to install, and probably can’t uninstall. Most people treat it like a universal law—until they meet someone running a different OS. Cue confusion, arrogance, or violence.

The book explores what happens when cultural norms aren’t shared, and worse, aren’t even legible. Imagine trying to enforce property rights on beings who don’t understand “ownership.” It’s like trying to baptise a toaster.

Sex/Gender: You Keep Using Those Words…

One of the quiet joys of writing non-human characters is discarding human assumptions about sex and gender—and watching readers squirm.

What if sex wasn’t about power, pleasure, or identity? What if it was just a biological procedure, like cell division or pruning roses? Would you still be interested? Would you still moralise about it?

We love to believe our sex/gender constructs are inevitable. They’re not. They’re habits—often bad ones.

Consent: Your Framework Is Showing

Consent, as we use it, assumes mutual understanding, shared stakes, and equivalent agency. Remove any one of those and what’s left?

Sustenance doesn’t try to solve this—it just shows what happens when those assumptions fall apart. Spoiler: it’s not pretty, but it is honest.

Projection: The Mirror That Lies

Humans are deeply committed to anthropocentrism. If it walks like us, or flinches like us, it must be us. This is why we get so disoriented when faced with the truly alien: it won’t dance to our tune, and we’re left staring at ourselves in the funhouse mirror.

This isn’t a book about aliens.

It’s a book about the ways we refuse to see what’s not us.

Memory: The Autobiography of Your Justifications

Memory is not a record. It’s a defence attorney with a narrative license. We rewrite the past to make ourselves look consistent, or innocent, or right.

In Sustenance, memory acts less as a tether to truth and more as a sculpting tool—a way to carve guilt into something manageable. Something you can live with. Until you can’t.

In Summary: It’s Not About Them. It’s About You.

If that sounds bleak, good. It’s meant to.

But it’s also a warning: don’t get too comfortable in your own categories. They’re only universal until you meet someone who doesn’t share them.

Like I said, it’s not really about the aliens.

It’s about us.


If you enjoy fiction that’s more unsettling than escapist, more question than answer, you might be interested in Sustenance. It’s live on Kindle now for the cost of a regrettable coffee:

📘 Sustenance on Amazon US
Also available in the UK, DE, FR, ES, IT, NL, JP, BR, CA, MX, AU, and IN—because alienation is a universal language.

On Ishiguro, Cioran, and Whatever I Think I’m Doing

Sora-generated image of Emil Cioran and Kazuo Ishiguro reading a generic book together

Having just finished Never Let Me Go by Kazuo Ishiguro, I’ve now cracked open my first taste of Cioran—History and Utopia. You might reasonably ask why. Why these two? And what, if anything, do they have in common? Better yet—what do the three of us have in common?

Audio: NotebookLM podcast on this topic.

Recently, I finished writing a novella titled Propensity (currently gathering metaphorical dust on the release runway). Out of curiosity—or narcissism—I fed it to AI and asked whose style it resembled. Among the usual suspects were two names I hadn’t yet read: Ishiguro and Cioran. I’d read the others and understood the links. These two, though, were unknown quantities. So I gave them a go.

Ishiguro is perhaps best known for The Remains of the Day, which, like Never Let Me Go, got the Hollywood treatment. I chose the latter, arbitrarily. I even asked ChatGPT to compare both books with their cinematic counterparts. The AI was less than charitable, describing Hollywood’s adaptations as bastardised and bowdlerised—flattened into tidy narratives for American palates too dim to digest ambiguity. On this, we agree.

What struck me about Never Let Me Go was its richly textured mundanity. That’s apparently where AI saw the resemblance to Propensity. I’m not here to write a book report—partly because I detest spoilers, and partly because summaries miss the point. It took about seven chapters before anything ‘happened’, and then it kept happening. What had at first seemed like a neurotic, wandering narrative from the maddeningly passive Kathy H. suddenly hooked me. The reveals began to unfold. It’s a book that resists retelling. It demands firsthand experience. A vibe. A tone. A slow, aching dread.

Which brings me neatly to Cioran.

History and Utopia is a collection of essays penned in French (not his mother tongue, but you’d never guess it) while Cioran was holed up in postwar Paris. I opted for the English translation—unapologetically—and was instantly drawn in. His prose? Electric. His wit? Acidic. If Ishiguro was a comparison of style, then Cioran was one of spirit. Snark, pessimism, fatalistic shrugs toward civilisation—finally, someone speaking my language.

Unlike the cardboard cut-outs of Cold War polemics we get from most Western writers of the era, Cioran’s take is layered, uncomfortably self-aware, and written by someone who actually fled political chaos. There’s no naïve idealism here, no facile hero-villain binaries. Just a deeply weary intellect peering into the abyss and refusing to blink. It’s not just what he says, but the tone—the curled-lip sneer at utopian pretensions and historical self-delusions. If I earned even a drop of that comparison, I’ll take it.

Both Ishiguro and Cioran delivered what I didn’t know I needed: the reminder that some writers aren’t there to tell you a story. They’re there to infect you with an atmosphere. An idea. A quiet existential panic you can’t shake.

I’ve gotten what I came for from these two, though I suspect I’ll be returning, especially to Cioran. Philosophically, he’s my kind of bastard. I doubt this’ll be my last post on his work.

“Trust the Science,” They Said. “It’s Reproducible,” They Lied.

—On Epistemology, Pop Psychology, and the Cult of Empirical Pretence

Science, we’re told, is the beacon in the fog – a gleaming lighthouse of reason guiding us through the turbulent seas of superstition and ignorance. But peer a bit closer, and the lens is cracked, the bulb flickers, and the so-called lighthouse keeper is just some bloke on TikTok shouting about gut flora and intermittent fasting.

Audio: NotebookLM podcast on this topic.

We are creatures of pattern. We impose order. We mistake correlation for causation, narrative for truth, confidence for knowledge. What we have, in polite academic parlance, is an epistemology problem. What we call science is often less Newton and more Nostradamus—albeit wearing a lab coat and wielding a p-hacked dataset.

Let’s start with the low-hanging fruit—the rotting mango of modern inquiry: nutritional science, which is to actual science what alchemy is to chemistry, or vibes are to calculus. We study food the way 13th-century monks studied demons: through superstition, confirmation bias, and deeply committed guesswork. Eat fat, don’t eat fat. Eat eggs, don’t eat eggs. Eat only between the hours of 10:00 and 14:00 under a waxing moon while humming in Lydian mode. It’s a cargo cult with chia seeds.

But why stop there? Let’s put the whole scientific-industrial complex on the slab.

Psychology: The Empirical Astrological Society

Psychology likes to think it’s scientific. Peer-reviewed journals, statistical models, the odd brain scan tossed in for gravitas. But at heart, much of it is pop divination, sugar-dusted for mass consumption. The replication crisis didn’t merely reveal cracks – it bulldozed entire fields. The Stanford Prison Experiment? A theatrical farce. Power poses? Empty gestural theatre. Half of what you read in Psychology Today could be replaced with horoscopes and no one would notice.

Medical Science: Bloodletting, But With Better Branding

Now onto medicine, that other sacred cow. We tend to imagine it as precise, data-driven, evidence-based. In practice? It’s a Byzantine fusion of guesswork, insurance forms, and pharmaceutical lobbying. As Crémieux rightly implies, medicine’s predictive power is deeply compromised by overfitting, statistical fog, and a staggering dependence on non-replicable clinical studies, many funded by those who stand to profit from the result.

And don’t get me started on epidemiology, that modern priesthood that speaks in incantations of “relative risk” and “confidence intervals” while changing the commandments every fortnight. If nutrition is theology, epidemiology is exegesis.

The Reproducibility Farce

Let us not forget the gleaming ideal: reproducibility, that cornerstone of Enlightenment confidence. The trouble is, in field after field—from economics to cancer biology—reproducibility is more aspiration than reality. What we actually get is a cacophony of studies no one bothers to repeat, published to pad CVs, p-hacked into publishable shape, and then cited into canonical status. It’s knowledge by momentum. We don’t understand the world. We just retweet it.

What, Then, Is To Be Done?

Should we become mystics? Take up tarot and goat sacrifice? Not necessarily. But we should strip science of its papal robes. We should stop mistaking publication for truth, consensus for accuracy, and method for epistemic sanctity. The scientific method is not the problem. The pretence that it’s constantly being followed is.

Perhaps knowledge doesn’t have a half-life because of progress, but because it was never alive to begin with. We are not disproving truth; we are watching fictions expire.

Closing Jab

Next time someone says “trust the science,” ask them: which bit? The part that told us margarine was manna? The part that thought ulcers were psychosomatic? The part that still can’t explain consciousness, but is confident about your breakfast?

Science is a toolkit. But too often, it’s treated like scripture. And we? We’re just trying to lose weight while clinging to whatever gospel lets us eat more cheese.

Unwilling Steelman, Part V

A five-part descent into the illusion of autonomy, where biology writes the script, reason provides the excuse, and the self is merely the echo of its own conditioning. This is a follow-up to a recent post on the implausibility of free will.

You Cannot Originate Yourself

The causa sui argument, and the final collapse of moral responsibility

“If you cannot cause yourself, you cannot cause your choices.
And if you cannot cause your choices, you cannot own them.”

Audio: NotenookLM podcast on this topic.

Everything until now has pointed to erosion:

  • Your choices are state-dependent.
  • Your identity is cumulative, not authored.
  • Your evaluations are judged by compromised observers.

But here, finally, we strike at the bedrock.

It isn’t merely that you are manipulated.
It isn’t merely that you are misperceived.
It’s that you never could have been free, even in theory.

Because you did not make yourself.

The Causa Sui Problem

To be ultimately morally responsible, you must be the origin of who you are.

  • You must have chosen your disposition.
  • You must have selected your values.
  • You must have designed your will.

But you didn’t.

You emerged:

  • With a particular genetic cocktail.
  • Into a particular historical moment.
  • Through particular developmental experiences.
  • With particular neurological quirks and vulnerabilities.

And at no point did you step outside yourself to say:

“I would like to be this kind of agent, with this kind of character.”

You were thrown — as Heidegger might say — into a situation not of your choosing, with equipment you didn’t request, subject to pressures you couldn’t anticipate.

And everything you think of as “yours” — your courage, your laziness, your generosity, your rage — is the unfolding of that original unchosen situation.

No Escape via Reflexivity

Some will protest:

“But I can reflect! I can change myself!”

But this, too, is a mirage.

Because:

  • The desire to reflect is conditioned.
  • The capacity to reflect is conditioned.
  • The courage to act on reflection is conditioned.

You didn’t author your ability to self-correct.
You simply inherited it — like a river inheriting a particular gradient.

Even your rebellion is written in your blueprint.

Freedom by Degrees Is Not Freedom

The compatibilist fallback — that freedom is just “acting according to oneself” — collapses under causa sui.

Because the self that acts was never authored. It was configured by prior causes.

If you cannot be the cause of yourself,
then you cannot be the cause of your actions in any ultimate sense.

Thus:

  • No ultimate credit for your virtues.
  • No ultimate blame for your vices.
  • Only causal flow, chemical procession, narrative stitching after the fact.

The criminal and the saint are both unlucky configurations of biology and circumstance.

TL;DR: No Self, No Sovereignty

  • To be responsible, you must be causa sui — the cause of yourself.
  • You are not.
  • Therefore, you are not ultimately responsible for your actions.
  • Therefore, free will — as traditionally imagined — does not exist.

There is choice.
But there is no chooser behind the choice.
Only the momentum of prior conditions, impersonating agency.


Series Summary: Unwilling Steelmen

A five-part descent into the illusion of autonomy

What remains, if not free will?
Something perhaps stranger — and possibly, more humane:

A universe of actors who deserve understanding, but not blame.
Compassion, but not judgment.
Help, but not hagiography.