Perspective Is Everything

2–3 minutes

This clip of Rachel Barr slid into my feed today, fashionably late by a week, and I thought it deserved a little dissection. The video wouldn’t embed directly – Instagram always has to be precious – so I downloaded it and linked it here. Don’t worry, Rachel, I’m not stealing your clicks.

Video: Neuroscientist Dr Rachel Barr discusses Charlie Kirk and gun violence.
Source: https://www.instagram.com/p/DOd3LnjDUW8

Now, the United States. Or rather, the United States In Name Only – USINO. A nation perpetually rebranding itself as a “union” whilst its citizens claw at each other like alley cats in a bin fire. Yes, divisions abound – economic, racial, ideological, pick your poison – but some fissures cut to the bone. Today’s example: Charlie Kirk and the rabid congregation of defenders he’s managed to cultivate.

Audio: NotebookLM podcast on this topic.

The Competing Liturgies

To hear one camp tell it, Kirk is no hater at all. He’s a gentle, God-soaked soul, brimming with Christian love and trying – halo tilted just so – to shepherd stray sheep toward Our Lord and Saviour™. A real Sunday-school sweetheart.

But this is not, shockingly, the consensus. The other camp (my camp, if disclosure still matters in a post-truth age) see him as a snarling opportunist, a huckster of hate packaged in the familiar varnish of patriotism and piety. In short: a hate-merchant with a mailing list.

Spectacle as Weapon

I’ve watched Kirk at work. He loves to stage “debates” – quotation marks mandatory – where a token dissenter is dropped into an amphitheatre of loyalists. It’s the rhetorical equivalent of feeding Christians to lions, except the lions roar on cue and the crowd thinks the blood is wine. He laces misogyny, racism, and reheated premodern dogma into cheap soundbites, and the audience laps it up as though they were attending a revival. For the believers, it’s a festival. For everyone else, it’s a hostile takeover of public discourse.

Deaf Ears, Loud Mouths

Here’s the rub: Cohort A doesn’t perceive his words as hate because they already share the operating system. It’s not hate to them – it’s common sense. Cohort B, meanwhile, hears every syllable as the screech of a chalkboard dragged across the public square. Same words, two worlds.

And when I dare to suggest that if you can’t hear the hatred, you might just be complicit in it, the pushback is instantaneous: Stop imposing your worldview! Which is rich, since their worldview is already blaring through megaphones at tax-exempt rallies. If my worldview is one that insists on less hate, less dehumanisation, less sanctified bullying, then fine, I’ll take the charge.

The deeper accusation, though, is almost comic: that I’m hallucinating hate in a man of pure, lamb-like love. That’s the gaslighting twist of the knife – turning critique into pathology. As if the problem isn’t the bile spilling from the stage but my faulty perception of it.

Perspective is everything, yes – but some perspectives reek of wilful blindness.

The Morality We Can’t Stop Wanting

1–2 minutes

Humans can’t seem to stop clawing after morality. The primates among us chuck cucumbers when their neighbours get grapes, and the rest of us grumble about fairness on social media. The impulse is practically universal, an evolutionary quirk that kept us from throttling each other long enough to raise children and build cities.

Image: A seemingly perturbed capuchin monkey.

But universality is not objectivity. Just because every ape howls about fairness doesn’t mean “Justice” floats somewhere in Platonic space, waiting to be downloaded. It only means we’re the kind of animal that survives by narrating rules and enforcing them with shunning, shame, or, when necessary, cudgels.

Audio: NotebookLM podcast on this topic.

This is where Alasdair MacIntyre trips over his own robes. After Virtue skewers Enlightenment rationalists who tried to prop morality on reason, it then dismisses Nietzsche for being “irrational.” MacIntyre’s fix? Resurrect Aristotle’s teleology. If reason can’t save morality, maybe an ancient oak tree can. But this is wish-thinking with a Greek accent. He’s still arguing by reason that reason can’t do the job, then sneaking back in through Aristotle’s back door with a “firmer ground.” Firmer only because he says so.

Nietzsche, at least, had the decency to call the bluff: no telos, no floor, no cosmic anchor. Just will, style, and the abyss. Uncomfortable? Absolutely. Honest? Yes.

Deleuze went further. He pointed out that morality, like culture, doesn’t look like a tree at all. It’s a rhizome: tangled, proliferating, hybridising, never grounded in a single root. The fragments MacIntyre despairs over aren’t evidence of collapse. They’re evidence of how moral life actually grows—messy, contingent, interconnected. The only reason it looks chaotic is that we keep demanding a trunk where only tubers exist.

So here we are, apes with a craving for rules, building cities and philosophies on scaffolds of habit, language, and mutual illusion. We are supported as surely as the Earth is supported – by nothing. And yet, we go on living.

The need for morality is real. The yearning for telos is real. The floor is not.

‘Luigi Mangione Is Not a Terrorist’

3–4 minutes

This isn’t a political post. It’s about language, the insufficiency of it, and the games we play when pretending words carry more weight than they do.

Luigi Mangione is the man accused of killing UnitedHealthcare CEO Brian Thompson. After his arrest, prosecutors stacked the usual charges – murder, firearms, assorted legal bric-a-brac – then added the cherry on top: domestic terrorism.

Audio: NotebookLM podcast on this topic.

Recently, a pretrial judge cut the cherry loose.

NEW YORK, Sept 16 (Reuters) – A New York state judge dismissed on Tuesday two terrorism-related counts against Luigi Mangione over the December 2024 killing of health insurance executive Brian Thompson, though the 27-year-old remains charged with second-degree murder and eight other criminal counts in the case.

“There was no evidence presented of a desire to terrorize the public, inspire widespread fear, engage in a broader campaign of violence, or to conspire with organized terrorist groups,” Judge Gregory Carro found in a 12-page written decision (pdf). “Here, the crime – the heinous, but targeted and discrete killing of one person – is very different from the examples of terrorism set forth in the statute.” (source)

The prosecution insisted the label fit. The judge disagreed. Cue outrage, applause, and confusion. The crime is still horrific, but suddenly the word “terrorist” is off-limits.

The Elasticity of Terror

How can two educated parties look at the same set of facts and come to opposite conclusions? Because “terrorism” isn’t a Platonic form. It’s an elastic linguistic category. The prosecutor drags it out because “terrorist” is a magical word in American law: it inflates an already ugly act into a civilisation-level threat, unlocks harsher penalties, and lets politicians posture about national security.

The judge, however, reminded everyone that a bullet in Manhattan does not equal al-Qaeda. Murder, yes. Terrorism, no. Not because murder is less grotesque, but because the statutory definition won’t stretch that far without breaking.

Language Games, Legal Hierarchies

This is where it gets trickier. The judge isn’t merely “pulling rank”—though rank does matter. American jurisprudence is hierarchical: trial judges hand down rulings, appellate judges review them, and nine robed partisans in Washington can one day rewrite the whole script. On paper, these tiers are meant to iron out ambiguity. In practice, they multiply it.

Even co-equal judges, reading the same facts, can diverge wildly. Split decisions at the Supreme Court prove the point: five minds say “constitutional,” four say “unconstitutional,” and the one-vote margin becomes binding law for 330 million people. That’s not the discovery of truth; it’s the triumph of one language game over another, enforced by hierarchy.

The Insufficiency Laid Bare

So we return to Mangioni. He has been charged with murder – the second degree flavour; that much is uncontested. But is he a “terrorist”? The prosecution said yes, the judge said no, and another judge, higher up or sitting elsewhere, might well say yes again. Each claim is defensible. Each is motivated by language, by politics, and by the institutional pressures of the bench.

And that’s the point. Language doesn’t tether itself to reality; it choreographs our endless arguments about reality. The law tries to tame it with hierarchies and definitions, but the seams always show. Mangioni is a murderer. Whether he is a terrorist depends less on his actions than on which interpretive dance is winning in the courtroom that day.

Within One Sigma of Civilisation

Freud once quipped that people are “normal” only on average. To the degree that they deviate from the mean, they are neurotic, psychotic, or otherwise abnormal. Whatever else one thinks of Freud, the metaphor holds for Modernity.

Image: Picture and quote by Sigmund Freud: Every normal person, in fact, is only normal on the average. His ego approximates to that of the psychotic in some part or other and to a greater or lesser extent. —Analysis Terminable And Interminable (1937), Chapter V

We are “Modern” only on average, and only for the first standard deviation. Within one sigma, you can wave a flag and declare: rational, secular, Enlightened. But step further into the tails and the façade dissolves. The “normal” modern turns out to attend megachurches, consult horoscopes, share conspiracy memes, or cling to metaphysical relics that Enlightenment reason was supposed to have torched centuries ago.

The problem isn’t that these people aren’t Modern. The problem is that nobody is Modern, not in the sense the story requires. The mean is an over-fitted abstraction. “Modernity” works like Freud’s “normal”: a statistical average that erases the deviations, then insists that the erased bits are pathology rather than reality.

But the tails are where most of human life actually happens. The “average Modern” is as mythical as the “reasonable person.” What we call Modernity is just a bell curve costume draped over the same mix of superstition, desire, and contingency that has always driven human behaviour.

Ground News

I like to stay updated on the news of the day, so I just registered for a Ground News account. Ground News is a news aggregator. They gather news and categorise it by political leaning and the publication’s record on factuality. Their claim is to reveal blind spots so help people not get caught in perspective bubbles. It also shows when a story is picked up predominantly by one side or another. I’ve seen ads for this on many channels and have for a while, so it’s likely that you have, too. This is not an ad.

This article attracted my attention, not because of the content but because of the headline. As a statistician, this bothers me. As a communicator, the damage is trebled. I don’t receive any compensation for clicking the link. I include it for reference for those who are not familiar with the service.

Image: Ground News Screengrab

Notice the choice of writing, ‘1-in-6 parents reject vaccine recommendations‘.

Two things shine through.

  1. The use of ‘reject’ – a negative verb.
  2. The use of ‘1-in-6’ – the figure accompanying the negative verb – 17%.

Statistically, this means that 5-in-6 parents follow vaccine recommendations – 83%.

This is the summary view. Scan down, and notice the Left-leaning Raw Story references a ‘staggering number’ of parents who reject vaccines. Notice also how the language softens – the claim is revised to ‘delay or reject’. Without clicking into the story, what is this breakdown? I’m not sure, but this is what sensationalism looks like to attract clicks.

Image: Ground News Summary View

Interestingly, the outlets tend to use different language and give different attention. What percentage of this is due to political bias and which is benign editorial licence is unclear.

On balance, the articles – Left, Right, and Centre – unanimously note that vaccine use is down, incidences of measles are up, RFK policies appear to be exacerbating the health management issue. The worst offenders are ‘very’ religious, white, politically conservative people. This cohort aligns with the RFK and the current administration.

The poll also found that parents who have postponed or avoided vaccinating their children tend to be white, conservative, and highly religious, and some choose to homeschool.

For this story, one of the sources was Greek and another French. Some claim to be behind a paywall, but this didn’t pose a problem for me. Perhaps they offer some complementary views.

Separately, on the right-hand side of the top image, there is a bias indicator: It shows that 57% of the reports were from Left-leaning journals, 36% Centre, leaving the remaining 7% to Right-leaning sources.

Image: Updated Bias Distribution

When I returned to write this post, I noticed that the reporting had changed as more Centre-focused reports picked up the story.

If I were to guess, this story shines a negative light on the Right, so they may just be waiting for the news cycle to pass.

In the (Right-facing) Greek story I read, the reporting wasn’t materially different to the other stories, which is to say they don’t try to render the story through rose-colour glasses.

Nature and Its Paperwork

We humans pride ourselves on being civilised. Unlike animals, we don’t let biology call the shots. A chimp reaches puberty and reproduces; a human reaches puberty and is told, not yet – society has rules. Biologically mature isn’t socially mature, and we pat ourselves on the back for having spotted the difference.

But watch how quickly that distinction vanishes when it threatens the in-group narrative. Bring up gender, and suddenly there’s no such thing as a social construct. Forget the puberty-vs-adulthood distinction we were just defending – now biology is destiny, immutable and absolute. Cross-gender clothing? “Against nature.” Transition? “You can’t be born into the wrong body.” Our selective vision flips depending on whose ox is being gored.

The same trick appears in how we talk about maturity. You can’t vote until 18. You’re not old enough to drink until 21. You’re not old enough to stop working until 67. These numbers aren’t natural; they’re paperwork. They’re flags planted in the soil of human life, and without the right flag, you don’t count.

The very people who insist on distinguishing biological maturity from social maturity when it comes to puberty suddenly forget the distinction when it comes to gender. They know perfectly well that “maturity” is a construct – after all, they’ve built entire legal systems around arbitrary thresholds – but they enforce the amnesia whenever it suits them. Nietzsche would say it plainly: the powerful don’t need to follow the rules, they only need to make sure you do.

So the next time someone appeals to “nature,” ask: which one? The nature that declares you old enough to marry at puberty? The nature that withholds voting, drinking, or retirement rights until a bureaucrat’s calendar says so? Or the nature that quietly mutates whenever the in-group needs to draw a new line around civilisation?

The truth is, “nature” and “maturity” are less about describing the world than about policing it. They’re flags, shibboleths, passwords. We keep calling them natural, but the only thing natural about them is how often they’re used to enforce someone else’s story.

A Critique of Reason (Not to Be Confused with Kant’s)

2–3 minutes

Kant, bless him, thought he was staging the trial of Reason itself, putting the judge in the dock and asking whether the court had jurisdiction. It was a noble spectacle, high theatre of self-scrutiny. But the trick was always rigged. The presiding judge, the prosecution, the jury, the accused, all wore the same powdered wig. Unsurprisingly, Reason acquitted itself.

The Enlightenment’s central syllogism was never more than a parlour trick:

  • P1: The best path is Reason.
  • P2: I practice Reason.
  • C: Therefore, Reason is best.

It’s the self-licking ice-cream cone of intellectual history. And if you dare to object, the trap springs shut: what, you hate Reason? Then you must be irrational. Inquisitors once demanded heretics prove they weren’t in league with Satan; the modern equivalent is being told you’re “anti-science.” The categories defend themselves by anathematising doubt.

The problem is twofold:

First, Reason never guaranteed agreement. Two thinkers can pore over the same “facts” and emerge with opposite verdicts, each sincerely convinced that Reason has anointed their side. In a power-laden society, it is always the stronger voice that gets to declare its reasoning the reasoning. As Dan Hind acidly observed, Reason is often nothing more than a marketing label the powerful slap on their interests.

Second, and this is the darker point, Reason itself is metaphysical, a ghost in a powdered wig. To call something “rational” is already to invoke an invisible authority, as if Truth had a clerical seal. Alasdair MacIntyre was right: strip away the old rituals and you’re left with fragments, not foundations.

Other witnesses have tried to say as much. Horkheimer and Adorno reminded us that Enlightenment rationality curdles into myth the moment it tries to dominate the world. Nietzsche laughed until his throat bled at the pretence of universal reason, then promptly built his own metaphysics of will. Bruno Latour, in We Have Never Been Modern, dared to expose Science as what it actually is – a messy network of institutions, instruments, and politics masquerading as purity. The backlash was so swift and sanctimonious that he later called it his “worst” book, a public recantation that reads more like forced penance than revelation. Even those who glimpsed the scaffolding had to return to the pews.

So when we talk about “Reason” as the bedrock of Modernity, let’s admit the joke. The bedrock was always mist. The house we built upon it is held up by ritual, inertia, and vested interest, not granite clarity. Enlightenment sold us the fantasy of a universal judge, when what we got was a self-justifying oracle. Reason is not the judge in the courtroom. Reason is the courtroom itself, and the courtroom is a carnival tent – all mirrors, no floor.

Modernity: The Phase That Never Was

6–8 minutes

We’re told we live in the Enlightenment, that Reason™ sits on the throne and superstition has been banished to the attic. Yet when I disguised a little survey as “metamodern,” almost none came out as fully Enlightened. Three managed to shed every trace of the premodern ghost, one Dutch wanderer bypassed Modernity entirely, and not a single soul emerged free of postmodern suspicion. So much for humanity’s great rational awakening. Perhaps Modernity wasn’t a phase we passed through at all, but a mirage we still genuflect before, a lifestyle brand draped over a naked emperor.

Audio: NotebookLM podcast on this topic

The Enlightenment as Marketing Campaign

The Enlightenment is sold to us as civilisation’s great coming-of-age: the dawn when the fog of superstition lifted and Reason took the throne. Kant framed it as “man’s emergence from his self-incurred immaturity” – an Enlightenment bumper sticker that academics still like to polish and reapply. But Kant wasn’t writing for peasants hauling mud or women without the vote; he was writing for his own coterie of powdered-wig mandarins, men convinced their own habits of rational debate were humanity’s new universal destiny.

Modernity, in this story, isn’t a historical stage we all inhabited. It’s an advertising campaign: Reason™ as lifestyle brand, equality as tagline, “progress” as the logo on the tote bag. Modernity, in the textbooks, is billed as a historical epoch, a kind of secular Pentecost in which the lights came on and we all finally started thinking for ourselves. In practice, it was more of a boutique fantasy, a handful of gentlemen mistaking their own rarefied intellectual posture for humanity’s destiny.

The Archetype That Nobody Lives In

At the core of the Enlightenment lies the archetype of Man™: rational, autonomous, unencumbered by superstition, guided by evidence, weighing pros and cons with the detachment of a celestial accountant. Economics repackaged him as homo economicus, forever optimising his utility function as if he were a spreadsheet in breeches.

But like all archetypes, this figure is a mirage. Our survey data, even when baited as a “metamodern survey”, never produced a “pure” Enlightenment subject.

  • 3 scored 0% Premodern (managing, perhaps, to kick the gods and ghosts to the kerb).
  • 1 scored 0% Modern (the Dutch outlier: 17% Premodern, 0% Modern, 83% Post, skipping the Enlightenment altogether, apparently by bike).
  • 0 scored 0% Postmodern. Every single participant carried at least some residue of suspicion, irony, or relativism.

The averages themselves were telling: roughly 18% Premodern, 45% Modern, 37% Postmodern. That’s not an age of Reason. That’s a muddle, a cocktail of priestly deference, rationalist daydreams, and ironic doubt.

Even the Greats Needed Their Crutches

If the masses never lived as Enlightenment subjects, what about the luminaries? Did they achieve the ideal? Hardly.

  • Descartes, desperate to secure the cogito, called in God as guarantor, dragging medieval metaphysics back on stage.
  • Kant built a cathedral of reason only to leave its foundations propped up by noumena: an unseeable, unknowable beyond.
  • Nietzsche, supposed undertaker of gods, smuggled in his own metaphysics of will to power and eternal recurrence.
  • William James, surveying the wreckage, declared that “truth” is simply “what works”, a sort of intellectual aspirin for the Enlightenment headache.

And economists, in a fit of professional humiliation, pared the rational subject down to a corpse on life support. Homo economicus became a creature who — at the very least, surely — wouldn’t choose to make himself worse off. But behavioural economics proved even that meagre hope to be a fantasy. People burn their wages on scratch tickets, sign up for exploitative loans, and vote themselves into oblivion because a meme told them to.

If even the “best specimens” never fully embodied the rational archetype, expecting Joe Everyman, who statistically struggles to parse a sixth-grade text and hasn’t cracked a book since puberty, to suddenly blossom into a mini-Kant is wishful thinking of the highest order.

The Dual Inertia

The real story isn’t progress through epochs; it’s the simultaneous drag of two kinds of inertia:

  • Premodern inertia: we still cling to sacred myths, national totems, and moral certainties.
  • Modern inertia: we still pretend the rational subject exists, because democracy, capitalism, and bureaucracy require him to.

The result isn’t a new epoch. It’s a cultural chimaera: half-superstitious, half-rationalist, shot through with irony. A mess, not a phase..

Arrow’s Mathematical Guillotine

Even if the Enlightenment dream of a rational demos were real, Kenneth Arrow proved it was doomed. His Impossibility Theorem shows that no voting system can turn individual rational preferences into a coherent “general will.” In other words, even a parliament of perfect Kants would deadlock when voting on dinner. The rational utopia is mathematically impossible.

So when we are told that democracy channels Reason, we should hear it as a polite modern incantation, no sturdier than a priest blessing crops.

Equality and the Emperor’s Wardrobe

The refrain comes like a hymn: “All men are created equal.” But the history is less inspiring. “Men” once meant property-owning Europeans; later it was generously expanded to mean all adult citizens who’d managed to stay alive until eighteen. Pass that biological milestone, and voilà — you are now certified Rational, qualified to determine the fate of nations.

And when you dare to question this threadbare arrangement, the chorus rises: “If you don’t like democracy, capitalism, or private property, just leave.” As if you could step outside the world like a theatre where the play displeases you. Heidegger’s Geworfenheit makes the joke bitter: we are thrown into this world without choice, and then instructed to exit if we find the wallpaper distasteful. Leave? To where, precisely? The void? Mars?

The Pre-Modern lord said: Obey, or be exiled. The Modern democrat says: Vote, or leave. And the Post-Enlightenment sceptic mutters: Leave? To where, exactly? Gravity? History? The species? There is no “outside” to exit into. The system is not a hotel; it’s the weather.

Here the ghost of Baudrillard hovers in the wings, pointing out that we are no longer defending Reason, but the simulacrum of Reason. The Emperor’s New Clothes parable once mocked cowardice: everyone saw the nudity but stayed silent. Our situation is worse. We don’t even see that the Emperor is naked. We genuinely believe in the fineries, the Democracy™, the Rational Man™, the sacred textile of Progress. And those who point out the obvious are ridiculed: How dare you mock such fineries, you cad!

Conclusion: The Comfort of a Ghost

So here we are, defending the ghost of a phase we never truly lived. We cling to Modernity as if it were a sturdy foundation, when in truth it was always an archetype – a phantom rational subject, a Platonic ideal projected onto a species of apes with smartphones. We mistook it for bedrock, built our institutions upon it, and now expend colossal energy propping up the papier-mâché ruins. The unfit defend it out of faith in their own “voice,” the elites defend it to preserve their privilege, and the rest of us muddle along pragmatically, dosing ourselves with Jamesian aspirin and pretending it’s progress.

Metamodernism, with its marketed oscillation between sincerity and irony, is less a “new stage” than a glossy rebranding of the same old admixture: a bit of myth, a bit of reason, a dash of scepticism. And pragmatism –James’s weary “truth is what works” – is the hangover cure that keeps us muddling through.

Modernity promised emancipation from immaturity. What we got was a new set of chains: reason as dogma, democracy as ritual, capitalism as destiny. And when we protest, the system replies with its favourite Enlightenment lullaby: If you don’t like it, just leave.

But you can’t leave. You were thrown here. What we call “Enlightenment” is not a stage in history but a zombie-simulation of an ideal that never drew breath. And yet, like villagers in Andersen’s tale, we not only guard the Emperor’s empty wardrobe – we see the garments as real. The Enlightenment subject is not naked. He is spectral, and we are the ones haunting him.

Go Back Where You Came From (And Other Spells)

2–3 minutes

There’s a certain kind of rhetorical grenade people like to lob when their sense of ownership feels wobbly. You’ve heard it. You’ve maybe had it lobbed in your general direction.

It’s not an argument, of course. It’s a spell. A warding charm. The linguistic equivalent of hissing at a stray cat in the garden. The phrase carries the weight of assumed belonging: we are naturally here, you are obviously not. The incantation is meant to banish you with a puff of words.

The other day, I watched a black activist absorb this spell and toss it back with deadpan precision:

Cue awkward silence. The symmetry was perfect. Suddenly, the verbal hex had reversed polarity, exposing the hypocrisy built into the original curse. The power of the spell depends entirely on who gets to cast it. When it comes from the wrong mouth, the whole structure of “common sense” collapses into farce.

Another example: a Greek immigrant in my orbit, accent still clinging to every consonant, grumbling about a black family that had moved into his neighbourhood. Why didn’t they “go back to Africa”? This from a man who himself had gone “back” from nowhere, except a homeland he happily abandoned for better wages and better weather. Colonialism is apparently a one-way ticket: Europeans roam the globe and call it destiny, but when others move into their postcode, it’s treated like an invasion.

I confess, I once flirted with the same nonsense. Years ago in Japan, in my more callow phase, I asked – half in jest, wholly in arrogance – why these people didn’t have the decency to speak my language. The difference, such as it is, lay in my awareness that I was being ridiculous. My Greek neighbour, my activist’s heckler—no irony there. They were dead serious.

That’s the grotesque comedy of racism: its logic isn’t logic at all. It’s ritual. A mantra recited to reassure oneself of belonging by denying it to others. It dresses itself in the robes of rationality – “go back where you came from” sounds like geography, after all – but it’s closer to medieval exorcism than reasoned debate.

And when the cursed simply whispers the incantation back? The spell collapses. The supposed “truth” reveals itself for what it always was: a desperate attempt to maintain the fiction that one kind of stranger is native and another will always be alien.

Every empire tells its children they were born at home, and tells everyone else they were born trespassing.

Butler versus Butler (on a bed of Beauvoir)

2–3 minutes

I’ve been reading Octavia Butler’s Dawn and find myself restless. The book is often lauded as a classic of feminist science fiction, but I struggle with it. My problem isn’t with aliens, or even with science fiction tropes; it’s with the form itself, the Modernist project embedded in the genre, which insists on posing questions and then supplying answers, like a catechism for progress. Sci-Fi rarely leaves ambiguity alone; it instructs.

Find the companion piece on my Ridley Park blog.

Audio: NotebookLM podcast summarising this topic.

Beauvoir’s Ground

Simone de Beauvoir understood “woman” as the Other – defined in relation to men, consigned to roles of reproduction, care, and passivity. Her point was not that these roles were natural, but that they were imposed, and that liberation required stripping them away.

Octavia Butler’s Lilith

Lilith Iyapo, the protagonist of Dawn, should be radical. She is the first human awakened after Earth’s destruction, a Black woman given the impossible role of mediating between humans and aliens. Yet she is not allowed to resist her role so much as to embody it. She becomes the dutiful mother, the reluctant carer, the compliant negotiator. Butler’s narration frequently tells us what Lilith thinks and feels, as though to pre-empt the reader’s interpretation. She is less a character than an archetype: the “reasonable woman,” performing the script of liberal Western femininity circa the 1980s.

Judith Butler’s Lens

Judith Butler would have a field day with this. For her, gender is performative: not an essence but a repetition of norms. Agency, in her view, is never sovereign; it emerges, if at all, in the slippages of those repetitions. Read through this lens, Octavia Butler’s Lilith is not destabilising gender; she is repeating it almost too faithfully. The novel makes her into an allegory, a vessel for explaining and reassuring. She performs the role assigned and is praised for her compliance – which is precisely how power inscribes itself.

Why Sci-Fi Leaves Me Cold

This helps me understand why science fiction so often fails to resonate with me. The problem isn’t the speculative element; I like the idea of estrangement, of encountering the alien. The problem is the Modernist scaffolding that underwrites so much of the genre: the drive to solve problems, to instruct the reader, to present archetypes as universal stand-ins. I don’t identify with that project. I prefer literature that unsettles rather than reassures, that leaves questions open rather than connecting the dots.

So, Butler versus Butler on the bedrock of Beauvoir: one Butler scripting a woman into an archetype, another Butler reminding us that archetypes are scripts. And me, somewhere between them, realising that my discomfort with Dawn is not just with the book but with a genre that still carries the DNA of the very Modernism it sometimes claims to resist.