Yaron Brook, ever Ayn Rand’s ventriloquist, insists students are customers. Education, in his frame, is no different from a gym membership; you pay to be made “uncomfortable.” Professors as personal trainers, universities as masochism boutiques. It’s an absurd metaphor that fits all too well in our consumerist age: education rebranded as a service industry, discomfort sold at premium prices.
Video: What is killing universities?
Catherine Liu cuts in sharply: I am not a service worker. And she’s right. Education is not concierge service; it is meant to disturb, dislodge, and disorient. Liu distinguishes “Leftist” universal reason from “Liberal” mushy inclusivity, nostalgic for Enlightenment rationality, perhaps, but her refusal to collapse education into hospitality is a rare moment of clarity.
Eric Kaufman diagnoses the “new left” as a cult of the sacred, where identity is fetishised and offence policed. Liu nods; Brook flirts with Marxism for a minute; suddenly everyone seems to agree the university has lost its bearings.
Brook is not wrong that conservatives self-select out of higher ed. But let’s be clear: not because academia is too “left,” but because they crave catechism, not critique. They want ideological madrassas, not laboratories of doubt. In this sense, Brook’s consumer model is apt: conservatives want a product that validates their priors. That is indoctrination, not education.
Meanwhile, the universities collude in their own corruption. They market “education™” as networking, branding, and employability. At the top tier, the Ivies, Oxbridge, Grandes Écoles, you might still buy proximity to power. But below that? Snake oil. At best, you get nosebleed seats in the auditorium of influence. At worst, an obstructed view behind a pillar. For most, the ticket is counterfeit: a credential that promises access and delivers only debt.
And yet, the true thing still exists. Real education, the kind Liu gestured toward, doesn’t need oak-panelled halls or hedge-fund endowments. It can happen online, in a book, in a seminar, even here with ChatGPT. It’s the deliberate encounter with discomfort, with error, with reason itself. But snake oil sells better than hard truths, and so universities keep hawking tickets they don’t own.
When I was a child, the United States Supreme Court was still spoken of in hushed, reverent tones, as though nine robed sages in Washington were the Platonic guardians of justice. Impartiality was the word on everyone’s lips, and we were meant to believe that “the law” floated above the grubby realm of politics, as pure and crystalline as the Ten Commandments descending from Sinai.
Audio: NotebookLM podcast on this topic (MP3).
Even then, I didn’t buy it. The whole thing reeked of theatre. And the past few decades have proved that scepticism correct: the Court has become a pantomime. In this robed reality show, nine unelected lawyers cosplaying as oracles interpret the world for us, often by a razor-thin vote that splits exactly along partisan lines. Impartial? Please. A coin toss would be less predictable.
This is why I perked up when I heard Iain McGilchrist, in his recent interview with Curt Jaimungal, wax lyrical about rationality versus reasonableness. Schizophrenia, he tells us, is like a left hemisphere gone berserk, parsing the world in a literalist frenzy without the right hemisphere’s sense of context. The schizophrenic hears a voice in an empty room and, lacking the capacity for metaphor, deduces that it must be the neighbours whispering through the electrical socket. Rational, in its way, but absurd.
Video: Iain McGilchrist and Curt Jaimungal
McGilchrist’s corrective is “reasonableness,” which he casts as the quality of a wise judge: not a slave to mechanistic logic, but able to balance intuition, context, and experience. The problem, of course, is that “reasonable” is one of those delightful weasel words I keep writing about. It claims to be neutral – a universal standard, above the fray – but in practice, it’s just a ventriloquism act. “Reasonable” always turns out to mean what I, personally, consider obvious.
Enter Judge Judy, daytime television’s answer to jurisprudence. Watch her wag a finger and declare, “Any reasonable person would have kept the receipt!” And the studio audience – hand-picked to agree with her every twitch – erupts in applause. It’s reasonableness as spectacle, the mob dressed up as jurisprudence.
Now scale that performance up to SCOTUS. The “reasonable person” test is embedded deep in the common law tradition, but the reasonable person is not you, me, or anyone who has actually missed a bus, pawned a wedding ring, or heard a neighbour’s radio through thin walls. No, the reasonable person is an imaginary, well-groomed gentleman of property whose intuitions happen to dovetail nicely with the prejudices of the bench. The Court, like Judge Judy, insists it is Reason incarnate, when in truth it is reasonableness-by-consensus, a carefully curated consensus at that.
McGilchrist is right that rationality, stripped of context, can lead to absurdity. But in elevating “reasonableness” as if it were a transcendent virtue, he mistakes projection for philosophy. A judge is “reasonable” only when her intuitions rhyme with yours. And when they don’t? Suddenly, she’s a madwoman in robes, and her “reasonableness” is exposed as nothing more than taste disguised as universal law.
The “reasonable person” – whether invoked by the Supreme Court or by Judge Judy – is a ghost that conveniently resembles the speaker. We imagine we’re appealing to some objective standard, when in fact we’re gazing into a mirror. The tragedy of schizophrenia, as McGilchrist notes, is to take metaphor literally. The tragedy of law and politics is the opposite: to dress literal bias in metaphor, to call it “reason,” and to applaud ourselves for our wisdom while the stage set burns behind us.
I just read The Granton Star Cause in Irvine Welsh’s short story collection, The Acid House, and couldn’t help but reflect it off of Kafka’s Metamorphosis.
Kafka gave us Gregor Samsa: a man who wakes up as vermin, stripped of usefulness, abandoned by family, slowly rotting in a godless universe. His tragedy is inertia; his metamorphosis grants him no agency, only deeper alienation.
Audio: NotebookLM podcast on this topic.
Welsh replies with Boab Coyle, a lad who is likewise cast off, rejected by his football mates, scorned by his parents, dumped by his girlfriend, and discarded by his job. Boab is surplus to every domain: civic, familial, erotic, and economic. Then he undergoes his own metamorphosis. And here Welsh swerves from Kafka.
Boab meets his “god.” But the god is nothing transcendent. It is simply Boab’s latent agency, given a mask – a projection of his bitterness and thwarted desires. God looks like him, speaks like him, and tells him to act on impulses long repressed. Where Kafka leaves Gregor to die in silence, Welsh gives Boab a grotesque theology of vengeance.
Through a Critical Theory lens, the contrast is stark:
Marx: Both men are surplus. Gregor is disposable labour; Boab is Thatcher’s lumpen. Alienated, both become vermin.
Nietzsche: Gregor has no god, only the absurd. Boab makes one in his own image, not an Übermensch, but an Über-fly – quite literally a Superfly – a petty deity of spite.
Foucault: Gregor is disciplined into passivity by the family gaze. Boab flips it: as a fly, he surveils and annoys, becoming the pest-panopticon.
Bataille/Kristeva: Gregor embodies the abjection of his family’s shame. Boab revels in abjection, weaponising filth as his new mode of agency.
The punchline? Boab’s new god-agency leads straight to destruction. His rage is cathartic, but impotent. The lumpen are permitted vengeance only when it consumes themselves.
So Kafka gave us the tragedy of stasis; Welsh provides us with the tragedy of spite. Both are bleak parables of alienation, but Welsh injects a theology of bad attitude: a god who licenses action only long enough to destroy the actor.
I identify strongly with Irvine Welsh’s characters in Trainspotting – the book, not the sanitised film version. Especially with Mark Renton, whose voice strips away illusions with a brutality that borders on honesty.
Audio: NotebookLM podcast on this topic.
Consider this passage from the chapter “Bang to Rites” (pp. 86–87), where Renton attends the funeral of his mate Billy. Billy joined the army to escape the dead-end life they all shared, only to be killed on duty in Northern Ireland. Renton’s verdict:
“He died a hero they sais. Ah remember that song: ‘Billy Don’t Be A Hero’. In fact, he died a spare prick in a uniform, walking along a country road Wi a rifle in his hand. He died an ignorant victim ay imperialism, understanding fuck all about the myriad circumstances which led tae his death. That wis the biggest crime, he understood fuck all about it. Aw he hud tae guide um through this great adventure in Ireland, which led tae his death, wis a few vaguely formed sectarian sentiments.
“The cunt died as he lived: completely fuckin scoobied.
“His death wis good fir me. He made the News at Ten. In Warholian terms, the cunt had a posthumous fifteen minutes ay fame. People offered us sympathy, n although it wis misguided, it wis nice tae accept anywey. Ye dinnae want tae disappoint folk.
“Some ruling class cunt, a junior minister or something, says in his Oxbridge voice how Billy wis a brave young man. [1] He wis exactly the kind ay cunt they’d huv branded as a cowardly thug if he wis in civvy street rather than on Her Majesty’s Service. [2] This fucking walking abortion says that his killers will be ruthlessly hunted down. So they fuckin should. Aw the wey tae the fuckin Houses ay Parliament.
“Savour small victories against this white–trash tool of the rich that’s no no no”
[1] Renton doesn’t let anyone off the hook. Not Billy, not the army, not the Oxbridge suits who polish the tragedy into something fit for the News at Ten. The uniform is a costume, a disguise: a working-class lad suddenly deemed “brave” only because he was wearing the right outfit when he died. Strip away the uniform, and he’d have been dismissed as a thug or a waster.
[2] Renton’s root-cause analysis is unsparing. Billy wasn’t killed by the man with the gun so much as by the machine that put him there – the state, the ruling classes, the ones who spin death into “sacrifice” while continuing to shuffle the poor like pawns across the board.
It’s this clarity that makes Welsh’s work more than a drug novel. Trainspotting isn’t just about needles and nods; it’s about seeing through the charade. Renton despises both establishment and rebellion because both are performance, both hollow. His cynicism is the closest thing to honesty in a world that would rather dress up corpses in borrowed dignity.
And maybe that’s why I feel the affinity: because subversion matters more than allegiance, and sometimes the only truthful voice is the one that refuses to be polite at the funeral.
Karl Popper’s Paradox of Intolerance has become a kind of intellectual talisman, clutched like a rosary whenever fascists start goose-stepping into the town square. Its message is simple enough: to preserve tolerance, one must be intolerant of intolerance. Shine enough sunlight on bad ideas, and – so the pious hope – they’ll shrivel into dust like a vampire caught out at dawn.
If only.
The trouble with this Enlightenment fairy tale is that it presumes bad ideas melt under the warm lamp of Reason, as if ignorance were merely a patch of mildew waiting for the bleach of debate. But bad ideas are not bacteria; they are weeds, hydra-headed and delighting in the sun. Put them on television, and they metastasise. Confront them with logic, and they metastasise faster, now with a martyr’s halo.
Audio: NotebookLM podcast on this topic.
And here’s the part no liberal dinner-party theorist likes to face: the people most wedded to these “bad ideas” often don’t play the game of reason at all. Their critical faculties have been packed up, bubble-wrapped, and left in the loft decades ago. They don’t want dialogue. They want to chant. They don’t want evidence. They want affirmation. The Socratic method bounces off them like a ping-pong ball fired at a tank.
But let’s be generous. Suppose, just for a moment, we had Plato’s dream: a citizenry of Philosopher Kings™, all enlightened, all rational. Would democracy then work? Cue Arrow’s Impossibility Theorem, that mathematical killjoy which proves that even under perfect conditions – omniscient voters, saintly preferences, universal literacy – you still cannot aggregate those preferences into a system that is both fair and internally consistent. Democracy can’t even get out of its own way on paper.
Now throw in actual humans. Not the Platonic paragons, but Brexit-uncle at the pub, Facebook aunt with her memes, the American cousin in a red cap insisting a convicted felon is the second coming. Suddenly, democracy looks less like a forum of reasoned debate and more like a lottery machine coughing up numbers while we all pretend they mean “the will of the people.”
Democracy is the worst form of government, except for all the others.
And this is where the Churchill quip waddles in, cigar smoke curling round its bowler hat: “Democracy is the worst form of government, except for all the others.” Ah yes, Winston, do please save us with a quip so well-worn it’s practically elevator music. But the problem is deeper than taste in quotations. If democracy is logically impossible (Arrow) and practically dysfunctional (Trump, Brexit, fill in your own national catastrophe), then congratulating ourselves that it’s “better than the alternatives” is simply an admission that we’ve run out of imagination.
Because there are alternatives. A disinterested AI, for instance, could distribute resources with mathematical fairness, free from lobbyists and grievance-mongers. Nursery schools versus nursing homes? Feed in the data, spit out the optimal allocation. No shouting matches, no demagoguery, no ballots stuffed with slogans. But here the defenders of democracy suddenly become Derrida in disguise: “Ah, but what does fair really mean?” And just like that, we are back in the funhouse of rhetorical mirrors where “fair” is a word everyone loves until it costs them something.
So perhaps democracy doesn’t require an “educated populace” at all; that was always just sugar-paper wrapping. It requires, instead, a population sufficiently docile, sufficiently narcotised by the spectacle, to accept the carnival of elections as a substitute for politics. Which is why calling the devotees of a Trump, or any other demagogue, a gaggle of lemmings is both accurate and impolitic: they know they’re not reasoning; they’re revelling. Your contempt merely confirms the script they’ve already written for you.
Video: Short callout to Karl Popper and Hilary Lawson.
The philosopher, meanwhile, is left polishing his lantern, muttering about reason to an audience who would rather scroll memes about pedophile pizza parlours. Popper warned us that tolerance cannot survive if it tolerates its own annihilation. Arrow proved that even if everyone were perfectly reasonable, the maths would still collapse. And Churchill, bless him, left us a one-liner to make it all seem inevitable.
Perhaps democracy isn’t the worst form of government except for all the others. Perhaps it’s simply the most palatable form of chaos, ballots instead of barricades, polling booths instead of pitchforks. And maybe the real scandal isn’t that people are too stupid for democracy, but that democracy was never designed to be about intelligence in the first place. It was always about managing losers while telling them they’d “had their say.”
The Enlightenment promised us reason; what it delivered was a carnival where the loudest barker gets the booth. The rest of us can either keep muttering about paradoxes in the corner or admit that the show is a farce and start imagining something else.
I was a professional musician in the 1980s. I played guitar, but this was always a sideline to my real work as a recording engineer and producer. Competence, not virtuosity, was the coin of the realm in the studio, and I was competent. Still, I spent much of my time surrounded by musicians who left me slack-jawed: people who could sight-read Bach at breakfast and bash out Van Halen riffs after lunch without missing a beat. Next to them, I was, charitably, merely competent.
That’s the thing about competence: it doesn’t make you the star, but it keeps the machine running. I knew I wasn’t the flash guitarist or the prodigy bassist, but I could play my parts cleanly and hold a band together. When later groups already had lead guitarists, I played bass. Was I a bassist? No. But I was competent enough to lock in with the drummer and serve the ensemble. Nobody mistook me for a virtuoso, least of all me. I wasn’t an impostor; I was a cog in the machine, good enough to keep the show on the road. That was my ego attachment: not “musician” as identity, but member of a band.
The Hallucination of “Impostor Syndrome”
Much ink is spilt on impostor syndrome, that anxious whisper that one is a fraud who doesn’t belong. The polite story is that it’s just nerves: you are competent, you do belong, you’re simply holding yourself against impossible standards. Nonsense. The truth is darker. Most people are impostors.
The nervous tension is not a malfunction of self-esteem; it’s a moment of clarity. A faint recognition that you’ve been miscast in a role you can’t quite play, but are forced to mime anyway. The Peter Principle doesn’t kick in at some distant managerial plateau; it’s the basic law of organisational gravity. People rise past their competence almost immediately, buoyed not by skill but by connections, bluff, and HR’s obsession with “fit.”
The Consultant’s View from the Cheap Seats
As a Management Consultant™, I met countless “leaders” whose only discernible talent was staying afloat whilst already over their heads. Organisations, too blind or too immature to notice, rewarded them with raises and promotions anyway. Somebody’s got to get them, after all. HR dutifully signed the paperwork, called it “talent management,” and congratulated itself on another triumph of culture-fit over competence.
In music, incompetence is self-correcting: audiences walk out, bands dissolve, the market punishes mediocrity. In corporate life, incompetence metastasises. Bluffers thrive. Mediocrity is embalmed, padded with stock options, and paraded on stage at leadership summits.
Good Enough vs. Bluff Enough
Competence, though, is underrated. You don’t need to be the best guitarist or the savviest CEO. You need to be good enough for the role you’re actually playing, and honest enough not to mistake the role for your identity. In bands, that worked fine. In business and politics, it’s subversive. The whole edifice depends on people pretending to be more than they are, rehearsing confidence in lieu of competence.
No wonder impostor syndrome is rampant. It’s not a pathology; it’s the ghost of truth in a system of lies.
The antidote isn’t TED-talk therapy or self-affirmation mantras. It’s honesty: admit the limits of your competence, stop mistaking ego for ability, and refuse to play HR’s charade. Competence is enough. The rest is noise.
Let us disabuse ourselves of one of the workplace’s most cherished delusions: that Human Resources is there for the humans. HR is not your therapist, not your advocate, not your confessor. HR is an appendage of the organisation, and like all appendages, its nerve endings run straight back to the corporate brain. Its “concern” for your well-being is merely a prophylactic against lawsuits and productivity dips. The error is ours; we persist in mistaking the guard dog for a pet.
Audio: NotebookLM podcast on this topic.
Bal and Dóci’s 2018 paper in the European Journal of Work and Organizational Psychology (EJWOP) tears the mask off this charade. They demonstrate how neoliberal ideology has seeped, unseen, into both workplace practice and the very research that pretends to study it objectively. Through the lenses of political, social, and fantasmatic logics, they show that neoliberalism has convinced us of three dangerous fairy tales:
Instrumentality: people are not people but “resources,” as fungible as printer ink.
Individualism: you are not part of a collective but a lone entrepreneur of the self, shackled to your CV like a Victorian debtor.
Competition: you are locked in an endless cage fight with your colleagues, grinning through the blood as you “collaborate.”
These logics are then dressed up in fantasies to keep us compliant: the fantasy of freedom (“you’re free to negotiate your own zero-hours contract”), the fantasy of meritocracy (“you got that promotion because you’re brilliant, not because you went to the right school”), and the fantasy of progress (“growth is good, even if it kills you”).
Those of us with an interest in Behavioural Economics had naively hoped that the mythical homo economicus, that laughable caricature of a rational, utility-maximising automaton, would by now be filed under “anachronistic curiosities.” Yet in corporate domains, this zombie shuffles on, cosseted and cultivated by neoliberal ideology. Far from being discredited, homo economicus remains a protected species, as if the boardroom were some Jurassic Park of bad economics.
The brilliance and the horror is that even the academics meant to be studying work and organisations have been captured by the same ideology. Work and Organisational Psychology (WOP) too often frames employees as variables in a productivity equation, measuring “engagement” only in terms of its effect on shareholder value. The worker’s humanity is rendered invisible; the employee exists only insofar as they generate output.
So when HR offers you a mindfulness app or a “resilience workshop,” remember: these are not gifts but obligations. There are ways of making you responsible for surviving a system designed to grind you down. The neoliberal trick is to convince you that your suffering is your own fault, that if only you had been more proactive, more adaptable, more “employable,” you wouldn’t be so crushed beneath the wheel.
Bal and Dóci are right: the way forward is to re-politicise and re-humanise organisational studies, to see workers as humans rather than performance units. But until then, expect HR to keep smiling while sharpening its knives.
Absolute liberty means absolute liberty, but what if the liberty you seek is death? The moment you carve out exceptions – speech you can’t say, choices you can’t make, exits you can’t take – you’ve left the realm of liberty and entered the gated community of permission.
And here’s the test most self-styled liberty lovers fail: you’re free to skydive without a parachute, but try ending your life peacefully and watch how quickly the freedom brigade calls in the moral SWAT team.
If the right to live doesn’t come with the right to leave, it’s not a right. It’s a sentence.
I’m not his usual audience; I’m already in the choir, but this eight-minute clip by Philosopher Muse is worth your time. It’s a lucid walk through the ethical terrain mapped by Sarah Perry in Every Cradle Is a Grave, and it’s one of the better distillations of antinatalist thought I’ve seen for the general public. Perry’s libertarian starting point is straightforward: if you truly own your life, you must also have the right to give it up.
He threads in the dark-glimmer insights of Emil Cioran’s poetic despair, Thomas Ligotti’s existential horror, David Benatar’s asymmetry, and Peter Wessel Zapffe’s tragic consciousness. Together they point to an uncomfortable truth: autonomy that stops short of death isn’t autonomy at all; it’s a petting zoo of freedoms where the gate is locked.
What we do not own, we are enslaved to. If we cannot choose death, then we do not truly own our lives.
—Sarah Perry
I’ve said this before, but it bears repeating. I once had a girlfriend who hated her life but was too afraid of hell to end it. She didn’t “pull through.” She overdosed by accident. Loophole closed, I suppose. That’s what happens when metaphysical prohibitions are allowed to run the operating system.
And here’s where I diverge from the purist libertarians. I don’t believe most people are competent enough to have the liberty they think they deserve. Not because they’re all dribbling idiots, but because they’ve been marinated for generations in a stew of indoctrination. For thousands of years, nobody talked about “liberty” or “freedom” as inalienable rights. Once the notion caught on, it was packaged and sold – complete with an asterisk, endless fine print, and a service desk that’s never open.
We tell ourselves we’re free, but only in the ways that don’t threaten the custodians. You can vote for whoever the party machine serves up, but you can’t opt out of the game. You can live any way you like, as long as it looks enough like everyone else’s life. You can risk death in countless state-approved ways, but the moment you calmly choose it, your autonomy gets revoked.
Sometimes the most loving thing we can do is not to rescue but to listen – not to prevent, but to allow.
So yes, watch the video. Read Perry’s Every Cradle Is a Grave. Then ask yourself whether your liberty is liberty, or just a longer leash.
If liberty means anything, it means the right to live and the right to leave. The former without the latter is just life imprisonment with better marketing.
A response on another social media site got me thinking about another Sorites paradox. The notion just bothers me. I’ve long held that it is less a paradox than an intellectually lazy way to manoeuvre around language insufficiency.
<rant>
The law loves a nice, clean number. Eighteen to vote. Sixteen to marry. This-or-that to consent. As if we all emerge from adolescence on the same morning like synchronised cicadas, suddenly equipped to choose leaders, pick spouses, and spot the bad lovers from the good ones.
But the Sorites paradox gives the game away: if you’re fit to vote at 18 years and 0 days, why not at 17 years, 364 days? Why not 17 years, 363 days? Eventually, you’re handing the ballot to a toddler who thinks the Prime Minister is Peppa Pig. Somewhere between there and adulthood, the legislator simply throws a dart and calls it “science.”
To bolster this fiction, we’re offered pseudo-facts: “Women mature faster than men”, or “Men’s brains don’t finish developing until thirty.” These claims, when taken seriously, only undermine the case for a single universal threshold. If “maturity” were truly the measure, we’d have to track neural plasticity curves, hormonal arcs, and a kaleidoscope of individual factors. Instead, the state settles for the cheapest approximation: a birthday.
This obsession with fixed thresholds is the bastard child of Enlightenment rationalism — the fantasy that human variation can be flattened into a single neat line on a chart. The eighteenth-century mind adored universals: universal reason, universal rights, universal man. In this worldview, there must be one age at which all are “ready,” just as there must be one unit of measure for a metre or a kilogram. It is tidy, legible, and above all, administratively convenient.
Cue the retorts:
“We need something.” True, but “something” doesn’t have to mean a cliff-edge number. We could design systems of phased rights, periodic evaluations, or contextual permissions — approaches that acknowledge people as more than interchangeable cut-outs from a brain-development chart.
“It would be too complicated.” Translation: “We prefer to be wrong in a simple way than right in a messy way.” Reality is messy. Pretending otherwise isn’t pragmatism; it’s intellectual cowardice. Law is supposed to contend with complexity, not avert its gaze from it.
And so we persist, reducing a continuous, irregular, and profoundly personal process to an administratively convenient fiction — then dressing it in a lab coat to feign objectivity. A number is just a number, and in this case, a particularly silly one.
Yesterday, I suggested democracy is a mediocre theatre production where the audience gets to choose which mediocre understudy performs. Some readers thought I was being harsh. I wasn’t.
A mate recently argued that humans will always be superior to AI because of emergence, the miraculous process by which complexity gives rise to intelligence, creativity, and emotion. Lovely sentiment. But here’s the rub: emergence is also how we got this political system, the one no one really controls anymore.
Like the human body being mostly non-human microbes, our so-called participatory government is mostly non-participatory components: lobbyists, donors, bureaucrats, corporate media, careerists, opportunists, the ecosystem that is the actual organism. We built it, but it now has its own metabolism. And thanks to the law of large numbers, multiplied by the sheer number of political, economic, and social dimensions in play, even the human element is diluted into statistical irrelevance. At any rate, what remains of it has lost control – like the sorcerer’s apprentice.
People like to imagine they can “tame” this beast, the way a lucid dreamer thinks they can bend the dream to their will. But you’re still dreaming. The narrative still runs on the dream’s logic, not yours. The best you can do is nudge it; a policy tweak here, a symbolic vote there, before the system digests your effort and excretes more of itself.
a bad system beats a good person every time
W Edwards Deming
This is why Deming’s line hits so hard: a bad system beats a good person every time. Even if you could somehow elect the Platonic ideal of leadership, the organism would absorb them, neutralise them, or spit them out. It’s not personal; it’s structural.
And yet we fear AI “taking over,” as if that would be a radical departure from the status quo. Newsflash: you’ve already been living under an autonomous system for generations. AI would just be a remodel of the control room, new paint, same prison.
So yes, emergence makes humans “special.” It also makes them the architects of their own inescapable political microbiome. Congratulations, you’ve evolved the ability to build a machine that can’t be turned off.