Becoming a Woman with Penetration Politics

Male flatworms, those primordial swordsmen of the slime, have invented what can only be described as penetration politics. They don’t seduce; they don’t serenade; they don’t even swipe right. They duel. Penises out, sabres up, they jab at one another in a tiny, biological cockfight until one is stabbed into submission. The “winner” ejaculates his way to freedom, while the “loser” becomes a mother by default. Gender, in flatworm society, is not destiny; it’s a duel with dicks for sabres.

Audio: NotebookLM podcast on this topic.

Errata: Upon further research, I share additional information on my author site.

Beauvoir once reminded us: “One is not born, but rather becomes, a woman.” The flatworm demonstrates this principle with obscene literalness. You are not born female. You become female when you lose the fight and get stabbed full of sperm. Congratulations: you’ve been penis-fenced into maternity.

And here we can smuggle in that old feminist provocation – every man is a rapist. Not in the polite, bourgeois sense of candlelight coercion, but in the bare biological logic of the worm. To inseminate is to penetrate; to penetrate is to conquer; to conquer is to outsource the cost of life onto someone else’s body. The duel is just foreplay for the inevitable violation. Consent, in worm-world, is as fictional as a unicorn with a diaphragm. The “winner” is celebrated precisely because he doesn’t have to consent to anything afterwards – he stabs, struts, and slips away, leaving the loser’s body to incubate the consequences.

Now, humanity likes to pretend it has outgrown this. We have laws, customs, and etiquette. We invented flowers, chocolates, and marriage vows. But scratch the surface, and what do you find? Penetration politics. Who gets to wield the dick, who gets saddled with the debt. The radical feminists weren’t entirely wrong: structurally, culturally, biologically, the male role has been defined as penetration – and penetration, whether dressed in lace or latex, is always a form of conquest.

The worm is honest. We are hypocrites. They fence with their penises and accept the consequences. We fence with our laws, our armies, our religions, our institutions – and still manage to convince ourselves we’re civilised.

So yes, The Left Hand of Darkness can keep its glacial androgynes. For a metaphor that actually explains our sorry state, look no further than penis-fencing flatworms: every thrust a power play, every victory a rape in miniature, every loss a womb conscripted. Humanity in a nutshell – or rather, in a stab wound.

HR’s Neoliberal Mirage: Human Resources Without the Humans

Let us disabuse ourselves of one of the workplace’s most cherished delusions: that Human Resources is there for the humans. HR is not your therapist, not your advocate, not your confessor. HR is an appendage of the organisation, and like all appendages, its nerve endings run straight back to the corporate brain. Its “concern” for your well-being is merely a prophylactic against lawsuits and productivity dips. The error is ours; we persist in mistaking the guard dog for a pet.

Audio: NotebookLM podcast on this topic.

Bal and Dóci’s 2018 paper in the European Journal of Work and Organizational Psychology (EJWOP) tears the mask off this charade. They demonstrate how neoliberal ideology has seeped, unseen, into both workplace practice and the very research that pretends to study it objectively. Through the lenses of political, social, and fantasmatic logics, they show that neoliberalism has convinced us of three dangerous fairy tales:

  • Instrumentality: people are not people but “resources,” as fungible as printer ink.
  • Individualism: you are not part of a collective but a lone entrepreneur of the self, shackled to your CV like a Victorian debtor.
  • Competition: you are locked in an endless cage fight with your colleagues, grinning through the blood as you “collaborate.”

These logics are then dressed up in fantasies to keep us compliant: the fantasy of freedom (“you’re free to negotiate your own zero-hours contract”), the fantasy of meritocracy (“you got that promotion because you’re brilliant, not because you went to the right school”), and the fantasy of progress (“growth is good, even if it kills you”).

Those of us with an interest in Behavioural Economics had naively hoped that the mythical homo economicus, that laughable caricature of a rational, utility-maximising automaton, would by now be filed under “anachronistic curiosities.” Yet in corporate domains, this zombie shuffles on, cosseted and cultivated by neoliberal ideology. Far from being discredited, homo economicus remains a protected species, as if the boardroom were some Jurassic Park of bad economics.

The brilliance and the horror is that even the academics meant to be studying work and organisations have been captured by the same ideology. Work and Organisational Psychology (WOP) too often frames employees as variables in a productivity equation, measuring “engagement” only in terms of its effect on shareholder value. The worker’s humanity is rendered invisible; the employee exists only insofar as they generate output.

So when HR offers you a mindfulness app or a “resilience workshop,” remember: these are not gifts but obligations. There are ways of making you responsible for surviving a system designed to grind you down. The neoliberal trick is to convince you that your suffering is your own fault, that if only you had been more proactive, more adaptable, more “employable,” you wouldn’t be so crushed beneath the wheel.

Bal and Dóci are right: the way forward is to re-politicise and re-humanise organisational studies, to see workers as humans rather than performance units. But until then, expect HR to keep smiling while sharpening its knives.

The Red Flag of Truth

Nothing says “I’ve stopped thinking” quite like someone waving the banner of Truth. The word itself, when capitalised and flapped about like a holy relic, isn’t a signal of wisdom but of closure. A red flag.

The short video by Jonny Thompson that inspired this post.

Those who proclaim to “speak the Truth” or “know the Truth”rarely mean they’ve stumbled upon a tentative insight awaiting refinement. No, what they mean is: I have grasped reality in its totality, and—surprise!—it looks exactly like my prejudices. It’s the epistemic equivalent of a toddler declaring ownership of the playground by drooling on the swings.

The Fetish of Objectivity

The conceit is that Truth is singular, objective, eternal, a monolithic obelisk towering over human folly. But history’s scrapyard is full of such obelisks, toppled and broken: phlogiston, bloodletting, Manifest Destiny, “the market will regulate itself.” Each was once trumpeted as capital-T Truth. Each is now embarrassing clutter for the dustbin.

Still, the zealots never learn. Every generation delivers its own batch of peddlers, flogging their version of Truth as if it were snake oil guaranteed to cure ignorance and impotence. (Side effects may include dogmatism, authoritarianism, and an inability to read the room.)

Why It’s a Red Flag

When someone says, “It’s just the truth”, what they mean is “, I am not listening,” like the parent who argues, “because I said so.” Dialogue is dead; curiosity cremated. Truth, in their hands, is less a lantern than a cosh. It is wielded not to illuminate, but to bludgeon.

Ralph Waldo Emerson’s voice breaks in, urging us to trust ourselves and to think for ourselves. Nothing is more degrading than to borrow another’s convictions wholesale and parade them as universal law. Better to err in the wilderness of one’s own reason than to be shepherded safely into another man’s paddock of certainties.

A Better Alternative

Rather than fetishising Truth, perhaps we ought to cultivate its neglected cousins: curiosity, provisionality, and doubt. These won’t look as good on a placard, admittedly. Picture a mob waving banners emblazoned with Ambiguity! – not exactly the stuff of revolutions. But infinitely more honest, and infinitely more humane.

So when you see someone waving the flag of Truth, don’t salute. Recognise it for what it is: a warning sign. Proceed with suspicion, and for God’s sake, bring Emerson.

Don’t Tread on My Ass

Absolute liberty means absolute liberty, but what if the liberty you seek is death? The moment you carve out exceptions – speech you can’t say, choices you can’t make, exits you can’t take – you’ve left the realm of liberty and entered the gated community of permission.

Video: YouTube vid by Philosopher Muse.

And here’s the test most self-styled liberty lovers fail: you’re free to skydive without a parachute, but try ending your life peacefully and watch how quickly the freedom brigade calls in the moral SWAT team.

I’m not his usual audience; I’m already in the choir, but this eight-minute clip by Philosopher Muse is worth your time. It’s a lucid walk through the ethical terrain mapped by Sarah Perry in Every Cradle Is a Grave, and it’s one of the better distillations of antinatalist thought I’ve seen for the general public. Perry’s libertarian starting point is straightforward: if you truly own your life, you must also have the right to give it up.

He threads in the dark-glimmer insights of Emil Cioran’s poetic despair, Thomas Ligotti’s existential horror, David Benatar’s asymmetry, and Peter Wessel Zapffe’s tragic consciousness. Together they point to an uncomfortable truth: autonomy that stops short of death isn’t autonomy at all; it’s a petting zoo of freedoms where the gate is locked.

I’ve said this before, but it bears repeating. I once had a girlfriend who hated her life but was too afraid of hell to end it. She didn’t “pull through.” She overdosed by accident. Loophole closed, I suppose. That’s what happens when metaphysical prohibitions are allowed to run the operating system.

And here’s where I diverge from the purist libertarians. I don’t believe most people are competent enough to have the liberty they think they deserve. Not because they’re all dribbling idiots, but because they’ve been marinated for generations in a stew of indoctrination. For thousands of years, nobody talked about “liberty” or “freedom” as inalienable rights. Once the notion caught on, it was packaged and sold – complete with an asterisk, endless fine print, and a service desk that’s never open.

We tell ourselves we’re free, but only in the ways that don’t threaten the custodians. You can vote for whoever the party machine serves up, but you can’t opt out of the game. You can live any way you like, as long as it looks enough like everyone else’s life. You can risk death in countless state-approved ways, but the moment you calmly choose it, your autonomy gets revoked.

So yes, watch the video. Read Perry’s Every Cradle Is a Grave. Then ask yourself whether your liberty is liberty, or just a longer leash.

If liberty means anything, it means the right to live and the right to leave. The former without the latter is just life imprisonment with better marketing.

Ages of Consent: A Heap of Nonsense

A response on another social media site got me thinking about another Sorites paradox. The notion just bothers me. I’ve long held that it is less a paradox than an intellectually lazy way to manoeuvre around language insufficiency.

<rant>

The law loves a nice, clean number. Eighteen to vote. Sixteen to marry. This-or-that to consent. As if we all emerge from adolescence on the same morning like synchronised cicadas, suddenly equipped to choose leaders, pick spouses, and spot the bad lovers from the good ones.

But the Sorites paradox gives the game away: if you’re fit to vote at 18 years and 0 days, why not at 17 years, 364 days? Why not 17 years, 363 days? Eventually, you’re handing the ballot to a toddler who thinks the Prime Minister is Peppa Pig. Somewhere between there and adulthood, the legislator simply throws a dart and calls it “science.”

To bolster this fiction, we’re offered pseudo-facts: “Women mature faster than men”, or “Men’s brains don’t finish developing until thirty.” These claims, when taken seriously, only undermine the case for a single universal threshold. If “maturity” were truly the measure, we’d have to track neural plasticity curves, hormonal arcs, and a kaleidoscope of individual factors. Instead, the state settles for the cheapest approximation: a birthday.

This obsession with fixed thresholds is the bastard child of Enlightenment rationalism — the fantasy that human variation can be flattened into a single neat line on a chart. The eighteenth-century mind adored universals: universal reason, universal rights, universal man. In this worldview, there must be one age at which all are “ready,” just as there must be one unit of measure for a metre or a kilogram. It is tidy, legible, and above all, administratively convenient.

Cue the retorts:

  • “We need something.” True, but “something” doesn’t have to mean a cliff-edge number. We could design systems of phased rights, periodic evaluations, or contextual permissions — approaches that acknowledge people as more than interchangeable cut-outs from a brain-development chart.
  • “It would be too complicated.” Translation: “We prefer to be wrong in a simple way than right in a messy way.” Reality is messy. Pretending otherwise isn’t pragmatism; it’s intellectual cowardice. Law is supposed to contend with complexity, not avert its gaze from it.

And so we persist, reducing a continuous, irregular, and profoundly personal process to an administratively convenient fiction — then dressing it in a lab coat to feign objectivity. A number is just a number, and in this case, a particularly silly one.

</rant>

Democracy: The Idiot’s Opiate, The Sequel Nobody Asked For

Yesterday, I suggested democracy is a mediocre theatre production where the audience gets to choose which mediocre understudy performs. Some readers thought I was being harsh. I wasn’t.

A mate recently argued that humans will always be superior to AI because of emergence, the miraculous process by which complexity gives rise to intelligence, creativity, and emotion. Lovely sentiment. But here’s the rub: emergence is also how we got this political system, the one no one really controls anymore.

Like the human body being mostly non-human microbes, our so-called participatory government is mostly non-participatory components: lobbyists, donors, bureaucrats, corporate media, careerists, opportunists, the ecosystem that is the actual organism. We built it, but it now has its own metabolism. And thanks to the law of large numbers, multiplied by the sheer number of political, economic, and social dimensions in play, even the human element is diluted into statistical irrelevance. At any rate, what remains of it has lost control – like the sorcerer’s apprentice.

People like to imagine they can “tame” this beast, the way a lucid dreamer thinks they can bend the dream to their will. But you’re still dreaming. The narrative still runs on the dream’s logic, not yours. The best you can do is nudge it; a policy tweak here, a symbolic vote there, before the system digests your effort and excretes more of itself.

This is why Deming’s line hits so hard: a bad system beats a good person every time. Even if you could somehow elect the Platonic ideal of leadership, the organism would absorb them, neutralise them, or spit them out. It’s not personal; it’s structural.

And yet we fear AI “taking over,” as if that would be a radical departure from the status quo. Newsflash: you’ve already been living under an autonomous system for generations. AI would just be a remodel of the control room, new paint, same prison.

So yes, emergence makes humans “special.” It also makes them the architects of their own inescapable political microbiome. Congratulations, you’ve evolved the ability to build a machine that can’t be turned off.

Democracy: Opiate of the Masses

Democracy is sold, propagandised, really, as the best system of governance we’ve ever devised, usually with the grudging qualifier “so far.” It’s the Coca-Cola of political systems: not particularly good for you, but so entrenched in the cultural bloodstream that to question it is tantamount to treason.

Audio: NotebookLM Podcast on this topic.

The trouble is this: democracy depends on an electorate that is both aware and capable. Most people are neither. Worse still, even if they could be aware, they wouldn’t be smart enough to make use of it. And even if they were smart enough, Arrow’s Impossibility Theorem strolls in, smirking, to remind us that the whole thing is mathematically doomed anyway.

Even this number is a charade. IQ measures how well you navigate the peculiar obstacle course we’ve designed as “education,” not the whole terrain of human thought. It’s as culturally loaded as asking a fish to climb a tree, then declaring it dim-witted when it flops. We call it intelligence because it flatters those already rewarded by the system that designed the test. In the United States, the average IQ stands at 97 – hardly a figure that instils confidence in votes and outcomes.

The Enlightenment gents who pushed democracy weren’t exactly selfless visionaries. They already had power, and simply repackaged it as something everyone could share, much as the clergy promised eternal reward to peasants if they only kept their heads down. Democracy is merely religion with ballots instead of bibles: an opiate for the masses, sedating the population with the illusion of influence.

Worse still, it’s a system optimised for mediocrity. It rewards consensus, punishes brilliance, and ensures the average voter is, by definition, average. Living under it is like starring in Idiocracy, only without the comedic relief, just the grim recognition that you’re outnumbered, and the crowd is cheering the wrong thing.

The Myth of Causa Sui Creativity

(or: Why Neither Humans nor AI Create from Nothing)

In the endless squabble over whether AI can be “creative” or “intelligent,” we always end up back at the same semantic swamp. At the risk of poking the bear, I have formulated a response. Creativity is either whatever humans do, or whatever humans do that AI can’t. Intelligence is either the general ability to solve problems or a mysterious inner light that glows only in Homo sapiens. The definitions shift like sand under the feet of the argument.

Audio: NotebookLM podcast on this topic

Strip away the romance, and the truth is far less flattering: neither humans nor AI conjure from the void. Creativity is recombination, the reconfiguration of existing material into something unfamiliar. Intelligence is the ability to navigate problems using whatever tools and heuristics one has to hand.

The Causa Sui conceit, the idea that one can be the cause of oneself, is incoherent in art, thought, or physics. Conservation of energy applies as much to ideas as to atoms.

  • Humans consume inputs: books, conversations, music, arguments, TikTok videos.
  • We metabolise them through cognitive habits, biases, and linguistic forms.
  • We output something rearranged, reframed, sometimes stripped to abstraction.

The AI process is identical in structure, if not in substrate: ingest vast data, run it through a model, output recombination. The difference is that AI doesn’t pretend otherwise.

When a human produces something impressive, we call it creative without inspecting the provenance of the ideas. When an AI produces something impressive, we immediately trace the lineage of its inputs, as if the human mind weren’t doing the same. This is not epistemic rigour, it’s tribal boundary enforcement.

The real objection to AI is not that it fails the test of creativity or intelligence; it’s that it passes the functional test without being part of the club. Our stories about human exceptionalism require a clear line between “us” and “it,” even if we have to draw that line through semantic fog.

My Language Insufficiency Hypothesis began with the recognition that language cannot fully capture the reality it describes. Here, the insufficiency is deliberate; the words “creativity” and “intelligence” are kept vague so they can always be shifted away from anything AI achieves.

I cannot be causa sui, and neither can you. The only difference is that I’m willing to admit it.

The Enlightenment: A Postmortem

Or: How the Brightest Ideas in Europe Got Us into This Bloody Mess

Disclaimer: This output is entirely ChatGPT 4o from a conversation on the failure and anachronism of Enlightenment promises. I’m trying to finish editing my next novel, so I can’t justify taking much more time to share what are ultimately my thoughts as expounded upon by generative AI. I may comment personally in future. Until then, this is what I have to share.

AI Haters, leave now or perish ye all hope.


The Enlightenment promised us emancipation from superstition, authority, and ignorance. What we got instead was bureaucracy, colonialism, and TED Talks. We replaced divine right with data dashboards and called it progress. And like any good inheritance, the will was contested, and most of us ended up with bugger-all.

Below, I take each Enlightenment virtue, pair it with its contemporary vice, and offer a detractor who saw through the Enlightenment’s powder-wigged charade. Because if we’re going down with this ship, we might as well point out the dry rot in the hull.


1. Rationalism

The Ideal: Reason shall lead us out of darkness.
The Reality: Reason led us straight into the gas chambers—with bureaucratic precision.

Detractor: Max Horkheimer & Theodor Adorno

“Enlightenment is totalitarian.”
Dialectic of Enlightenment (1944)

Horkheimer and Adorno saw what reason looks like when it slips off its leash. Instrumental rationality, they warned, doesn’t ask why—it only asks how efficiently. The result? A world where extermination is scheduled, costs are optimised, and ethics are politely filed under “subjective.”


2. Empiricism

The Ideal: Observation and experience will uncover truth.
The Reality: If it can’t be measured, it can’t be real. (Love? Not statistically significant.)

Detractor: Michel Foucault

“Truth isn’t outside power… truth is a thing of this world.”
Power/Knowledge (1977)

Foucault dismantled the whole edifice. Knowledge isn’t neutral; it’s an instrument of power. Empiricism becomes just another way of disciplining the body—measuring skulls, classifying deviants, and diagnosing women with “hysteria” for having opinions.


3. Individualism

The Ideal: The sovereign subject, free and self-determining.
The Reality: The atomised consumer, trapped in a feedback loop of self-optimisation.

Detractor: Jean Baudrillard

“The individual is no longer an autonomous subject but a terminal of multiple networks.”
Simulacra and Simulation (1981)

You wanted autonomy? You got algorithms. Baudrillard reminds us that the modern “individual” is a brand in search of market validation. You are free to be whoever you want, provided it fits within platform guidelines and doesn’t disrupt ad revenue.


4. Secularism

The Ideal: Liberation from superstition.
The Reality: We swapped saints for STEMlords and called it even.

Detractor: Charles Taylor

“We are now living in a spiritual wasteland.”
A Secular Age (2007)

Taylor—perhaps the most polite Canadian apocalypse-whisperer—reminds us that secularism didn’t replace religion with reason; it replaced mystery with malaise. We’re no longer awed, just “motivated.” Everything is explainable, and yet somehow nothing means anything.


5. Progress

The Ideal: History is a forward march toward utopia.
The Reality: History is a meat grinder in a lab coat.

Detractor: Walter Benjamin

“The storm irresistibly propels him into the future to which his back is turned.”
Theses on the Philosophy of History (1940)

Benjamin’s “angel of history” watches helplessly as the wreckage piles up—colonialism, genocide, climate collapse—all in the name of progress. Every step forward has a cost, but we keep marching, noses in the spreadsheet, ignoring the bodies behind us.


6. Universalism

The Ideal: One humanity, under Reason.
The Reality: Enlightenment values, brought to you by cannon fire and Christian missionaries.

Detractor: Gayatri Chakravorty Spivak

“White men are saving brown women from brown men.”
Can the Subaltern Speak? (1988)

Universalism was always a bit… French, wasn’t it? Spivak unmasks it as imperialism in drag—exporting “rights” and “freedom” to people who never asked for them, while ignoring the structural violence built into the Enlightenment’s own Enlightened societies.


7. Tolerance

The Ideal: Let a thousand opinions bloom.
The Reality: Tolerance, but only for those who don’t threaten the status quo.

Detractor: Karl Popper

“Unlimited tolerance must lead to the disappearance of tolerance.”
The Open Society and Its Enemies (1945)

Popper, bless him, thought tolerance needed a firewall. But in practice, “tolerance” has become a smug liberal virtue signalling its own superiority while deplatforming anyone who makes the dinner party uncomfortable. We tolerate all views—except the unseemly ones.


8. Scientific Method

The Ideal: Observe, hypothesise, repeat. Truth shall emerge.
The Reality: Publish or perish. Fund or flounder.

Detractor: Paul Feyerabend

“Science is not one thing, it is many things.”
Against Method (1975)

Feyerabend called the whole thing a farce. There is no single “method,” just a bureaucratic orthodoxy masquerading as objectivity. Today, science bends to industry, cherry-picks for grants, and buries null results in the backyard. Peer review? More like peer pressure.


9. Anti-Authoritarianism

The Ideal: Smash the throne! Burn the mitre!
The Reality: Bow to the data analytics team.

Detractor: Herbert Marcuse

“Free election of masters does not abolish the masters or the slaves.”
One-Dimensional Man (1964)

Marcuse skewered the liberal illusion of choice. We may vote, but we do so within a system that already wrote the script. Authority didn’t vanish; it just became procedural, faceless, algorithmic. Bureaucracy is the new monarchy—only with more forms.


10. Education and Encyclopaedism

The Ideal: All knowledge, accessible to all minds.
The Reality: Behind a paywall. Written in impenetrable prose. Moderated by white men with tenure.

Detractor: Ivan Illich

“School is the advertising agency which makes you believe that you need the society as it is.”
Deschooling Society (1971)

Illich pulls the curtain: education isn’t emancipatory; it’s indoctrinatory. The modern university produces not thinkers but credentialed employees. Encyclopaedias are replaced by Wikipedia, curated by anonymous pedants and revision wars. Truth is editable.


Postscript: Picking through the Rubble

So—has the Enlightenment failed?

Not exactly. It succeeded too literally. It was taken at its word. Its principles, once radical, were rendered banal. It’s not that reason, progress, or rights are inherently doomed—it’s that they were never as pure as advertised. They were always products of their time: male, white, bourgeois, and utterly convinced of their own benevolence.

If there’s a path forward, it’s not to restore Enlightenment values, but to interrogate them—mercilessly, with irony and eyes open.

After all, the problem was never darkness. It was the people with torches who thought they’d found the only path.

The Cult of Officer Safety: How SCOTUS Legalised Fear

In the great American theatre of liberty, there’s one character whose neuroses we all must cater to: the police officer. Not the civil servant. Not the trained professional. No, the trembling bundle of nerves with a badge and a gun. According to the United States Supreme Court, this anxious figure is so vulnerable that the Constitution itself must bend to accommodate his fear. I’m not sure I have less respect for these people than for most other professions.

Audio: NotebookLM podcast on this topic.

Let’s review.

In Pennsylvania v. Mimms (1977), the Court held that police can order a driver out of their vehicle during any lawful traffic stop—no suspicion, no cause, just vibes. Why? Because the officer might get nervous otherwise.

Fast-forward to Maryland v. Wilson (1997), and that same logic is extended to passengers. That’s right: even if you’re just catching a ride, you too can be ordered out and subject to scrutiny because, well, a cop might be spooked.

The rationale? “Officer safety.” A phrase so overused it may as well be stamped on every judge’s gavel and stitched into every uniform. Forget that you’re a citizen with rights; forget that the Fourth Amendment was intended to restrain arbitrary power. If your mere presence makes Officer Skittish feel a bit antsy, the law now permits him to act like he’s clearing a war zone.

It’s worth asking – gently, of course, so as not to alarm anyone in uniform – why exactly we entrust our most coercive state powers to individuals apparently one errant movement away from fight-or-flight mode?

Rather than raising the bar for police conduct, these rulings lower the bar for constitutional protections. Rather than requiring police to be calm, competent, and capable under pressure, the Court concedes that they’re none of those things and therefore need extra authority to compensate.

So here’s a radical suggestion: What if “officer safety” wasn’t a get-out-of-liberty-free card? What if we demanded emotional resilience and psychological stability before issuing guns and power? What if, instead of warping the law around the most paranoid members of the force, we removed them from the force?

But no. Instead, we get jurisprudence that treats every routine traffic stop like a potential ambush. And to ensure our jittery guardian gets home safe, you, dear citizen, will be the one legally disarmed.

So buckle up – because your rights don’t mean much when the man with the badge is afraid of his own shadow.