Anniversary of Sorts

1–2 minutes

WordPress has just informed me that my blog is having an anniversary. Technically true, though a little misleading: this blog has been around since 1 January 2017, but I’ve been loitering on the platform since 2006. Before that I dabbled in the great blog diaspora of the early internet—Google, Yahoo! 360, Blogger, and a few others that have long since evaporated into the ether.

Each space had its own flavour. One I recall from around 2010 was devoted to an experiment in World of Warcraft: levelling a pacifist character. The premise was simple—no violence allowed. My Human Priest, suitably named Passivefist, managed to crawl his way to level 7 before stalling out. The challenge was never to attack other NPCs, only to survive by gathering, healing, or sneaking through hostile terrain.

This was my grand opening statement back then:

I am creating this account to track my progress as a pacifist in World of Warcraft. Others have done this before me and are, in fact, way ahead of me. Nonetheless, it is the challenge I am setting. I have created a Human Priest on Kael’thas named Passivefist.

Of course, in later expansions Blizzard eventually added pacifist-friendly content, making my small crusade somewhat redundant.

As for this blog, it’s taken a different path. I’ve recently crossed the 100,000-word milestone—101.4K, to be precise. Not that I’ve been counting obsessively, but it’s a nice marker, even if much of my writing also leaks into other projects: other blogs, manuscripts, and workaday scribbling.

As for this blog…

The intent here remains the same as when I started in 2017: to keep a space for philosophic musings, digressions, and the occasional provocation. I’ll continue publishing when I have something worth saying—or at least something worth testing out in public.

Here’s to the next 100K.

Impostors, Competence, and the HR Hall of Mirrors

I was a professional musician in the 1980s. I played guitar, but this was always a sideline to my real work as a recording engineer and producer. Competence, not virtuosity, was the coin of the realm in the studio, and I was competent. Still, I spent much of my time surrounded by musicians who left me slack-jawed: people who could sight-read Bach at breakfast and bash out Van Halen riffs after lunch without missing a beat. Next to them, I was, charitably, merely competent.

That’s the thing about competence: it doesn’t make you the star, but it keeps the machine running. I knew I wasn’t the flash guitarist or the prodigy bassist, but I could play my parts cleanly and hold a band together. When later groups already had lead guitarists, I played bass. Was I a bassist? No. But I was competent enough to lock in with the drummer and serve the ensemble. Nobody mistook me for a virtuoso, least of all me. I wasn’t an impostor; I was a cog in the machine, good enough to keep the show on the road. That was my ego attachment: not “musician” as identity, but member of a band.

The Hallucination of “Impostor Syndrome”

Much ink is spilt on impostor syndrome, that anxious whisper that one is a fraud who doesn’t belong. The polite story is that it’s just nerves: you are competent, you do belong, you’re simply holding yourself against impossible standards. Nonsense. The truth is darker. Most people are impostors.

The nervous tension is not a malfunction of self-esteem; it’s a moment of clarity. A faint recognition that you’ve been miscast in a role you can’t quite play, but are forced to mime anyway. The Peter Principle doesn’t kick in at some distant managerial plateau; it’s the basic law of organisational gravity. People rise past their competence almost immediately, buoyed not by skill but by connections, bluff, and HR’s obsession with “fit.”

The Consultant’s View from the Cheap Seats

As a Management Consultant™, I met countless “leaders” whose only discernible talent was staying afloat whilst already over their heads. Organisations, too blind or too immature to notice, rewarded them with raises and promotions anyway. Somebody’s got to get them, after all. HR dutifully signed the paperwork, called it “talent management,” and congratulated itself on another triumph of culture-fit over competence.

In music, incompetence is self-correcting: audiences walk out, bands dissolve, the market punishes mediocrity. In corporate life, incompetence metastasises. Bluffers thrive. Mediocrity is embalmed, padded with stock options, and paraded on stage at leadership summits.

Good Enough vs. Bluff Enough

Competence, though, is underrated. You don’t need to be the best guitarist or the savviest CEO. You need to be good enough for the role you’re actually playing, and honest enough not to mistake the role for your identity. In bands, that worked fine. In business and politics, it’s subversive. The whole edifice depends on people pretending to be more than they are, rehearsing confidence in lieu of competence.

No wonder impostor syndrome is rampant. It’s not a pathology; it’s the ghost of truth in a system of lies.

The antidote isn’t TED-talk therapy or self-affirmation mantras. It’s honesty: admit the limits of your competence, stop mistaking ego for ability, and refuse to play HR’s charade. Competence is enough. The rest is noise.

Ages of Consent: A Heap of Nonsense

A response on another social media site got me thinking about another Sorites paradox. The notion just bothers me. I’ve long held that it is less a paradox than an intellectually lazy way to manoeuvre around language insufficiency.

<rant>

The law loves a nice, clean number. Eighteen to vote. Sixteen to marry. This-or-that to consent. As if we all emerge from adolescence on the same morning like synchronised cicadas, suddenly equipped to choose leaders, pick spouses, and spot the bad lovers from the good ones.

But the Sorites paradox gives the game away: if you’re fit to vote at 18 years and 0 days, why not at 17 years, 364 days? Why not 17 years, 363 days? Eventually, you’re handing the ballot to a toddler who thinks the Prime Minister is Peppa Pig. Somewhere between there and adulthood, the legislator simply throws a dart and calls it “science.”

To bolster this fiction, we’re offered pseudo-facts: “Women mature faster than men”, or “Men’s brains don’t finish developing until thirty.” These claims, when taken seriously, only undermine the case for a single universal threshold. If “maturity” were truly the measure, we’d have to track neural plasticity curves, hormonal arcs, and a kaleidoscope of individual factors. Instead, the state settles for the cheapest approximation: a birthday.

This obsession with fixed thresholds is the bastard child of Enlightenment rationalism — the fantasy that human variation can be flattened into a single neat line on a chart. The eighteenth-century mind adored universals: universal reason, universal rights, universal man. In this worldview, there must be one age at which all are “ready,” just as there must be one unit of measure for a metre or a kilogram. It is tidy, legible, and above all, administratively convenient.

Cue the retorts:

  • “We need something.” True, but “something” doesn’t have to mean a cliff-edge number. We could design systems of phased rights, periodic evaluations, or contextual permissions — approaches that acknowledge people as more than interchangeable cut-outs from a brain-development chart.
  • “It would be too complicated.” Translation: “We prefer to be wrong in a simple way than right in a messy way.” Reality is messy. Pretending otherwise isn’t pragmatism; it’s intellectual cowardice. Law is supposed to contend with complexity, not avert its gaze from it.

And so we persist, reducing a continuous, irregular, and profoundly personal process to an administratively convenient fiction — then dressing it in a lab coat to feign objectivity. A number is just a number, and in this case, a particularly silly one.

</rant>

The Purpose of Purpose

I’m a nihilist. Possibly always have been. But let’s get one thing straight: nihilism is not despair. That’s a slander cooked up by the Meaning Merchants – the sentimentalists and functionalists who can’t get through breakfast without hallucinating some grand purpose to butter their toast. They fear the void, so they fill it. With God. With country. With yoga.

Audio: NotebookLM podcast on this topic.

Humans are obsessed with function. Seeing it. Creating it. Projecting it onto everything, like graffiti on the cosmos. Everything must mean something. Even nonsense gets rebranded as metaphor. Why do men have nipples? Why does a fork exist if you’re just going to eat soup? Doesn’t matter – it must do something. When we can’t find this function, we invent it.

But function isn’t discovered – it’s manufactured. A collaboration between our pattern-seeking brains and our desperate need for relevance, where function becomes fiction, where language and anthropomorphism go to copulate. A neat little fiction. An ontological fantasy. We ask, “What is the function of the human in this grand ballet of entropy and expansion?” Answer: there isn’t one. None. Nada. Cosmic indifference doesn’t write job descriptions.

And yet we prance around in lab coats and uniforms – doctors, arsonists, firemen, philosophers – playing roles in a drama no one is watching. We build professions and identities the way children host tea parties for dolls. Elaborate rituals of pretend, choreographed displays of purpose. Satisfying? Sometimes. Meaningful? Don’t kid yourself.

We’ve constructed these meaning-machines – society, culture, progress – not because they’re real, but because they help us forget that they’re not. It’s theatre. Absurdist, and often bad. But it gives us something to do between birth and decomposition.

Sisyphus had his rock. We have careers.

But let’s not confuse labour for meaning, or imagination for truth. The boulder never reaches the top, and that’s not failure. That’s the show.

So roll the stone. Build the company. Write the blog. Pour tea for Barbie. Just don’t lie to yourself about what it all means.

Because it doesn’t mean anything.

Unwilling: The Neuroscience Against Free Will

Why the cherished myth of human autonomy dissolves under the weight of our own biology

We cling to free will like a comfort blanket—the reassuring belief that our actions spring from deliberation, character, and autonomous choice. This narrative has powered everything from our justice systems to our sense of personal achievement. It feels good, even necessary, to believe we author our own stories.

But what if this cornerstone of human self-conception is merely a useful fiction? What if, with each advance in neuroscience, our cherished notion of autonomy becomes increasingly untenable?

Audio: NotebookLM podcast on this topic.

I. The Myth of Autonomy: A Beautiful Delusion

Free will requires that we—some essential, decision-making “self”—stand somehow separate from the causal chains of biology and physics. But where exactly would this magical pocket of causation exist? And what evidence do we have for it?

Your preferences, values, and impulses emerge from a complex interplay of factors you never chose:

The genetic lottery determined your baseline neurochemistry and cognitive architecture before your first breath. You didn’t select your dopamine sensitivity, your amygdala reactivity, or your executive function capacity.

The hormonal symphony that controls your emotional responses operates largely beneath conscious awareness. These chemical messengers—testosterone, oxytocin, and cortisol—don’t ask permission before altering your perceptions and priorities.

Environmental exposures—from lead in your childhood drinking water to the specific traumas of your upbringing—have sculpted neural pathways you didn’t design and can’t easily rewire.

Developmental contingencies have shaped your moral reasoning, impulse control, and capacity for empathy through processes invisible to conscious inspection.

Your prized ability to weigh options, inhibit impulses, and make “rational” choices depends entirely on specific brain structures—particularly the dorsolateral prefrontal cortex (DLPFC)—operating within a neurochemical environment you inherited rather than created.

You occupy this biological machinery; you do not transcend it. Yet, society holds you responsible for its outputs as if you stood separate from these deterministic processes.

II. The DLPFC: Puppet Master of Moral Choice

The dorsolateral prefrontal cortex serves as command central for what we proudly call executive function—our capacity to plan, inhibit, decide, and morally judge. We experience its operations as deliberation, as the weighing of options, as the essence of choice itself.

And yet this supposed seat of autonomy can be manipulated with disturbing ease.

When researchers apply transcranial magnetic stimulation to inhibit DLPFC function, test subjects make dramatically different moral judgments about identical scenarios. Under different stimulation protocols, the same person arrives at contradictory conclusions about right and wrong without any awareness of the external influence.

Similarly, transcranial direct current stimulation over the DLPFC alters moral reasoning, especially regarding personal moral dilemmas. The subject experiences these externally induced judgments as entirely their own, with no sense that their moral compass has been hijacked.

If our most cherished moral deliberations can be redirected through simple electromagnetic manipulation, what does this reveal about the nature of “choice”? If will can be so easily influenced, how free could it possibly be?

III. Hormonal Puppetmasters: The Will in Your Bloodstream

Your decision-making machinery doesn’t stop at neural architecture. Your hormonal profile actively shapes what you perceive as your autonomous choices.

Consider oxytocin, popularly known as the “love hormone.” Research demonstrates that elevated oxytocin levels enhance feelings of guilt and shame while reducing willingness to harm others. This isn’t a subtle effect—it’s a direct biological override of what you might otherwise “choose.”

Testosterone tells an equally compelling story. Administration of this hormone increases utilitarian moral judgments, particularly when such decisions involve aggression or social dominance. The subject doesn’t experience this as a foreign influence but as their own authentic reasoning.

These aren’t anomalies or edge cases. They represent the normal operation of the biological systems governing what we experience as choice. You aren’t choosing so much as regulating, responding, and rebalancing a biochemical economy you inherited rather than designed.

IV. The Accident of Will: Uncomfortable Conclusions

If the will can be manipulated through such straightforward biological interventions, was it ever truly “yours” to begin with?

Philosopher Galen Strawson’s causa sui argument becomes unavoidable here: To be morally responsible, one must be the cause of oneself, but no one creates their own neural and hormonal architecture. By extension, no one can be ultimately responsible for actions emerging from that architecture.

What we dignify as “will” may be nothing more than a fortunate (or unfortunate) biochemical accident—the particular configuration of neurons and neurochemicals you happened to inherit and develop.

This lens forces unsettling questions:

  • How many behaviours we praise or condemn are merely phenotypic expressions masquerading as choices? How many acts of cruelty or compassion reflect neurochemistry rather than character?
  • How many punishments and rewards are we assigning not to autonomous agents, but to biological processes operating beyond conscious control?
  • And perhaps most disturbingly: If we could perfect the moral self through direct biological intervention—rewiring neural pathways or adjusting neurotransmitter levels to ensure “better” choices—should we?
  • Or would such manipulation, however well-intentioned, represent the final acknowledgement that what we’ve called free will was never free at all?

A Compatibilist Rebuttal? Not So Fast.

Some philosophers argue for compatibilism, the view that determinism and free will can coexist if we redefine free will as “uncoerced action aligned with one’s desires.” But this semantic shuffle doesn’t rescue moral responsibility.

If your desires themselves are products of biology and environment—if even your capacity to evaluate those desires depends on inherited neural architecture—then “acting according to your desires” just pushes the problem back a step. You’re still not the ultimate author of those desires or your response to them.

What’s Left?

Perhaps we need not a defence of free will but a new framework for understanding human behaviour—one that acknowledges our biological embeddedness while preserving meaningful concepts of agency and responsibility without magical thinking.

The evidence doesn’t suggest we are without agency; it suggests our agency operates within biological constraints we’re only beginning to understand. The question isn’t whether biology influences choice—it’s whether anything else does.

For now, the neuroscientific evidence points in one direction: The will exists, but its freedom is the illusion.

Bullshit Jobs

I’ve recently decided to take a sabbatical from what passes for economic literature these days — out of a sense of self-preservation, mainly — but before I hermetically sealed myself away, I made a quick detour through Jorge Luis Borges’ The Library of Babel (PDF). Naturally, I emerged none the wiser, blinking like some poor subterranean creature dragged into the daylight, only to tumble headlong into David Graeber’s Bullshit Jobs.

This particular tome had been languishing in my inventory since its release, exuding a faint but persistent odour of deferred obligation. Now, about a third of the way in, I can report that Graeber’s thesis — that the modern world is awash with soul-annihilatingly pointless work — does resonate. I find myself nodding along like one of those cheap plastic dashboard dogs. Yet, for all its righteous fury, it’s more filler than killer. Directionally correct? Probably. Substantively airtight? Not quite. It’s a bit like admiring a tent that’s pitched reasonably straight but has conspicuous holes large enough to drive a fleet of Uber Eats cyclists through.

An amusing aside: the Spanish edition is titled Trabajos de mierda (“shitty jobs”), a phrase Graeber spends an entire excruciating section of the book explaining is not the same thing. Meanwhile, the French, in their traditional Gallic shrug, simply kept the English title. (One suspects they couldn’t be arsed.)

Chapter One attempts to explain the delicate taxonomy: bullshit jobs are fundamentally unnecessary — spawned by some black magic of bureaucracy, ego, and capitalist entropy — whilst shit jobs are grim, thankless necessities that someone must do, but no one wishes to acknowledge. Tragically, some wretches get the worst of both worlds, occupying jobs that are both shit and bullshit — a sort of vocational purgatory for the damned.

Then, in Chapter Two, Graeber gleefully dissects bullshit jobs into five grotesque varieties:

  1. Flunkies, whose role is to make someone else feel important.
  2. Goons, who exist solely to fight other goons.
  3. Duct Tapers, who heroically patch problems that ought not to exist in the first place.
  4. Box Tickers, who generate paperwork to satisfy some Kafkaesque demand that nobody actually reads.
  5. Taskmasters, who either invent unnecessary work for others or spend their days supervising people who don’t need supervision.

Naturally, real-world roles often straddle several categories. Lucky them: multi-classed in the RPG of Existential Futility.

Graeber’s parade of professional despair is, admittedly, darkly entertaining. One senses he had a great deal of fun cataloguing these grotesques — like a medieval monk illustrating demons in the margins of a holy text — even as the entire edifice wobbles under the weight of its own repetition. Yes, David, we get it: the modern economy is a Potemkin village of invented necessity. Carry on.

If the first chapters are anything to go by, the rest of the book promises more righteous indignation, more anecdotes from anonymous sad-sacks labouring in existential oubliettes, and — if one is lucky — perhaps a glimmer of prescription hidden somewhere amidst the diagnosis. Though, I’m not holding my breath. This feels less like an intervention and more like a well-articulated primal scream.

Still, even in its baggier moments, Bullshit Jobs offers the grim pleasure of recognition. If you’ve ever sat through a meeting where the PowerPoint had more intellectual integrity than the speaker or spent days crafting reports destined for the corporate oubliette marked “For Review” (translation: Never to Be Seen Again), you will feel seen — in a distinctly accusatory, you-signed-up-for-this sort of way.

In short: it’s good to read Graeber if only to have one’s vague sense of societal derangement vindicated in print. Like having a charmingly irate friend in the pub lean over their pint and mutter, “It’s not just you. It’s the whole bloody system.”

I’m not sure I’ll stick with this title either. I think I’ve caught the brunt of the message, and it feels like a diversion. I’ve also got Yanis Varoufakis’ Technofeudalism: What Killed Capitalism on the shelf. Perhaps I’ll spin this one up instead.

Lies as Shibboleth

Watching Sam Harris ruminate on the nature of political lies (still believing, poor lamb, that reason might one day triumph) reminds me of something more sinister: lies today are not attempts at persuasion. They are shibboleths — tribal passwords, loyalty oaths, secret handshakes performed in the broad light of day.

Video: Sam Harris tells us why Trump and his ilk lie.

Forget “alternative facts.” That charming euphemism was merely a decoy, a jangling set of keys to distract the infantile media. The real game was always deeper: strategic distortion, the deliberate blurring of perception not to deceive the outsider, but to identify the insider.

Audio: NotebookLM podcast on this topic.

When Trump — or any other post-truth demagogue — proclaims that penguins are, in fact, highly trained alien operatives from the Andromeda galaxy, the objective is not persuasion. The point is to force a choice: will you, standing before this glistening absurdity, blink and retreat into reason, stammering something about ornithology… Or will you step forward, clasp the hand of madness, and mutter, ‘Yes, my liege, the penguins have been among us all along’?

Those who demur, those who scoff or gasp or say ‘You’re an idiot,”’have failed the loyalty test. They have outed themselves as enemy combatants in the epistemic war. Truth, in this brave new world, is not a destination; it is an allegiance. To speak honestly is to wage rebellion.

Orwell, who tried very hard to warn us, understood this dynamic well: the real triumph of Big Brother was not merely to compel you to lie but to compel you to believe the lie. Koestler, another battered prophet of the age, charted how political movements sink into ritualistic unreason, demanding not conviction but performance. Swift, for his part, knew it was all hilarious if you tilted your head just right.

The bigger the lie, the better the shibboleth. Claim that two and two make five, and you catch out the weak-willed rationalists. Claim that penguins are extraterrestrials, and you find the truly devoted, the ones willing to build altars from ice and sacrifice to their feathery overlords.

It’s no accident that modern political theatre resembles a deranged initiation ritual. Each day brings a new absurdity, a fresh madness to affirm: ‘Men can become women by declaration alone!” “Billionaires are victims of systemic oppression!’ ‘The penguins are amongst us, plotting!’ Each claim a little more grotesque than the last, each compliance a little more degrading, a little more irreversible.

And oh, how eagerly the initiates rush forward! Clap for the penguins, or be cast out into the howling wilderness! Better to bend the knee to absurdity than be marked as an unbeliever. Better to humiliate yourself publicly than to admit that the Emperor’s penguin suit is just a costume.

Meanwhile, the opposition — earnest, naive — keeps trying to argue, to rebut, to point out that penguins are terrestrial flightless birds. How quaint. How pathetic. They do not understand that the moment they say, “You’re an idiot,” they’ve broken the spell, declared themselves apostates, and rendered themselves politically irrelevant.

The shibboleth, once uttered, divides the world cleanly: the believers, who will say anything, do anything, believe anything, provided it marks them safe from exile; and the infidels, who cling stupidly to reality.

The future belongs, not to the true, but to the loyal. Not to the rational, but to the ritualistic. The more extravagant the lie, the greater the proof of your faith.

So raise a glass to the penguins, ye of faint heart, and prepare your soul for abasement. Or stand firm, if you dare, and be prepared to be eaten alive by those who traded reason for the rapture of belonging.

After all, in the land of the blind, the one-eyed man is not king. He’s a heretic.


Flat-Earth Politics in a Cubic World

Audio: NotebookLM podcast on this topic.

History of Intelligence

I’ve made my way a couple of chapters into A Brief History of Intelligence: Evolution, AI, and the Five Breakthroughs That Made Our Brains by Max Bennet. My son recommended it last month, assuring me it was a delicious cocktail of SapiensBehaveand Superintelligence,—all books I’ve rated highly, courtesy of Harari, Sapolsky, and Bostrom, respectively. So far, it’s digestible without being patronizing, requiring no extensive background in the field.

Audio: Podcast conversation on this topic.

But this post isn’t about the book. It’s about what all good books should do: make you think.

If you’ve followed my writing over the years, you’ll know that I have little patience for psychology, which I regard as the astrology to neuroscience’s astronomy. Reading Fisher’s Capitalist Realism has only reinforced this perspective.

Frankly, I should do away with psychology altogether. Much of it—no, not just the vacuous self-help drivel clogging the internet and bookstore shelves—is pseudoscience. To its credit, it did function as a stepping stone to neuroscience, but that’s like crediting alchemy for modern chemistry.

Psychology’s greatest sin? Missing the forest for the trees—or, more precisely, ignoring the structural forces that shape the so-called individual. Western capitalism, ever eager to monetize everything, finds it far easier (and more profitable) to blame the individual rather than the system. It’s like the old joke about the man searching for his lost keys under the streetlamp, not because that’s where he dropped them, but because that’s where the light is. It’s just more convenient (and profitable) that way.

Enter psychology: the perfect tool for a society steeped in narcissism and instant gratification. Feeling anxious? Depressed? Alienated? Just take a pill! Never mind the material conditions of your existence—your stagnant wages, your crushing debt, your eroding sense of community. No, the problem is you, and conveniently, there’s a profitable solution waiting on the pharmacy shelf.

Sure, psychology has made some strides in attributing behaviours to neurotransmitters—dopamine, serotonin, norepinephrine, and the rest of the usual suspects. And sure, pharmaceuticals can sometimes treat symptoms effectively. But they are just that: symptoms. The root cause? Often stressors imposed by the very society we refuse to scrutinize. And guess what rarely makes the diagnostic checklist? The system itself.

We need to zoom out and see the whole damn forest. We need to ask the hard questions—run the classic five whys to get to the root of the problem. And spoiler alert: the answer isn’t some chemical imbalance in your head.

It’s us. Collectively. Systemically. Structurally.

But sure, keep searching under that streetlamp.

“Your Triggers Aren’t My Problem!”

…except, sometimes they are.

This came across my feed, the laminated wisdom of our times: Your triggers are your responsibility. It isn’t the world’s obligation to tiptoe around you. A phrase so crisp, so confident, it practically struts. You can imagine it on a mug, alongside slogans like Live, Laugh, Gaslight. These are the language games I love to hate.

Now, there’s a certain truth here. Life is hard, and people aren’t psychic. We can’t reasonably expect the world to read our mental weather reports—50% chance of anxiety, rising storms of existential dread. In an adult society, we are responsible for understanding our own emotional terrain, building the bridges and detours that allow us to navigate it. That’s called resilience, and it’s a good thing.

Audio: NotebookLM Podcast on this topic.

But (and it’s a big but) this maxim becomes far less admirable when you scratch at its glossy surface. What does triggers even mean here? Because trigger is a shape-shifter, what I term Shrödinger’s Weasels. For someone with PTSD, a trigger is not a metaphor; it’s a live wire. It’s a flashback to trauma, a visceral hijacking of the nervous system. That’s not just “feeling sensitive” or “taking offence”—it’s a different universe entirely.

Yet, the word has been kidnapped by the cultural peanut gallery, drained of precision and applied to everything from discomfort to mild irritation. Didn’t like that movie? Triggered. Uncomfortable hearing about your privilege? Triggered. This semantic dilution lets people dodge accountability. Now, when someone names harm—racism, misogyny, homophobia, you name it—the accused can throw up their hands and say, Well, that’s your problem, not mine.

And there’s the rub. The neat simplicity of Your triggers are your responsibility allows individuals to dress their cruelty as stoic rationality. It’s not their job, you see, to worry about your “feelings.” They’re just being honest. Real.

Except, honesty without compassion isn’t noble; it’s lazy. Cruelty without self-reflection isn’t courage; it’s cowardice. And rejecting someone’s very real pain because you’re too inconvenienced to care? Well, that’s not toughness—it’s emotional illiteracy.

Let’s be clear: the world shouldn’t have to tiptoe. But that doesn’t mean we’re free to stomp. If someone’s discomfort stems from bigotry, prejudice, or harm, then dismissing them as “too sensitive” is gaslighting, plain and simple. The right to swing your fist, as the old adage goes, ends at someone else’s nose. Likewise, the right to be “brutally honest” ends when your honesty is just brutality.

The truth is messy, as most truths are. Some triggers are absolutely our responsibility—old wounds, minor slights, bruised egos—and expecting the world to cushion us is neither reasonable nor fair. But if someone names harm that points to a broader problem? That’s not a trigger. That’s a mirror.

So yes, let’s all take responsibility for ourselves—our pain, our growth, our reactions. But let’s also remember that real strength is found in the space where resilience meets accountability. Life isn’t about tiptoeing or stomping; it’s about walking together, with enough care to watch where we step.