Impostors, Competence, and the HR Hall of Mirrors

I was a professional musician in the 1980s. I played guitar, but this was always a sideline to my real work as a recording engineer and producer. Competence, not virtuosity, was the coin of the realm in the studio, and I was competent. Still, I spent much of my time surrounded by musicians who left me slack-jawed: people who could sight-read Bach at breakfast and bash out Van Halen riffs after lunch without missing a beat. Next to them, I was, charitably, merely competent.

That’s the thing about competence: it doesn’t make you the star, but it keeps the machine running. I knew I wasn’t the flash guitarist or the prodigy bassist, but I could play my parts cleanly and hold a band together. When later groups already had lead guitarists, I played bass. Was I a bassist? No. But I was competent enough to lock in with the drummer and serve the ensemble. Nobody mistook me for a virtuoso, least of all me. I wasn’t an impostor; I was a cog in the machine, good enough to keep the show on the road. That was my ego attachment: not “musician” as identity, but member of a band.

The Hallucination of “Impostor Syndrome”

Much ink is spilt on impostor syndrome, that anxious whisper that one is a fraud who doesn’t belong. The polite story is that it’s just nerves: you are competent, you do belong, you’re simply holding yourself against impossible standards. Nonsense. The truth is darker. Most people are impostors.

The nervous tension is not a malfunction of self-esteem; it’s a moment of clarity. A faint recognition that you’ve been miscast in a role you can’t quite play, but are forced to mime anyway. The Peter Principle doesn’t kick in at some distant managerial plateau; it’s the basic law of organisational gravity. People rise past their competence almost immediately, buoyed not by skill but by connections, bluff, and HR’s obsession with “fit.”

The Consultant’s View from the Cheap Seats

As a Management Consultant™, I met countless “leaders” whose only discernible talent was staying afloat whilst already over their heads. Organisations, too blind or too immature to notice, rewarded them with raises and promotions anyway. Somebody’s got to get them, after all. HR dutifully signed the paperwork, called it “talent management,” and congratulated itself on another triumph of culture-fit over competence.

In music, incompetence is self-correcting: audiences walk out, bands dissolve, the market punishes mediocrity. In corporate life, incompetence metastasises. Bluffers thrive. Mediocrity is embalmed, padded with stock options, and paraded on stage at leadership summits.

Good Enough vs. Bluff Enough

Competence, though, is underrated. You don’t need to be the best guitarist or the savviest CEO. You need to be good enough for the role you’re actually playing, and honest enough not to mistake the role for your identity. In bands, that worked fine. In business and politics, it’s subversive. The whole edifice depends on people pretending to be more than they are, rehearsing confidence in lieu of competence.

No wonder impostor syndrome is rampant. It’s not a pathology; it’s the ghost of truth in a system of lies.

The antidote isn’t TED-talk therapy or self-affirmation mantras. It’s honesty: admit the limits of your competence, stop mistaking ego for ability, and refuse to play HR’s charade. Competence is enough. The rest is noise.

HR’s Neoliberal Mirage: Human Resources Without the Humans

Let us disabuse ourselves of one of the workplace’s most cherished delusions: that Human Resources is there for the humans. HR is not your therapist, not your advocate, not your confessor. HR is an appendage of the organisation, and like all appendages, its nerve endings run straight back to the corporate brain. Its “concern” for your well-being is merely a prophylactic against lawsuits and productivity dips. The error is ours; we persist in mistaking the guard dog for a pet.

Audio: NotebookLM podcast on this topic.

Bal and Dóci’s 2018 paper in the European Journal of Work and Organizational Psychology (EJWOP) tears the mask off this charade. They demonstrate how neoliberal ideology has seeped, unseen, into both workplace practice and the very research that pretends to study it objectively. Through the lenses of political, social, and fantasmatic logics, they show that neoliberalism has convinced us of three dangerous fairy tales:

  • Instrumentality: people are not people but “resources,” as fungible as printer ink.
  • Individualism: you are not part of a collective but a lone entrepreneur of the self, shackled to your CV like a Victorian debtor.
  • Competition: you are locked in an endless cage fight with your colleagues, grinning through the blood as you “collaborate.”

These logics are then dressed up in fantasies to keep us compliant: the fantasy of freedom (“you’re free to negotiate your own zero-hours contract”), the fantasy of meritocracy (“you got that promotion because you’re brilliant, not because you went to the right school”), and the fantasy of progress (“growth is good, even if it kills you”).

Those of us with an interest in Behavioural Economics had naively hoped that the mythical homo economicus, that laughable caricature of a rational, utility-maximising automaton, would by now be filed under “anachronistic curiosities.” Yet in corporate domains, this zombie shuffles on, cosseted and cultivated by neoliberal ideology. Far from being discredited, homo economicus remains a protected species, as if the boardroom were some Jurassic Park of bad economics.

The brilliance and the horror is that even the academics meant to be studying work and organisations have been captured by the same ideology. Work and Organisational Psychology (WOP) too often frames employees as variables in a productivity equation, measuring “engagement” only in terms of its effect on shareholder value. The worker’s humanity is rendered invisible; the employee exists only insofar as they generate output.

So when HR offers you a mindfulness app or a “resilience workshop,” remember: these are not gifts but obligations. There are ways of making you responsible for surviving a system designed to grind you down. The neoliberal trick is to convince you that your suffering is your own fault, that if only you had been more proactive, more adaptable, more “employable,” you wouldn’t be so crushed beneath the wheel.

Bal and Dóci are right: the way forward is to re-politicise and re-humanise organisational studies, to see workers as humans rather than performance units. But until then, expect HR to keep smiling while sharpening its knives.

The Red Flag of Truth

Nothing says “I’ve stopped thinking” quite like someone waving the banner of Truth. The word itself, when capitalised and flapped about like a holy relic, isn’t a signal of wisdom but of closure. A red flag.

The short video by Jonny Thompson that inspired this post.

Those who proclaim to “speak the Truth” or “know the Truth”rarely mean they’ve stumbled upon a tentative insight awaiting refinement. No, what they mean is: I have grasped reality in its totality, and—surprise!—it looks exactly like my prejudices. It’s the epistemic equivalent of a toddler declaring ownership of the playground by drooling on the swings.

The Fetish of Objectivity

The conceit is that Truth is singular, objective, eternal, a monolithic obelisk towering over human folly. But history’s scrapyard is full of such obelisks, toppled and broken: phlogiston, bloodletting, Manifest Destiny, “the market will regulate itself.” Each was once trumpeted as capital-T Truth. Each is now embarrassing clutter for the dustbin.

Still, the zealots never learn. Every generation delivers its own batch of peddlers, flogging their version of Truth as if it were snake oil guaranteed to cure ignorance and impotence. (Side effects may include dogmatism, authoritarianism, and an inability to read the room.)

Why It’s a Red Flag

When someone says, “It’s just the truth”, what they mean is “, I am not listening,” like the parent who argues, “because I said so.” Dialogue is dead; curiosity cremated. Truth, in their hands, is less a lantern than a cosh. It is wielded not to illuminate, but to bludgeon.

Ralph Waldo Emerson’s voice breaks in, urging us to trust ourselves and to think for ourselves. Nothing is more degrading than to borrow another’s convictions wholesale and parade them as universal law. Better to err in the wilderness of one’s own reason than to be shepherded safely into another man’s paddock of certainties.

A Better Alternative

Rather than fetishising Truth, perhaps we ought to cultivate its neglected cousins: curiosity, provisionality, and doubt. These won’t look as good on a placard, admittedly. Picture a mob waving banners emblazoned with Ambiguity! – not exactly the stuff of revolutions. But infinitely more honest, and infinitely more humane.

So when you see someone waving the flag of Truth, don’t salute. Recognise it for what it is: a warning sign. Proceed with suspicion, and for God’s sake, bring Emerson.

Don’t Tread on My Ass

Absolute liberty means absolute liberty, but what if the liberty you seek is death? The moment you carve out exceptions – speech you can’t say, choices you can’t make, exits you can’t take – you’ve left the realm of liberty and entered the gated community of permission.

Video: YouTube vid by Philosopher Muse.

And here’s the test most self-styled liberty lovers fail: you’re free to skydive without a parachute, but try ending your life peacefully and watch how quickly the freedom brigade calls in the moral SWAT team.

I’m not his usual audience; I’m already in the choir, but this eight-minute clip by Philosopher Muse is worth your time. It’s a lucid walk through the ethical terrain mapped by Sarah Perry in Every Cradle Is a Grave, and it’s one of the better distillations of antinatalist thought I’ve seen for the general public. Perry’s libertarian starting point is straightforward: if you truly own your life, you must also have the right to give it up.

He threads in the dark-glimmer insights of Emil Cioran’s poetic despair, Thomas Ligotti’s existential horror, David Benatar’s asymmetry, and Peter Wessel Zapffe’s tragic consciousness. Together they point to an uncomfortable truth: autonomy that stops short of death isn’t autonomy at all; it’s a petting zoo of freedoms where the gate is locked.

I’ve said this before, but it bears repeating. I once had a girlfriend who hated her life but was too afraid of hell to end it. She didn’t “pull through.” She overdosed by accident. Loophole closed, I suppose. That’s what happens when metaphysical prohibitions are allowed to run the operating system.

And here’s where I diverge from the purist libertarians. I don’t believe most people are competent enough to have the liberty they think they deserve. Not because they’re all dribbling idiots, but because they’ve been marinated for generations in a stew of indoctrination. For thousands of years, nobody talked about “liberty” or “freedom” as inalienable rights. Once the notion caught on, it was packaged and sold – complete with an asterisk, endless fine print, and a service desk that’s never open.

We tell ourselves we’re free, but only in the ways that don’t threaten the custodians. You can vote for whoever the party machine serves up, but you can’t opt out of the game. You can live any way you like, as long as it looks enough like everyone else’s life. You can risk death in countless state-approved ways, but the moment you calmly choose it, your autonomy gets revoked.

So yes, watch the video. Read Perry’s Every Cradle Is a Grave. Then ask yourself whether your liberty is liberty, or just a longer leash.

If liberty means anything, it means the right to live and the right to leave. The former without the latter is just life imprisonment with better marketing.

Ages of Consent: A Heap of Nonsense

A response on another social media site got me thinking about another Sorites paradox. The notion just bothers me. I’ve long held that it is less a paradox than an intellectually lazy way to manoeuvre around language insufficiency.

<rant>

The law loves a nice, clean number. Eighteen to vote. Sixteen to marry. This-or-that to consent. As if we all emerge from adolescence on the same morning like synchronised cicadas, suddenly equipped to choose leaders, pick spouses, and spot the bad lovers from the good ones.

But the Sorites paradox gives the game away: if you’re fit to vote at 18 years and 0 days, why not at 17 years, 364 days? Why not 17 years, 363 days? Eventually, you’re handing the ballot to a toddler who thinks the Prime Minister is Peppa Pig. Somewhere between there and adulthood, the legislator simply throws a dart and calls it “science.”

To bolster this fiction, we’re offered pseudo-facts: “Women mature faster than men”, or “Men’s brains don’t finish developing until thirty.” These claims, when taken seriously, only undermine the case for a single universal threshold. If “maturity” were truly the measure, we’d have to track neural plasticity curves, hormonal arcs, and a kaleidoscope of individual factors. Instead, the state settles for the cheapest approximation: a birthday.

This obsession with fixed thresholds is the bastard child of Enlightenment rationalism — the fantasy that human variation can be flattened into a single neat line on a chart. The eighteenth-century mind adored universals: universal reason, universal rights, universal man. In this worldview, there must be one age at which all are “ready,” just as there must be one unit of measure for a metre or a kilogram. It is tidy, legible, and above all, administratively convenient.

Cue the retorts:

  • “We need something.” True, but “something” doesn’t have to mean a cliff-edge number. We could design systems of phased rights, periodic evaluations, or contextual permissions — approaches that acknowledge people as more than interchangeable cut-outs from a brain-development chart.
  • “It would be too complicated.” Translation: “We prefer to be wrong in a simple way than right in a messy way.” Reality is messy. Pretending otherwise isn’t pragmatism; it’s intellectual cowardice. Law is supposed to contend with complexity, not avert its gaze from it.

And so we persist, reducing a continuous, irregular, and profoundly personal process to an administratively convenient fiction — then dressing it in a lab coat to feign objectivity. A number is just a number, and in this case, a particularly silly one.

</rant>

Democracy: The Idiot’s Opiate, The Sequel Nobody Asked For

Yesterday, I suggested democracy is a mediocre theatre production where the audience gets to choose which mediocre understudy performs. Some readers thought I was being harsh. I wasn’t.

A mate recently argued that humans will always be superior to AI because of emergence, the miraculous process by which complexity gives rise to intelligence, creativity, and emotion. Lovely sentiment. But here’s the rub: emergence is also how we got this political system, the one no one really controls anymore.

Like the human body being mostly non-human microbes, our so-called participatory government is mostly non-participatory components: lobbyists, donors, bureaucrats, corporate media, careerists, opportunists, the ecosystem that is the actual organism. We built it, but it now has its own metabolism. And thanks to the law of large numbers, multiplied by the sheer number of political, economic, and social dimensions in play, even the human element is diluted into statistical irrelevance. At any rate, what remains of it has lost control – like the sorcerer’s apprentice.

People like to imagine they can “tame” this beast, the way a lucid dreamer thinks they can bend the dream to their will. But you’re still dreaming. The narrative still runs on the dream’s logic, not yours. The best you can do is nudge it; a policy tweak here, a symbolic vote there, before the system digests your effort and excretes more of itself.

This is why Deming’s line hits so hard: a bad system beats a good person every time. Even if you could somehow elect the Platonic ideal of leadership, the organism would absorb them, neutralise them, or spit them out. It’s not personal; it’s structural.

And yet we fear AI “taking over,” as if that would be a radical departure from the status quo. Newsflash: you’ve already been living under an autonomous system for generations. AI would just be a remodel of the control room, new paint, same prison.

So yes, emergence makes humans “special.” It also makes them the architects of their own inescapable political microbiome. Congratulations, you’ve evolved the ability to build a machine that can’t be turned off.

Democracy: Opiate of the Masses

Democracy is sold, propagandised, really, as the best system of governance we’ve ever devised, usually with the grudging qualifier “so far.” It’s the Coca-Cola of political systems: not particularly good for you, but so entrenched in the cultural bloodstream that to question it is tantamount to treason.

Audio: NotebookLM Podcast on this topic.

The trouble is this: democracy depends on an electorate that is both aware and capable. Most people are neither. Worse still, even if they could be aware, they wouldn’t be smart enough to make use of it. And even if they were smart enough, Arrow’s Impossibility Theorem strolls in, smirking, to remind us that the whole thing is mathematically doomed anyway.

Even this number is a charade. IQ measures how well you navigate the peculiar obstacle course we’ve designed as “education,” not the whole terrain of human thought. It’s as culturally loaded as asking a fish to climb a tree, then declaring it dim-witted when it flops. We call it intelligence because it flatters those already rewarded by the system that designed the test. In the United States, the average IQ stands at 97 – hardly a figure that instils confidence in votes and outcomes.

The Enlightenment gents who pushed democracy weren’t exactly selfless visionaries. They already had power, and simply repackaged it as something everyone could share, much as the clergy promised eternal reward to peasants if they only kept their heads down. Democracy is merely religion with ballots instead of bibles: an opiate for the masses, sedating the population with the illusion of influence.

Worse still, it’s a system optimised for mediocrity. It rewards consensus, punishes brilliance, and ensures the average voter is, by definition, average. Living under it is like starring in Idiocracy, only without the comedic relief, just the grim recognition that you’re outnumbered, and the crowd is cheering the wrong thing.

The Enlightenment: A Postmortem

Or: How the Brightest Ideas in Europe Got Us into This Bloody Mess

Disclaimer: This output is entirely ChatGPT 4o from a conversation on the failure and anachronism of Enlightenment promises. I’m trying to finish editing my next novel, so I can’t justify taking much more time to share what are ultimately my thoughts as expounded upon by generative AI. I may comment personally in future. Until then, this is what I have to share.

AI Haters, leave now or perish ye all hope.


The Enlightenment promised us emancipation from superstition, authority, and ignorance. What we got instead was bureaucracy, colonialism, and TED Talks. We replaced divine right with data dashboards and called it progress. And like any good inheritance, the will was contested, and most of us ended up with bugger-all.

Below, I take each Enlightenment virtue, pair it with its contemporary vice, and offer a detractor who saw through the Enlightenment’s powder-wigged charade. Because if we’re going down with this ship, we might as well point out the dry rot in the hull.


1. Rationalism

The Ideal: Reason shall lead us out of darkness.
The Reality: Reason led us straight into the gas chambers—with bureaucratic precision.

Detractor: Max Horkheimer & Theodor Adorno

“Enlightenment is totalitarian.”
Dialectic of Enlightenment (1944)

Horkheimer and Adorno saw what reason looks like when it slips off its leash. Instrumental rationality, they warned, doesn’t ask why—it only asks how efficiently. The result? A world where extermination is scheduled, costs are optimised, and ethics are politely filed under “subjective.”


2. Empiricism

The Ideal: Observation and experience will uncover truth.
The Reality: If it can’t be measured, it can’t be real. (Love? Not statistically significant.)

Detractor: Michel Foucault

“Truth isn’t outside power… truth is a thing of this world.”
Power/Knowledge (1977)

Foucault dismantled the whole edifice. Knowledge isn’t neutral; it’s an instrument of power. Empiricism becomes just another way of disciplining the body—measuring skulls, classifying deviants, and diagnosing women with “hysteria” for having opinions.


3. Individualism

The Ideal: The sovereign subject, free and self-determining.
The Reality: The atomised consumer, trapped in a feedback loop of self-optimisation.

Detractor: Jean Baudrillard

“The individual is no longer an autonomous subject but a terminal of multiple networks.”
Simulacra and Simulation (1981)

You wanted autonomy? You got algorithms. Baudrillard reminds us that the modern “individual” is a brand in search of market validation. You are free to be whoever you want, provided it fits within platform guidelines and doesn’t disrupt ad revenue.


4. Secularism

The Ideal: Liberation from superstition.
The Reality: We swapped saints for STEMlords and called it even.

Detractor: Charles Taylor

“We are now living in a spiritual wasteland.”
A Secular Age (2007)

Taylor—perhaps the most polite Canadian apocalypse-whisperer—reminds us that secularism didn’t replace religion with reason; it replaced mystery with malaise. We’re no longer awed, just “motivated.” Everything is explainable, and yet somehow nothing means anything.


5. Progress

The Ideal: History is a forward march toward utopia.
The Reality: History is a meat grinder in a lab coat.

Detractor: Walter Benjamin

“The storm irresistibly propels him into the future to which his back is turned.”
Theses on the Philosophy of History (1940)

Benjamin’s “angel of history” watches helplessly as the wreckage piles up—colonialism, genocide, climate collapse—all in the name of progress. Every step forward has a cost, but we keep marching, noses in the spreadsheet, ignoring the bodies behind us.


6. Universalism

The Ideal: One humanity, under Reason.
The Reality: Enlightenment values, brought to you by cannon fire and Christian missionaries.

Detractor: Gayatri Chakravorty Spivak

“White men are saving brown women from brown men.”
Can the Subaltern Speak? (1988)

Universalism was always a bit… French, wasn’t it? Spivak unmasks it as imperialism in drag—exporting “rights” and “freedom” to people who never asked for them, while ignoring the structural violence built into the Enlightenment’s own Enlightened societies.


7. Tolerance

The Ideal: Let a thousand opinions bloom.
The Reality: Tolerance, but only for those who don’t threaten the status quo.

Detractor: Karl Popper

“Unlimited tolerance must lead to the disappearance of tolerance.”
The Open Society and Its Enemies (1945)

Popper, bless him, thought tolerance needed a firewall. But in practice, “tolerance” has become a smug liberal virtue signalling its own superiority while deplatforming anyone who makes the dinner party uncomfortable. We tolerate all views—except the unseemly ones.


8. Scientific Method

The Ideal: Observe, hypothesise, repeat. Truth shall emerge.
The Reality: Publish or perish. Fund or flounder.

Detractor: Paul Feyerabend

“Science is not one thing, it is many things.”
Against Method (1975)

Feyerabend called the whole thing a farce. There is no single “method,” just a bureaucratic orthodoxy masquerading as objectivity. Today, science bends to industry, cherry-picks for grants, and buries null results in the backyard. Peer review? More like peer pressure.


9. Anti-Authoritarianism

The Ideal: Smash the throne! Burn the mitre!
The Reality: Bow to the data analytics team.

Detractor: Herbert Marcuse

“Free election of masters does not abolish the masters or the slaves.”
One-Dimensional Man (1964)

Marcuse skewered the liberal illusion of choice. We may vote, but we do so within a system that already wrote the script. Authority didn’t vanish; it just became procedural, faceless, algorithmic. Bureaucracy is the new monarchy—only with more forms.


10. Education and Encyclopaedism

The Ideal: All knowledge, accessible to all minds.
The Reality: Behind a paywall. Written in impenetrable prose. Moderated by white men with tenure.

Detractor: Ivan Illich

“School is the advertising agency which makes you believe that you need the society as it is.”
Deschooling Society (1971)

Illich pulls the curtain: education isn’t emancipatory; it’s indoctrinatory. The modern university produces not thinkers but credentialed employees. Encyclopaedias are replaced by Wikipedia, curated by anonymous pedants and revision wars. Truth is editable.


Postscript: Picking through the Rubble

So—has the Enlightenment failed?

Not exactly. It succeeded too literally. It was taken at its word. Its principles, once radical, were rendered banal. It’s not that reason, progress, or rights are inherently doomed—it’s that they were never as pure as advertised. They were always products of their time: male, white, bourgeois, and utterly convinced of their own benevolence.

If there’s a path forward, it’s not to restore Enlightenment values, but to interrogate them—mercilessly, with irony and eyes open.

After all, the problem was never darkness. It was the people with torches who thought they’d found the only path.

From Thesaurus to Thoughtcrime: The Slippery Slope of Authorial Purity

I had planned to write about Beauvoir’s Second Sex, but this has been on my mind lately.

There’s a certain breed of aspiring author, let’s call them the Sacred Scribes, who bristle at the notion of using AI to help with their writing. Not because it’s unhelpful. Not because it produces rubbish. But because it’s impure.

Like some Victorian schoolmarm clutching her pearls at the sight of a split infinitive, they cry: “If you let the machine help you fix a clumsy sentence, what’s next? The whole novel? Your diary? Your soul?”

The panic is always the same: one small compromise and you’re tumbling down the greased chute of creative ruin. It starts with a synonym suggestion and ends with a ghostwritten autobiography titled My Journey to Authenticity, dictated by chatbot, of course.

But let’s pause and look at the logic here. Or rather, the lack thereof.

By this standard, you must also renounce the thesaurus. Shun the spellchecker. Burn your dictionary. Forbid yourself from reading any book you might accidentally learn from. Heaven forbid you read a well-constructed sentence and think, “I could try that.” That’s theft, isn’t it?

And while we’re at it, no editors. No beta readers. No workshopping. No taking notes. Certainly no research. If your brain didn’t birth it in a vacuum, it’s suspect. It’s borrowed. It’s… contaminated.

Let’s call this what it is: purity fetishism in prose form.

But here’s the twist: it’s not new. Plato, bless him, was already clutching his tunic about this twenty-four centuries ago. In Phaedrus, he warned that writing itself would be the death of memory, of real understanding. Words on the page were a crutch. Lazy. A hollow imitation of wisdom. True knowledge lived in the mind, passed orally, and refined through dialogue. Writing, he said, would make us forgetful, outsource our thinking.

Sound familiar?

Fast forward a few millennia, and we’re hearing the same song, remixed for the AI age:
“If you let ChatGPT restructure your second paragraph, you’re no longer the author.”
Nonsense. You were never the sole author. Not even close.

Everything you write is a palimpsest, your favourite genres echoing beneath the surface, your heroes whispering in your turns of phrase. You’re just remixing the residue. And there’s no shame in that. Unless, of course, you believe that distilling your top five comfort reads into a Frankenstein narrative somehow makes you an oracle of literary genius.

Here’s the rub: You’ve always been collaborating.

With your past. With your influences. With your tools. With language itself, which you did not invent and barely control. Whether the suggestion comes from a friend, an editor, a margin note, or an algorithm, what matters is the choice you make with it. That’s authorship. Let’s not play the slippery slope game.

The slippery slope argument collapses under its own weight. No one accuses you of cheating when you use a pencil sharpener. Or caffeine. Or take a walk to clear your head. But involve a silicon co-author, and suddenly you’re the Antichrist of Art?

Let’s not confuse integrity with insecurity. Let’s not confuse control with fear.

Use the tool. Ignore the purists. They’ve been wrong since Plato, and they’ll still be wrong when your great-grandchildren are dictating novels to a neural implant while bathing in synthetic dopamine.

The future of writing is always collaborative. The only question is whether you’ll join the conversation or sit in the corner, scribbling manifestos by candlelight, declaring war on electricity.

The Heuristic Self: On Persona, Identity, and Character

Man is least himself when he talks in his own person. Give him a mask, and he will tell you the truth.”
— Oscar Wilde

Identity is an illusion—but a necessary one. It’s a shortcut. A heuristic, evolved not for truth but for coherence. We reduce ourselves and others to fixed traits to preserve continuity—psychological, social, narrative.

Audio: NotebookLM podcast on this topic. (Direct)

Audio: NotebookLM podcast on this topic. (Spotify)

In the latest post on RidleyPark.blog, we meet Sarah—a woman who survives by splintering. She has three names, three selves, three economies of interaction. Each persona—Sarah, Stacey, and Pink—fulfils a role. Each protects her in a system that punishes complexity.

Identity Is Compression

Cognitive science suggests that we don’t possess a self—we perform one. Our so-called identity is assembled post-hoc from memory, context, and social cues. It’s recursive. It’s inferred.

We are not indivisible atoms of identity. We are bundled routines, personae adapted to setting and audience.

From Performance to Survival

In Needle’s Edge, Sarah doesn’t use aliases to deceive. She uses them to survive contradictions:

  • Stacey is desirable, stable, and profitable—so long as she appears clean and composed.
  • Pink is a consumer, invisible, stripped of glamour but allowed access to the block.
  • Sarah is the residue, the name used by those who once knew her—or still believe they do.

Each persona comes with scripts, limitations, and permissions. Sarah isn’t being dishonest. She’s practicing domain-specific identity. This is no different from how professionals code-switch at work, or how people self-edit on social media.

The Literary Echo

In character development, we often demand “depth,” by which we mean contradiction. We want to see a character laugh and break. Love and lie. But Sarah shows us that contradiction isn’t depth—it’s baseline reality. Any singular identity would be a narrative failure.

Characters like Sarah expose the poverty of reduction. They resist archetype. They remind us that fiction succeeds when it reflects the multiple, the shifting, the incompatible—which is to say, the real.

What Else Might We Say?

  • That authenticity is a myth: “Just be yourself” presumes you know which self to be.
  • That moral judgment often stems from a failure to see multiple selves in others.
  • That trauma survivors often fracture not because they’re broken, but because fracturing is adaptive.
  • That in a capitalist framework, the ability to fragment and role-play becomes a survival advantage.
  • That fiction is one of the few spaces where we can explore multiple selves without collapse.

The Missing Link

For a concrete, narrative reflection of these ideas, this post on RidleyPark.blog explores how one woman carries three selves to survive three worlds—and what it costs her.