Embracing Your Anti-Self

Lessons from Keats on the Art of Self-Creation

I don’t believe in the notions of ‘self’ or identities, but it makes for a nice thought experiment.

Imagine, just for a moment, that somewhere on this planet, there is someone who is your opposite in every conceivable way. They live as you do not. If you are kind, they are cruel. If you revel in the thrill of running through a rainstorm, they are the kind who sit comfortably by the fire, dreading the mere thought of a brisk step outdoors. If you drink to toast life’s joys, they abstain, unwilling to let a drop pass their lips. They are your anti-self—an inversion of who you are, lacking everything that you have and yet possessing everything that you do not.

As strange as it seems, this image is more than idle speculation. According to the Romantic poet John Keats, holding such an image of your anti-self is an essential part of the process of creating your own identity. The elusive art of true self-creation lies, paradoxically, in our capacity to hold in our minds those lives and feelings that are utterly different from our own. To truly grow, we must encounter the other—whether that other is someone we know or a shadowy, imagined version of who we could have been if only we’d chosen differently. This exercise is more than an intellectual indulgence; it is at the core of what Keats called ‘soul-making.’

Keats believed in the concept of the ‘chameleon poet’—the idea that writers, and indeed all human beings, must cultivate the ability to lose themselves in the perspectives of others. It is not enough to gaze upon the world through the singular lens of our own experience; to truly create, we must dissolve our egos and embrace a kaleidoscope of possibilities. A woman might explore the life of a soldier, writing deeply about a battle she’s never fought. A contented parent might dare to delve into the unimaginable grief of losing a child. Fiction writers, poets, artists—they all do this: they shed their own skin, assume another’s, and, in doing so, broaden the horizons of their own soul.

But Keats’ lesson here isn’t limited to the domain of poets and storytellers; it’s a practice that should extend to all of us. In what he evocatively called ‘the vale of soul-making,’ Keats posited that life offers each of us the raw materials to forge a soul, but we must engage imaginatively with all the lives we might lead, all the people we could be. We must dare to envision every possible road before us, not as a commitment but as an act of creation—enriching ourselves with the essence of each path before deciding which one we wish to tread.

And therein lies the heartbreak of it all. When we choose one possible life, we necessarily burn the others. In the very act of committing, we close other doors. We must set ablaze all our imagined lives just to make room for the one we decide to live. This thought is thrilling but also terrifying. Unlike a poet, who can glide into and out of fictional worlds, we must choose where we stand and stay there. We are not chameleons. We cannot flit endlessly between possibilities. We cannot write a library of books. We must write the one, and we must write it well.

Keats understood that the art of imagining one’s anti-self wasn’t about living vicariously forever in a land of could-have-beens. The exercise is in acknowledging these spectres of other lives, learning from them, and then committing, knowing full well what is lost in the process. Self-creation means being both the builder of one’s house and the one who tears down all the others, brick by potential brick. It means knowing who you could have been and yet, resolutely stepping into who you choose to be.

In a world obsessed with keeping every option open, Keats offers us the wisdom of finality. Burn off your possible lives and focus on writing the best version of the one that remains. Embrace the anti-self, learn from it, and commit once you have glimpsed all the possible worlds you might inhabit.

That is the paradoxical art of soul-making—of becoming whole while knowing you could have been anyone else. The beauty lies in the commitment, not in the drifting dream of endless potentiality. There is a deep satisfaction in choosing, in writing your own story, in saying, ‘This is who I am,’ even though you could have been another. And for that, we have John Keats to thank, the poet who understood that our anti-selves are not merely an idle game of imagination but the fuel for becoming fully human—the forge in which the soul is made.

Language Insufficiency, Rev 3

I’m edging ever closer to finishing my book on the Language Insufficiency Hypothesis. It’s now in its third pass—a mostly subtractive process of streamlining, consolidating, and hacking away at redundancies. The front matter, of course, demands just as much attention, starting with the Preface.

The opening anecdote—a true yet apocryphal gem—dates back to 2018, which is evidence of just how long I’ve been chewing on this idea. It involves a divorce court judge, a dose of linguistic ambiguity, and my ongoing scepticism about the utility of language in complex, interpretative domains.

At the time, my ex-wife’s lawyer was petitioning the court to restrict me from spending any money outside our marriage. This included a demand for recompense for any funds already spent. I was asked, point-blank: Had I given another woman a gift?

Seeking clarity, I asked the judge to define gift. The response was less than amused—a glare, a sneer, but no definition. Left to my own devices, I answered no, relying on my personal definition: something given with no expectation of return or favour. My reasoning, then as now, stemmed from a deep mistrust of altruism.

The court, however, didn’t share my philosophical detours. The injunction came down: I was not to spend any money outside the marital arrangement. Straightforward? Hardly. At the time, I was also in a rock band and often brought meals for the group. Was buying Chipotle for the band now prohibited?

The judge’s response dripped with disdain. Of course, that wasn’t the intent, they said, but the language of the injunction was deliberately broad—ambiguous enough to cover whatever they deemed inappropriate. The phrase don’t spend money on romantic interests would have sufficed, but clarity seemed to be a liability. Instead, the court opted for what I call the Justice Stewart Doctrine of Legal Ambiguity: I know it when I see it.

Unsurprisingly, the marriage ended. My ex-wife and I, however, remain close; our separation in 2018 was final, but our friendship persists. Discussing my book recently, I mentioned this story, and she told me something new: her lawyer had confided that the judge disliked me, finding me smug.

This little revelation cemented something I’d already suspected: power relations, in the Foucauldian sense, pervade even our most banal disputes. It’s why Foucault makes a cameo in the book alongside Nietzsche, Wittgenstein, Saussure, Derrida, Borges, and even Gödel.

This anecdote is just one straw on the poor camel’s back of my linguistic grievances, a life filled with moments where language’s insufficiency has revealed itself. And yet, I found few others voicing my position. Hence, a book.

I aim to self-publish in early 2025—get it off my chest and into the world. Maybe then I can stop wittering on about it. Or, more likely, I won’t.

The Insufficiency of Language in an Agile World

I wrote and published this article on LinkedIn. I even recycled the cover image. Although it is about the particular topic of Agile, it relates to the Language Insufficiency Hypothesis, so I felt it would be apt here as well. It demonstrates how to think about language insufficiency through the framework.

Agile in Name Only

For over two decades, I’ve been immersed in Agile and its myriad interpretations. One refrain has persisted throughout: Agile™ is “just about agility,” a term that anyone can define as they see fit. The ambiguity begs the question: What does it really mean?

On its face, this sounds inclusive, but it never passed my intuitive sniff test. I carried on, but as I reflected on my broader work concerning the insufficiency of language, this persistent fuzziness started to make sense. Agile’s conceptual murkiness can be understood through the lens of language and identity—particularly through in-group and out-group dynamics.

Otherness and the Myth of Universality

To those who truly understand agility, no elaborate definition is required. It’s instinctive, embedded in their DNA. They don’t need to label it; they simply are agile. Yet, for the out-group—the ones who aspire to the status without the substance—Agile™ becomes a muddy abstraction. Unable to grasp the core, they question its very existence, claiming, “Who really knows what Agile means?”

The answer is simple: Everyone but those asking this question.

The Agility Crisis

This disconnect creates a power shift. The in-group, small and focused, operates with quiet competence. Meanwhile, the out-group, larger and louder, hijacks the conversation. What follows is an inevitable dilution: “Agile is dead,” “Agile doesn’t work,” they declare. But these proclamations often reflect their own failures to execute or evolve, not flaws inherent to agility itself.

This pattern follows a familiar playbook: create a strawman—define Agile™ as something it’s not—then decry its inability to deliver. The result? Performative agility, a theatre of motion without progress, where the players confuse activity for achievement and rely on brittle, inextensible infrastructures.

Agile Beyond the Label

Ironically, the true practitioners of agility remain unbothered by these debates. They adapt, innovate, and thrive—with or without the label. Agile™ has become a victim of its own success, co-opted by those who misunderstand it, leading to a paradox: the louder the chorus claiming “Agile doesn’t work,” the more it underscores the gap between those who do agility and those who merely wear its name.

The lesson here is not just about Agile™ but about language itself. Words, when untethered from their essence, fail. They cease to communicate, becoming tools of obfuscation rather than clarity. In this, Agile™ mirrors a broader phenomenon: the insufficiency of language in the face of complexity and its misuse by those unwilling or unable to engage with its deeper truths.

Guns, Germs, and Steel

I am reading Jared Diamond’s Guns, Germs, and Steel: The Fates of Human Societies, the first and likely most famous of an informal trilogy. I thought I had already read it, but I think I only saw the PBS show. Having recently finished Josephine Quinn’s How the World Made the West, I wanted to revisit this perspective. The two books are presented in different styles and represent different perspectives, but they seem to be complementary.

Where Diamond focuses on environmental factors (an oft-voiced critique), Quinn focuses on human agency.

Diamond takes a bird’ s-eye view, looking for universal patterns and systemic explanations, whilst Quinn adopts a granular, specific approach, highlighting the fluidity and contingency of history.

Diamond deconstructs European dominance by attributing it to environmental luck, but his narrative risks sidelining the agency of colonised peoples. Quinn critiques the very idea of Western dominance, arguing that the concept of the West itself is a myth born of appropriation and exchange.

Rather than being wholly opposed, Diamond and Quinn’s approaches might be seen as complementary. Diamond provides the structural scaffolding – the environmental and geographic conditions that shape societies – whilst Quinn fills in the cultural and human dynamics that Diamond often glosses over. Together, they represent two sides of the historiographical coin: one focusing on systemic patterns, the other on the messiness of cultural particularities.

Quinn’s approach is more aligned with The Dawn of Everything: A New History of Humanity, co-authored by David Graeber and archaeologist David Wengrow, if you can use that as a reference point.

The Narcissist’s Playbook

I’ve lived in Los Angeles a couple of times for a sum total of perhaps 15 years. The first time, I loved it. The next time, I was running on fumes. The first time, I was in my twenties – the second time in my forties. What a difference perspective and ageing makes. In my twenties, I was a pretty-boy punk-ass who owned the club scene on the Strip. In my forties, I was a wage slave.

Audio: NotebookLM podcast on this topic.

This morning, I heard a country song on Insta with a line claiming ‘there are nines and dimes in all 50’, and it reminded me of a phrase we used when I lived in Los Angeles – ‘LA 7’. This is constructed on the egoist, sexist notion that if you were a 10, you’d have already moved to LA. If you still lived in, say, Iowa and were considered a 10, the exchange rate to LA would be a 7.

Then, I thought about the LA-NYC rivalry and wrote this article with some help from ChatGPT.

How L.A. and NYC Became the Centres of the Universe (According to Them)

It is a truth universally acknowledged that Los Angeles and New York City—those bickering siblings of American exceptionalism—believe themselves to be the sun around which the rest of us drearily orbit. Each is utterly convinced of its centrality to the human experience, and neither can fathom that people outside their borders might actually exist without yearning to be them. This is the essence of the ‘Centre of the Universe Complex,’ a condition in which self-importance metastasises into a full-blown cultural identity.

Let us begin with Los Angeles, the influencer of cities. L.A. doesn’t merely think it’s the centre of the universe; it believes it’s the universe, replete with its own atmosphere of smog-filtered sunlight and an economy powered entirely by dreams, green juice, and Botox. For L.A., beauty isn’t just a priority—it’s a moral imperative. Hence the concept of the ‘L.A. 10,’ a stunningly arrogant bit of mathematics whereby physical attractiveness is recalculated based on proximity to the Pacific Coast Highway.

Here’s how it works: a ’10’ in some picturesque-but-hopelessly-provincial state, say Nebraska, is automatically downgraded to a ‘7’ upon arrival in Los Angeles. Why? Because, according to L.A.’s warped ‘arithmetic, if she were a real 10, she’d already be there, lounging by an infinity pool in Malibu and ignoring your DMs. This isn’t just vanity—it’s top-tier delusion. L.A. sees itself as a black hole of good looks, sucking the beautiful people from every corner of the earth while leaving the ‘merely pretty’ to languish in flyover country. The Midwest, then, isn’t so much a place as it is an agricultural waiting room for future Angelenos.

But don’t be fooled—New York City is no better. Where L.A. is obsessed with beauty, NYC worships hustle. The city doesn’t just believe it’s important; it believes it’s the only place on earth where anything important happens. While L.A. is out perfecting its tan, NYC is busy perfecting its reputation as the cultural and intellectual capital of the world—or, at least, its part of the world, which conveniently ends somewhere in Connecticut.

This mindset is best summed up by that sanctimonious mantra, If you can make it here, you can make it anywhere. Translation: if you survive the daily humiliation of paying $4,000 a month for a shoebox apartment while dodging both rats and an existential crisis, you’ve unlocked the secret to life itself. New York isn’t about looking good; it’s about enduring bad conditions and then boasting about it as if suffering were an Olympic sport. In this worldview, the rest of the world is simply an unworthy understudy in NYC’s perpetual Broadway production.

And here’s the thing: neither city can resist taking cheap shots at the other. L.A. dismisses NYC as a grim, grey treadmill where fun goes to die, while NYC scoffs at L.A. as a vapid bubble of avocado toast and Instagram filters. It’s brains versus beauty, grit versus glamour, black turtlenecks versus Lululemon. And yet, in their relentless need to outshine one another, they reveal a shared truth: both are equally narcissistic.

This mutual self-obsession is as exhausting as it is entertaining. While L.A. and NYC bicker over who wears the crown, the rest of the world is quietly rolling its eyes and enjoying a life unencumbered by astronomical rent or the constant pressure to appear important. The people of Iowa, for example, couldn’t care less if they’re an ‘LA 7’ or if they’ve “made it” in New York. They’re too busy living comfortably, surrounded by affordable housing and neighbours who might actually help them move a sofa.

But let’s give credit where it’s due. For all their flaws, these two cities do keep the rest of us entertained. Their constant self-aggrandisement fuels the cultural zeitgeist: without L.A., we’d have no Kardashians; without NYC, no Broadway. Their rivalry is the stuff of legend, a never-ending soap opera in which both cities play the lead role.

So, let them have their delusions of grandeur. After all, the world needs a little drama—and nobody does it better than the cities that think they’re the centre of it.

Blinded by Bias: The Irony of Greed and Self-Perception

Greed is a vice we readily recognise in others but often overlook in ourselves. This selective perception was strikingly evident during a recent conversation I had with a man who was quick to condemn another’s greed while remaining oblivious to his own similar tendencies. I told him about the escalating greed of certain companies who profit greatly from selling their printer inks and toner brands. I’ll spare you this history. This encounter underscores the powerful influence of fundamental attribution bias on our judgments and self-awareness.

Exploring Greed

Greed can be defined as an intense and selfish desire for something, especially wealth, power, or food. Psychologically, it is considered a natural human impulse that, when unchecked, can lead to unethical behaviour and strained relationships. Societally, greed is often condemned, yet it persists across cultures and histories.

We tend to label others as greedy when their actions negatively impact us or violate social norms. However, when we aggressively pursue our interests, we might frame it as ambition or resourcefulness. This dichotomy reveals a discrepancy in how we perceive greed in ourselves versus others.

Understanding Fundamental Attribution Bias

Fundamental attribution bias, or fundamental attribution error, is the tendency to attribute others’ actions to their character while attributing our own actions to external circumstances. This cognitive bias allows us to excuse our behaviour while holding others fully accountable for theirs.

For example, if someone cuts us off in traffic, we might think they’re reckless or inconsiderate. But if we cut someone off, we might justify it by claiming we were late or didn’t see them. This bias preserves our self-image but distorts our understanding of others.

The Conversation

Our conversation was centred on an HP printer that has shown a ‘low ink – please replace’ message since the cartridge was first installed. I recounted the history of the ink and toner industry. HP had a monopoly on ink for their products, a situation that earned them substantial marginal profits. Upstarts entered the marketplace. This started an escalating arms war. HP spent R&D dollars trying to defend their profit margins with nil benefit to the consumers of their product. In fact, it kept costs artificially higher. Competitors who wanted a slice of those fat margins found ways around these interventions. Eventually, HP installed chips on their toner cartridges. Unfortunately, they have a bug – or is it a feature? If you install a cartridge and remove it, it assumes you’re up to something shady, so it spawns this false alert. Some people believe this out of hand, so HP benefits twice.

If this bloke had worked for HP and had been responsible for revenue acquisition and protection, he would have swooned over the opportunity. Have no doubt. At arm’s length, he recognised this as sleazy, unethical business practices.

This conversation revealed how easily we can fall into the trap of judging others without reflecting on our own behaviour. His indignation seemed justified to him, yet he remained unaware of how his actions mirrored those he criticised.

Biblical Reference and Moral Implications

This situation brings to mind the biblical passage from Matthew 7:3-5:

“Why do you look at the speck of sawdust in your brother’s eye and pay no attention to the plank in your own eye? … You hypocrite, first take the plank out of your own eye, and then you will see clearly to remove the speck from your brother’s eye.”

The verse poignantly captures the human tendency to overlook our flaws while magnifying those of others. It calls for introspection and humility, urging us to address our shortcomings before passing judgment.

The Asymmetry of Self-Perception

Several psychological factors contribute to this asymmetry:

  • Self-Serving Bias: We attribute our successes to internal factors and our failures to external ones.
  • Cognitive Dissonance: Conflicting beliefs about ourselves and our actions create discomfort, leading us to rationalize or ignore discrepancies.
  • Social Comparison: We often compare ourselves favourably against others to boost self-esteem.

This skewed self-perception can hinder personal growth and damage relationships, as it prevents honest self-assessment and accountability.

Overcoming the Bias

Awareness is the first step toward mitigating fundamental attribution bias. Here are some strategies:

  1. Mindful Reflection: Regularly assess your actions and motivations. Ask yourself if you’re holding others to a standard you’re not meeting. Riffing from ancient moral dictates, just ask yourself if this is how you would want to be treated. Adopt Kant’s moral imperative framework.
  2. Seek Feedback: Encourage honest input from trusted friends or colleagues about your behaviour.
  3. Empathy Development: Practice seeing situations from others’ perspectives to understand their actions more fully.
  4. Challenge Assumptions: Before making judgments, consider external factors that might influence someone’s behaviour.

By actively recognising and adjusting for our biases, we can develop more balanced perceptions of ourselves and others.

Conclusion

The irony of condemning in others what we excuse in ourselves is a common human pitfall rooted in fundamental attribution bias. The adage, ‘Know thyself’ might come into view here. We can overcome these biases by striving for self-awareness and empathy, leading to more authentic relationships and personal integrity.

Exploring Antinatalist Philosophies

A Comparative Analysis of Sarah Perry, Emil Cioran, and Contemporaries

In a world where procreation is often celebrated as a fundamental human aspiration, a group of philosophers challenges this deeply ingrained belief by questioning the ethical implications of bringing new life into existence. Antinatalism, the philosophical stance that posits procreation is morally problematic due to the inherent suffering embedded in life, invites us to reexamine our assumptions about birth, existence, and the value we assign to life itself.

Audio: Podcast related to the content on this page

Central to this discourse are thinkers like Sarah Perry, whose work “Every Cradle is a Grave: Rethinking the Ethics of Birth and Suicide” intertwines the ethics of procreation with the right to die, emphasizing personal autonomy and critiquing societal norms. Alongside Perry, philosophers such as Emil Cioran, David Benatar, Thomas Ligotti, and Peter Wessel Zapffe offer profound insights into the human condition, consciousness, and our existential burdens.

This article delves into the complex and often unsettling arguments presented by these philosophers, comparing and contrasting their perspectives on antinatalism. By exploring their works, we aim to shed light on the profound ethical considerations surrounding birth, suffering, and autonomy over one’s existence.

The Inherent Suffering of Existence

At the heart of antinatalist philosophy lies the recognition of life’s intrinsic suffering. This theme is a common thread among our featured philosophers, each articulating it through their unique lenses.

Sarah Perry argues that suffering is an unavoidable aspect of life, stemming from physical ailments, emotional pains, and existential anxieties. In “Every Cradle is a Grave,” she states:

“Existence is imposed without consent, bringing inevitable suffering.”

Perry emphasises that since every human will experience hardship, bringing a new person into the world exposes them to harm they did not choose.

Similarly, David Benatar, in his seminal work “Better Never to Have Been: The Harm of Coming into Existence,” presents the asymmetry argument. He posits that coming into existence is always a harm:

“Coming into existence is always a serious harm.”

Benatar reasons that while the absence of pain is good even if no one benefits from it, the absence of pleasure is not bad unless there is someone for whom this absence is a deprivation. Therefore, non-existence spares potential beings from suffering without depriving them of pleasures they would not miss.

Emil Cioran, a Romanian philosopher known for his profound pessimism, delves deep into the despair inherent in life. In “The Trouble with Being Born,” he reflects:

“Suffering is the substance of life and the root of personality.”

Cioran’s aphoristic musings suggest that life’s essence is intertwined with pain, and acknowledging this is crucial to understanding our existence.

Thomas Ligotti, blending horror and philosophy in “The Conspiracy Against the Human Race,” portrays consciousness as a cosmic error:

“Consciousness is a mistake of evolution.”

Ligotti argues that human awareness amplifies suffering, making us uniquely burdened by the knowledge of our mortality and the futility of our endeavours.

Peter Wessel Zapffe, in his essay “The Last Messiah,” examines how human consciousness leads to existential angst:

“Man is a biological paradox due to excessive consciousness.”

Zapffe contends that our heightened self-awareness results in an acute recognition of life’s absurdities, causing inevitable psychological suffering.



Ethics of Procreation

Building upon the acknowledgement of life’s inherent suffering, these philosophers explore the moral dimensions of bringing new life into the world.

Sarah Perry focuses on the issue of consent. She argues that since we cannot obtain consent from potential beings before birth, procreation imposes life—and its accompanying suffering—upon them without their agreement. She writes:

“Procreation perpetuates harm by introducing new sufferers.”

Perry challenges the societal norm that views having children as an unquestioned good, highlighting parents’ moral responsibility for the inevitable pain their children will face.

In David Benatar’s asymmetry argument, he extends this ethical concern by suggesting that non-existence is preferable. He explains that while the absence of pain is inherently good, the absence of pleasure is not bad because no one is deprived of it. Therefore, bringing someone into existence who will undoubtedly experience suffering is moral harm.

Emil Cioran questions the value of procreation given the futility and despair inherent in life. While not explicitly formulating an antinatalist argument, his reflections imply scepticism about the act of bringing new life into a suffering world.

Peter Wessel Zapffe proposes that refraining from procreation is a logical response to the human condition. By not having children, we can halt the perpetuation of existential suffering. He suggests that humanity’s self-awareness is a burden that should not be passed on to future generations.

The Right to Die and Autonomy over Existence

A distinctive aspect of Sarah Perry’s work is her advocacy for the right to die. She asserts that just as individuals did not consent to be born into suffering, they should have the autonomy to choose to end their lives. Perry critiques societal and legal barriers that prevent people from exercising this choice, arguing:

“Autonomy over one’s life includes the right to die.”

By decriminalizing and destigmatizing suicide, she believes society can respect individual sovereignty and potentially alleviate prolonged suffering.

Emil Cioran contemplates suicide not necessarily as an action to be taken but as a philosophical consideration. In “On the Heights of Despair,” he muses:

“It is not worth the bother of killing yourself, since you always kill yourself too late.”

Cioran views the option of ending one’s life as a paradox that underscores the absurdity of existence.

While Benatar, Ligotti, and Zapffe acknowledge the despair that can accompany life, they do not extensively advocate for the right to die. Their focus remains on the ethical implications of procreation and the existential burdens of consciousness.

Coping Mechanisms and Societal Norms

Peter Wessel Zapffe delves into how humans cope with the existential angst resulting from excessive consciousness. He identifies four defence mechanisms:

  1. Isolation: Repressing disturbing thoughts from consciousness.
  2. Anchoring: Creating or adopting values and ideals to provide meaning.
  3. Distraction: Engaging in activities to avoid self-reflection.
  4. Sublimation: Channeling despair into creative or intellectual pursuits.

According to Zapffe, these mechanisms help individuals avoid confronting life’s inherent meaninglessness.

Thomas Ligotti echoes this sentiment, suggesting that optimism is a psychological strategy to cope with the horror of existence. He writes:

“Optimism is a coping mechanism against the horror of existence.”

Sarah Perry and Emil Cioran also critique societal norms that discourage open discussions about suffering, death, and the choice not to procreate. They argue that societal pressures often silence individuals who question the value of existence, thereby perpetuating cycles of unexamined procreation and stigmatizing those who consider alternative perspectives.

Comparative Insights

While united in their acknowledgement of life’s inherent suffering, these philosophers approach antinatalism and existential pessimism through varied lenses.

  • Sarah Perry emphasises personal autonomy and societal critique, advocating for policy changes regarding birth and suicide.
  • Emil Cioran offers a deeply personal exploration of despair, using poetic language to express the futility he perceives in existence.
  • David Benatar provides a structured, logical argument against procreation, focusing on the ethical asymmetry between pain and pleasure.
  • Thomas Ligotti combines horror and philosophy to illustrate the bleakness of consciousness and its implications for human suffering.
  • Peter Wessel Zapffe analyzes the psychological mechanisms humans employ to avoid confronting existential angst.

Critiques and Counterarguments

Critics of antinatalism often point to an overemphasis on suffering, arguing that it neglects the joys, love, and meaningful experiences that life can offer. They contend that while suffering is a part of life, it is not the totality of existence.

In response, antinatalist philosophers acknowledge the presence of pleasure but question whether it justifies the inevitable suffering every person will face. Benatar argues that while positive experiences are good, they do not negate the moral harm of bringing someone into existence without their consent.

Regarding the right to die, opponents express concern over the potential neglect of mental health issues. They worry that normalizing suicide could prevent individuals from seeking help and support that might alleviate their suffering.

Sarah Perry addresses this by emphasizing the importance of autonomy and the need for compassionate support systems. She advocates for open discussions about suicide to better understand and assist those contemplating it rather than stigmatizing or criminalizing their considerations.

Societal and Cultural Implications

These philosophers’ works challenge pro-natalist biases ingrained in many cultures. By questioning the assumption that procreation is inherently positive, they open a dialogue about the ethical responsibilities associated with bringing new life into the world.

Sarah Perry critiques how society glorifies parenthood while marginalizing those who choose not to have children. She calls for reevaluating societal norms that pressure individuals into procreation without considering the ethical implications.

Similarly, Emil Cioran and Thomas Ligotti highlight how societal denial of life’s inherent suffering perpetuates illusions that hinder genuine understanding and acceptance of the human condition.

Conclusion

The exploration of antinatalist philosophy through the works of Sarah Perry, Emil Cioran, and their contemporaries presents profound ethical considerations about life, suffering, and personal autonomy. Their arguments compel us to reflect on the nature of existence and the responsibilities we bear in perpetuating life.

While one may not fully embrace antinatalist positions, engaging with these ideas challenges us to consider the complexities of the human condition. It encourages a deeper examination of our choices, the societal norms we accept, and how we confront or avoid the fundamental truths about existence.

Final Thoughts

These philosophers’ discussions are not merely abstract musings but have real-world implications for how we live our lives and make decisions about the future. Whether it’s rethinking the ethics of procreation, advocating for personal autonomy over life and death, or understanding the coping mechanisms we employ, their insights offer valuable perspectives.

By bringing these often-taboo topics into the open, we can foster a more compassionate and thoughtful society that respects individual choices and acknowledges the full spectrum of human experience.

Encouraging Dialogue

As we conclude this exploration, readers are invited to reflect on their own beliefs and experiences. Engaging in open, respectful discussions about these complex topics can lead to greater understanding and empathy.

What are your thoughts on the ethical considerations of procreation? How do you perceive the balance between life’s joys and its inherent suffering? Share your perspectives and join the conversation.


References and Further Reading

  • Perry, Sarah. Every Cradle is a Grave: Rethinking the Ethics of Birth and Suicide. Nine-Banded Books, 2014.
  • Benatar, David. Better Never to Have Been: The Harm of Coming into Existence. Oxford University Press, 2006.
  • Cioran, Emil. The Trouble with Being Born. Arcade Publishing, 1973.
  • Ligotti, Thomas. The Conspiracy Against the Human Race. Hippocampus Press, 2010.
  • Zapffe, Peter Wessel. “The Last Messiah.” Philosophy Now, 1933.

For more in-depth analyses and reviews, consider exploring the following blog posts:

  • Book Review: Better Never to Have Been (Link)
  • Book Review: The Conspiracy Against the Human Race (Link)
  • Reading ‘The Last Messiah’ by Peter Zapffe (Link)

Note to Readers

This ChatGPT o1-generated article aims to thoughtfully and respectfully present the philosophical positions on antinatalism and existential pessimism. The discussions about suffering, procreation, and the right to die are complex and sensitive. If you or someone you know is struggling with such thoughts, please seek support from mental health professionals or trusted individuals in your community.

Next Steps

Based on reader interest and engagement, future articles may delve deeper into individual philosophers’ works, explore thematic elements such as consciousness and suffering, or address counterarguments in more detail. Your feedback and participation are valuable in shaping these discussions.

Let us continue this journey of philosophical exploration together.

The Relativity of Morality: A Penguin’s Tale

I recently watched The Penguin on HBO Max, a series set in DC’s Batman universe. Ordinarily, I avoid television – especially the superhero genre – but this one intrigued me. Less spandex, more mob drama. An origin story with a dash of noir. I’ll spare you spoilers, but suffice it to say that it was an enjoyable detour, even for someone like me who prefers philosophy over fistfights.

This post isn’t a review, though. It’s a springboard into a larger idea: morality’s subjectivity – or, more precisely, its relativity.

Audio: Spotify podcast related to this topic.

Morality in a Vacuum

Morality, as I see it, is a social construct. You might carry a private moral compass, but without society, it’s about as useful as a clock on a desert island. A personal code of ethics might guide you in solitary moments, but breaking your own rules – eating that forbidden biscuit after vowing to abstain, for instance – doesn’t carry the weight of a true moral transgression. It’s more akin to reneging on a New Year’s resolution. Who’s harmed? Who’s holding you accountable? The answer is: no one but yourself, and even then, only if you care.

The Social Contract

Introduce a second person, and suddenly, morality gains traction. Agreements form – explicit or tacit – about how to behave. Multiply that to the level of a community or society, and morality becomes a kind of currency, exchanged and enforced by the group. Sometimes, these codes are elevated to laws. And, ironically, the act of adhering to a law – even one devoid of moral content – can itself become the moral thing to do. Not because the act is inherently right, but because it reinforces the structure society depends upon.

But morality is neither universal nor monolithic. It is as fractured and kaleidoscopic as the societies and subcultures that create it. Which brings us back to The Penguin.

Crime’s Moral Code

The Penguin thrives in a criminal underworld where the moral compass points in a different direction. In the dominant society’s eyes, crime is immoral. Robbery, murder, racketeering – all “bad,” all forbidden. But within the subculture of organised crime, a parallel morality exists. Honour among thieves, loyalty to the family, the unspoken rules of the game – these are their ethics, and they matter deeply to those who live by them.

When one criminal praises another – “You done good” – after a successful heist or a precise hit, it’s a moral judgement within their own framework. Outside that framework, society condemns the same actions as abhorrent. Yet even dominant societies carve out their own moral exceptions. Killing, for instance, is broadly considered immoral. Murder is outlawed. But capital punishment? That’s legal, and often deemed not only acceptable but righteous. Kant argued it was a moral imperative. Nietzsche, ever the cynic, saw this duality for what it was: a power dynamic cloaked in self-righteousness.

In The Penguin, we see this dichotomy laid bare. The underworld isn’t without morals; it simply operates on a different axis. And while the larger society might disdain it, the hypocrisy of their own shifting moral codes remains unexamined.

Final Thoughts on the Series

I’ll save other philosophical musings about The Penguin for another time – spoilers would be unavoidable, after all. But here’s a quick review: the series leans into drama, eschewing flashy gimmicks for a grittier, more grounded tone. The writing is generally strong, though there are moments of inconsistency – plot holes and contrivances that mar an otherwise immersive experience. Whether these flaws stem from the writers, director, or editor is anyone’s guess, but the effect is the same: they momentarily yank the viewer out of the world they’ve built.

Still, it’s a worthwhile watch, especially if you’re a fan of mob-style crime dramas. The final episode was, in my estimation, the best of the lot – a satisfying culmination that leaves the door ajar for philosophical ruminations like these.

Have you seen it? What are your thoughts – philosophical or otherwise? Drop a comment below. Let’s discuss.

Dukkha, the Path of Pain, and the Illusion of Freedom: Buddhism, Antinatalism, and the Lonely Road of Individuation

The First Noble Truth of Buddhism—the notion that life is suffering, or dukkha—is often misinterpreted as a bleak condemnation of existence. But perhaps there’s something deeper here, something challenging yet quietly liberating. Buddhism doesn’t merely suggest that life is marred by occasional suffering; rather, it proposes that suffering is woven into the very fabric of life itself. Far from relegating pain to an exception, dukkha posits that dissatisfaction, discomfort, and unfulfilled longing are the baseline conditions of existence.

This isn’t to say that life is an unending stream of torment; even in nature, suffering may seem the exception rather than the rule, often concealed by survival-driven instincts and primal ignorance. But we, as conscious beings, are haunted by awareness. Aware of our mortality, our desires, our inadequacies, and ultimately, of our impotence to escape this pervasive friction. And so, if suffering is indeed the constant, how do we respond? Buddhism, antinatalism, and Jungian psychology each offer their own, starkly different paths.

The Buddhist Response: Letting Go of the Illusion

In Buddhism, dukkha is a truth that urges us not to look away but to peer more closely into the nature of suffering itself. The Buddha, with his diagnosis, didn’t suggest we simply “cope” with suffering but rather transform our entire understanding of it. Suffering, he argued, is born from attachment—from clinging to transient things, ideas, people, and identities. We build our lives on desires and expectations, only to find ourselves caught in a cycle of wanting, attaining, and inevitably losing. It’s a form of existential whiplash, one that keeps us bound to dissatisfaction because we can’t accept the impermanence of what we seek.

The Buddhist approach is both radical and elusive: by dissolving attachment and breaking the cycle of clinging, we supposedly dissolve suffering itself. The destination of this path—Nirvana—is not a state of elation or contentment but a transcendence beyond the very conditions of suffering. In reaching Nirvana, one no longer relies on external or internal validation, and the violence of social judgment, cultural obligation, and personal ambition falls away. This may seem austere, yet it offers a powerful antidote to a world that equates happiness with accumulation and possession.

Antinatalism: Opting Out of Existence’s Violence

Where Buddhism seeks liberation within life, antinatalism takes an even more radical stance: why bring new beings into an existence steeped in suffering? For antinatalists, the suffering embedded in life renders procreation ethically questionable. By creating life, we induct a new being into dukkha, with all its attendant violences—society’s harsh judgments, culture’s rigid impositions, the bureaucratic machinery that governs our daily lives, and the inescapable tyranny of time. In essence, to give birth is to invite someone into the struggle of being.

This perspective holds that the most humane action may not be to mend the suffering we encounter, nor even to accept it as Buddhism advises, but to prevent it altogether. It sees the cycle of life and death not as a majestic dance but as a tragic spiral, in which each generation inherits suffering from the last, perpetuating violence, hardship, and dissatisfaction. Antinatalism, therefore, could be seen as the ultimate recognition of dukkha—an extreme empathy for potential beings and a refusal to impose the weight of existence upon them.

Jungian Individuation: The Lonely Path of Becoming

Jung’s concept of individuation offers yet another approach: to delve deeply into the self, to integrate all aspects of the psyche—the conscious and the unconscious—and to emerge as a fully realised individual. For Jung, suffering is not to be escaped but understood and incorporated. Individuation is a journey through one’s darkest shadows, a confrontation with the parts of oneself that society, culture, and even one’s own ego would rather ignore. It is, in a way, an anti-social act, as individuation requires the courage to step away from societal norms and embrace parts of oneself that might be seen as disturbing or unconventional.

But individuation is a lonely road. Unlike the Buddhist path, which seeks to transcend suffering, individuation requires one to face it head-on, risking rejection and alienation. Society’s judgment, a kind of violence in itself, awaits those who deviate from accepted roles. The individuated person may, in effect, be punished by the very structures that insist upon conformity. And yet, individuation holds the promise of a more authentic existence, a self that is not a mere amalgam of cultural expectations but a reflection of one’s truest nature.

The Delusions That Keep Us Tethered to Suffering

Yet, for all their starkness, these paths might seem almost abstract, philosophical abstractions that don’t fully capture the reality of living within the constraints of society, culture, and self. Human beings are armed with powerful psychological mechanisms that obscure dukkha: self-delusion, cognitive dissonance, and hubris. We fabricate beliefs about happiness, purpose, and progress to protect ourselves from dukkha’s existential weight. We convince ourselves that fulfilment lies in achievements, relationships, or material success. Cognitive dissonance allows us to live in a world that we know, on some level, will disappoint us without being paralysed by that knowledge.

It’s worth noting that even those who acknowledge dukkha—who glimpse the violence of existence and the illusory nature of happiness—may still find themselves clinging to these mental defences. They are shields against despair, the comforting armours that allow us to navigate a world in which suffering is the baseline condition. This is why Buddhism, antinatalism, and individuation require such rigorous, often painful honesty: they each ask us to set down these shields, to face suffering not as a solvable problem but as an intrinsic truth. In this light, psychological defences are seen not as failures of awareness but as survival strategies, albeit strategies that limit us from ever fully confronting the nature of existence.

Finding Meaning Amidst the Violence of Being

To pursue any of these paths—Buddhist enlightenment, antinatalism, or Jungian individuation—one must be prepared to question everything society holds dear. They are radical responses to a radical insight: that suffering is not accidental but foundational. Each path offers a different form of liberation, whether through transcendence, abstention, or self-integration, but they all require a certain fearlessness, a willingness to look deeply into the uncomfortable truths about life and existence.

Buddhism calls us to renounce attachment and embrace impermanence, transcending suffering by reshaping the mind. Antinatalism challenges us to consider whether it is ethical to bring life into a world marked by dukkha, advocating non-existence as an escape from suffering. And individuation asks us to become fully ourselves, embracing the loneliness and alienation that come with resisting society’s violence against the individual.

Perhaps the most realistic approach is to accept that suffering exists, to choose the path that resonates with us, and to walk it with as much awareness as possible. Whether we seek to transcend suffering, avoid it, or integrate it, each path is a confrontation with the violence of being. And maybe, in that confrontation, we find a fleeting peace—not in the absence of suffering, but in the freedom to choose our response to it. Dukkha remains, but we may find ourselves less bound by it, able to move through the world with a deeper, quieter understanding.