Good Enough

As I approach my sixty-second year on earth, having almost expired in March, I’ve been a bit more reflective and introspective. One is categorical. I’ve been told over the years that I am ‘good’ or ‘excel’ at such and such, but I always know someone better—even on a personal level, not just someone out in the world. We can all assume not to be the next Einstein or Picasso, but I am talking closer than that.

During my music career, I was constantly inundated with people better than me. I spent most of my time on the other side of a mixing console, where I excelled. Even still, I knew people who were better for this or another reason. In this realm, I think of two stories. First, I had the pleasure and good fortune to work on a record with Mick Mars and Motley Crue in the mid-’80s. We had a chat about Ratt’s Warren DiMartini, and Mick told me that he knew that Warren and a spate of seventeen-year-olds could play circle around him, but success in the music business is not exclusively based on talent. He appreciated his position.

In this vein, I remember an interview with Tom Morello of Rage Against the Machine. As he was building his chops he came to realise that he was not going to be the next Shredder or Eddie Van Halen, so he focused on creating his own voice, the one he’s famous for. I know plenty of barely competent musicians who make it, and I know some virtual virtuosos who don’t. But it involves aesthetics and a fickle public, so all bets are off anyway.

As I reflect on myself, I consider art and photography. Always someone better. When I consider maths or science, there’s always someone better. Guitar, piano? Same story.

Even as something as vague and multidimensional as business, I can always name someone better. I will grant that in some instances, there literally is no better at some level—just different—, so I sought refuge and solace in these positions. Most of these involved herding cats, but I took what I could.

Looking back, I might have been better off ignoring that someone was better. There’s a spot for more than the best guitarist or singer or artist or policeman for that matter. As a musician, I never thrived financially—that’s why I was an engineer—, but I could have enjoyed more moments and taken more opportunities.

When I was 18, I was asked to join a country music band. I was a guitarist and they needed a bass player. I didn’t like country music, so I declined—part ego, part taste. Like I said, aesthetics.

As I got older and started playing gigs, I came to realise that just playing was its own reward. I even played cover bands, playing songs that were either so bad or so easy. But they were still fun. I’m not sure how that would have translated as playing exclusively country music day after day, but I still think I might have enjoyed myself—at least until I didn’t. And the experience would still have been there.

I was a software developer from the nineties to the early aughts. I was competent, but not particularly great. As it turns out, I wasn’t even very interested in programming on someone else’s projects. It’s like being a commercial artist. No, thank you. It might pay the bills, but at what emotional cost?

I was a development manager for a while, and that was even worse, so I switched focus to business analysis and programme management, eventually transitioning to business strategy and management consulting. I enjoyed these more, but I still always knew someone better.

On one hand, whilst I notice the differences, it’s lucky that I don’t care very much. Not everyone can be a LeBron James or a Ronaldo, but even the leagues are not filled with this talent. I’m not suggesting that a ten-year-old compete at this level, but I am saying if you like it, do it. But temper this with the advice at the Oracle of Delphi: Know thyself. But also remember that you might never be the best judge of yourself, so take this with a grain of salt. Sometimes, ‘good enough’ is good enough.

Illusions of Self: Evanescent Instants in Time

In the realm of existential contemplation, the notion of the ‘self’ is akin to a fleeting present moment. It flits into existence for a fraction of an attosecond, vanishing before our grasp. Much like the illusory present, the ‘self’ manifests briefly and then fades into the annals of the past, a mere connection of temporal slices.

When we traverse the corridors of time, we effortlessly speak of the ‘past,’ stringing together these slices into a continuous narrative. This amalgamation serves our language and thought processes, aiding idiomatic expression. Yet, it remains a construct, a fiction we collectively weave. It is akin to the frames of a movie, where the illusion of movement and coherence is crafted by arranging individual frames in rapid succession.

The ‘self’ follows a similar illusionary trajectory. It exists only inasmuch as we christen it, attributing a name to a fleeting instance of being. However, this existence is as fleeting and ephemeral as a mirage. We name it, we perceive it, but it dissolves like smoke upon closer inspection.

This existential musing reminds one of the fictional entity – the unicorn. We can name it, describe it, and even envision it, yet its tangible existence eludes us. The ‘self’ aligns itself with this enigmatic unicorn, an abstract concept woven into the fabric of human understanding.

In this dance of philosophical thought, published works echo similar sentiments. Renowned thinkers like Nietzsche, in his exploration of eternal recurrence, or Camus, delving into the absurdity of life, have grappled with the transient nature of the ‘self.’ Their writings form a canvas, painting the portrait of an existence that flits through time, leaving only traces of memory and illusion in its wake.

In conclusion, the ‘self’ is a fleeting enigma, a temporal wisp that vanishes as quickly as it appears. Like a raindrop in the river of time, it merges and dissipates, leaving behind an evanescent trace of what we conceive as ‘I’. The philosophical gaze peers through the mist, challenging the very essence of this ephemeral entity, inviting us to question the very fabric of our perceived reality.

Identity as Fiction: You Do Not Exist

Identity is a fiction; it doesn’t exist. It’s a contrivance, a makeshift construct, a label slapped on to an entity with some blurry amalgam of shared experiences. But this isn’t just street wisdom; some of history’s sharpest minds have said as much.

— Friedrich Nietzsche

Think about Hume, who saw identity as nothing more than a bundle of perceptions, devoid of any central core. Or Nietzsche, who embraced the chaos and contradictions within us, rejecting any fixed notion of self.

Edmund Dantes chose to become the Count of Monte Cristo, but what choice do we have? We all have control over our performative identities, a concept that Judith Butler would argue isn’t limited to gender but applies to the very essence of who we are.

— Michel Foucault

But here’s the kicker, identities are a paradox. Just ask Michel Foucault, who’d say our sense of self is shaped not by who we are but by power, society, and external forces.

You think you know who you are? Well, Erik Erikson might say your identity’s still evolving, shifting through different stages of life. And what’s “normal” anyway? Try to define it, and you’ll end up chasing shadows, much like Derrida’s deconstruction of stable identities.

— Thomas Metzinger

“He seemed like a nice man,” how many times have we heard that line after someone’s accused of a crime? It’s a mystery, but Thomas Metzinger might tell you that the self is just an illusion, a by-product of the brain.

Nations, they’re the same mess. Like Heraclitus’s ever-changing river, a nation is never the same thing twice. So what the hell is a nation, anyway? What are you defending as a nationalist? It’s a riddle that echoes through history, resonating with the philosophical challenges to identity itself.

— David Hume

If identity and nations are just made-up stories, what’s all the fuss about? Why do people get so worked up, even ready to die, for these fictions? Maybe it’s fear, maybe it’s pride, or maybe it’s because, as Kierkegaard warned, rationality itself can seem mad in a world gone astray.

In a world where everything’s shifting and nothing’s set in stone, these fictions offer some solid ground. But next time you’re ready to go to the mat for your identity or your nation, take a minute and ask yourself: what the hell am I really fighting for? What am I clinging to?

Small Town Sentiments

Servile compliance and vigilante justice are the core messages underlying Small Town. Comply, or else…

Watching the video, Try That in a Small Town by Jason Aldean, I was left pondering: Are there no convenience store robberies in small towns, or are petrol station and liquor store robberies exempt from scrutiny? Most mass school shootings happen in small towns. Am I missing something through the bravado?

I guess the works of the likes Truman Capote and Flannery O’Connor are lost to this generation, and the message of the Borg has faded into history.

Don’t dare be different or speak your mind about anything meaningful. Sure, serve ham over turkey on Thanksgiving. Be a rebel, but don’t complain about low wages or political subjugation…unless it’s what the local consensus believes.

But tightly-knit small towns will make sure that justice prevails even if it’s the extra-judicial flavour.

This video is divisive to the country as a whole at the expense of some small-town jingoism.

Oh, and don’t even think of burning that flag.

Whitewashing Spoken English

An AI startup is facing allegations of racism and discrimination after being accused of manipulating non-American accents to sound “more white.” The company uses speech recognition technology to change the user’s accent in near-real time. (Source)

Podcast: Audio rendition of this page content

Friction is an impediment to a perfect customer experience. Removing this friction is always welcome, but homogenisation by a dominant culture is a bit more sketchy. It’s laudable that someone aims to remove friction from communication. Raze that tower of Babel—or does it need constructing? I’m no biblical scholar. I’m all for fostering communication, but this control should be an option for the customer receiving the call, not the sender—press 1 if you don’t wish to hear a foreign accent.

When it comes down to it, translation services have the same challenge. Which accent comes out the other end? (I’ll guess it is similar to this one.)

And what American accent is being represented? The neutral accent of the flyover states, the Texas drawl, or the non-rhotic accent of Harvard Yard? I’m guessing it’s not California cool or urban Philadelphia or down on the bayou. Press 7 for Canadian English, eh?

It’s bad enough that US English, despite having a minority of speakers, is running roughshod over World English

It’s bad enough that US English, despite having a minority of speakers, is running roughshod over World English spelling and pronunciation, colonising the world via streaming services and infestation on the internet.

The BBC relaxed its RP requirements in 1989 for the purpose of regional cultural inclusiveness. Which direction do we want to go?

In the end, this is another example of businesses being more concerned with business than customers and the human experience.

As for me, I prefer an accent I don’t have to work so hard to discern. But at the same time, I’ve worked with many people whose first language is not English, and though it does take a bit more effort, it’s really not that difficult. Besides, I’ve heard native English speakers with regional accents and dialects that are just as taxing.

I sent a survey a month or so ago asking which regional accent people preferred. As it turned out—and not unsurprisingly—, people preferred the English they are used to hearing. Continental Indians preferred continental English; Americans wanted neutral American English; Jamaicans preferred Jamaican English, and British speakers preferred modern RP. And so it goes.

What’s your take?

Anatomy of a Social Media Challenge

As a Social Justice Warrior, I tend to favour diversity and inclusion as a principle. As such, I follow some people who share this interest. In fact, most of these people expend much more energy toward this end than I do. The challenge I am about to convey is that some people don’t read beyond the subject line, and don’t even attempt to assess the underlying claim, let alone the issue at hand.

I recently engaged in a nonsensical interaction that I am sharing and dissecting. It started with this share, an image of the border outline of Nigeria with an overlay caption that reads: “Nigeria becomes the first country to ban white and British models in all advertising”.

I’d like to point out two items in particular. Firstly, the caption is fabricated. I’ll get to the source reference presently. Secondly, the re-poster aptly corrects the caption when he shared it—”Well, all foreign models, but HELL YEAH!”

Nigeria recently passes a law that essentially assesses a tariff or levy on advertising content using non-Nigerian talent. There is no mention of ‘white’ models, though British models would fall under this umbrella. This protectionist law stems from nationalism. I’d guess that ‘white’ people comprise less than one per cent of the Nigerian national population, but I could be wrong. This is well outside my area of expertise.

My response was to say “Down with Nationalism and the Promotion of Otherism.”
I may be misinterpreting myself, but it feels to me that this is denouncing racism and other forms of otherness.

Sabrina responds, ‘Why is not having white models in advertising a bad thing?” and “Isn’t the whole point of advertising [for] people to…see themselves… ?”
In response, I should have pointed out that the initiative had nothing to do with skin colour. Instead, I responded to the second question: the point of advertising is to sell product. Full stop. If people see themselves with the product, then great. Clearly, this comprises a fraction of successful adverts. More common is to make a connection to what they aspire to. It’s not about making a social statement—unless, of course, that social statement will sell more product. If an ad with a white model will sell more product, a business would be derelict not to employ one; conversely, if white models result in lower sales, a business would be foolish not to switch to the more successful vector.

Sabrina really goes off the reservation with her reply, somehow conflating Nigeria with the African continent. Attention to detail is not her forte.

At this point, I feed into her laziness and send her a link to an Al-Jazeera article addressing the law.

She leaves with a parting shot, and I quote: “Have you ever thought about the harm you might cause by playing devil’s advocate and “creating an argument”?”

She’s off course and then attempts to diminish my point by calling it ‘playing devil’s advocate’ rather than admitting that she hadn’t even considered the rationale and possible ramifications. She didn’t even grasp the main point, so I suppose I should forgive her for not noticing secondary and edge cases.

At this point, Dr Perkins adds her voice. Her initial question is valid, and as I responded, the answer is “No”. The race card was introduced by some narrator who didn’t know what game he was broadcasting. But then she goes on to “applaud Nigeria for making a [decision] centering [on] Blackness”, save to say that was not what prompted the decision.

Notice, too, that other people “Liked” the other comments, a testament to the principle of least effort of the bystanders, too.

I recognise that the original post anchored the conversation off the actual topic, but it was also very easy to track down the reference and note the content discrepancy. Granted, this takes time and effort, but so does responding on a thread and then escalating commitment to a non-cause. And for one tilting at windmills to be tossing around accusations of playing devil’s advocate. It’s not a good sign.

But wait, there’s more. I commented on this post on a second thread.

In this case, Dr Anderson suggests that this is just “a country celebrating its own citizens by recognizing their beauty and knowing they can move product just as good, and probably better than white women, to which I responded that this is a testable hypothesis. It’s either true that on balance white models sell more product or black models do. Again, don’t fail to miss the point that none of this is about white versus black models.

Somehow, LinkedIn can’t seem to keep their threads in order, but Ms Rice takes my hypothesis testing point as a support for racism before precipitating to full-on troll mode.

It scares me to see that there are two academic doctors participating in this thread, neither with a trait of attention to detail nor even a fundamental pursuit of evidence.

This is why it is difficult to engage with social media. You have no idea what level a commenter is coming in on. And even when spoon-fed information, they refuse to alter their position. In fact, they tend to double down on their wrongness.
Moving on…

Book Review: Conspiracy Against the Human Race

The Conspiracy against the Human Race is a work of non-fiction by horror author Thomas Ligotti. There is an audio podcast version and a YouTube video version. Feel free to leave comments in the space below or on YouTube.

Transcript

In this segment, I’ll be reviewing a book by Thomas Ligotti, The Conspiracy Against the Human Race, A Contrivance of Horror.

I haven’t done any book reviews, but since I tend to read a lot of books, I figure why not share my take and see how it’s received? If you like these reviews, click the like button and I’ll consider creating more.

Let’s get started.

First, I’ll be providing a little background, and then I’ll summarise some of the content and main themes. I’ll close with my review and perspective.

The author is Thomas Ligotti. He is a published writer in the horror genre in the vein of Lovecraft’s atmospheric horror. I’ve not read any of his work and haven’t read much fiction in ages.

The Conspiracy Against the Human Race is Ligotti’s first work of non-fiction. The book was originally published in 2010. I read the 2018 paperback version published by Penguin Books.

Conspiracy Against the Human Race falls into the category of Ethics and Moral Philosophy in a subcategory of pessimism. The main thesis of this book is that humans ought never to have been born. Following in the footsteps of anti-natalist David Benatar, who published Better Never to Have Been Born in 2007, Ligotti doubles down on Benatar’s position on the harm of coming into existence and argues that humans should just become extinct. Moreover, we should take out life in general.

In the book, Ligotti posits that consciousness was a blunder of nature and is the root of all suffering. He argues the derived Buddhist position of dukkha, which translates as Life is suffering. He establishes that most people are aware of this fact, but that we are nonetheless wired to be biased toward optimism through delusion and what a psychoanalyst might call repressed memories. Moreover, pessimists are a cohort not tolerated by society, who don’t want their delusions shattered.

Philosophically, Ligotti is a determinist. I’ve created content on this topic, but in a nutshell, determinism is the belief that all events are caused by antecedent events, leading to a chain of causes and effects stretching back to the beginning of time and bringing us to where we are now. If we were able to rewind time and restart the process, we would necessarily end up in the same place, and all future processes will unfold in a like manner.

Ligotti likes the metaphor of puppets. He employs puppets in two manners. Firstly, being the determinist he is, he reminds us that we are meat puppets with no free will. Our strings are controlled by something that is not us. This something ends up being Schopenhauer’s Will, reminding us that one can want what we will, but we can’t will what we will. This Will is the puppeteer. Secondly, puppets are soulless, lifeless homunculi that are employed in the horror genre to create unease by means of an uncanny association.
He cites the work and philosophy of Norwegian author Peter Zapffe, who also elucidates human existence as a tragedy. Humans are born with one and only one right—the right to die. And death is the only certainty. The knowledge of this causes unnecessary suffering.

Quoting Ligotti,

Stringently considered, then, our only natural birthright is a right to die. No other right has ever been allocated to anyone except as a fabrication, whether in modern times or days past. The divine right of kings may now be acknowledged as a fabrication, a falsified permit for prideful dementia and impulsive mayhem. The inalienable rights of certain people, on the other hand, seemingly remain current: somehow we believe they are not fabrications because hallowed documents declare they are real.

Ligotti reminds us that consciousness is a mystery. We don’t really know what it is or what causes it other than it exists and we seem to have it, to be cursed with it. He adopts Zapffe’s position that consciousness is also responsible for the false notion of the self.

As all life is, humans are the result of an evolutionary process. Consciousness was just the result of an evolutionary blunder. He cites Zapffe and conveys that “mutations must be considered blind. They work, are thrown forth, without any contact of interest with their environment.”

Whilst pessimists view consciousness as a curse, optimists such as Nicholas Humphry think of it as a marvellous endowment.

He summarises the reason humans have it worse than the rest of nature:

For the rest of the earth’s organisms, existence is relatively uncomplicated. Their lives are about three things: survival, reproduction, death—and nothing else. But we know too much to content ourselves with surviving, reproducing, dying—and nothing else. We know we are alive and know we will die. We also know we will suffer during our lives before suffering—slowly or quickly—as we draw near to death. This is the knowledge we “enjoy” as the most intelligent organisms to gush from the womb of nature. And being so, we feel shortchanged if there is nothing else for us than to survive, reproduce, and die. We want there to be more to it than that, or to think there is. This is the tragedy: Consciousness has forced us into the paradoxical position of striving to be unself-conscious of what we are—hunks of spoiling flesh on disintegrating bones.”

I’ll repeat that: Consciousness has forced us into the paradoxical position of striving to be unself-conscious.

He cites Zapffe’s four principal strategies to minimise our consciousness, isolation, anchoring, distraction, and sublimation

  1. Isolation is compartmentalising the dire facts of being alive. So, he argues, that a coping mechanism is to push our suffering out of sight, out of mind, shoved back into the unconscious so we don’t have to deal with it.
  2. Anchoring is a stabilisation strategy by adopting fictions as truth. We conspire to anchor our lives in metaphysical and institutional “verities”—God, Morality, Natural Law, Country, Family—that inebriate us with a sense of being official, authentic, and safe in our beds.
  3. Distraction falls into the realm of manufactured consent. People lose themselves in their television sets, their government’s foreign policy, their science projects, their careers, their place in society or the universe, et cetera. Anything not to think about the human condition.
  4. Sublimation. This reminds me of Camus’ take on the Absurd. Just accept it. Embrace it and incorporate it into your routine. Pour it into your art or music. Ligotti invokes Camus’ directive that we must imagine Sisyphus happy, but he dismisses the quip as folly.

Ligotti underscores his thesis by referencing the works of other authors from David Benatar to William James.

Interestingly, he suggests that people who experience depression are actually in touch with reality and that psychology intervenes to mask it again with the preferred veil of delusion and delf-deception. Society can’t operate if people aren’t in tune with the masquerade. Citing David Livingstone Smith in his 2007 publication, Why We Lie: The Evolution of Deception and the Unconscious Mind, Ligotti writes:
“Psychiatry even works on the assumption that the “healthy” and viable is at one with the highest in personal terms. Depression, “fear of life,” refusal of nourishment and so on are invariably taken as signs of a pathological state and treated thereafter.”

Ligotti returns to the constructed notion of the self and presents examples of how a lack of self is an effective horror trope, citing John Carpenter’s The Thing and Invasion of the Body Snatchers.

He spends a good amount of time on ego-death and the illusion of self, a topic I’ve covered previously. He mentions Thomas Metzinger and his writings in several places including his Being No One, published in 2004, ostensibly reinforcing a position described as naïve realism, that things not being knowable as they really are in themselves, something every scientist and philosopher knows.

He delves into Buddhism as a gateway to near-death experiences, where people have dissociated their sense of self, illustrating the enlightenment by accident of U. G. Krishnamurti, who after some calamity “was no longer the person he once was, for now he was someone whose ego had been erased. In this state, he had all the self-awareness of a tree frog. To his good fortune, he had no problem with his new way of functioning. He did not need to accept it, since by his report he had lost all sense of having an ego that needed to accept or reject anything.” Krishnamurti had become a veritable zombie. He also cited the examples of Tem Horwitz, John Wren-Lewis, and Suzanne Segal, but I won’t elaborate here.

Russian Romantic author, Leo Tolstoy, famous for War and Peace and Anna Karenina, was another pessimist. He noticed a coping approach his associates had employed to deal with their morality.

  1. Ignorance is the first. As the saying goes, ignorance is bliss. For whatever reason, these people are simply blind to the inevitability of their mortal lives. As Tolstoy said these people just did not know or understand that “life is an evil and an absurdity”.
  2. Epicureanism comes next. The tactic here is to understand that we are all in here and no one gets out alive, so we might as well make the best of it and adopt a hedonistic lifestyle.
  3. Following Camus’ cue, or rather Camus following Tolstoy and Schopenhauer, he suggests the approach of strength and energy, by which he means the strength and energy to suicide.
  4. Finally, one can adopt the path of weakness. This is the category Tolstoy finds himself in, writing “People of this kind know that death is better than life, but not having the strength to act rationally—to end the deception quickly and kill themselves—they seem to wait for something.”

The last section of the book feels a bit orthogonal to the rest. I won’t bother with details, but essentially he provides the reader with examples of how horror works by exploring some passages, notably Radcliffe’s, The Mysteries of Udolpho; Conrad’s Heart of Darkness; Poe’s Fall of the House of Usher; Lovecraft’s Call of Cthulhu; and contrasting Shakespeare’s Macbeth and Hamlet.

This has been a summary of Thomas Logotti’s Conspiracy against the human race. Here’s my take. But first some background, as it might be important to understand where I am coming from.

I am a Nihilist. I feel that life has no inherent meaning, but people employ existentialist strategies to create a semblance of meaning, much akin to Zapffe’s distraction theme or perhaps anchoring.
This said I feel that, similar to anarchism, people don’t understand nihilism. Technically, it’s considered to be a pessimistic philosophy because they are acculturated to expect meaning, but I find it liberating. People feel that without some constraints of meaning, that chaos will ensue as everyone will adopt Tolstoy’s Epicureanism or to fall into despair and suicide. What they don’t know is they’ve already fabricated some narrative and have adopted one of Zappfe’s first three offerings: isolation, which is to say repression); anchoring on God or country; or distracting themselves with work, sports, politics, social media, or reading horror stories.

Because of my background, I identify with Ligotti’s position. I do feel the suffering and anguish that he mentions, and perhaps I am weak and rationalising, but I don’t feel that things are so bad. I may be more sympathetic to Benatar’s anti-natalism than to advocate for a mass extinction event, though I feel that humans are already heading down that path. Perhaps this could be psychoanalysed as collective guilt, but I won’t go there.

I recommend reading this. I knocked it out in a few hours, and you could shorten this by skipping the last section altogether. If you are on the fence, I’d suggest reading David Benatar’s Better Never to Have Been. Perhaps I’ll review that if there seems to be interest. If you’ve got the time, read both.

So there you have it. That’s my summary and review of Thomas Ligotti’s The Conspiracy against the Human Race.

Before I end this, I’ll share a personal story about an ex-girlfriend of mine. Although she experienced some moments of happiness and joy, she saw life as a burden. Because she had been raised Catholic and embodied the teachings, she was afraid that committing suicide would relegate her to hell. In fact, on one occasion, she and her mum had been robbed at gunpoint, and her mum stepped between my girlfriend and the gun. They gave the gunmen what they wanted, so the situation came to an end.

My girlfriend laid into her mother that if she ever did something like that again and took a bullet that was her ticket out, she would never forgive her. As it turned out, my girlfriend died as collateral damage during the Covid debacle. She became ill, but because she was living with her elderly mum, she didn’t want to go to hospital and bring something back. One early morning, she was writhing in pain and her mum called the ambulance. She died later that morning in hospital, having waited too long.

For me, I saw the mercy in it all. She got her ticket out and didn’t have to face the hell eventuality. Not that I believe in any of that, but she was able to exit in peace. Were it not for the poison of religion, she could have exited sooner. She was not, in Tolstoy’s words, weak, so much as she had been a victim of indoctrination. I feel this indoctrination borders on child abuse, but I’ll spare you the elaboration.
So, what are your thoughts on this book? Is there a conspiracy against humanity? Are optimists ruining it for the pessimists? What do you think about anti-natalism or even extinction of all conscious beings or the extreme case of all life on earth? Is Ligotti on to something or just on something?

Share your thoughts in the comments below.

The Meaning of Life for Sisyphus

In pursuit of my travail intellectuel, I stumbled on a thought experiment proposed by Richard Taylor regarding an old crowd favourite, Sisyphus.

Of course, Albert Camus had famously published his Myth of Sisyphus essay (PDF), portraying his life as analogous to the workaday human, absurdly plodding through existence like rinse and repeat clockwork—same gig on a different day.

Given my perspective on human agency and the causa sui argument, I felt commenting on Taylor’s essay, The Meaning of Life (PDF) would be apt.

The story of Sisyphus finds the namesake character, fated by the gods to each day push a stone up a hill only for it to roll back down for him to push it back up every day ad infinitum. Camus leaves us with the prompt, ‘One must imagine Sisyphus happy’. But must we.?

As Taylor puts it,

Sisyphus, it will be remembered, betrayed divine secrets to mortals, and for this he was condemned by the gods to roll a stone to the top of a hill, the stone then immediately to roll back down, again to be pushed to the top by Sisyphus, to roll down once more, and so on again and again, forever. Now in this we have the picture of meaningless, pointless toil, of a meaningless existence that is absolutely never redeemed.

Taylor wants us to consider an amended Sisyphus. He writes,

Let us suppose that the gods, while condemning Sisyphus to the fate just described, at the same time, as an afterthought, waxed perversely merciful by implanting in him a strange and irrational impulse; namely, a compulsive impulse to roll stones.

This significantly alters the dynamic. In the scenario, Sisyphus is not toiling; rather, he is pursuing his passion—following his heart. This is the athlete, artist, politician, or mass murderer following their passion. In fact, one might say that he is being his authentic self. He has no control over his self or his desire to roll stones, but he is in his element.

Taylor’s ultimate point is that in either case, the life of Sisyphus is just as devoid of meaning. Ostensibly, nothing can provide meaning. The best one can do is to have the perception of meaning. He writes,

Sisyphus’ existence would have meaning if there were some point to his labors, if his efforts ever culminated in something that was not just an occasion for fresh labors of the same kind. But that is precisely the meaning it lacks.

Although we cannot control what is within, contentment and happiness derive from perception. As we might be reminded by the quip attributed to Schopenhauer,

We can want what we will,
but we can’t will what we will.

In the end, Taylor wants us to know that nothing out there can make us happy.

The meaning of life is from within us, it is not bestowed from without, and it far exceeds in both its beauty and permanence any heaven of which men have ever dreamed or yearned for.


I’ve subsequently read some critiques of Taylor’s position, but I don’t want to take the time to rejoin them. Suffice it to say that I find them to be weak and wanting.

Institutionalised

Jordan Peterson is decidedly not my cup of tea. I can tolerate Pinker and Haidt. I agree with much of what they have to say, but in this video, the dissonance finally dawns on me. Interestingly, I can tolerate Peterson within the scope of this discussion.

I don’t agree with much of what these three are saying, but it is refreshing to hear Peterson outside of a philosophical domain, a place where he has no place. And although I don’t agree with him here, it is on the basis of his argumentation rather than his abject ineptitude.

I disagree with this trio. This video reveals these three people as Institutionalists. Peterson may be a political Conservative versus Pinker’s and Haidt’s enlightened Liberalism, but this is a common core value they defend with escalating commitment. Typically, we find these to be polar opposites, but here they have a common enemy that is not necessarily anti-institutionalists or anarchists but people who don’t understand venerable institutions and thereby risk tipping the apple cart or toppling the Jenga tower because they just don’t understand. Not like them. Besides constitutionalism, the common thread is Paternalism. They may disagree on the specifics, but one thing is true: We know more than you, and this knowledge is embedded in the sacred institutions. If only the others understood.

In this video, we hear these three commiserate about the diversity and inclusion forces in University today, and where this movement is off base.

Video: No-Self, Self, and Selves

First, this is an extension of sorts from a prior post on No-Self, Selves & Self, but I wanted to create a short video for my YouTube channel to establish somewhat of a foundation for my intended video on the causa sui argument. Related content can be found on this one of the Theseus posts.

This video is under 8-minutes long and provides some touch-points. I had considered making it longer and more comprehensive, but since it is more of a bridge to a video I feel is more interesting, I cut some corners. This leaves openings for more in-depth treatment down the road.

As has become a routine, I share the transcript here for convenience and SEO relevance.

Transcript

In this segment of free will scepticism, we’ll establish some perspectives on the notion of the self.
Most of us in the West are familiar with the notion of the self. What’s your self? It’s me. For the more pedantic crowd, It is I.

We’re inundated with everything from self-help to self-awareness to self-esteem to selfies and self-love. We’ve got self-portraits, self-image, and self-harm. We’ve got self-ish and self-less.
We’ve even got self-oriented psychological disorders like narcissism. Attending to the self is a billion-dollar industry.

And whilst psychology and pop-psychology seem to consider the self to be a nicely wrapped package fastened tightly with a bow, it’s a little more contentious within philosophy. But there are other perspectives that don’t include the self, from no-self to slices of discontiguous selves. Let’s shift gears and start from the notion of having no self, what Buddhism calls no-self.

No-Self

Buddhism is an Eastern discipline, so it does not have the same foundations as the West. According to this system of belief, the notion of a personal identity is delusional, so there is no self at all. This obsession and clinging to this delusional self is a major cause of suffering.

the notion of a personal identity is delusional,
so there is no self at all.

In this view, all is one and indivisible, but self-deception leads us to believe we are individuals, each with a discrete self. In fact, the Buddhist notion of Enlightenment—as opposed to the Western notion of Enlightenment—is precisely this realisation that there is only one self, and this is the collective self. But, to be fair, except for the times where the self has yet to be developed—we’ll get to this in a bit—, this notion of no-self is aspirational in the sense of losing one’s self in order to reduce suffering.

The concept of selflessness exists in language, but this is more aimed at sublimating the self in favour of a greater collective good.

Self

The self is the central feature of many personality theories from Sigmund Freud and Carl Jung to Rollo May and Abraham Maslow. From individuation to self-actualisation. The self is self-referenced as I and me. Historically, the self had been considered to be synonymous with some metaphysical soul. Nowadays, psychology has taken the reigns on definitions.

One version of the self can be thought of as a single thread connecting beads of experience through time, time-slices of experience. We’ll come back to this. This sense of self extends backwards in time until now and contains aspirations projected forward in time as viewed from the perspective of now.

This sense of self extends backwards in time until now and contains aspirations projected forward in time as viewed from the perspective of now.

Whilst we use terms like ‘person’, ‘self’, and ‘individual’ somewhat synonymously, they each have different meanings. Whereas ‘individual’ is a biological term; ‘person’ is sociological or cultural; ‘self’ is psychological. Although the default position in the West is the adoption of the psychological notion, where each person has a self, there is also a philosophical notion. Given that the perspective of self is so ubiquitous with people accepting it as obvious, that it feels like I shouldn’t even spend time producing content to fill this space. But for a sense of completeness, I shall.

Psychologist William James distinguished between the ‘I’ and ‘me’ sense of the self, but let’s not parse this and consider each a stand-in for the self as experienced by the self. In this view, the self is generally considered to be the aggregation of continuous phenomenological moments and how we interpret them into a sense of ‘identity’.

In the West, the notion of having a self is imposed by convention. To feel otherwise is considered to be a sign of mental illness. As much as I want to share Foucault’s perspective on how delineating mental illness operates to the benefit of power structures, let’s just consider this out of scope. The Diagnostic and Statistical Manual of Mental Disorders, DSM-5, notes that a key symptom of borderline personality disorder, BPD, is a ‘markedly and persistently unstable self-image or sense of self’. Become selfless at your own peril.

There are challenges with the notion of self even in psychology. In developmental psychology, the self—differentiating one’s self into an identity separate from the world—, is not acquired until about the age of 18 months. Lacan had suggested that this so-called mirror stage developed at around 5 months as part of ego formation, but further research disputes this.

Although I won’t go into detail, individualist cultures experience the self differently than collectivist cultures. The origin of the concept of the individualistic view of self can be traced to early Christianity. In American culture, Protestantism seems to be a primary driver of the individualistic view of self. Let’s continue.

Selves

Heraclitus quipped, ‘No man ever steps in the same river twice, for it’s not the same river and he’s not the same man’. This is a nod to the impermanence of the self. Instead, there are selves.

No man ever steps in the same river twice, for it’s not the same river and he’s not the same man

Heraclitus

Galen Strawson proposes that although he understands intellectually what others mean when they use the word self, he doesn’t share this experience emotionally. Unlike the phenomenological slices connected by a thread, he doesn’t feel he has a thread. He posits that he experiences this prevailing sense of narrativity episodically without continuity.

A typical view of the self is that one feels narratively connected to past slices—the 5-year-old self with the 20-year-old self and with the 50-year-old self, whether that 50-year-old self is in the past, present, or future. Even though we are not the same person, there is some felt affinity.

My View

As for me, I consider the self to be a constructed fiction that serves a heuristic function. I don’t feel as disconnected as it seems Strawson does, but I don’t feel very connected to my 7 or 8-year-old self. And I can’t even remember before that. I’m not even sure I’ve got one data point for each year between 8 and 12, and it doesn’t get much better until 18 or 20. From there, I may be able to cobble together some average of a dozen or so per year without prompting, but I don’t even feel like the same person. Many of my views and perspectives have changed as well.

I don’t even feel like the same person

I was in the military until I quit as a Conscientious Objector. During that time, I became aware of Buddhism, and I doubled down on my musical interests. I worked in the Entertainment industry until I became an undergrad student, transitioning to become a wage slave whilst also attending grad school until I graduated. I’ve had several career foci since then. With each change, I’ve had a different self with a different outlook.

Can I connect the dots? Sort of. But I can also create a thematic collage out of magazine clippings or create art with found objects. I can tell a disjointed story of how I transitioned from X to Y to Z. It may even contain some elements of truth. Given how memory operates, who can tell?

In any case, what about you? In the next segment, I’m going to be discussing why we may not have free will owing to a lack of agency based on a causa sui argument.

Do you feel like you have a self? Does your sense of self have any gaps or inconsistencies? Do you feel you don’t have a self at all?

Let me know in the comments below.