When Suspension of Disbelief Escapes the Page

Welcome to the Age of Realism Fatigue

Once upon a time — which is how all good fairy tales begin — suspension of disbelief was a tidy little tool we used to indulge in dragons, space travel, talking animals, and the idea that people in rom-coms have apartments that match their personalities and incomes. It was a temporary transaction, a gentleman’s agreement, a pact signed between audience and creator with metaphorical ink: I know this is nonsense, but I’ll play along if you don’t insult my intelligence.

Audio: NotebookLM podcast of this page content.

This idea, famously coined by Samuel Taylor Coleridge as the “willing suspension of disbelief,” was meant to give art its necessary air to breathe. Coleridge’s hope was that audiences would momentarily silence their rational faculties in favour of emotional truth. The dragons weren’t real, but the heartbreak was. The ghosts were fabrications, but the guilt was palpable.

But that was then. Before the world itself began auditioning for the role of absurdist theatre. Before reality TV became neither reality nor television. Before politicians quoted memes, tech CEOs roleplayed as gods, and conspiracy theorists became bestsellers on Amazon. These days, suspension of disbelief is no longer a leisure activity — it’s a survival strategy.

The Fictional Contract: Broken but Not Forgotten

Traditionally, suspension of disbelief was deployed like a visitor’s badge. You wore it when entering the imagined world and returned it at the door on your way out. Fiction, fantasy, speculative fiction — they all relied on that badge. You accepted the implausible if it served the probable. Gandalf could fall into shadow and return whiter than before because he was, after all, a wizard. We were fine with warp speed as long as the emotional logic of Spock’s sacrifice made sense. There were rules — even in rule-breaking.

The genres varied. Hard sci-fi asked you to believe in quantum wormholes but not in lazy plotting. Magical realism got away with absurdities wrapped in metaphor. Superhero films? Well, their disbelief threshold collapsed somewhere between the multiverse and the Bat-credit card.

Still, we always knew we were pretending. We had a tether to the real, even when we floated in the surreal.

But Then Real Life Said, “Hold My Beer.”

At some point — let’s call it the twenty-first century — the need to suspend disbelief seeped off the screen and into the bloodstream of everyday life. News cycles became indistinguishable from satire (except that satire still had editors). Headlines read like rejected Black Mirror scripts. A reality TV star became president, and nobody even blinked. Billionaires declared plans to colonise Mars whilst democracy quietly lost its pulse.

We began to live inside a fiction that demanded that our disbelief be suspended daily. Except now, it wasn’t voluntary. It was mandatory. If you wanted to participate in public life — or just maintain your sanity — you had to turn off some corner of your rational mind.

You had to believe, or pretend to, that the same people calling for “freedom” were banning books. That artificial intelligence would definitely save us, just as soon as it was done replacing us. That social media was both the great democratiser and the sewer mainline of civilisation.

The boundary between fiction and reality? Eroded. Fact-checking? Optional. Satire? Redundant. We’re all characters now, improvising in a genreless world that refuses to pick a lane.

Cognitive Gymnastics: Welcome to the Cirque du Surréalisme

What happens to a psyche caught in this funhouse? Nothing good.

Our brains, bless them, were designed for some contradiction — religion’s been pulling that trick for millennia — but the constant toggling between belief and disbelief, trust and cynicism, is another matter. We’re gaslit by the world itself. Each day, a parade of facts and fabrications marches past, and we’re told to clap for both.

Cognitive dissonance becomes the default. We scroll through doom and memes in the same breath. We read a fact, then three rebuttals, then a conspiracy theory, then a joke about the conspiracy, then a counter-conspiracy about why the joke is state-sponsored. Rinse. Repeat. Sleep if you can.

The result? Mental fatigue. Not just garden-variety exhaustion, but a creeping sense that nothing means anything unless it’s viral. Critical thinking atrophies not because we lack the will but because the floodwaters never recede. You cannot analyse the firehose. You can only drink — or drown.

Culture in Crisis: A Symptom or the Disease?

This isn’t just a media problem. It’s cultural, epistemological, and possibly even metaphysical.

We’ve become simultaneously more skeptical — distrusting institutions, doubting authorities — and more gullible, accepting the wildly implausible so long as it’s entertaining. It’s the postmodern paradox in fast-forward: we know everything is a construct, but we still can’t look away. The magician shows us the trick, and we cheer harder.

In a world where everything is performance, authenticity becomes the ultimate fiction. And with that, the line between narrative and news, between aesthetic and actuality, collapses.

So what kind of society does this create?

One where engagement replaces understanding. Where identity is a curated feed. Where politics is cosplay, religion is algorithm, and truth is whatever gets the most shares. We aren’t suspending disbelief anymore. We’re embalming it.

The Future: A Choose-Your-Own-Delusion Adventure

So where does this all end?

There’s a dark path, of course: total epistemic breakdown. Truth becomes just another fandom and reality a subscription model. But there’s another route — one with a sliver of hope — where we become literate in illusion.

We can learn to hold disbelief like a scalpel, not a blindfold. To engage the implausible with curiosity, not capitulation. To distinguish between narratives that serve power and those that serve understanding.

It will require a new kind of literacy. One part media scepticism, one part philosophical rigour, and one part good old-fashioned bullshit detection. We’ll have to train ourselves not just to ask “Is this true?” but “Who benefits if I believe it?”

That doesn’t mean closing our minds. It means opening them with caution. Curiosity without credulity. Wonder without worship. A willingness to imagine the impossible whilst keeping a firm grip on the probable.

In Conclusion, Reality Is Optional, But Reason Is Not

In the age of AI, deepfakes, alt-facts, and hyperreality, we don’t need less imagination. We need more discernment. The world may demand our suspension of disbelief, but we must demand our belief back. In truth, in sense, in each other.

Because if everything becomes fiction, then fiction itself loses its magic. And we, the audience, are left applauding an empty stage.

Lights down. Curtain call.
Time to read the footnotes.

Blinded by Bias: The Irony of Greed and Self-Perception

Greed is a vice we readily recognise in others but often overlook in ourselves. This selective perception was strikingly evident during a recent conversation I had with a man who was quick to condemn another’s greed while remaining oblivious to his own similar tendencies. I told him about the escalating greed of certain companies who profit greatly from selling their printer inks and toner brands. I’ll spare you this history. This encounter underscores the powerful influence of fundamental attribution bias on our judgments and self-awareness.

Exploring Greed

Greed can be defined as an intense and selfish desire for something, especially wealth, power, or food. Psychologically, it is considered a natural human impulse that, when unchecked, can lead to unethical behaviour and strained relationships. Societally, greed is often condemned, yet it persists across cultures and histories.

We tend to label others as greedy when their actions negatively impact us or violate social norms. However, when we aggressively pursue our interests, we might frame it as ambition or resourcefulness. This dichotomy reveals a discrepancy in how we perceive greed in ourselves versus others.

Understanding Fundamental Attribution Bias

Fundamental attribution bias, or fundamental attribution error, is the tendency to attribute others’ actions to their character while attributing our own actions to external circumstances. This cognitive bias allows us to excuse our behaviour while holding others fully accountable for theirs.

For example, if someone cuts us off in traffic, we might think they’re reckless or inconsiderate. But if we cut someone off, we might justify it by claiming we were late or didn’t see them. This bias preserves our self-image but distorts our understanding of others.

The Conversation

Our conversation was centred on an HP printer that has shown a ‘low ink – please replace’ message since the cartridge was first installed. I recounted the history of the ink and toner industry. HP had a monopoly on ink for their products, a situation that earned them substantial marginal profits. Upstarts entered the marketplace. This started an escalating arms war. HP spent R&D dollars trying to defend their profit margins with nil benefit to the consumers of their product. In fact, it kept costs artificially higher. Competitors who wanted a slice of those fat margins found ways around these interventions. Eventually, HP installed chips on their toner cartridges. Unfortunately, they have a bug – or is it a feature? If you install a cartridge and remove it, it assumes you’re up to something shady, so it spawns this false alert. Some people believe this out of hand, so HP benefits twice.

If this bloke had worked for HP and had been responsible for revenue acquisition and protection, he would have swooned over the opportunity. Have no doubt. At arm’s length, he recognised this as sleazy, unethical business practices.

This conversation revealed how easily we can fall into the trap of judging others without reflecting on our own behaviour. His indignation seemed justified to him, yet he remained unaware of how his actions mirrored those he criticised.

Biblical Reference and Moral Implications

This situation brings to mind the biblical passage from Matthew 7:3-5:

“Why do you look at the speck of sawdust in your brother’s eye and pay no attention to the plank in your own eye? … You hypocrite, first take the plank out of your own eye, and then you will see clearly to remove the speck from your brother’s eye.”

The verse poignantly captures the human tendency to overlook our flaws while magnifying those of others. It calls for introspection and humility, urging us to address our shortcomings before passing judgment.

The Asymmetry of Self-Perception

Several psychological factors contribute to this asymmetry:

  • Self-Serving Bias: We attribute our successes to internal factors and our failures to external ones.
  • Cognitive Dissonance: Conflicting beliefs about ourselves and our actions create discomfort, leading us to rationalize or ignore discrepancies.
  • Social Comparison: We often compare ourselves favourably against others to boost self-esteem.

This skewed self-perception can hinder personal growth and damage relationships, as it prevents honest self-assessment and accountability.

Overcoming the Bias

Awareness is the first step toward mitigating fundamental attribution bias. Here are some strategies:

  1. Mindful Reflection: Regularly assess your actions and motivations. Ask yourself if you’re holding others to a standard you’re not meeting. Riffing from ancient moral dictates, just ask yourself if this is how you would want to be treated. Adopt Kant’s moral imperative framework.
  2. Seek Feedback: Encourage honest input from trusted friends or colleagues about your behaviour.
  3. Empathy Development: Practice seeing situations from others’ perspectives to understand their actions more fully.
  4. Challenge Assumptions: Before making judgments, consider external factors that might influence someone’s behaviour.

By actively recognising and adjusting for our biases, we can develop more balanced perceptions of ourselves and others.

Conclusion

The irony of condemning in others what we excuse in ourselves is a common human pitfall rooted in fundamental attribution bias. The adage, ‘Know thyself’ might come into view here. We can overcome these biases by striving for self-awareness and empathy, leading to more authentic relationships and personal integrity.

Morality: The Mirage of Subjectivity Within a Relative Framework

Morality, that ever-elusive beacon of human conduct, is often treated as an immutable entity—a granite monolith dictating the terms of right and wrong. Yet, upon closer inspection, morality reveals itself to be a mirage: a construct contingent upon cultural frameworks, historical conditions, and individual subjectivity. It is neither absolute nor universal but, rather, relative and ultimately subjective, lacking any intrinsic meaning outside of the context that gives it shape.

Audio: Spotify podcast conversation about this topic.

Nietzsche: Beyond Good and Evil, Beyond Absolutes

Friedrich Nietzsche, in his polemical Beyond Good and Evil and On the Genealogy of Morality, exposes the illusion of objective morality. For Nietzsche, moral systems are inherently the products of human fabrication—tools of power masquerading as eternal truths. He describes two primary moralities: master morality and slave morality. Master morality, derived from the strong, values power, creativity, and self-affirmation. Slave morality, by contrast, is reactive, rooted in the resentment (ressentiment) of the weak, who redefine strength as “evil” and weakness as “good.”

Nietzsche’s critique dismantles the notion that morality exists independently of cultural, historical, or power dynamics. What is “moral” for one era or society may be utterly abhorrent to another. Consider the glorification of war and conquest in ancient Sparta versus the modern valorisation of equality and human rights. Each framework exalts its own virtues not because they are universally true but because they serve the prevailing cultural and existential needs of their time.

The Myth of Monolithic Morality

Even viewed through a relativistic lens—and despite the protestations of Immanuel Kant or Jordan Peterson—morality is not and has never been monolithic. The belief in a singular, unchanging moral order is, at best, a Pollyanna myth or wishful thinking, perpetuated by those who prefer their moral compass untroubled by nuance. History is not the story of one moral narrative, but of a multiplicity of subcultures and countercultures, each with its own moral orientation. These orientations, while judged by the dominant moral compass of the era, always resist and redefine what is acceptable and good.

If the tables are turned, so is the moral compass reoriented. The Man in the High Castle captures this truth chillingly. Had the Nazis won World War II, Americans—despite their lofty self-perceptions—would have quickly adopted the morality of their new rulers. The foundations of American morality would have been reimagined in the image of the Third Reich, not through inherent belief but through cultural osmosis, survival instincts, and institutionalised pressure. What we now consider abhorrent might have become, under those circumstances, morally unremarkable. Morality, in this view, is not timeless but endlessly pliable, bending to the will of power and circumstance.

The Case for Moral Objectivity: Kantian Ethics

In contrast to Nietzsche’s relativism, Immanuel Kant offers a vision of morality as rational, universal, and objective. Kant’s categorical imperative asserts that moral principles must be universally applicable, derived not from cultural or historical contingencies but from pure reason. For Kant, the moral law is intrinsic to rational beings and can be expressed as: “Act only according to that maxim whereby you can, at the same time, will that it should become a universal law.”

This framework provides a stark rebuttal to Nietzsche’s subjectivity. If morality is rooted in reason, then it transcends the whims of power dynamics or cultural specificity. Under Kant’s system, slavery, war, and exploitation are not morally permissible, regardless of historical acceptance or cultural norms, because they cannot be willed universally without contradiction. Kant’s moral absolutism thus offers a bulwark against the potential nihilism of Nietzschean subjectivity.

Cultural Pressure: The Birthplace of Moral Adoption

The individual’s adoption of morality is rarely a matter of pure, autonomous choice. Rather, it is shaped by the relentless pressures of culture. Michel Foucault’s analysis of disciplinary power in works such as Discipline and Punish highlights how societies engineer moral behaviours through surveillance, normalisation, and institutional reinforcement. From childhood, individuals are inculcated with the moral codes of their culture, internalising these norms until they appear natural and self-evident.

Yet this adoption is not passive. Even within the constraints of culture, individuals exercise agency, reshaping or rejecting the moral frameworks imposed upon them. Nietzsche’s Übermensch represents the apotheosis of this rebellion: a figure who transcends societal norms to create their own values, living authentically in the absence of universal moral truths. By contrast, Kantian ethics and utilitarianism might critique the Übermensch as solipsistic, untethered from the responsibilities of shared moral life.

Morality in a Shifting World

Morality’s subjectivity is its double-edged sword. While its flexibility allows adaptation to changing societal needs, it also exposes the fragility of moral consensus. Consider how modern societies have redefined morality over decades, from colonialism to civil rights, from gender roles to ecological responsibility. What was once moral is now abhorrent; what was once abhorrent is now a moral imperative. Yet even as society evolves, its subcultures and countercultures continue to resist and reshape dominant moral paradigms. If history teaches us anything, it is that morality is less a fixed star and more a flickering flame, always at the mercy of shifting winds.

Conclusion: The Artifice of Moral Meaning

Morality, then, is not a universal truth etched into the fabric of existence but a subjective artifice, constructed by cultures to serve their needs and adopted by individuals under varying degrees of pressure. Nietzsche’s philosophy teaches us that morality, stripped of its pretensions, is not an arbiter of truth but a symptom of human striving—one more manifestation of the will to power. In contrast, Kantian ethics and utilitarianism offer structured visions of morality, but even these grapple with the tensions between universal principles and the messy realities of history and culture.

As The Man in the High Castle suggests, morality is a contingent, situational artefact, liable to be rewritten at the whim of those in power. Its apparent stability is an illusion, a construct that shifts with every epoch, every conquest, every revolution. To ignore this truth is to cling to a comforting, but ultimately deceptive, myth. Morality, like all human constructs, is both a triumph and a deception, forever relative, ever mutable, yet persistently contested by those who would impose an impossible order on its chaos.

Dukkha, the Path of Pain, and the Illusion of Freedom: Buddhism, Antinatalism, and the Lonely Road of Individuation

The First Noble Truth of Buddhism—the notion that life is suffering, or dukkha—is often misinterpreted as a bleak condemnation of existence. But perhaps there’s something deeper here, something challenging yet quietly liberating. Buddhism doesn’t merely suggest that life is marred by occasional suffering; rather, it proposes that suffering is woven into the very fabric of life itself. Far from relegating pain to an exception, dukkha posits that dissatisfaction, discomfort, and unfulfilled longing are the baseline conditions of existence.

This isn’t to say that life is an unending stream of torment; even in nature, suffering may seem the exception rather than the rule, often concealed by survival-driven instincts and primal ignorance. But we, as conscious beings, are haunted by awareness. Aware of our mortality, our desires, our inadequacies, and ultimately, of our impotence to escape this pervasive friction. And so, if suffering is indeed the constant, how do we respond? Buddhism, antinatalism, and Jungian psychology each offer their own, starkly different paths.

The Buddhist Response: Letting Go of the Illusion

In Buddhism, dukkha is a truth that urges us not to look away but to peer more closely into the nature of suffering itself. The Buddha, with his diagnosis, didn’t suggest we simply “cope” with suffering but rather transform our entire understanding of it. Suffering, he argued, is born from attachment—from clinging to transient things, ideas, people, and identities. We build our lives on desires and expectations, only to find ourselves caught in a cycle of wanting, attaining, and inevitably losing. It’s a form of existential whiplash, one that keeps us bound to dissatisfaction because we can’t accept the impermanence of what we seek.

The Buddhist approach is both radical and elusive: by dissolving attachment and breaking the cycle of clinging, we supposedly dissolve suffering itself. The destination of this path—Nirvana—is not a state of elation or contentment but a transcendence beyond the very conditions of suffering. In reaching Nirvana, one no longer relies on external or internal validation, and the violence of social judgment, cultural obligation, and personal ambition falls away. This may seem austere, yet it offers a powerful antidote to a world that equates happiness with accumulation and possession.

Antinatalism: Opting Out of Existence’s Violence

Where Buddhism seeks liberation within life, antinatalism takes an even more radical stance: why bring new beings into an existence steeped in suffering? For antinatalists, the suffering embedded in life renders procreation ethically questionable. By creating life, we induct a new being into dukkha, with all its attendant violences—society’s harsh judgments, culture’s rigid impositions, the bureaucratic machinery that governs our daily lives, and the inescapable tyranny of time. In essence, to give birth is to invite someone into the struggle of being.

This perspective holds that the most humane action may not be to mend the suffering we encounter, nor even to accept it as Buddhism advises, but to prevent it altogether. It sees the cycle of life and death not as a majestic dance but as a tragic spiral, in which each generation inherits suffering from the last, perpetuating violence, hardship, and dissatisfaction. Antinatalism, therefore, could be seen as the ultimate recognition of dukkha—an extreme empathy for potential beings and a refusal to impose the weight of existence upon them.

Jungian Individuation: The Lonely Path of Becoming

Jung’s concept of individuation offers yet another approach: to delve deeply into the self, to integrate all aspects of the psyche—the conscious and the unconscious—and to emerge as a fully realised individual. For Jung, suffering is not to be escaped but understood and incorporated. Individuation is a journey through one’s darkest shadows, a confrontation with the parts of oneself that society, culture, and even one’s own ego would rather ignore. It is, in a way, an anti-social act, as individuation requires the courage to step away from societal norms and embrace parts of oneself that might be seen as disturbing or unconventional.

But individuation is a lonely road. Unlike the Buddhist path, which seeks to transcend suffering, individuation requires one to face it head-on, risking rejection and alienation. Society’s judgment, a kind of violence in itself, awaits those who deviate from accepted roles. The individuated person may, in effect, be punished by the very structures that insist upon conformity. And yet, individuation holds the promise of a more authentic existence, a self that is not a mere amalgam of cultural expectations but a reflection of one’s truest nature.

The Delusions That Keep Us Tethered to Suffering

Yet, for all their starkness, these paths might seem almost abstract, philosophical abstractions that don’t fully capture the reality of living within the constraints of society, culture, and self. Human beings are armed with powerful psychological mechanisms that obscure dukkha: self-delusion, cognitive dissonance, and hubris. We fabricate beliefs about happiness, purpose, and progress to protect ourselves from dukkha’s existential weight. We convince ourselves that fulfilment lies in achievements, relationships, or material success. Cognitive dissonance allows us to live in a world that we know, on some level, will disappoint us without being paralysed by that knowledge.

It’s worth noting that even those who acknowledge dukkha—who glimpse the violence of existence and the illusory nature of happiness—may still find themselves clinging to these mental defences. They are shields against despair, the comforting armours that allow us to navigate a world in which suffering is the baseline condition. This is why Buddhism, antinatalism, and individuation require such rigorous, often painful honesty: they each ask us to set down these shields, to face suffering not as a solvable problem but as an intrinsic truth. In this light, psychological defences are seen not as failures of awareness but as survival strategies, albeit strategies that limit us from ever fully confronting the nature of existence.

Finding Meaning Amidst the Violence of Being

To pursue any of these paths—Buddhist enlightenment, antinatalism, or Jungian individuation—one must be prepared to question everything society holds dear. They are radical responses to a radical insight: that suffering is not accidental but foundational. Each path offers a different form of liberation, whether through transcendence, abstention, or self-integration, but they all require a certain fearlessness, a willingness to look deeply into the uncomfortable truths about life and existence.

Buddhism calls us to renounce attachment and embrace impermanence, transcending suffering by reshaping the mind. Antinatalism challenges us to consider whether it is ethical to bring life into a world marked by dukkha, advocating non-existence as an escape from suffering. And individuation asks us to become fully ourselves, embracing the loneliness and alienation that come with resisting society’s violence against the individual.

Perhaps the most realistic approach is to accept that suffering exists, to choose the path that resonates with us, and to walk it with as much awareness as possible. Whether we seek to transcend suffering, avoid it, or integrate it, each path is a confrontation with the violence of being. And maybe, in that confrontation, we find a fleeting peace—not in the absence of suffering, but in the freedom to choose our response to it. Dukkha remains, but we may find ourselves less bound by it, able to move through the world with a deeper, quieter understanding.

Multiple Stupidities

A mate of mine since grade four recently shared an article with me. We’ve been acquainted since the early ’70s and have remained in touch on and off along the way. He ended up attending university with a degree in Political Science whilst I took the Economics route. Not only are our world views are different, but they were also different then, and they are differently different today. Still, we respect each other’s differences and know where we converge—our love of music and the socio-political sphere—and diverge—which music and what aspects of the socio-political sphere. This has no material impact on this post, but he is more of a pragmatic optimist whilst I lean toward pragmatic realism—whatever that even means. Perhaps I’ll share our political courses in future.

This friend shared with me an article on the five universal laws of human stupidity. I gave him a short response, but even as I was responded, I had more I wanted to articulate, and this place is reserved for musings of just this sort.

The article establishes a premise that people generally underestimate human stupidity. I am pretty sure I don’t underestimate human stupidity. Yet I question whether this perspective is misanthropic or good old fashioned realism. To voice it is to be accused of being a misanthrope. Within the perspective of the ternary chart I’ve been developing, the answer differs if one is Modern versus Postmodern. And to be clear, Moderns at one time claim to be abject Humanists, and yet I hear often how stupid this or that person is—or even people are in general—, and yet they counterbalance that with some hope for humanity—humans as a viable species.

Before tackling the issue of stupidity, let’s establish a frame. I tend to accept the theory of multiple intelligences. Perhaps, I don’t wholly agree or even feel the model captures the domain entirely, but conceptually, I feel that what we term intelligence can be dimensionalised. Whether these dimensions can be measured is a separate story—and my answer is a no—, but it can be conceptualised. Some have argued that all the theory of multiple intelligences does is to name the dimensions already accounted for in a grand intelligence model. Although I agree that these dimensions can be aggregated to capture weighted measures, I disagree that this is occurring. I am also sceptical as to whether this can be accomplished meaningfully.

However, one couches it, if we believe that intelligence is a thing and we can dimensionalise it, this also leaves open the door to the absent position. If we have a rating scale between 0 and 100 representing intelligence, where at some point an entity is considered to be functionally intelligent and then gradations of increasing degrees of superior intelligence, then we can also run the scale in the other direction—100 minus the intelligence value.

In practice, this is how the old IQ system worked. On the upside, we get average to genius; on the downside, we’ve got imbeciles, morons, and idiots—and of course, we’ve got the more general category of stupid. And if we allow for multiple intelligences, we get the contrary situation of multiple stupidities.

Standard multiple intelligence theory proposes that intelligence can be assessed along nine dimensions. Even if we excel on a few of the 9 proposed dimensions, we are still left deficient in the rest.

There have been studies performed where the multiple intelligences of medical professionals were assessed. Aggregated, these people typically marginally excel in rote learning and (believe it or not) interpersonal skills but can’t necessarily balance a chequebook. And they are notably deficient in the rest. To add insult to injury, many of these people overcompensate by feigning interest in matters of culture.

I am fully aware that this is a sweeping generalisation, but the point remains that one can excel in 2 or 3 dimensions, yet still be stupid in the remaining 6 or 7. If you consider the so-called progress of human civilisation, it has ‘advanced’ because of the intellectual contributions of very few: There are only so many Newtons and Einsteins among us—and Rembrandts and Picassos or Beethovens and Mozarts. We debate when AI will reach singularity and defend that AI can never be a Shakespeare, but fail to note that even qualitatively, the best we can amass is some homoeopathic quantity of these people. But when I point out that given the opportunity I wouldn’t have hired some 90+ per cent of my university or grad school classmates, who graduated with me because although they technically passed the course material, they were, as is the topic at hand, stupid. These are normal, ordinary people. They have jobs, families and relationships, and have hobbies and activities they excel at. Still, on balance, stupid sums up their totalities. On LinkedIn, every now and again I read posts on the Imposter Syndrome, how you are not an imposter. Not to be politically incorrect, but you are an imposter. But take comfort, so is everyone else. This is what Judith Butler means by performativism. This is Sartre’s waiter. Stay in your lane, and you’ll be fine. This is the Modern world. It’s also why Moderns have such a problem with Postmoderns who point out these things. In short and in sum: people are ostensibly stupid. Get over it. It could be worse.

Omnipotence and hubris are strong cognitive defences against cognitive dissonances. We may be familiar with Dunning-Kruger‘s chart that depicts how people over-estimate their topical knowledge, but we may not be aware that this overestimation is not limited to the scope of neophytes.

American Exceptionalism

It’s sometimes difficult living in such a narcissistic place. I’ve lived in and out of the US, but I seemed to have settled here for now. I’ve lived on each coast, the Southwest, and the Midwest. I’ve visited all but four states—notably, Wyoming, Montana, and North & South Dakota, so you might recognise the trend.

Currently, I reside in Delaware, but my office is in Manhatten. As a consultant, I am most often wherever my client is. Combined, I’ve lived in LA for well over a decade, my earliest youth was spent in and around Boston. In my 20s in the 1980s, I spent my formative years in Los Angeles, the centre of the music industry at the time, where I was a recording engineer and musician. I had left my roots in Boston with various pitstops along the way to settle in LA, but I returned to Boston in the late ’80s to attend university and grad school. In Boston, I was married and then divorced, an event that gave me leave to return to Los Angeles, where I got married again and relocated to Chicago, where I spent over a decade as well. Divorced again, I relocated near Philadelphia for work and settled into rural northern, Delaware.

To the uninitiated, the US have two cosmopolitan cities, NYC and LA. By population, the third largest city, Chicago, is an oversized farm town. It qualifies as a city on the basis of population, and it’s not a bad place to be, but it lacks the cultural diversity and buzz of a NY or LA. There is none of that in Philadelphia and even less in Delaware.

Donut

The United States are like Australia. It’s ostensibly like a doughnut—empty in the middle, except to say the top and bottom don’t offer much either. So this is not to say that there aren’t valuable things, lessons, and people in these other areas, but by and large, even with the Internet and social media, they are still a decade and more behind.

When I lived in Japan—and I realise that I am coming off as some sort of culture snob—, I was taken aback at how far they seemed behind my frame of reference, having come from an affluent, white, East Coast, family. On one hand, their technology was off the charts and, owing to the exchange rate, it was cheap. Besides the exchange rate, the mark-up was enormous. Americans have no sense of value, and so as much as they exploit other countries, the last laugh is on them.

Americans are not some monolithic entity. There are many dimensions and divisions. To say Americans [fill in the blank] would be disingenuous. To listen to the politicians—especially the ones on the Right, and not just the fringe—you might be left thinking that American are all narcissistic assholes. In fact, this is the same cohort that leaves you feeling that the US have never left the Dark Ages with their religious superstitions.

Much of the country is actually in the 21st Century, but when you try to assess some average sentiment, this vocal minority makes it seem we live in perhaps the fifteenth century.

Roman General Lucius (John Cusack) — Dragon Blade Film

Even behind these anachronisms, there is still a sense of American exceptionalism—or perhaps there was a time that they were exceptional in some bout of nostalgia. You can cherry-pick some dimensions and claim to rank high on the scale but any exceptionalism only happens by adopting a frame. Many who come to the conclusion that the US are or were exceptional tend to fetishise Ancient Greece and Rome as well. In my opinion, it’s indoctrination, but there has to be more than this. There needs to be a certain gullibility gene that creates the propensity to believe these narratives. Without going off the rails, it might be fair to say that this genetic predisposition might have been the reason humans have evolved this far. I’d like to think it’s merely vestigial, but I’ll presume that this is only wishful thinking.

Americans, like most people, have a sense of identity, whether personal or to groups. And like the personae we project as individuals, we have myriad group personae as well. Perhaps there is already a term for this. If not, I’m not going to coin a term now.

Like individual identity, people defend their notion of group identity, and they tend to over-estimate. More than half of people consider themselves to be average or better than average in looks or intelligence and so on. Clearly, this defies statistics, but it is not merely an attempt to assuage some cognitive dissonance; you can come to a defensible position by picking some attributes that might excel (on some subjective aesthetic scale) and then overvaluing these attributes relative to the entire domain. Perhaps a person is taller than average and has been told s/he has beautiful eyes. It would be easy to discount other factors and place oneself in a higher rank due to these two factors. It works like this for national identity.

In the US, they will focus on some economic indicator, argue that it is important and captures broader coverage than it does, and then reference it as proof of exceptionalism. Meantime, the population is being indoctrinated into accepting this narrative, and much effort is spent trying to convince the larger world that this attribute is important.

In this MAGA Age of Make America Great Again, it’s helpful to remember that it never was and never will be great. And that’s OK. It’s also helpful to remember that the ‘good ole days’ are rarely as we remember them in the rearview mirror.

Defending Democracy

2–3 minutes

Indeed, it has been said that democracy is the worst form of Government except for all those other forms that have been tried from time to time. 

Sir Winston Chirchill

I am not a defender of or apologist for Democracy. Any system is only as strong as its weakest link, but save for the rhetorical promises Democracy is nothing but weak links. Turtles all the way down. It’s another failed Enlightenment experiment. Sure, you can argue that the Ancient Greeks invented democracy—or at least implemented it at any scale—, but specious Enlightenment ideals pushed it forward into the mainstream.

The Achilles’ heel of Democracy is the principle-agent problem, the same one that separates management (CEOs) from owners (shareholders). Incentives are different.

Achilles’ Heel

Plato published his solution is Republic, but this proposal was naive at best. The notion that meritocracy is something real or that we can appropriately understand dimensions and measures in order to create the right incentives is another weak link.

Plato’s Republic

We see the same problem controlling elected officials. Time and again, we elect them, and time and again, they disappoint. We, the People, are the principles, and the elected are our agents. People in the US (and in so-called ‘democratic’ societies) have the vote, and yet—per the oft-cited definition of insanity—, they perform the same action and continue to expect different results; in fact; they are always surprised). At its core, it’s an incentive and accountability problem.

Kenneth Arrow wrote about the Impossibility Theorem, where he proved mathematically that no voting system would yield optimal results. Democracy is cursed with mediocrity. We like to soft-pedal the notion of mediocrity with the euphemism of compromise, another Ancient Greek legacy of moderation. If this makes you feel better, who am I to break the delusion? Cognitive dissonance is a powerful palliative.

μηδέν άγαν

Do Nothing in Excess, Delphic Oracle Inscription

Interestingly enough, many people clamour for term limits (a subversion of democracy) because they can’t help themselves from voting for the same shit politicians over and again. They rationalise it and say it is to defend against the other guy’s vote because they’d have never voted for shit representation.

This is often couched as ‘save me from myself’, but it is just as aptly cast as ‘save me from democracy’. I suppose a heroin addict might have the same thoughts.