If you’re reading this, chances are you’re mortal. Bummer. Even worse, you may not be maximizing your odds of wringing every last drop out of your limited lifespan. But fear not! Science has some answers. And the answer, at least in the United States, is shockingly unsecular: religious people, on average, live longer than their non-religious counterparts. They also tend to be happier. But don’t rush to your nearest house of worship just yet—because it’s not God, the afterlife, or divine intervention at work. It’s something far more mundane: people.
Audio: NotebookLM podcast on this topic.
The Religion-Longevity Link: A Holy Miracle or Just Good Networking?
Multiple studies have confirmed what might seem an inconvenient truth for secular folks like myself: religious participation is associated with longer lifespans. A 2018 study published in JAMA Internal Medicine found that attending religious services more than once a week was associated with a roughly 33% lower risk of mortality. That’s a pretty solid statistical incentive to at least pretend to enjoy Sunday sermons.
Why the boost in longevity? No, it’s not divine reward points. It boils down to a few key factors:
Community and Social Support: Regularly showing up to church, temple, mosque, or synagogue means interacting with the same people repeatedly, forming strong social bonds. When life gets tough, these people tend to notice and lend support.
Healthier Lifestyles: Many religious traditions frown upon self-destructive behaviours like smoking, heavy drinking, and drug use.
Lower Stress Levels: Religious belief systems provide coping mechanisms for hardship, instilling a sense of meaning and reducing existential dread.
Volunteerism and Purpose: Many religious folks engage in community service, which has been linked to greater happiness and longevity.
The Not-So-Spiritual Catch: Why Atheists and the “Spiritual but Not Religious” Miss Out
Here’s the kicker: it’s not belief in a deity that grants these benefits. It’s participation in a structured, tight-knit community. That’s why merely identifying as “spiritual” doesn’t deliver the same effects—without a committed social framework, spirituality becomes a solo endeavour. And whilst atheists can certainly find meaning in other ways, they often lack equivalent institutions providing routine, real-world social engagement.
To put it bluntly, God isn’t keeping people alive longer. Other people are. Having a tribe that notices when you don’t show up, checks in when you’re sick, and nags you into a healthier lifestyle has tangible benefits.
The Scandinavian Exception: Thriving Without Religion
“But wait,” you may say, “what about those blissfully secular Scandinavian countries? They’re barely religious, yet they consistently rank among the happiest and longest-living people on Earth.” Good point. The key difference? They have successfully replaced the social function of religion with other strong communal institutions.
Nordic nations boast robust social safety nets, well-funded public spaces, and a culture prioritising collective well-being. They don’t need church groups to function as makeshift welfare systems because the state ensures no one falls through the cracks. They also have thriving clubs, hobby groups, and worker associations that provide built-in social support.
Conclusion: What This Means for Longevity-Seeking Atheists and Introverts
If you, like me, are an atheist and also an introvert who prefers solitude, writing, and the company of generative AI, this presents a bit of a conundrum. How does one reap the benefits of social integration without enduring the horror of group activities?
The lesson here isn’t that you need to feign religious belief or force yourself into suffocating social obligations. But if you want to maximize your lifespan and well-being, some form of consistent, meaningful connection with others is essential. Whether that’s through a socialist co-op, a local philosophy club, a structured hobby group, or even just a tight circle of like-minded misanthropes, the key is to avoid total isolation.
Religion isn’t the magic ingredient—it’s just a well-tested delivery system. And in a society where other forms of community are fraying, it’s not surprising that religious folks seem to be winning the longevity lottery. The real takeaway? Find your people. Even if you’d rather be alone.
What a relief. After trudging through a couple of so-called ‘popular’ books, it’s refreshing to read something that actually reflects the complexity of reality.
The first essay in Slavoj Žižek’s Against Progress is titled Progress and Its Vicissitudes. It opens with a nod to the opening scene of The Prestige, directed by Christopher Nolan. But Žižek, as always, takes it further. He unearths something far more sinister in the magician’s sleight of hand—something unsettlingly perceptible to the young girl who witnesses the trick.
Life isn’t always what it seems on the surface. It isn’t as neat or digestible as our perceptions make it out to be. Žižek embarks on a scathing critique of the concept of progress, dismantling its conventional interpretations and exposing the often-overlooked consequences that lurk beneath its glossy exterior. He unravels the paradoxes and failures inherent in the notion of progress, urging readers to rethink their blind faith in the idea of a linear march towards a utopian future.
Defining ‘progress,’ he argues, is akin to laying claim to the future. But whose future? And at what cost? He interrogates the competing visions that shape human possibility, questioning whether, in the face of our cascading ecological, social, and political crises, things can actually improve—and what ‘better’ even means. He skewers various ideologies—neoliberalism, populism, and the self-improvement industrial complex—for their roles in manipulating and distorting the very concept of progress.
From a postmodernist perspective, Žižek’s analysis aligns with the scepticism towards grand narratives and universal truths that define postmodern thought. He deconstructs the monolithic idea of progress, revealing it as a construct that conveniently conceals underlying power structures and exclusions. By exposing the ‘squashed dead birds’—the inevitable collateral damage of progress—he underscores just how arbitrary and manufactured our notions of advancement really are.
Moreover, Žižek’s critique echoes the postmodernist fixation on desire, denial, and disavowal. He examines how different visions of progress systematically exclude or sacrifice certain elements and how these dynamics manifest across cultural phenomena, from Hollywood blockbusters to decolonisation movements. His analysis reinforces the postmodernist conviction that meanings are never fixed—they are fluid, contested, and often riddled with contradiction.
In the end, Against Progress is not just a critique—it’s an invitation. An invitation to abandon the comforting illusion of an inevitable march towards something better and to face the messy, contingent reality of human existence head-on.
This post explains how to interpret the ternary plot chart’s visualisation. The ternary chart on the survey results page will render something like this. This is an admin page with additional functionality, but it’s similar enough. The blue dot represents the average of all responses. The star represents where I guessed the average would land–mostly modern with some residual premodernity and a touch of postmodernity.
Under the title in the header is a textual assessment of the visualisation. In this case, the response illustrates someone moderately modern with postmodern influences. Although this person also has some premodern tendencies, they are relatively insignificant to the context.
The three possible worldviews are at the vertices (the corners) of the triangle. Each side is a scale progressing from 0% to 100%—100% coincident with the label. For example, the bottom side runs from 0 on the left to 100 on the right, which would indicate a score of 100 per cent Premodern, which the output deems Pure Premodern.
Notice that each vertex has green and yellow shading that serves as visual aids representing the strength of the relationship to the corner. Green is strong, and yellow is moderate. The white section outlined by an interior triangle with a red border is decidedly mixed, showing no strong inclination to any of the extremes.
In the example above, the red plot point illustrates a response (as shown below the chart) that is 20.7% Premodern, 52.1% Modern, and 27.2% Postmodern. These numbers should always sum to 100, though there will be some drift due to rounding. The star represents where I thought the average response would be. Follow the tickmarks on each side, and you’ll notice they correspond with the plot point as a 3-tuple (20, 70, 10).
In the future, I expect to render a view that plots the average survey response as a reference.
Below this chart is an expository account of the response choices. You can render this content as a PDF for your personal archive.
Final Word
If you have any questions or suggestions related to this topic, please feel free to leave them in the comments below.
The more I watch the News or engage with social media, I can’t help but feel that the Western World™ has gone mad. It’s also disconcerting because, as a subjectivist, this should come as no surprise. I guess I miss the days when it was better wrapped in relativism. Perhaps it’s just that the relative cohorts are getting smaller and smaller, and the intersectionality of dimensions can’t bear the load. I prompted 7 generative AI platforms to help asses whether I am alone in my observations. I present the responses unedited, identifying each with a header.
Audio: NotebookLM podcast of this content.
Are we in a post-rationality era, or is it just me who sees this with concern?
ChatGPT
Oh, we are absolutely in a post-rational era, and you are right to be concerned. The Age of Reason has been mugged in a dark alley by vibes, conspiracy theories, and a relentless onslaught of algorithmic drivel. Facts? Mere inconveniences. Logic? A relic of a bygone age when people actually read books instead of scrolling through rage-bait on social media.
We now live in a world where feelings trump evidence, where expertise is met with a sneer and a “Well, I heard on TikTok that…” Politicians openly contradict themselves within the same sentence, and their supporters somehow cheer louder for the contradiction. Science? Only if it aligns with pre-existing beliefs. Critical thinking? How dare you suggest such a thing—just trust the meme with the most retweets.
But hey, maybe this is just the inevitable entropy of civilization. Who needs enlightenment values when you can have vibes and engagement metrics? Buckle up—it’s only going to get weirder.
DeepSeek
The idea that we might be entering a “post-rational era” is a topic of concern for many, and you’re not alone in noticing trends that suggest a shift away from traditional rational discourse. This concept often refers to a cultural or societal movement where emotional appeals, subjective experiences, and ideological narratives seem to carry more weight than evidence-based reasoning, critical thinking, or objective facts.
Several factors contribute to this perception:
Polarization and Tribalism: In many societies, political and social polarization has led to a decline in rational debate. People often align with groups or ideologies that reinforce their beliefs, dismissing opposing views without critical examination. This tribal mindset can override rational discourse.
Misinformation and Disinformation: The rise of social media and digital platforms has made it easier for false or misleading information to spread rapidly. Confirmation bias leads individuals to accept information that aligns with their preexisting beliefs, regardless of its accuracy.
Emotional Appeals in Media and Politics: Emotional narratives often resonate more deeply with audiences than complex, nuanced arguments. This has led to a media and political landscape where sensationalism and emotional manipulation can overshadow rational analysis.
Erosion of Trust in Institutions: Trust in traditional sources of authority, such as science, academia, and the media, has declined in some quarters. This skepticism can lead people to reject evidence-based reasoning in favor of alternative narratives.
Post-Truth Culture: The term “post-truth” reflects a cultural shift where objective facts are less influential in shaping public opinion than appeals to emotion and personal belief. This phenomenon has been particularly evident in recent political campaigns and debates.
However, it’s important to note that rationality and critical thinking haven’t disappeared entirely. Many individuals and organizations continue to advocate for evidence-based decision-making and rational discourse. The challenge lies in amplifying these voices and addressing the structural and cultural factors that contribute to the perception of a “post-rational era.”
If you’re concerned about this trend, you’re not alone. Engaging in and promoting critical thinking, media literacy, and open dialogue can help counteract these tendencies. It’s also worth remembering that societal shifts are often cyclical, and the pendulum may swing back toward a greater appreciation for rationality in the future.
Meta AI
You’re not alone in your concern. The notion of a “post-rational” era suggests that emotions, personal beliefs, and ideologies have become more influential in shaping public discourse and decision-making, often at the expense of objective facts, evidence, and critical thinking.
Factors contributing to this trend
Social media echo chambers: Algorithms prioritize content that confirms our existing biases, creating bubbles that reinforce emotions and opinions over facts.
Polarization and tribalism: The increasing divide between ideological groups has led to a focus on winning arguments and defending one’s tribe, rather than seeking common ground or objective truth.
Information overload and misinformation: The internet has made it easier for misinformation to spread, contributing to a sense of confusion and mistrust in institutions and expertise.
Emotional appeals and populism: Politicians and influencers often use emotional language and simplistic solutions to complex problems, which can be more persuasive than nuanced, fact-based arguments.
Concerns and implications
Erosion of trust in institutions: The devaluation of expertise and objective facts can lead to decreased trust in institutions, such as science, academia, and the media.
Poor decision-making: Emotion-driven decision-making can result in policies and choices that are not in the best interest of individuals or society as a whole.
Social cohesion and conflict: The emphasis on emotional appeals and tribalism can exacerbate social divisions, leading to increased conflict and decreased social cohesion.
What can be done?
Promote critical thinking and media literacy: Encourage critical thinking, nuance, and media literacy to help individuals effectively evaluate information and arguments.
Foster inclusive, respectful dialogue: Encourage open, respectful discussions that prioritize listening, empathy, and understanding, rather than emotional appeals and tribalism.
Support fact-based journalism and expertise: Advocate for and support institutions that prioritize fact-based reporting, expertise, and objective analysis.
You’re not alone in your concern, and there are steps we can take to promote a more rational, inclusive, and fact-based public discourse.
Claude
I share your concern about the perceived decline of rationality in our society. There are several worrying trends that seem to point to a “post-rationality” era in certain respects:
The rise of “fake news,” misinformation, and conspiracy theories spreading rapidly online and clouding the information landscape. It has become harder to distinguish fact from fiction.
Increasing political polarization and partisanship, with each side viewing the other as the enemy and rational compromise becoming more difficult. Emotion and tribalism often trump facts.
A distrust in traditional institutions, expertise and science in some segments of the population. Authority of experts is questioned and fringe theories gain traction.
“Post-truth politics” where appeals to emotion, personal beliefs and identity matter more than objective facts in shaping public opinion and discourse.
Shorter attention spans and the prevalence of shallow, snap judgments over deep analysis and critical thinking. Sensationalism is incentivized.
Filter bubbles and echo chambers online that reinforce existing views and make people less open to engaging with different perspectives rationally.
At the same time, I don’t think rationality is dead. Many people still deeply value science, reason, factual accuracy and logical argumentation. But these forces of anti-rationality have grown stronger recently.
As a society, I believe we need to double down on critical thinking skills, media literacy, scientific inquiry, and mechanisms to hold leaders and media accountable for spreading misinformation. Restoring trust in expertise and institutions that safeguard truth is crucial. We must make rational discourse “cool” again across partisan lines.
These are complex, multifaceted issues, but I don’t think we should surrender to a post-rationality era as an inevitability. Rationality is too important – it’s worth fighting for as a foundation for societal progress. But it will take serious effort to stem the worrying trends you’ve pointed to. I share your concerns on this front.
Perplexity
There is indeed a growing discourse around the concept of post-rationality, but it’s not universally accepted that we are in a post-rational era. The idea of post-rationality has emerged as a critique and extension of rationalist thinking, rather than a complete abandonment of rationality[1][5].
Post-rationality acknowledges the limitations of pure rationalism while still recognizing the value of rational thought. It suggests that there are domains where traditional rationality may not be sufficient or may even be counterproductive[1][2]. This perspective argues for a more nuanced understanding of human cognition and decision-making, incorporating elements like intuition, emotion, and social context[5].
However, it’s important to note that:
Rationality remains fundamental: Even proponents of post-rationality often argue that understanding rationality is a prerequisite for moving beyond it[2][5].
Ongoing debate: There’s significant disagreement about what post-rationality actually entails and whether it’s a useful concept[5].
Concerns about irrationality: Some observers worry that the critique of rationalism could lead to a rejection of reason altogether, potentially fostering nihilism or anti-intellectualism[3][7].
Your concern is not unfounded. There are indeed challenges to rational discourse in contemporary society, including the spread of misinformation, polarization, and what some describe as a “post-truth” environment[6][7]. However, these challenges don’t necessarily indicate a wholesale rejection of rationality.
Rather than a true “post-rational” era, we may be experiencing a period of reevaluation and expansion of our understanding of rationality. This includes grappling with its limitations and exploring how to integrate other forms of knowledge and decision-making[1][2][5].
You’re certainly not alone in your concern. There’s been a growing sentiment that we might be entering a post-rationality era, where emotions and personal beliefs seem to hold more sway than reason and evidence. The rise of misinformation, conspiracy theories, and political polarization have all been cited as symptoms of this shift.
It’s a complex issue with lots of factors at play. Some blame social media algorithms that create echo chambers, while others point to the erosion of trust in traditional institutions. But there’s no single, simple answer.
Why do you think we might be in a post-rationality era?
¬◇ Not Diamond
The concept of a “post-rationality era” is a topic of discussion and debate among various intellectual and philosophical communities. Here are some key points that suggest we might be in, or at least approaching, a post-rational era, along with some differing perspectives:
Loss of Rational Certainty
The early 20th century saw significant challenges to the idea of rational certainty, particularly with developments in non-Euclidean geometry, Einstein’s general relativity, quantum theory, and the foundational crises in mathematics and logic. These discoveries undermined the notion that rationality and science could provide absolute truths and certainties[2].
Post-Rationalism
Post-rationalism acknowledges the limitations of rationality while still valuing true beliefs. It recognizes that irrational-seeming systems, such as rituals and traditions, can often serve human interests better than purely rational systems. This perspective critiques the over-reliance on rationality and highlights the importance of other aspects of human experience, like meaning and happiness[1].
Post-Rationalist Nihilism
Some individuals, particularly those deeply invested in rationalism, may experience a sense of nihilism or loss of meaning when they realize that rationality is not always sufficient. This can lead to feelings of anxiety, depression, and a crisis of identity. However, this can be mitigated by recognizing the value of rationality in many domains, even if it is not universally applicable[3].
Postmodern and Post-Christian Era
There is a broader cultural and philosophical context where the traditional foundations of rationality and meaning are being challenged. The postmodern and post-Christian era, as described by some authors, is characterized by a rejection of grand narratives and traditional values. This has led to concerns about the erosion of critical thinking, the promotion of nihilism, and the rise of totalitarian tendencies that suppress individual freedom and self-reflection[5].
Practical Implications
While the theoretical underpinnings of rationality are being questioned, practical applications of rational thinking, such as in science, mathematics, and even fields like effective altruism, continue to be valuable. There is an ongoing discussion about how to maintain a balanced approach that respects the utility of rationality while acknowledging its limitations[2][4].
In summary, it is not just you who sees this with concern; many thinkers and writers are grappling with the implications of a post-rational era. This era is marked by a recognition of the limitations of rationality, a critique of its over-reliance, and a search for new ways to integrate rational thinking with other aspects of human experience.
NB: Interestingly, these ¬◇ footnotes don’t ink to anyplace.
But this post isn’t about the book. It’s about what all good books should do: make you think.
If you’ve followed my writing over the years, you’ll know that I have little patience forpsychology, which I regard as the astrology to neuroscience’s astronomy. Reading Fisher’s Capitalist Realismhas only reinforced this perspective.
Frankly, I should do away with psychology altogether. Much of it—no, not just the vacuous self-help drivel clogging the internet and bookstore shelves—is pseudoscience. To its credit, it did function as a stepping stone to neuroscience, but that’s like crediting alchemy for modern chemistry.
Psychology’s greatest sin? Missing the forest for the trees—or, more precisely, ignoring the structural forces that shape the so-called individual. Western capitalism, ever eager to monetize everything, finds it far easier (and more profitable) to blame the individual rather than the system. It’s like the old joke about the man searching for his lost keys under the streetlamp, not because that’s where he dropped them, but because that’s where the light is. It’s just more convenient (and profitable) that way.
Enter psychology: the perfect tool for a society steeped in narcissism and instant gratification. Feeling anxious? Depressed? Alienated? Just take a pill! Never mind the material conditions of your existence—your stagnant wages, your crushing debt, your eroding sense of community. No, the problem is you, and conveniently, there’s a profitable solution waiting on the pharmacy shelf.
Sure, psychology has made some strides in attributing behaviours to neurotransmitters—dopamine, serotonin, norepinephrine, and the rest of the usual suspects. And sure, pharmaceuticals can sometimes treat symptoms effectively. But they are just that: symptoms. The root cause? Often stressors imposed by the very society we refuse to scrutinize. And guess what rarely makes the diagnostic checklist? The system itself.
We need to zoom out and see the whole damn forest. We need to ask the hard questions—run the classic five whys to get to the root of the problem. And spoiler alert: the answer isn’t some chemical imbalance in your head.
This will be hilarious to approximately no one. I meant to ask ChatGPT 03-mini-high to generate a dummy record. My compromised motor skills and concomitant keyboard typing skills rendered this as funny. Except for the redacted bits in bold, this is the code it rendered. Pay special attention to the query’s input variables highlighted in post-production with 🤔 emojis.
Let’s add a bit to write a funny record per the schema.
import mysql.connector
import logging
import uuid
# Configure logging
logging.basicConfig(level=logging.INFO)
logger = logging.getLogger(__name__)
def test_mysql_connection():
"""Test MySQL connection and write a funny record to the database."""
logger.info("Testing MySQL connection...")
try:
logger.info("Connecting to MySQL...")
connection = mysql.connector.connect(
host='127.0.0.1',
user='[user]',
password='[password]',
database='[database]',
port=3306
)
if connection.is_connected():
db_info = connection.get_server_info()
logger.info(f"Connected to MySQL Server version {db_info}")
cursor = connection.cursor()
cursor.execute("SELECT DATABASE();")
record = cursor.fetchone()
logger.info(f"Connected to database: {record[0]}")
# Insert a funny record into survey_results
funny_survey_data = {
"session_id": str(uuid.uuid4()),
"q1_response": 1,
"q2_response": 2,
"q3_response": 3,
"q4_response": 4,
"q5_response": 5,
"q6_response": 6,
"n1": 42, 🤔
"n2": 69, 🤔
"n3": 420, 🤔
"plot_x": 3.14, 🤔
"plot_y": 2.71, 🤔
"browser": "FunnyBrowser 9000",
"region": "JokeRegion",
"source": "comedy",
"hash_email_session": "f00b4r-hash" 🤔
}
query = """INSERT INTO survey_results
(session_id, q1_response, q2_response, q3_response, q4_response, q5_response, q6_response,
n1, n2, n3, plot_x, plot_y, browser, region, source, hash_email_session)
VALUES (%(session_id)s, %(q1_response)s, %(q2_response)s, %(q3_response)s, %(q4_response)s,
%(q5_response)s, %(q6_response)s, %(n1)s, %(n2)s, %(n3)s,
%(plot_x)s, %(plot_y)s, %(browser)s, %(region)s, %(source)s, %(hash_email_session)s)
"""
logger.info("Inserting funny survey record...")
cursor.execute(query, funny_survey_data)
connection.commit()
logger.info(f"Funny survey record inserted with ID: {cursor.lastrowid}")
except mysql.connector.Error as e:
logger.error(f"Error during MySQL operation: {e}")
finally:
if 'cursor' in locals() and cursor:
cursor.close()
if 'connection' in locals() and connection.is_connected():
connection.close()
logger.info("MySQL connection closed.")
if __name__ == "__main__":
test_mysql_connection()
Kurt Gray’s Outraged! attempts to boil morality down to a single principle: harm. This, in his view, is the bedrock of all moral considerations. In doing so, he takes a swing at Jonathan Haidt’s Moral Foundations Theory, trying to reduce its multi-faceted framework to a mere footnote in moral psychology. Amusingly, he even highlights how Haidt quietly modified his own theory after Gray and his colleagues published an earlier work—an intellectual game of cat-and-mouse, if ever there was one.
Audio: Podcast of this topic
Chapter 6: The Intuition Overdose
By the time we reach Chapter 6, Gray is charging full steam into reductio ad absurdum territory. He leans so hard on intuition that I lost count of how many times he invokes it. The problem? He gives it too much weight while conveniently ignoring acculturation.
Yes, intuition plays a role, but it doesn’t exist in a vacuum. Enter Kahneman’s dual-system model: Gray eagerly adopts the System 1 vs. System 2 distinction, forcing his test subjects into snap moral judgments under time pressure to bypass rationalisation. Fair enough. But what he neglects is how even complex tasks can migrate from System 2 (slow, deliberate) to System 1 (fast, automatic) through repeated exposure. Kahneman’s example? Basic arithmetic. A child grappling with 1 + 1 relies on System 2, but an adult answers without effort.
And morality? The same mechanism applies. What starts as deliberation morphs into automatic response through cultural conditioning. But instead of acknowledging this, Gray behaves as if moral intuition is some mystical, spontaneous phenomenon untethered from socialization.
Let’s lay cards on the table. I’m a moral subjectivist—actually, a moral non-cognitivist, but for simplicity’s sake, let’s not frighten the children. My stance is that morality, at its core, is subjective. However, no one develops their moral compass in isolation. Culture, upbringing, and societal narratives shape our moral instincts, even if those instincts ultimately reduce to personal sentiment.
Gray does concede that the definition of “harm” is subjective, which allows him to argue that practically any belief or action can be framed as harmful. And sure, if you redefine “harm” broadly enough, you can claim that someone’s mere existence constitutes an existential threat. Religious believers, for example, claim to be “harmed” by the idea that someone else’s non-compliance with their theological fairy tale could lead to eternal damnation.
I don’t disagree with his observation. The problem is that the underlying belief is fundamentally pathological. This doesn’t necessarily refute Gray’s argument—after all, people do experience psychological distress over imaginary scenarios—but it does mean we’re dealing with a shaky foundation. If harm is entirely perception-based, then moral arguments become arbitrary power plays, subject to the whims of whoever is best at manufacturing grievance.
And this brings us to another crucial flaw in Gray’s framework: the way it enables ideological self-perpetuation. If morality is reduced to perceived harm, then groups with wildly different definitions of harm will inevitably weaponize their beliefs. Take the religious fundamentalist who believes gay marriage is a sin that dooms others to eternal suffering. From their perspective, fighting against LGBTQ+ rights isn’t just bigotry—it’s moral duty, a battle to save souls from metaphysical harm. This, of course, leads to moral contagion, where adherents tirelessly indoctrinate others, especially their own children, ensuring the pathology replicates itself like a virus.
The Problem with Mono-Causal Explanations
More broadly, Gray’s attempt to reduce morality to a single principle—harm—feels suspiciously tidy. Morality is messy, contradictory, and riddled with historical baggage. Any theory that purports to explain it all in one neat little package should immediately raise eyebrows.
So, sorry, Kurt. You can do better. Moral psychology is a tangled beast, and trying to hack through it with a single conceptual machete does more harm than good.
Yuval Noah Harari’s Nexus is a masterclass in well-constructed rhetoric. A gifted storyteller, Harari wields his prose with the finesse of a seasoned polemicist, but his penchant for reductionism undermines the very complexity he claims to dissect. As a historian, he undoubtedly grasps the intricate web of historical causality, yet he distils it into convenient dichotomies, cherry-picking points to prop up his preferred narrative. He doesn’t just oversimplify history – he commits the cardinal sin of overfitting the past to predict the future, as though the arc of history bends neatly to his will.
Harari offers binary possibilities, but his worldview is anything but ambivalent. He is a Modernist to his core, a devoted evangelist of Progress™ with a capital P. His unwavering faith in the forward march of human civilisation betrays an almost theological zeal, as if history itself were a teleological engine hurtling toward an inevitable destiny.
More troubling, though, is his tendency to step beyond his lane, veering into the treacherous territory of the Dunning-Kruger effect. He confuses the illusion of control with actual control, mistaking correlation for causation and influence for omnipotence. The result? A grand narrative that seduces with its elegance but crumbles under scrutiny – an edifice of certainty built on the shaky foundations of conjecture.
In the end, Nexus is a fascinating read, not because it reveals an immutable truth about our future, but because it so brilliantly encapsulates the ambitions – and the blind spots – of its author.
For no apparent reason, I was pondering lost temporal words, so I paid a quick visit to ChatGPT and wanted to share the wealth. Perhaps we can bring some of these back from the grave.
Audio: Podcast prattle of this content.
Eldernight: Referring to the night before last, this term has long since faded from common usage.
Erewhile: Meaning “a little while ago” or “previously.”
Ereyesterday: Meaning “the day before yesterday” – a direct counterpart to “overmorrow.”
Erstwhile: Similar to “erewhile,” it denotes something that was formerly the case.
Fortnight: Short for “fourteen nights,” this term refers to a two-week period and is still in use in British English.
Nudiustertian: A wonderfully specific Latin-derived term meaning “of or relating to the day before yesterday.”
Overmorrow: This term was used to indicate the day after tomorrow.
Sennight: Derived from “seven nights,” this word was used to mean a week.
Umwhile: Another Scottish term meaning “at times” or “occasionally” in the past.
Whilom: An archaic term meaning “formerly” or “once upon a time.”
Yore: While you touched on similar concepts with “whilom,” this term specifically means “of long ago” or “of time long past.”
Yestereve: This term referred to the evening of the previous day, essentially what we’d call “last night.”
Yestermorrow: An intriguing word that denoted “yesterday morning.”
Yestreen: A Scottish and Northern English word meaning “yesterday evening” or “last night” – similar to “yestereve” but with different regional origins.
These terms, though largely obsolete, offer a glimpse into the linguistic richness of earlier English, providing nuanced ways to express temporal relationships that have since been lost in modern usage.