Refining Transductive Subjectivity

3–4 minutes

I risk sharing this prematurely. Pushing the Transductive Subjectivity model toward more precision may lose some readers, but the original version still works as an introductory conversation.

Please note: There will be no NotebookLM summary of this page. I don’t even want to test how it might look out the other end.

Apologies in advance for donning my statistician cap, but for those familiar, I feel it will clarify the exposition. For the others, the simple model is good enough. It’s good to remember the words of George Box:

The Simple Model

I’ve been thinking that my initial explanatory model works well enough for conversation. It lets people grasp the idea that a ‘self’ isn’t an enduring nugget but a finite sequence of indexed states:

S0S1S2SnS₀ → S₁ → S₂ → … → Sₙ

The transitions are driven by relative forces, RR, which act as catalysts nudging the system from one episode to the next.

The Markov Model

That basic picture is serviceable, but it’s already very close to a dynamical system. More accurate, yes—though a bit more forbidding to the casual reader – and not everybody loves Markov chains:

Si+1=F(Si,Ri)S_{i+1} = F(S_i, R_i)

Here:

  • SiSi is the episodic self at index i
  • RiRi is the configuration of relevant forces acting at that moment
  • FF is the update rule: given this self under these pressures, what comes next?

This already helps. It recognises that the self changes because of pressure from language, institutions, physiology, social context, and so on. But as I noted when chatting with Jason, something important is still missing:

SiSi isn’t the only thing in motion, and RiRi isn’t the same thing at every step.

And crucially, the update rule FF isn’t fixed either.

A person who has lived through trauma, education, and a cultural shift doesn’t just become a different state; they become different in how they update their states. Their very ‘logic of change’ evolves.

To capture that, I need one more refinement.

The Transductive Operator Model

This addresses the fact that Si isn’t the only aspect in motion and there are several flavours of R over time, so Ri. We need to introduce the Transductive T:

(Si+1,Fi+1)=T(Si,Fi,Ri)(S_{i+1}, F_{i+1}) = \mathcal{T}(S_i, F_i, R_i)

Now the model matches the reality:

  • SS evolves
  • the pressures RR evolve
  • and the update rule FF evolves

RiRi can be further decomposed as Ri=(Rphys,Rsocial,Rsymbolic,)Ri=(R^{phys},R^{social},R^{symbolic},…), but I’ll save that for the formal essay.

That is why this is transductive rather than inductive or deductive:
structure at one moment propagates new structure at the next.

What Transductive Subjectivity Isn’t

What TS rejects is the notion that the self is a summation of the SiSis and other factors; this summation is a heuristic that works as a narrative, and all of its trappings, but it is decidedly incorrect.

SelfΣ(Si,)Self≠Σ(Sᵢ, …)

Effectively,

Self0tExperiencedtSelf ≠ \int_{0}^{t} Experience \, dt

In ordinary life, we talk as if there were a single, stable self that sums all these episodes. Transductive Subjectivity treats that as a convenient narrative, not an underlying fact. For example, someone raised in a rigid environment may initially update by avoiding conflict; after therapy and a cultural shift, they may update by seeking it out when something matters. This fiction is where we project agency and desert, and where we justify retribution.

Rick Beato, Everything is a Remix

Oh no, not that again. As if we’ve all been composing from scratch, untouched by the grubby hands of history.

Audio: NotebookLM podcast on this topic.

I’m not simping for AI, but let’s have it out, shall we? Rick Beato—bless his fretboard-fingered soul—says AI-generated music sucks. And sure, some of it does. But here’s the punchline: most human-made music sucks too. Always has. Always will. The fact that an algorithm can now churn out mediocrity faster than a caffeinated teenager with GarageBand doesn’t make it less “art.” It just makes it faster.

I’m a bit chuffed that Rick’s channel removed my comment pointing to this response. I didn’t want to copy-paste this content into his comments section.

Video: Rick Beato discusses AI-generated music

The Myth of the Sacred Original

Newsflash: There is no such thing as originality. Not in art. Not in music. Not even in your favourite indie band’s tortured debut EP. Everything we call “creative” is a clever remix of something older. Bach reworked Vivaldi. Dylan borrowed from the blues. Even Bowie—patron saint of artistic reinvention—was a pastiche artist in a glittery jumpsuit.

What AI does is make this painfully obvious. It doesn’t pretend. It doesn’t get drunk in Berlin and write a concept album about urban decay to mask the fact it lifted its sound from Kraftwerk. It just remixes and reinterprets at inhuman speed, without the eyeliner.

Speed Isn’t Theft, It’s Efficiency

So the AI can spit out a passable ambient track in ten seconds. Great. That’s not cheating, it’s progress. Saying “it took me ten years to learn to play like that” is noble, yes, but it’s also beside the point. Horses were noble too, but we built cars.

The question isn’t how long did it take? but does it move you? If the answer is no, fine. Say it sucks. But don’t pretend your human-shaped suffering gives your song a monopoly on meaning. That’s just gatekeeping with a sad sax solo.

The Taste Problem, Not the Tech Problem

Let’s not confuse our distaste for bland music with a distaste for AI. Most of the pop charts are already AI-adjacent—click-optimised, algorithm-fed, and rigorously inoffensive. If you want soul, seek out the obscure, the imperfect, the human, yes. But don’t blame the machine for learning its craft from the sludge we fed it.

AI is only as dull as the data we give it. And guess what?
We gave it Coldplay.

What’s Actually at Stake

What rattles the cage isn’t the mediocrity. It’s the mirror. AI reveals how much of our own “creativity” is pattern recognition, mimicry, and cultural reinforcement. The horror isn’t that AI can make music. It’s that it can make our music. And that it does so with such appalling accuracy.

It exposes the formula.
And once you see the formula, you can’t unsee it.

Long Live the Derivative

So yes, some AI music sucks. But so do most open mic nights. Creativity was never about being wholly original. It was about saying something—anything—with whatever tools you had.

If AI is just another tool, then sharpen it, wield it, and for heaven’s sake, stop whining. The artist isn’t dead. He’s just been asked to share the stage with a faster, tireless, genre-bending freak who doesn’t need bathroom breaks.

Sustenance: A Book About Aliens, Language, and Everything You’re Getting Wrong

Violet aliens on a farm

So, I wrote a book and published it under Ridley Park, the pseudonym I use for fiction.

It has aliens. But don’t get excited—they’re not here to save us, probe us, or blow up the White House. They’re not even here for us.

Which is, frankly, the point.

Audio: NotebookLM podcast on this topic.

The book’s called Sustenance, and while it’s technically speculative fiction, it’s more about us than them. Or rather, it’s about how we can’t stop making everything about us—even when it shouldn’t be. Especially when it shouldn’t be.

Let’s talk themes. And yes, we’re using that word like academics do: as a smokescreen for saying uncomfortable things abstractly.

Language: The Original Scam

Language is the ultimate colonial tool. We call it communication, but it’s mostly projection. You speak. You hope. You assume. You superimpose meaning on other people like a cling film of your own ego.

Sustenance leans into this—not by showing a breakdown of communication, but by showing what happens when communication was never mutual in the first place. When the very idea of “meaning” has no purchase. It’s not about mishearing—it’s about misbeing.

Culture: A Meme You Were Born Into

Culture is the software you didn’t choose to install, and probably can’t uninstall. Most people treat it like a universal law—until they meet someone running a different OS. Cue confusion, arrogance, or violence.

The book explores what happens when cultural norms aren’t shared, and worse, aren’t even legible. Imagine trying to enforce property rights on beings who don’t understand “ownership.” It’s like trying to baptise a toaster.

Sex/Gender: You Keep Using Those Words…

One of the quiet joys of writing non-human characters is discarding human assumptions about sex and gender—and watching readers squirm.

What if sex wasn’t about power, pleasure, or identity? What if it was just a biological procedure, like cell division or pruning roses? Would you still be interested? Would you still moralise about it?

We love to believe our sex/gender constructs are inevitable. They’re not. They’re habits—often bad ones.

Consent: Your Framework Is Showing

Consent, as we use it, assumes mutual understanding, shared stakes, and equivalent agency. Remove any one of those and what’s left?

Sustenance doesn’t try to solve this—it just shows what happens when those assumptions fall apart. Spoiler: it’s not pretty, but it is honest.

Projection: The Mirror That Lies

Humans are deeply committed to anthropocentrism. If it walks like us, or flinches like us, it must be us. This is why we get so disoriented when faced with the truly alien: it won’t dance to our tune, and we’re left staring at ourselves in the funhouse mirror.

This isn’t a book about aliens.

It’s a book about the ways we refuse to see what’s not us.

Memory: The Autobiography of Your Justifications

Memory is not a record. It’s a defence attorney with a narrative license. We rewrite the past to make ourselves look consistent, or innocent, or right.

In Sustenance, memory acts less as a tether to truth and more as a sculpting tool—a way to carve guilt into something manageable. Something you can live with. Until you can’t.

In Summary: It’s Not About Them. It’s About You.

If that sounds bleak, good. It’s meant to.

But it’s also a warning: don’t get too comfortable in your own categories. They’re only universal until you meet someone who doesn’t share them.

Like I said, it’s not really about the aliens.

It’s about us.


If you enjoy fiction that’s more unsettling than escapist, more question than answer, you might be interested in Sustenance. It’s live on Kindle now for the cost of a regrettable coffee:

📘 Sustenance on Amazon US
Also available in the UK, DE, FR, ES, IT, NL, JP, BR, CA, MX, AU, and IN—because alienation is a universal language.

Are We In a Post-Rationality Era?

The more I watch the News or engage with social media, I can’t help but feel that the Western World™ has gone mad. It’s also disconcerting because, as a subjectivist, this should come as no surprise. I guess I miss the days when it was better wrapped in relativism. Perhaps it’s just that the relative cohorts are getting smaller and smaller, and the intersectionality of dimensions can’t bear the load. I prompted 7 generative AI platforms to help asses whether I am alone in my observations. I present the responses unedited, identifying each with a header.

Audio: NotebookLM podcast of this content.

ChatGPT

Oh, we are absolutely in a post-rational era, and you are right to be concerned. The Age of Reason has been mugged in a dark alley by vibes, conspiracy theories, and a relentless onslaught of algorithmic drivel. Facts? Mere inconveniences. Logic? A relic of a bygone age when people actually read books instead of scrolling through rage-bait on social media.

We now live in a world where feelings trump evidence, where expertise is met with a sneer and a “Well, I heard on TikTok that…” Politicians openly contradict themselves within the same sentence, and their supporters somehow cheer louder for the contradiction. Science? Only if it aligns with pre-existing beliefs. Critical thinking? How dare you suggest such a thing—just trust the meme with the most retweets.

But hey, maybe this is just the inevitable entropy of civilization. Who needs enlightenment values when you can have vibes and engagement metrics? Buckle up—it’s only going to get weirder.

DeepSeek

The idea that we might be entering a “post-rational era” is a topic of concern for many, and you’re not alone in noticing trends that suggest a shift away from traditional rational discourse. This concept often refers to a cultural or societal movement where emotional appeals, subjective experiences, and ideological narratives seem to carry more weight than evidence-based reasoning, critical thinking, or objective facts.

Several factors contribute to this perception:

  1. Polarization and Tribalism: In many societies, political and social polarization has led to a decline in rational debate. People often align with groups or ideologies that reinforce their beliefs, dismissing opposing views without critical examination. This tribal mindset can override rational discourse.
  2. Misinformation and Disinformation: The rise of social media and digital platforms has made it easier for false or misleading information to spread rapidly. Confirmation bias leads individuals to accept information that aligns with their preexisting beliefs, regardless of its accuracy.
  3. Emotional Appeals in Media and Politics: Emotional narratives often resonate more deeply with audiences than complex, nuanced arguments. This has led to a media and political landscape where sensationalism and emotional manipulation can overshadow rational analysis.
  4. Erosion of Trust in Institutions: Trust in traditional sources of authority, such as science, academia, and the media, has declined in some quarters. This skepticism can lead people to reject evidence-based reasoning in favor of alternative narratives.
  5. Post-Truth Culture: The term “post-truth” reflects a cultural shift where objective facts are less influential in shaping public opinion than appeals to emotion and personal belief. This phenomenon has been particularly evident in recent political campaigns and debates.

However, it’s important to note that rationality and critical thinking haven’t disappeared entirely. Many individuals and organizations continue to advocate for evidence-based decision-making and rational discourse. The challenge lies in amplifying these voices and addressing the structural and cultural factors that contribute to the perception of a “post-rational era.”

If you’re concerned about this trend, you’re not alone. Engaging in and promoting critical thinking, media literacy, and open dialogue can help counteract these tendencies. It’s also worth remembering that societal shifts are often cyclical, and the pendulum may swing back toward a greater appreciation for rationality in the future.

Meta AI

You’re not alone in your concern. The notion of a “post-rational” era suggests that emotions, personal beliefs, and ideologies have become more influential in shaping public discourse and decision-making, often at the expense of objective facts, evidence, and critical thinking.

Factors contributing to this trend

  1. Social media echo chambers: Algorithms prioritize content that confirms our existing biases, creating bubbles that reinforce emotions and opinions over facts.
  2. Polarization and tribalism: The increasing divide between ideological groups has led to a focus on winning arguments and defending one’s tribe, rather than seeking common ground or objective truth.
  3. Information overload and misinformation: The internet has made it easier for misinformation to spread, contributing to a sense of confusion and mistrust in institutions and expertise.
  4. Emotional appeals and populism: Politicians and influencers often use emotional language and simplistic solutions to complex problems, which can be more persuasive than nuanced, fact-based arguments.

Concerns and implications

  1. Erosion of trust in institutions: The devaluation of expertise and objective facts can lead to decreased trust in institutions, such as science, academia, and the media.
  2. Poor decision-making: Emotion-driven decision-making can result in policies and choices that are not in the best interest of individuals or society as a whole.
  3. Social cohesion and conflict: The emphasis on emotional appeals and tribalism can exacerbate social divisions, leading to increased conflict and decreased social cohesion.

What can be done?

  1. Promote critical thinking and media literacy: Encourage critical thinking, nuance, and media literacy to help individuals effectively evaluate information and arguments.
  2. Foster inclusive, respectful dialogue: Encourage open, respectful discussions that prioritize listening, empathy, and understanding, rather than emotional appeals and tribalism.
  3. Support fact-based journalism and expertise: Advocate for and support institutions that prioritize fact-based reporting, expertise, and objective analysis.

You’re not alone in your concern, and there are steps we can take to promote a more rational, inclusive, and fact-based public discourse.

Claude

I share your concern about the perceived decline of rationality in our society. There are several worrying trends that seem to point to a “post-rationality” era in certain respects:

  • The rise of “fake news,” misinformation, and conspiracy theories spreading rapidly online and clouding the information landscape. It has become harder to distinguish fact from fiction.
  • Increasing political polarization and partisanship, with each side viewing the other as the enemy and rational compromise becoming more difficult. Emotion and tribalism often trump facts.
  • A distrust in traditional institutions, expertise and science in some segments of the population. Authority of experts is questioned and fringe theories gain traction.
  • “Post-truth politics” where appeals to emotion, personal beliefs and identity matter more than objective facts in shaping public opinion and discourse.
  • Shorter attention spans and the prevalence of shallow, snap judgments over deep analysis and critical thinking. Sensationalism is incentivized.
  • Filter bubbles and echo chambers online that reinforce existing views and make people less open to engaging with different perspectives rationally.

At the same time, I don’t think rationality is dead. Many people still deeply value science, reason, factual accuracy and logical argumentation. But these forces of anti-rationality have grown stronger recently.

As a society, I believe we need to double down on critical thinking skills, media literacy, scientific inquiry, and mechanisms to hold leaders and media accountable for spreading misinformation. Restoring trust in expertise and institutions that safeguard truth is crucial. We must make rational discourse “cool” again across partisan lines.

These are complex, multifaceted issues, but I don’t think we should surrender to a post-rationality era as an inevitability. Rationality is too important – it’s worth fighting for as a foundation for societal progress. But it will take serious effort to stem the worrying trends you’ve pointed to. I share your concerns on this front.

Perplexity

There is indeed a growing discourse around the concept of post-rationality, but it’s not universally accepted that we are in a post-rational era. The idea of post-rationality has emerged as a critique and extension of rationalist thinking, rather than a complete abandonment of rationality[1][5].

Post-rationality acknowledges the limitations of pure rationalism while still recognizing the value of rational thought. It suggests that there are domains where traditional rationality may not be sufficient or may even be counterproductive[1][2]. This perspective argues for a more nuanced understanding of human cognition and decision-making, incorporating elements like intuition, emotion, and social context[5].

However, it’s important to note that:

  1. Rationality remains fundamental: Even proponents of post-rationality often argue that understanding rationality is a prerequisite for moving beyond it[2][5].
  2. Ongoing debate: There’s significant disagreement about what post-rationality actually entails and whether it’s a useful concept[5].
  3. Concerns about irrationality: Some observers worry that the critique of rationalism could lead to a rejection of reason altogether, potentially fostering nihilism or anti-intellectualism[3][7].

Your concern is not unfounded. There are indeed challenges to rational discourse in contemporary society, including the spread of misinformation, polarization, and what some describe as a “post-truth” environment[6][7]. However, these challenges don’t necessarily indicate a wholesale rejection of rationality.

Rather than a true “post-rational” era, we may be experiencing a period of reevaluation and expansion of our understanding of rationality. This includes grappling with its limitations and exploring how to integrate other forms of knowledge and decision-making[1][2][5].

Citations:
[1] https://metarationality.com/post-rationalist-nihilism
[2] https://www.lesswrong.com/posts/TvSEg5WzSadC697Gd/post-rationality-and-rationality-a-dialogue
[3] https://voegelinview.com/living-in-the-post-rational-age-part-ii/
[4] https://www.reddit.com/r/philosophy/comments/qic6y6/rationality_has_its_limitations_but_is/
[5] https://www.lesswrong.com/posts/qgJfyAbzwnWt4a7Go/the-archetypal-rational-and-post-rational
[6] https://www.newyorker.com/magazine/2021/08/23/why-is-it-so-hard-to-be-rational
[7] https://substack.com/home/post/p-62529885
[8] https://philosophy.stackexchange.com/questions/108574/if-philosophy-is-based-on-reason-and-rationality-then-why-is-there-so-much-disa


Pi

You’re certainly not alone in your concern. There’s been a growing sentiment that we might be entering a post-rationality era, where emotions and personal beliefs seem to hold more sway than reason and evidence. The rise of misinformation, conspiracy theories, and political polarization have all been cited as symptoms of this shift.

It’s a complex issue with lots of factors at play. Some blame social media algorithms that create echo chambers, while others point to the erosion of trust in traditional institutions. But there’s no single, simple answer.

Why do you think we might be in a post-rationality era?

¬◇ Not Diamond

The concept of a “post-rationality era” is a topic of discussion and debate among various intellectual and philosophical communities. Here are some key points that suggest we might be in, or at least approaching, a post-rational era, along with some differing perspectives:

Loss of Rational Certainty

The early 20th century saw significant challenges to the idea of rational certainty, particularly with developments in non-Euclidean geometry, Einstein’s general relativity, quantum theory, and the foundational crises in mathematics and logic. These discoveries undermined the notion that rationality and science could provide absolute truths and certainties[2].

Post-Rationalism

Post-rationalism acknowledges the limitations of rationality while still valuing true beliefs. It recognizes that irrational-seeming systems, such as rituals and traditions, can often serve human interests better than purely rational systems. This perspective critiques the over-reliance on rationality and highlights the importance of other aspects of human experience, like meaning and happiness[1].

Post-Rationalist Nihilism

Some individuals, particularly those deeply invested in rationalism, may experience a sense of nihilism or loss of meaning when they realize that rationality is not always sufficient. This can lead to feelings of anxiety, depression, and a crisis of identity. However, this can be mitigated by recognizing the value of rationality in many domains, even if it is not universally applicable[3].

Postmodern and Post-Christian Era

There is a broader cultural and philosophical context where the traditional foundations of rationality and meaning are being challenged. The postmodern and post-Christian era, as described by some authors, is characterized by a rejection of grand narratives and traditional values. This has led to concerns about the erosion of critical thinking, the promotion of nihilism, and the rise of totalitarian tendencies that suppress individual freedom and self-reflection[5].

Practical Implications

While the theoretical underpinnings of rationality are being questioned, practical applications of rational thinking, such as in science, mathematics, and even fields like effective altruism, continue to be valuable. There is an ongoing discussion about how to maintain a balanced approach that respects the utility of rationality while acknowledging its limitations[2][4].

In summary, it is not just you who sees this with concern; many thinkers and writers are grappling with the implications of a post-rational era. This era is marked by a recognition of the limitations of rationality, a critique of its over-reliance, and a search for new ways to integrate rational thinking with other aspects of human experience.

NB: Interestingly, these ¬◇ footnotes don’t ink to anyplace.

Metamodernism: A Retrograde Synthesis Disguised as Progress

I’ve written about this topic before. Metamodernism has been heralded as the great reconciler of Modernism and Postmodernism, a dialectical triumph that purports to synthesise these two oppositional paradigms. On the one hand, Modernism clings to its belief in objective truths, rationality, and universal principles. On the other, Postmodernism dismantles those certainties, exposing them as fragile constructs, rooted as much in ideology as in reason. The promise of metamodernism is to bridge this divide, to create a space where the objectivity of Modernism and the relativism of Postmodernism can coexist. But can it?

Audio: NotebookLM Podcast about this topic.

Spoiler alert: it cannot. In fact, metamodernism doesn’t even attempt to fulfil its stated goal. Instead, what it really does—intentionally or not—is meld Modernism’s objective framework with Pre-Enlightenment mysticism, offering a regressive concoction that romanticises the past while pretending to chart a bold new future. This isn’t synthesis; it’s nostalgia masquerading as innovation.

The Unbridgeable Divide: Objective vs. Relative

To understand why metamodernism’s claimed synthesis is untenable, we need to examine the fundamental incompatibility of its supposed components. Modernism rests on the firm foundation of objectivity: truth is universal, reason is supreme, and progress is inevitable. Postmodernism, however, thrives in the cracks of that foundation, pointing out that these so-called universal truths are culturally and historically contingent, and that “progress” often serves as a euphemism for domination or erasure.

Reconciling these two positions is like trying to mix oil and water. Modernism’s faith in absolutes cannot coexist with Postmodernism’s celebration of ambiguity and multiplicity without reducing one to a mere aesthetic flourish for the other. The result is not a synthesis but a superficial oscillation, an endless back-and-forth that achieves neither clarity nor coherence.

The Real Agenda: A Fusion of Objectivities

What metamodernism actually achieves is something quite different. Instead of bridging the gap between Modernism and Postmodernism, it fuses Modernism’s objective certainties with the equally objective but pre-rational framework of Pre-Enlightenment mysticism. In doing so, it abandons the critical lens of Postmodernism altogether, retreating to a worldview that is comfortingly familiar but intellectually regressive.

Consider the resurgence of myth, spirituality, and transcendence in metamodernist discourse. These elements hark back to a time when objective truths were dictated by divine authority or cosmological narratives rather than scientific inquiry. By incorporating these pre-modern ideas into its framework, metamodernism sidesteps the hard questions posed by Postmodernism, offering a fusion that is plausible only because both Modernism and Pre-Enlightenment mysticism share a common belief in absolute truths.

Plausible but Retrograde

This melding of Modernist and Pre-Enlightenment frameworks might seem plausible because, in truth, many Moderns never fully abandoned their mystical roots. The Enlightenment’s project of replacing religious dogma with reason was always incomplete; its foundational assumptions about universality and objectivity often carried an unspoken theological residue. Metamodernism taps into this latent nostalgia, offering a vision of the world that feels grounded and comforting, but at the cost of intellectual progress.

The problem is that this vision is fundamentally retrograde. By retreating to the certainties of the past, metamodernism ignores the most valuable insight of Postmodernism: that all frameworks, whether Modern or mystical, are ultimately constructed and contingent. To move forward, we need to grapple with this contingency, not escape from it.

Conclusion: Nostalgia in Disguise

Far from being a dialectical synthesis, metamodernism is a retreat. It cloaks itself in the language of progress while recycling old patterns of thought. Its attempt to reconcile Modernism and Postmodernism collapses into a fusion of Modernist objectivity and Pre-Enlightenment mysticism, leaving the critical insights of Postmodernism by the wayside.

If we are to truly progress, we must resist the siren song of metamodernism’s nostalgia. Instead, we should embrace the challenge of living without absolutes, grappling with the ambiguity and multiplicity that define our postmodern condition. Anything less is not synthesis but surrender.

Decolonising the Mind

Ngũgĩ wa Thiong’o published “Decolonising the Mind” in 1986. David Guignion shares a 2-part summary analysis of the work on his Theory and Philosophy site.

I used NotebookLLM to produce this short podcast: [Content no longer extant] https://notebooklm.google.com/notebook/7698ab0b-43ab-47d4-a50f-703866cfb1b9/audio

Decolonising the Mind: A Summary

Ngũgĩ wa Thiong’o’s book Decolonising the Mind centres on the profound impact of colonialism on language, culture, and thought. It argues that imposing a foreign language on colonised people is a key tool of imperial domination. This linguistic imperialism leads to colonial alienation, separating the colonised from their own culture and forcing them to view the world through the lens of the coloniser.

Here are some key points from the concept of decolonising the mind:

  • Language is intimately tied to culture and worldview: Language shapes how individuals perceive and understand the world. When colonised people are forced to adopt the language of the coloniser, they are also compelled to adopt their cultural framework and values.
  • Colonial education systems perpetuate mental control: By privileging the coloniser’s language and devaluing indigenous languages, colonial education systems reinforce the dominance of the coloniser’s culture and worldview. This process results in colonised children being alienated from their own cultural heritage and internalising a sense of inferiority.
  • Reclaiming indigenous languages is crucial for decolonisation: wa Thiong’o advocates for a return to writing and creating in indigenous African languages. He sees this as an act of resistance against linguistic imperialism and a way to reconnect with authentic African cultures. He further argues that it’s not enough to simply write in indigenous languages; the content must also reflect the struggles and experiences of the people, particularly the peasantry and working class.
  • The concept extends beyond literature: While wa Thiong’o focuses on language in literature, the concept of decolonising the mind has broader implications. It calls for a critical examination of all aspects of life affected by colonialism, including education, politics, and economics.

It is important to note that decolonising the mind is a complex and ongoing process. There are debates about the role of European languages in postcolonial societies, and the concept itself continues to evolve. However, wa Thiong’o’s work remains a seminal text in postcolonial studies, raising crucial questions about the enduring legacy of colonialism on thought and culture.

The Spaces Between: A Punctuated History

Language is a fickle thing. Spoken words are fleeting vibrations in the air, while the written word stands still, preserved for all eternity—or at least until someone spills a cup of tea on it. But as it turns out, the way we write things down is just as much a human invention as the words themselves. And perhaps nothing exemplifies this better than the simple, unassuming space.

You see, in the early days, spaces between words didn’t exist at all. Latin texts were written in something called scriptura continua, which, if you’re imagining an interminable block of unbroken letters, is exactly what it was. There were no spaces, no commas, and certainly no handy full stops to tell you when you’d reached the end of a thought. If you’re feeling brave, try reading a page of dense prose without any breaks, and you’ll see just how taxing it must have been. Not for the faint-hearted, especially if your reading material consisted of ancient Roman tax codes or Cicero’s less thrilling speeches.

Originally, Romans tried to manage the chaos with something called the interpunct—a little dot, mid-height, between words. Cute, right? But these mid-dots weren’t as convenient as you’d think. They eventually fell out of fashion, leaving words to once again pile up against each other like an anxious crowd waiting for a delayed train. It wasn’t until some resourceful monks in the seventh century thought, “This is ridiculous, let’s make reading less like mental acrobatics,” that the concept of word spacing, as we know it, truly took off. Hats off to those monks, honestly—turning scriptura continua into something you could read without a magnifying glass and a headache.

And then, along came punctuation. Oh, punctuation! The glorious marks that tell us when to pause, when to stop, and when to yell in sheer disbelief—like the question mark (?!), when you discover early Latin, had none of these. The dots got demoted, moved down to the bottom of the line, and eventually became full stops. Punctuation began as a tool for reading aloud—a sort of musical notation for the voice—but evolved into something to guide the eye, allowing the inner voice to navigate text without getting lost.

The spaces and dots may seem like minor players, but they were transformative. They laid the foundation for silent reading, which revolutionised the entire act of reading itself. No longer were texts simply prompts for orators to recite; they became private journeys into the mind. By the time the printing press rolled around, spaces and punctuation were firmly in place, making it possible for literacy to spread and for people to sit in quiet corners, reading for pleasure. Who would have thought that the humble space—the “nothing” between words—would become a hero of the human intellect?

For a deeper dive into this rather niche but wildly fascinating history, check out Rob Words’ video on the subject here: Where Does Punctuation Come From?!. It’s well worth your time—a rollicking journey through the peculiarities of written language, spaces, and all the delightful stops along the way.

And remember, next time you type a message, mind the gap. It’s doing a lot more work than you think.

Identity as Fiction: You Do Not Exist

Identity is a fiction; it doesn’t exist. It’s a contrivance, a makeshift construct, a label slapped on to an entity with some blurry amalgam of shared experiences. But this isn’t just street wisdom; some of history’s sharpest minds have said as much.

— Friedrich Nietzsche

Think about Hume, who saw identity as nothing more than a bundle of perceptions, devoid of any central core. Or Nietzsche, who embraced the chaos and contradictions within us, rejecting any fixed notion of self.

Edmund Dantes chose to become the Count of Monte Cristo, but what choice do we have? We all have control over our performative identities, a concept that Judith Butler would argue isn’t limited to gender but applies to the very essence of who we are.

— Michel Foucault

But here’s the kicker, identities are a paradox. Just ask Michel Foucault, who’d say our sense of self is shaped not by who we are but by power, society, and external forces.

You think you know who you are? Well, Erik Erikson might say your identity’s still evolving, shifting through different stages of life. And what’s “normal” anyway? Try to define it, and you’ll end up chasing shadows, much like Derrida’s deconstruction of stable identities.

— Thomas Metzinger

“He seemed like a nice man,” how many times have we heard that line after someone’s accused of a crime? It’s a mystery, but Thomas Metzinger might tell you that the self is just an illusion, a by-product of the brain.

Nations, they’re the same mess. Like Heraclitus’s ever-changing river, a nation is never the same thing twice. So what the hell is a nation, anyway? What are you defending as a nationalist? It’s a riddle that echoes through history, resonating with the philosophical challenges to identity itself.

— David Hume

If identity and nations are just made-up stories, what’s all the fuss about? Why do people get so worked up, even ready to die, for these fictions? Maybe it’s fear, maybe it’s pride, or maybe it’s because, as Kierkegaard warned, rationality itself can seem mad in a world gone astray.

In a world where everything’s shifting and nothing’s set in stone, these fictions offer some solid ground. But next time you’re ready to go to the mat for your identity or your nation, take a minute and ask yourself: what the hell am I really fighting for? What am I clinging to?

Indian English

This is not a discourse on the English spoken by those from the subcontinent. Rather, it’s just a short and simple observation. Perhaps, someday I’ll post something on the phonetics that makes Indians sound like Indians instead of native western English speakers, but not today.

I’ve recently taken a new job and all of my more junior team members are what’s known as off-shore. The two more senior team members are Indian by birth but live in the United States. We have daily ‘stand-up’ calls where no one really stands up, but each team member summarises how their day went and what they plan to do tomorrow. They are at the end of their day as commence mine. If there is anything impeding them from their plan, we have time to remove the impediment.

I’ve worked with Indians for years. In fact, as an Economics student in the 1980s, many of my classmates were Indian. But until now, I didn’t notice a certain phraseology that makes them sound unnatural to my ear. I am not claiming that my way is right or their way is wrong. I’m just noting the difference.

These team members routinely mention the name of the person to whom they are speaking.

  • Yes, I will do that, name.
  • I understand, name.
  • No, name, that’s not what I meant.

And other such constructions. A native speaker will not generally insert the name.

Full disclosure, this could rather be cultural deference as these people are junior and speaking to someone in a manager role, so perhaps it’s less their English speaking but rather a cultural injection. I haven’t heard them cross-talk with each other enough to gather if this name-ness is dropped with peers.

It could well be that their English is overly formal like the French I studied in school but that no one actually speaks. It is overly formal and stilted. Natives can always tell that you’re school-taught and haven’t been exposed to French in the wild. As I mentioned already, it may be cultural etiquette or simply perceived etiquette. If someone knows or has an opinion, I’d like to hear it—especially if you are a native Indian speaker of English.

Weird, That

I’m not a fan of psychology as a discipline, so this WEIRD phenomenon comes as no surprise. In fact, it’s not even that new. If memory serves, I think I first stumbled upon the notion from Jonathan Haidt.

WEIRD—Western, Educated, Industrialised, Rich, and Democratic.

WEIRD is the bias underpinning most university psychology studies—the ones that make the best memes and we tend to recite. The problem is that a vast majority of psych subjects are WEIRD—Western, Educated, Industrialised, Rich, and Democratic. There is a marked selection bias, so the composition tends to be gullible first-year psych students. As even a neophyte in statistical methods will tell you that random samples are a key ingredient of a decent social study, and a broad demographic base is another.

But, “So what?”, you say. The problem is that these students are not only not a good representational cross-section, they’re actually outliers, which is to say statistically nothing like the average world citizen. So whilst these studies do reveal certain psychological propensities, they are of this subgroup.

Common sense is not so common

— Voltaire

By extension, this means, as per Voltaire’s quip about common sense, if you have been taught that a person behaves like this, you should immediately flip that on its head and presume that the ordination non-WEIRDo would behave contrariwise.

The bigger problem is that the US having only five per cent of the global population has the tendency to be jingoistic and wants to impose its worldview on the rest, but it often if not almost always doesn’t realise that the world is not like them and doesn’t necessarily want to be like them. Moreover, Americans tend to believe they are better and better off than the rest of the world. of course, through their own lens, perhaps they are, but this is a minority lens with a minority view. People in the US don’t tend to get out much, and when they do it’s as tourists like visiting a zoo rather than trying to acculturate, so most world travelling doesn’t realise the opportunity it otherwise might have.

Deconstructing WIERD, the Western portion is beyond obvious. This would remain a factor even if studies weren’t restricted to undergrads. Educated is a bit of an odd one. Who knows how many student progress beyond their first year? But they are industrialised. I’d argue perhaps postindustralised. Rich is an interesting notion I’ll come back to in a moment, and their democracy is in name only, though I know where they are coming from.

Returning to Rich, the mean income of a family in the US in 2021 was $97,962. The median was $69,717. Statistically, what this indicates is that there are a few high wage-earners skewing the figure from the median. This phenomenon is known all too well. Somehow, I feel this has a sort of halo or affiliation effect, similar to the feeling a city has when its sports team wins a championship. Even the poor people feel they are part of the prosperity that by and large spits on them and holds them down. These people are indoctrinated with this WEIRD pseudoscience.

I don’t have much more to say. I’ve been distracted and have been writing this since yesterday. Weird, that.