Midjourney Boundaries

I promise that this will not become a hub for generative AI. Rather than return to editing, I wanted to test more of Midjourney’s boundaries.

It turns out that Midjourney is selective about the nudity it renders. I was denied a render because of cleavage, but full-on topless – no problem.

Both of these videos originate from the same source image, but they take different paths. There is no accompanying video content. The setup features three women in the frame with a mechanical arm. I didn’t prompt for it. I’m not even sure of its intent. It’s just there, shadowing the women nearest to it. I don’t recall prompting for the oversized redhead in the foreground, though I may have.

In both images, note the aliasing of the tattoos on the blonde, especially on her back. Also, notice that her right arm seems shorter than it should. Her movements are jerky, as if rendered in a video game. I’m not sure what ritual the two background characters are performing, but notice in each case the prepetition. This seems to be a general feature of generative AI. It gets itself in loops, almost autistic.

Notice a few things about the top render.

Video: Midjourney render of 3 females and a mechanical arm engaging in a ritual. (9 seconds)

The first video may represent an interrogation. The blonde woman on the left appears to be a bit disoriented, but she is visually tracking the woman on the right. She seems to be saying something. Notice when the woman on the right stands. Her right foot lands unnaturally. She rather glitches.

The camera’s push and pull, and then push, seems to be an odd directorial choice, but who am I to say?

Video: Midjourney render of 3 females and a mechanical arm engaging in a ritual. (12 seconds)

The second video may represent taunting. The woman on the left still appears to be a bit disoriented, but she checks the redhead in the foreground with a glance. Notice the rocking of the two background characters, as well as the mech arm, which sways in sync with the woman on the right. This is a repetition glitch I mentioned above.

Here, the camera seems to have a syncopated relationship with the characters’ sway.

Summary

The stationary objects are well-rendered and persistent.

Assignment

Draft a short story or flash fiction using this as an inspirational prompt. I’m trying to imagine the interactions.

  • The ginger seems catatonic or drugged. Is she a CIS-female? What’s with her getup?
  • The blonde seems only slightly less out of it. Did she arrive this way? Did they dress her? Why does she appear to still have a weapon on her back? Is it a weapon or a fetter? Why is she dressed like that? Is she a gladiatrix readying for a contest? Perhaps she’s in training. What is she saying? Who is she talking to? What is her relationship to the redhead? Are they friends or foes – or just caught up in the same web?
  • What is the woman wearing the helmet doing? She appears to have the upper hand. Is she a cyborg, or is she just wearing fancy boots? What’s with her outfit? What’s with her Tycho Brahe prosthetic nose piece?
  • What is that mechanical hand? Is it a guard? A restraint? Is it hypnotising the ginger? Both of them? Is it conducting music that’s not audible?
  • What’s it read on the back wall? The two clips don’t share the same text. Call the continuity people.

Conscious of Consciousness

Let us begin with the heresy: consciousness is not a thing. It is not a light bulb switched on in the mind. It is not a theatre with a little homunculus watching the play unfold. It is not a ghost in the machine, nor even a particularly welcome tenant. Consciousness is a conjuring trick – one so convincing that even the conjurer forgets it is an act.

Video: Related Topic: IAI Joscha Bosch on Consiousness

If that unsettles you, good. Welcome to the simulacrum.

The Wetness of Mind

We often hear that consciousness is “emergent,” but the term is used so promiscuously that it risks becoming decorative. So let us be specific. Consciousness, if it is emergent, is emergent as wetness is from H2O: not in the hydrogen or the oxygen, but in their relationship when bonded just so. Joscha Bach and others argue that consciousness arises not from the bits, but from the dance – the recursive feedback loops and predictive models running atop the neural substrate.

Audio: NotebookLM podcast on this topic.

In this view, the self is not the pilot but the dashboard. It is the user interface the brain conjures to coordinate action, interpret input, and maintain internal coherence. Not because it’s real, but because it’s useful. You are a GUI with delusions of grandeur.

The Cast of Theorists

Let us now parade the usual suspects:

  • Joscha Bach: Consciousness is a virtual self-model, emergent from recursive, computational feedback. Not the product of neurons firing per se, but of their ability to simulate a stable identity across time.
  • Thomas Metzinger: There is no self. Only a Phenomenal Self-Model (PSM) which becomes phenomenally transparent when the system no longer recognises it as a model. Consciousness is the experience of this hallucinated self.
  • Daniel Dennett: Dismantles the notion of a “central experiencer” with his Multiple Drafts Model. Consciousness is a narrative, a distributed process where drafts of experience compete, are edited, and retroactively interpreted.
  • David Chalmers: Waves his flag at the Hard Problem of consciousness. You can explain behaviour, memory, attention—but not experience itself. He flirts with dualism and panpsychism while insisting there’s a gap science cannot yet close.
  • Giulio Tononi: Gives us Integrated Information Theory (IIT) and the elusive metric Φ (phi). Consciousness is the degree to which information is unified within a system. Your brain is conscious because its parts can’t be reduced without losing coherence.
  • Karl Friston: The prophet of Free Energy Minimisation. Consciousness is an emergent property of systems that seek to reduce prediction error. The brain is a Bayesian engine, and the self is its best guess about how to survive.

So What Is Consciousness?

A hallucination. A recursive illusion. A predictive dashboard. A statistical artefact. A phi score. A phenomenally transparent model. Take your pick.

None of these theories fully agree, but most converge on one elegant horror: you are not what you think you are. The sense of being a continuous, stable, indivisible “I” is a construction. A simulation. The dream from which there is no waking because waking is part of the dream.

This is not despair; it is clarity. Just as wetness does not cry when told it is not a substance, the self need not mourn its own illusion. It is a marvellous fiction, worth inhabiting.

Conclusion: Through the Mirror

To be conscious of consciousness is to stand in the hall of mirrors and realise none reflect the original—because there is no original. The mirror is the thing.

But if the theatre is empty, the play goes on. Scripts are written, models simulated, selves performed. And perhaps, in this strange recursion, we find not meaning, but the possibility of coherence.

So raise a glass to the illusion. May your predictive model stay optimised, your narrative stay plausible, and your hallucinated self remain just this side of transparent.


For further hallucinatory episodes, consult your local philosopher, neuroscientist, or AI researcher. Side effects may include derealisation, epistemic vertigo, and mild enlightenment.

Semantic Drift: When Language Outruns the Science

Science has a language problem. Not a lack of it – if anything, a surfeit. But words, unlike test tubes, do not stay sterile. They evolve, mutate, and metastasise. They get borrowed, bent, misused, and misremembered. And when the public discourse gets hold of them, particularly on platforms like TikTok, it’s the language that gets top billing. The science? Second lead, if it’s lucky.

Semantic drift is at the centre of this: the gradual shift in meaning of a word or phrase over time. It’s how “literally” came to mean “figuratively,” how “organic” went from “carbon-based” to “morally superior,” and how “theory” in science means robust explanatory framework but in the public square means vague guess with no homework.

In short, semantic drift lets rhetoric masquerade as reason. Once a word acquires enough connotation, you can deploy it like a spell. No need to define your terms when the vibe will do.

Audio: NotebookLM podcast on this topic.

When “Vitamin” No Longer Means Vitamin

Take the word vitamin. It sounds objective. Authoritative. Something codified in the genetic commandments of all living things. (reference)

But it isn’t.

A vitamin is simply a substance that an organism needs but cannot synthesise internally, and must obtain through its diet. That’s it. It’s a functional definition, not a chemical one.

So:

  • Vitamin C is a vitamin for humans, but not for dogs, cats, or goats. They make their own. We lost the gene. Tough luck.
  • Vitamin D, meanwhile, isn’t a vitamin at all. It’s a hormone, synthesised when sunlight hits your skin. Its vitamin status is a historical relic – named before we knew better, and now marketed too profitably to correct.

But in the land of TikTok and supplement shelves, these nuances evaporate. “Vitamin” has drifted from scientific designation to halo term – a linguistic fig leaf draped over everything from snake oil to ultraviolet-induced steroidogenesis.

The Rhetorical Sleight of Hand

This linguistic slippage is precisely what allows the rhetorical shenanigans to thrive.

In one video, a bloke claims a burger left out for 151 days neither moulds nor decays, and therefore, “nature won’t touch it.” From there, he leaps (with Olympic disregard for coherence) into talk of sugar spikes, mood swings, and “metabolic chaos.” You can almost hear the conspiratorial music rising.

The science here is, let’s be generous, circumstantial. But the language? Oh, the language is airtight.

Words like “processed,” “chemical,” and “natural” are deployed like moral verdicts, not descriptive categories. The implication isn’t argued – it’s assumed, because the semantics have been doing quiet groundwork for years. “Natural” = good. “Chemical” = bad. “Vitamin” = necessary. “Addiction” = no agency.

By the time the viewer blinks, they’re nodding along to a story told by words in costume, not facts in context.

The Linguistic Metabolism of Misunderstanding

This is why semantic drift isn’t just an academic curiosity – it’s a vector. A vector by which misinformation spreads, not through outright falsehood, but through weaponised ambiguity.

A term like “sugar crash” sounds scientific. It even maps onto a real physiological process: postprandial hypoglycaemia. But when yoked to vague claims about mood, willpower, and “chemical hijacking,” it becomes a meme with lab coat cosplay. And the science, if mentioned at all, is there merely to decorate the argument, not drive it.

That’s the crux of my forthcoming book, The Language Insufficiency Hypothesis: that our inherited languages, designed for trade, prayer, and gossip, are woefully ill-equipped for modern scientific clarity. They lag behind our knowledge, and worse, they often distort it.

Words arrive first. Definitions come limping after.

In Closing: You Are What You Consume (Linguistically)

The real problem isn’t that TikTokers get the science wrong. The problem is that they get the words right – right enough to slip past your critical filters. Rhetoric wears the lab coat. Logic gets left in the locker room.

If vitamin C is a vitamin only for some species, and vitamin D isn’t a vitamin at all, then what else are we mislabelling in the great nutritional theatre? What other linguistic zombies are still wandering the scientific lexicon?

Language may be the best tool we have, but don’t mistake it for a mirror. It’s a carnival funhouse – distorting, framing, and reflecting what we expect to see. And until we fix that, science will keep playing second fiddle to the words pretending to explain it.

Elites Ruined It For Everyone

David Brooks and the Hollowing Out of Conservatism

David Brooks is the quintessential old-school Conservative—the kind who once upheld a semblance of ideological coherence. He belongs to the pre-Reagan-Thatcher vintage, a time when Conservatism at least had the decency to argue from principles rather than blind tribalism. We could debate these people in good faith. Those days are gone. The current incarnation of Conservatism contains only homoeopathic traces of its Classical™ predecessor—diluted beyond recognition.

The Degeneration of Conservatism

The rot set in with Reagan, who caught it from Thatcher. Greed and selfishness were laundered into virtues, repackaged as “individual responsibility,” and the party’s intellectual ballast began to erode. By the time Bush II’s administration rolled in, Neo-Conservatism had replaced any lingering Burkean ethos, and by Trump’s tenure, even the pretence of ideology was gone. Conservatism-in-Name-Only—whatever Trump’s brand of reactionary nihilism was—swallowed the party whole. Do they even call themselves Conservatives anymore, or has that ship sailed along with basic literacy?

Click here to take the worldview survey

To be fair, this didn’t go unnoticed. Plenty of old-school Republicans recoiled in horror when Trump became their figurehead. Before the 2016 election, conservative pundits could barely contain their disdain for his incompetence, lack of moral compass, and general buffoonery. And yet, once they realised he was the party’s golden goose, they clambered aboard the Trump Train with the enthusiasm of lottery winners at a payday loan office. His staunchest critics became his most obsequious apologists. What does this tell us about their value system? Spoiler: nothing good.

Brooks’ Lament

Which brings us back to Brooks, who now bemoans the death of Conservative values. On this, we agree. Where we part ways is on whether those values were worth saving. Say you’re boarding a train from New York to Los Angeles. Conservatism might argue that a Miami-bound train is still a train, so what’s the problem? It’s the same vehicle, just going somewhere else. Except, of course, Conservatism has always insisted on the slow train over the fast train—because urgency is unseemly, and progress must be rationed.

If I’m an affluent middle-classer, I might prefer Conservatism’s careful incrementalism—it keeps my apple cart stable. Admirable, if you enjoy tunnel vision. Progressives, by contrast, recognise that some people don’t even have apple carts. Some are starving while others hoard orchards. To the Conservative, the poor just aren’t trying hard enough. To the Progressive, the system is broken, and the playing field needs a serious re-levelling. Even when Conservatives acknowledge inequality, their instinct is to tiptoe toward justice rather than risk disrupting their own affluence.

The Fallacy of Objective Reality

Leaving politics for philosophy, Brooks predictably rails against Postmodernism, decrying relativism in favour of good old-fashioned Modernist “reality.” He’s horrified by subjectivism, as though personal interpretation weren’t the foundation of all human experience. Like Jordan Peterson, he believes his subjective truth is the objective truth. And like Peterson, he takes umbrage at anyone pointing out otherwise. It feels so absolute to them that they mistake their own convictions for universal constants.

As a subjectivist, I accept that reality is socially mediated. We interpret truth claims based on cognitive biases, cultural conditioning, and personal experience. Even when we strive for objectivity, we do so through subjective lenses. Brooks’ Modernist nostalgia is touching but delusional—akin to demanding we all agree on a single flavour of ice cream.

The Existential Problem

And so, I find myself in partial agreement with Brooks. Yes, there is an existential crisis. The patient has a broken leg. But our prescriptions differ wildly. I won’t offer a metaphor for that—consider it your homework as a reader.

Brooks is likely a better writer than a public speaker, but you may still find yourself nodding along with some of his arguments. If you’re a “true” Christian Conservative—if you still believe in something beyond crass self-interest—he may well be preaching to the choir. But let’s be honest: how many in that choir are still listening?

Video: Modernity Worldview Survey Intro

I’ve finally had time to create some video content for the Modernity Worldview Survey. This content is a cursory overview and serves as an introduction to deeper content planned for the future.

This video is short of seven minutes, so briefly, it outlines the worldviews and the questions. I opted not to produce a single comprehensive video so the material could arrive sooner. The content is bookmarked, though this is likely overkill for such a short video.

A permanent page about the survey is always available on this blog.

I’m still accumulating responses, but the survey is available here if you haven’t taken it. Apologies in advance for the fact that it renders best on a larger monitor or tablet rather than a mobile phone. It doesn’t render at all on a landline, so there’s that.

The Greatest Loss in American History: Donald Trump Wins (And Loses) the Presidency

I couldn’t resist posting this unedited from ChatGPT. Here’s what led to it: I just finished Mark Fisher’s Capitalist Realism. Fisher mentioned the 1974 Warren Beatty film The Parallax View. I generally don’t connect well with film or video as media, but I stashed the title in my memory. A few hours later, I read an article mentioning the film, with the claim, “Of everything lost in the age of streaming, no film genre casualty is as regrettable — or as ripe for a comeback — as the paranoid political thriller.”

Rather than trying to enjoy the film as a whole, I decided to concentrate on the plot and meta-massaging. As expected, I didn’t like the movie, but I understand why Fisher mentioned it. It renders rather like a weak tea version of a Kafka story. Unlike more contemporary fare, it’s a classic work of postmodernism, having no resolution.

After watching it, I engaged with Rotten Tomatoes and IMDB for reviews and context before prompting ChatGPT.

Asking about the film itself, this was the tail end of its response:

Then I asked what Žižek thought about it, but it turns out that he had written about a book with the same title. When I asked about Fisher, it:

Finally, I posed this query: I recently read Capitalist Realism. Although penned during the 2007-08 financial meltdown and released in 2009, it feels as relevant today as then. In fact, it seems to have foreshadowed the time between.

It’s reply is this post—and without further comment…


And so, history lurches forward with all the grace of a three-legged cow on roller skates. Donald J. Trump, the billionaire reality TV star turned political apocalypse, has once again won the presidential vote of the United States of America—only to lose the presidency to Kamala Harris. If that sentence gave you an aneurysm, congratulations, you’re now officially in 2025.

How did we get here? More importantly, why does this make absolutely no sense? Ah, but sense, dear reader, is an antiquated relic of a bygone era, much like democracy, coherent election results, and the idea that facts matter.

A Triumph for the Ages (Sort of)

Let’s be clear: Donald Trump won the popular vote. That’s what his supporters will scream into the void—or at least into their social media echo chambers where dissent is met with a virtual Second Amendment response. The red-capped faithful, still clutching their tattered “Stop the Steal” banners from the last electoral circus, are convinced that this was finally their moment of redemption.

Except, well… he lost.

Enter Kamala Harris, the political equivalent of cold toast, somehow managing to slide into the Oval Office despite Trump’s “win.” The courts, the states, the Electoral College, and whatever eldritch horror lurks beneath the Capitol all conspired—again!—to keep The Donald out of power. Or so the narrative goes.

The Electoral College Strikes Again

Ah, the Electoral College. America’s favourite 18th-century fever dream. Once again, this labyrinthine system of delegate-wrangling has managed to produce a result that defies logic, mathematics, and possibly the laws of physics. Trump, against all odds (and against, presumably, some very sweaty legal advisors), has pulled off the impossible: winning while losing.

Some claim voter suppression, others cry fraud, and a few brave souls are out there trying to explain complex election mechanics to an audience that still thinks “covfefe” was a divine prophecy. But the reality is simpler: Trump, like a political Schrödinger’s cat, is simultaneously victorious and defeated. He has transcended the normal bounds of electoral outcomes, achieving a state of quantum presidency, neither fully here nor fully gone.

What Happens Next?

Riots? Lawsuits? A new line of Trump-branded commemorative “I Won Again!” hats? Place your bets, because at this stage, America is basically one large, over-budget reality show and no one knows what the next episode holds.

For Kamala Harris, the challenge is clear: govern a nation where half the country believes she stole the election, and the other half is still googling “How does the Electoral College work?” As for Trump, he will do what he does best—declare victory, launch a thousand lawsuits, and, inevitably, turn the entire thing into a business opportunity.

And so, dear reader, the United States stumbles forward, democracy battered but still standing, a house divided but too stubborn to fall. Until next time, buckle up—it’s going to be a hell of a ride.

The Narcissist’s Playbook

I’ve lived in Los Angeles a couple of times for a sum total of perhaps 15 years. The first time, I loved it. The next time, I was running on fumes. The first time, I was in my twenties – the second time in my forties. What a difference perspective and ageing makes. In my twenties, I was a pretty-boy punk-ass who owned the club scene on the Strip. In my forties, I was a wage slave.

Audio: NotebookLM podcast on this topic.

This morning, I heard a country song on Insta with a line claiming ‘there are nines and dimes in all 50’, and it reminded me of a phrase we used when I lived in Los Angeles – ‘LA 7’. This is constructed on the egoist, sexist notion that if you were a 10, you’d have already moved to LA. If you still lived in, say, Iowa and were considered a 10, the exchange rate to LA would be a 7.

Then, I thought about the LA-NYC rivalry and wrote this article with some help from ChatGPT.

How L.A. and NYC Became the Centres of the Universe (According to Them)

It is a truth universally acknowledged that Los Angeles and New York City—those bickering siblings of American exceptionalism—believe themselves to be the sun around which the rest of us drearily orbit. Each is utterly convinced of its centrality to the human experience, and neither can fathom that people outside their borders might actually exist without yearning to be them. This is the essence of the ‘Centre of the Universe Complex,’ a condition in which self-importance metastasises into a full-blown cultural identity.

Let us begin with Los Angeles, the influencer of cities. L.A. doesn’t merely think it’s the centre of the universe; it believes it’s the universe, replete with its own atmosphere of smog-filtered sunlight and an economy powered entirely by dreams, green juice, and Botox. For L.A., beauty isn’t just a priority—it’s a moral imperative. Hence the concept of the ‘L.A. 10,’ a stunningly arrogant bit of mathematics whereby physical attractiveness is recalculated based on proximity to the Pacific Coast Highway.

Here’s how it works: a ’10’ in some picturesque-but-hopelessly-provincial state, say Nebraska, is automatically downgraded to a ‘7’ upon arrival in Los Angeles. Why? Because, according to L.A.’s warped ‘arithmetic, if she were a real 10, she’d already be there, lounging by an infinity pool in Malibu and ignoring your DMs. This isn’t just vanity—it’s top-tier delusion. L.A. sees itself as a black hole of good looks, sucking the beautiful people from every corner of the earth while leaving the ‘merely pretty’ to languish in flyover country. The Midwest, then, isn’t so much a place as it is an agricultural waiting room for future Angelenos.

But don’t be fooled—New York City is no better. Where L.A. is obsessed with beauty, NYC worships hustle. The city doesn’t just believe it’s important; it believes it’s the only place on earth where anything important happens. While L.A. is out perfecting its tan, NYC is busy perfecting its reputation as the cultural and intellectual capital of the world—or, at least, its part of the world, which conveniently ends somewhere in Connecticut.

This mindset is best summed up by that sanctimonious mantra, If you can make it here, you can make it anywhere. Translation: if you survive the daily humiliation of paying $4,000 a month for a shoebox apartment while dodging both rats and an existential crisis, you’ve unlocked the secret to life itself. New York isn’t about looking good; it’s about enduring bad conditions and then boasting about it as if suffering were an Olympic sport. In this worldview, the rest of the world is simply an unworthy understudy in NYC’s perpetual Broadway production.

And here’s the thing: neither city can resist taking cheap shots at the other. L.A. dismisses NYC as a grim, grey treadmill where fun goes to die, while NYC scoffs at L.A. as a vapid bubble of avocado toast and Instagram filters. It’s brains versus beauty, grit versus glamour, black turtlenecks versus Lululemon. And yet, in their relentless need to outshine one another, they reveal a shared truth: both are equally narcissistic.

This mutual self-obsession is as exhausting as it is entertaining. While L.A. and NYC bicker over who wears the crown, the rest of the world is quietly rolling its eyes and enjoying a life unencumbered by astronomical rent or the constant pressure to appear important. The people of Iowa, for example, couldn’t care less if they’re an ‘LA 7’ or if they’ve “made it” in New York. They’re too busy living comfortably, surrounded by affordable housing and neighbours who might actually help them move a sofa.

But let’s give credit where it’s due. For all their flaws, these two cities do keep the rest of us entertained. Their constant self-aggrandisement fuels the cultural zeitgeist: without L.A., we’d have no Kardashians; without NYC, no Broadway. Their rivalry is the stuff of legend, a never-ending soap opera in which both cities play the lead role.

So, let them have their delusions of grandeur. After all, the world needs a little drama—and nobody does it better than the cities that think they’re the centre of it.

The Scientist’s Dilemma: Truth-Seeking in an Age of Institutional Constraints

In an idealised vision of science, the laboratory is a hallowed space of discovery and intellectual rigour, where scientists chase insights that reshape the world. Yet, in a reflection as candid as it is disconcerting, Sabine Hossenfelder pulls back the curtain on a reality few outside academia ever glimpse. She reveals an industry often more concerned with securing grants and maintaining institutional structures than with the philosophical ideals of knowledge and truth. In her journey from academic scientist to science communicator, Hossenfelder confronts the limitations imposed on those who dare to challenge the mainstream — a dilemma that raises fundamental questions about the relationship between truth, knowledge, and institutional power.

I’ve also created a podcast to discuss Sabine’s topic. Part 2 is also available.

Institutionalised Knowledge: A Double-Edged Sword

The history of science is often framed as a relentless quest for truth, independent of cultural or economic pressures. But as science became more institutionalised, a paradox emerged. On the one hand, large academic structures offer resources, collaboration, and legitimacy, enabling ambitious research to flourish. On the other, they impose constraints, creating an ecosystem where institutional priorities — often financial — can easily overshadow intellectual integrity. The grant-based funding system, which prioritises projects likely to yield quick results or conform to popular trends, inherently discourages research that is too risky or “edgy.” Thus, scientific inquiry can become a compromise, a performance in which scientists must balance their pursuit of truth with the practicalities of securing their positions within the system.

Hossenfelder’s account reveals the philosophical implications of this arrangement: by steering researchers toward commercially viable or “safe” topics, institutions reshape not just what knowledge is pursued but also how knowledge itself is conceptualised. A system prioritising funding over foundational curiosity risks constraining science to shallow waters, where safe, incremental advances take precedence over paradigm-shifting discoveries.

Gender, Equity, and the Paradoxes of Representation

Hossenfelder’s experience with gender-based bias in her early career unveils a further paradox of institutional science. Being advised to apply for scholarships specifically for women, rather than being offered a job outright, reinforced a stereotype that women in science might be less capable or less deserving of direct support. Though well-intentioned, such programs can perpetuate inequality by distinguishing between “real” hires and “funded outsiders.” For Hossenfelder, this distinction created a unique strain on her identity as a scientist, leaving her caught between competing narratives: one of hard-earned expertise and one of institutionalised otherness.

The implications of this dilemma are profound. Philosophically, they touch on questions of identity and value: How does an individual scientist maintain a sense of purpose when confronted with systems that, however subtly, diminish their role or undercut their value? And how might institutional structures evolve to genuinely support underrepresented groups without reinforcing the very prejudices they seek to dismantle?

The Paper Mill and the Pursuit of Legacy

Another powerful critique in Hossenfelder’s reflection is her insight into academia as a “paper production machine.” In this system, academics are pushed to publish continuously, often at the expense of quality or depth, to secure their standing and secure further funding. This structure, which rewards volume over insight, distorts the very foundation of scientific inquiry. A paper may become less a beacon of truth and more a token in an endless cycle of academic currency.

This pursuit of constant output reveals the philosopher’s age-old tension between legacy and ephemerality. In a system driven by constant publication, scientific “advancements” are at risk of being rendered meaningless, subsumed by an industry that prizes short-term gains over enduring impact. For scientists like Hossenfelder, this treadmill of productivity diminishes the romantic notion of a career in science. It highlights a contemporary existential question: Can a career built on constant output yield a genuine legacy, or does it risk becoming mere noise in an endless stream of data?

Leaving the Ivory Tower: Science Communication and the Ethics of Accessibility

Hossenfelder’s decision to leave academia for science communication raises a question central to contemporary philosophy: What is the ethical responsibility of a scientist to the public? When institutional science falters in its pursuit of truth, perhaps scientists have a duty to step beyond its walls and speak directly to the public. In her pivot to YouTube, Hossenfelder finds a new audience, one driven not by academic pressures but by genuine curiosity.

This shift embodies a broader rethinking of what it means to be a scientist today. Rather than publishing in academic journals read by a narrow circle of peers, Hossenfelder now shares her insights with a public eager to understand the cosmos. It’s a move that redefines knowledge dissemination, making science a dialogue rather than an insular monologue. Philosophically, her journey suggests that in an age where institutions may constrain truth, the public sphere might become a more authentic arena for its pursuit.

Conclusion: A New Paradigm for Scientific Integrity

Hossenfelder’s reflections are not merely the story of a disillusioned scientist; they are a call to re-evaluate the structures that define modern science. Her journey underscores the need for institutional reform — not only to allow for freer intellectual exploration but also to foster a science that serves humanity rather than merely serving itself.

Ultimately, the scientist’s dilemma that Hossenfelder presents is a philosophical one: How does one remain true to the quest for knowledge in an age of institutional compromise? As she shares her story, she opens the door to a conversation that transcends science itself, calling us all to consider what it means to seek truth in a world that may have forgotten its value. Her insights remind us that the pursuit of knowledge, while often fraught, is ultimately a deeply personal, ethical journey, one that extends beyond the walls of academia into the broader, often messier realm of human understanding.

Choice and Blame: Why We Forgive Some and Condemn Others

A recent parody video making the rounds on social media shows a man at a kitchen table, his girlfriend, and their cat. In a desperate attempt to gain his girlfriend’s attention, he knocks a cup off the table. The moment it hits the floor, she turns on him, scolding him for his clumsiness. Quick to deflect, he blames the cat, and suddenly her anger dissipates. She shifts from reprimanding him to lavishing affection on the supposedly guilty feline. The tension lifts—until he sheepishly confesses that it was, in fact, his doing all along. Her response? An incredulous, “Are you kidding me?”

What’s fascinating about this skit isn’t the comedy of the man’s mischief or even the cat’s unknowing role in the charade. It’s the girlfriend’s starkly different reactions to the same act, depending on who she believes committed it. The cat, in her eyes, can do no wrong; the boyfriend, however, is immediately culpable. It’s easy to laugh at the scenario’s absurdity, but the dynamic it portrays is familiar and, dare I say, quite telling about human behaviour.

The Double Standard of Blame

Why is it that we’re quick to exonerate some and just as quick to indict others? The phenomenon is more than a quirk of personality; it reveals our deeper, often unconscious, biases. While it’s understandable that the girlfriend might think the cat incapable of intentional mischief, her reaction also suggests a predisposition to forgive certain actors—whether due to perceived innocence, attachment, or simply habit.

This dynamic isn’t limited to pets and partners. In families, workplaces, and social groups, we often see a similar pattern. One person becomes the perennial scapegoat, bearing the brunt of blame for any and all misdeeds, while another enjoys a seemingly unshakeable immunity. Think of the “golden child” and the “black sheep” within a family. One can rarely put a foot wrong, while the other’s every move is scrutinised, questioned, or condemned.

Beyond the Blame: Motivations and Consequences

The reasons behind these imbalances can be complex. Sometimes, they stem from past behaviour: if someone has repeatedly erred, we may be primed to expect the worst from them, even if they’ve reformed. Other times, they arise from emotional bonds or biases: we excuse those we love or admire because acknowledging their faults would cause us discomfort or cognitive dissonance.

This phenomenon isn’t just about playing favourites; it can have significant psychological consequences. For the person perpetually cast as the villain, the burden of unwarranted blame can lead to feelings of resentment, anxiety, or self-doubt. Meanwhile, those consistently exonerated may internalise a skewed perception of their own infallibility, which can be equally damaging.

A Broader Reflection on Accountability

Returning to the video’s context, the girlfriend’s swift switch from reproach to indulgence once she believed the cat was at fault, and her subsequent anger when the truth was revealed, invites us to question our own responses to perceived transgressions. Are we, too, guilty of selectively assigning blame based on who we think is responsible? How often do we let our preconceptions shape our judgments, favouring one actor over another without truly weighing the evidence?

The parody is amusing, no doubt, but it also serves as a subtle reminder: our reactions often reveal more about our biases and expectations than about the actions themselves. The next time we find ourselves quick to blame or forgive, it’s worth pausing to ask: are we reacting to the act, or to the actor?

In a world increasingly marked by polarised opinions and knee-jerk reactions, cultivating this kind of self-awareness is crucial. We need to be vigilant not only about how we judge others but also about why we do so. For, in the end, it’s not just about who knocked the cup off the table—it’s about who we believe deserves to be scolded for it.

What’s Wrong with Utilitarianism

Full disclosure. All normative morality frameworks are seriously flawed. Consequentialism and its redheaded stepchild, Utilitarianism, may be among the worst—at least in the top 10.

In this video, I’m introduced to Tommy Curry, who makes a strong point in the face of Western imperialism—any imperialism, but the West seems to do more and better (if better means worse for the world at large). One can’t claim a moral high ground after nearly genociding counter-opinions. As he notes, when the proto-United States “accidentally” murdered ninety-five per cent of the Indigenous population and then applied the majority rule, good of the people rule, that’s the worst of bad faith.

To be fair, the world has a history of killing off and disappearing counter-voices and then voting on issues they opposed. Rinse and repeat until you become the majority. No wonder genocide is so popular. Israel has adopted this approach as a perpetrator after their predecessors escaped a similar fate in the 1940s. They accused Nazi Germany of being evil. I guess it rubbed off. Who knew genocide was contagious?

Peter Singer comments on the full video, a symposium on land ownership and hypocrisy, which can be found here or by following the IAI link from the video above. Eventually, you’ll hit a paywall. Apologies in advance.

I’d love to write more as this is a topic in which I have a passionate interest. Unfortunately, I am otherwise indisposed and will settle on sharing this video content for now. I’ll love to read your thoughts.