Conscious of Consciousness

Let us begin with the heresy: consciousness is not a thing. It is not a light bulb switched on in the mind. It is not a theatre with a little homunculus watching the play unfold. It is not a ghost in the machine, nor even a particularly welcome tenant. Consciousness is a conjuring trick – one so convincing that even the conjurer forgets it is an act.

Video: Related Topic: IAI Joscha Bosch on Consiousness

If that unsettles you, good. Welcome to the simulacrum.

The Wetness of Mind

We often hear that consciousness is “emergent,” but the term is used so promiscuously that it risks becoming decorative. So let us be specific. Consciousness, if it is emergent, is emergent as wetness is from H2O: not in the hydrogen or the oxygen, but in their relationship when bonded just so. Joscha Bach and others argue that consciousness arises not from the bits, but from the dance – the recursive feedback loops and predictive models running atop the neural substrate.

Audio: NotebookLM podcast on this topic.

In this view, the self is not the pilot but the dashboard. It is the user interface the brain conjures to coordinate action, interpret input, and maintain internal coherence. Not because it’s real, but because it’s useful. You are a GUI with delusions of grandeur.

The Cast of Theorists

Let us now parade the usual suspects:

  • Joscha Bach: Consciousness is a virtual self-model, emergent from recursive, computational feedback. Not the product of neurons firing per se, but of their ability to simulate a stable identity across time.
  • Thomas Metzinger: There is no self. Only a Phenomenal Self-Model (PSM) which becomes phenomenally transparent when the system no longer recognises it as a model. Consciousness is the experience of this hallucinated self.
  • Daniel Dennett: Dismantles the notion of a “central experiencer” with his Multiple Drafts Model. Consciousness is a narrative, a distributed process where drafts of experience compete, are edited, and retroactively interpreted.
  • David Chalmers: Waves his flag at the Hard Problem of consciousness. You can explain behaviour, memory, attention—but not experience itself. He flirts with dualism and panpsychism while insisting there’s a gap science cannot yet close.
  • Giulio Tononi: Gives us Integrated Information Theory (IIT) and the elusive metric Φ (phi). Consciousness is the degree to which information is unified within a system. Your brain is conscious because its parts can’t be reduced without losing coherence.
  • Karl Friston: The prophet of Free Energy Minimisation. Consciousness is an emergent property of systems that seek to reduce prediction error. The brain is a Bayesian engine, and the self is its best guess about how to survive.

So What Is Consciousness?

A hallucination. A recursive illusion. A predictive dashboard. A statistical artefact. A phi score. A phenomenally transparent model. Take your pick.

None of these theories fully agree, but most converge on one elegant horror: you are not what you think you are. The sense of being a continuous, stable, indivisible “I” is a construction. A simulation. The dream from which there is no waking because waking is part of the dream.

This is not despair; it is clarity. Just as wetness does not cry when told it is not a substance, the self need not mourn its own illusion. It is a marvellous fiction, worth inhabiting.

Conclusion: Through the Mirror

To be conscious of consciousness is to stand in the hall of mirrors and realise none reflect the original—because there is no original. The mirror is the thing.

But if the theatre is empty, the play goes on. Scripts are written, models simulated, selves performed. And perhaps, in this strange recursion, we find not meaning, but the possibility of coherence.

So raise a glass to the illusion. May your predictive model stay optimised, your narrative stay plausible, and your hallucinated self remain just this side of transparent.


For further hallucinatory episodes, consult your local philosopher, neuroscientist, or AI researcher. Side effects may include derealisation, epistemic vertigo, and mild enlightenment.

Semantic Drift: When Language Outruns the Science

Science has a language problem. Not a lack of it – if anything, a surfeit. But words, unlike test tubes, do not stay sterile. They evolve, mutate, and metastasise. They get borrowed, bent, misused, and misremembered. And when the public discourse gets hold of them, particularly on platforms like TikTok, it’s the language that gets top billing. The science? Second lead, if it’s lucky.

Semantic drift is at the centre of this: the gradual shift in meaning of a word or phrase over time. It’s how “literally” came to mean “figuratively,” how “organic” went from “carbon-based” to “morally superior,” and how “theory” in science means robust explanatory framework but in the public square means vague guess with no homework.

In short, semantic drift lets rhetoric masquerade as reason. Once a word acquires enough connotation, you can deploy it like a spell. No need to define your terms when the vibe will do.

Audio: NotebookLM podcast on this topic.

When “Vitamin” No Longer Means Vitamin

Take the word vitamin. It sounds objective. Authoritative. Something codified in the genetic commandments of all living things. (reference)

But it isn’t.

A vitamin is simply a substance that an organism needs but cannot synthesise internally, and must obtain through its diet. That’s it. It’s a functional definition, not a chemical one.

So:

  • Vitamin C is a vitamin for humans, but not for dogs, cats, or goats. They make their own. We lost the gene. Tough luck.
  • Vitamin D, meanwhile, isn’t a vitamin at all. It’s a hormone, synthesised when sunlight hits your skin. Its vitamin status is a historical relic – named before we knew better, and now marketed too profitably to correct.

But in the land of TikTok and supplement shelves, these nuances evaporate. “Vitamin” has drifted from scientific designation to halo term – a linguistic fig leaf draped over everything from snake oil to ultraviolet-induced steroidogenesis.

The Rhetorical Sleight of Hand

This linguistic slippage is precisely what allows the rhetorical shenanigans to thrive.

In one video, a bloke claims a burger left out for 151 days neither moulds nor decays, and therefore, “nature won’t touch it.” From there, he leaps (with Olympic disregard for coherence) into talk of sugar spikes, mood swings, and “metabolic chaos.” You can almost hear the conspiratorial music rising.

The science here is, let’s be generous, circumstantial. But the language? Oh, the language is airtight.

Words like “processed,” “chemical,” and “natural” are deployed like moral verdicts, not descriptive categories. The implication isn’t argued – it’s assumed, because the semantics have been doing quiet groundwork for years. “Natural” = good. “Chemical” = bad. “Vitamin” = necessary. “Addiction” = no agency.

By the time the viewer blinks, they’re nodding along to a story told by words in costume, not facts in context.

The Linguistic Metabolism of Misunderstanding

This is why semantic drift isn’t just an academic curiosity – it’s a vector. A vector by which misinformation spreads, not through outright falsehood, but through weaponised ambiguity.

A term like “sugar crash” sounds scientific. It even maps onto a real physiological process: postprandial hypoglycaemia. But when yoked to vague claims about mood, willpower, and “chemical hijacking,” it becomes a meme with lab coat cosplay. And the science, if mentioned at all, is there merely to decorate the argument, not drive it.

That’s the crux of my forthcoming book, The Language Insufficiency Hypothesis: that our inherited languages, designed for trade, prayer, and gossip, are woefully ill-equipped for modern scientific clarity. They lag behind our knowledge, and worse, they often distort it.

Words arrive first. Definitions come limping after.

In Closing: You Are What You Consume (Linguistically)

The real problem isn’t that TikTokers get the science wrong. The problem is that they get the words right – right enough to slip past your critical filters. Rhetoric wears the lab coat. Logic gets left in the locker room.

If vitamin C is a vitamin only for some species, and vitamin D isn’t a vitamin at all, then what else are we mislabelling in the great nutritional theatre? What other linguistic zombies are still wandering the scientific lexicon?

Language may be the best tool we have, but don’t mistake it for a mirror. It’s a carnival funhouse – distorting, framing, and reflecting what we expect to see. And until we fix that, science will keep playing second fiddle to the words pretending to explain it.

“Trust the Science,” They Said. “It’s Reproducible,” They Lied.

—On Epistemology, Pop Psychology, and the Cult of Empirical Pretence

Science, we’re told, is the beacon in the fog – a gleaming lighthouse of reason guiding us through the turbulent seas of superstition and ignorance. But peer a bit closer, and the lens is cracked, the bulb flickers, and the so-called lighthouse keeper is just some bloke on TikTok shouting about gut flora and intermittent fasting.

Audio: NotebookLM podcast on this topic.

We are creatures of pattern. We impose order. We mistake correlation for causation, narrative for truth, confidence for knowledge. What we have, in polite academic parlance, is an epistemology problem. What we call science is often less Newton and more Nostradamus—albeit wearing a lab coat and wielding a p-hacked dataset.

Let’s start with the low-hanging fruit—the rotting mango of modern inquiry: nutritional science, which is to actual science what alchemy is to chemistry, or vibes are to calculus. We study food the way 13th-century monks studied demons: through superstition, confirmation bias, and deeply committed guesswork. Eat fat, don’t eat fat. Eat eggs, don’t eat eggs. Eat only between the hours of 10:00 and 14:00 under a waxing moon while humming in Lydian mode. It’s a cargo cult with chia seeds.

But why stop there? Let’s put the whole scientific-industrial complex on the slab.

Psychology: The Empirical Astrological Society

Psychology likes to think it’s scientific. Peer-reviewed journals, statistical models, the odd brain scan tossed in for gravitas. But at heart, much of it is pop divination, sugar-dusted for mass consumption. The replication crisis didn’t merely reveal cracks – it bulldozed entire fields. The Stanford Prison Experiment? A theatrical farce. Power poses? Empty gestural theatre. Half of what you read in Psychology Today could be replaced with horoscopes and no one would notice.

Medical Science: Bloodletting, But With Better Branding

Now onto medicine, that other sacred cow. We tend to imagine it as precise, data-driven, evidence-based. In practice? It’s a Byzantine fusion of guesswork, insurance forms, and pharmaceutical lobbying. As Crémieux rightly implies, medicine’s predictive power is deeply compromised by overfitting, statistical fog, and a staggering dependence on non-replicable clinical studies, many funded by those who stand to profit from the result.

And don’t get me started on epidemiology, that modern priesthood that speaks in incantations of “relative risk” and “confidence intervals” while changing the commandments every fortnight. If nutrition is theology, epidemiology is exegesis.

The Reproducibility Farce

Let us not forget the gleaming ideal: reproducibility, that cornerstone of Enlightenment confidence. The trouble is, in field after field—from economics to cancer biology—reproducibility is more aspiration than reality. What we actually get is a cacophony of studies no one bothers to repeat, published to pad CVs, p-hacked into publishable shape, and then cited into canonical status. It’s knowledge by momentum. We don’t understand the world. We just retweet it.

What, Then, Is To Be Done?

Should we become mystics? Take up tarot and goat sacrifice? Not necessarily. But we should strip science of its papal robes. We should stop mistaking publication for truth, consensus for accuracy, and method for epistemic sanctity. The scientific method is not the problem. The pretence that it’s constantly being followed is.

Perhaps knowledge doesn’t have a half-life because of progress, but because it was never alive to begin with. We are not disproving truth; we are watching fictions expire.

Closing Jab

Next time someone says “trust the science,” ask them: which bit? The part that told us margarine was manna? The part that thought ulcers were psychosomatic? The part that still can’t explain consciousness, but is confident about your breakfast?

Science is a toolkit. But too often, it’s treated like scripture. And we? We’re just trying to lose weight while clinging to whatever gospel lets us eat more cheese.

What’s Missing? Trust or Influence

Post-COVID, we’re told trust in science is eroding. But perhaps the real autopsy should be performed on the institution of public discourse itself.

Since the COVID-19 crisis detonated across our global stage—part plague, part PR disaster—the phrase “trust in science” has become the most abused slogan since “thoughts and prayers.” Every public official with a podium and a pulse declared they were “following the science,” as if “science” were a kindly oracle whispering unambiguous truths into the ears of the righteous. But what happened when those pronouncements proved contradictory, politically convenient, or flat-out wrong? Was it science that failed, or was it simply a hostage to an incoherent performance of authority?

Audio: NotebookLM podcast discussing this topic.

Two recent Nature pieces dig into the supposed “decline” of scientific credibility in the post-pandemic world, offering the expected hand-wringing about public opinion and populist mistrust. But let’s not be so credulous. This isn’t merely a crisis of trust—it’s a crisis of theatre.

“The Science” as Ventriloquism

Let’s begin by skewering the central absurdity: there is no such thing as “The Science.” Science is not a monolith. It’s not a holy writ passed down by lab-coated Levites. It’s a process—a messy, iterative, and perpetually provisional mode of inquiry. But during the pandemic, politicians, pundits, and even some scientists began to weaponise the term, turning it into a rhetorical cudgel. “The Science says” became code for “shut up and comply.” Any dissent—even from within the scientific community—was cast as heresy. Galileo would be proud.

In Nature Human Behaviour paper (van der Linden et al., 2025) identifies four archetypes of distrust: distrust in the message, the messenger, the medium, and the motivation. What they fail to ask is: what if all four were compromised simultaneously? What if the medium (mainstream media) served more as a stenographer to power than a check upon it? What if the message was oversimplified into PR slogans, the messengers were party apparatchiks in lab coats, and the motivations were opaque at best?

Trust didn’t just erode. It was actively incinerated in a bonfire of institutional vanity.

A Crisis of Influence, Not Integrity

The second Nature commentary (2025) wrings its hands over “why trust in science is declining,” as if the populace has suddenly turned flat-Earth overnight. But the real story isn’t a decline in trust per se; it’s a redistribution of epistemic authority. Scientists no longer have the stage to themselves. Influencers, conspiracy theorists, rogue PhDs, and yes—exhausted citizens armed with Wi-Fi and anxiety—have joined the fray.

Science hasn’t lost truth—it’s lost control. And frankly, perhaps it shouldn’t have had that control in the first place. Democracy is messy. Information democracies doubly so. And in that mess, the epistemic pedestal of elite scientific consensus was bound to topple—especially when its public face was filtered through press conferences, inconsistent policies, and authoritarian instincts.

Technocracy’s Fatal Hubris

What we saw wasn’t science failing—it was technocracy failing in real time, trying to manage public behaviour with a veneer of empirical certainty. But when predictions shifted, guidelines reversed, and public health policy began to resemble a mood ring, the lay public was expected to pretend nothing happened. Orwell would have a field day.

This wasn’t a failure of scientific method. It was a failure of scientific messaging—an inability (or unwillingness) to communicate uncertainty, probability, and risk in adult terms. Instead, the public was infantilised. And then pathologised for rebelling.

Toward a Post-Scientistic Public Sphere

So where does that leave us? Perhaps we need to kill the idol of “The Science” to resurrect a more mature relationship with scientific discourse—one that tolerates ambiguity, embraces dissent, and admits when the data isn’t in. Science, done properly, is the art of saying “we don’t know… yet.”

The pandemic didn’t erode trust in science. It exposed how fragile our institutional credibility scaffolding really is—how easily truth is blurred when science is fed through the meat grinder of media, politics, and fear.

The answer isn’t more science communication—it’s less scientism, more honesty, and above all, fewer bureaucrats playing ventriloquist with the language of discovery.

Conclusion

Trust in science isn’t dead. But trust in those who claim to speak for science? That’s another matter. Perhaps it’s time to separate the two.

Defying Death

I died in March 2023 — or so the rumour mill would have you believe.

Of course, given that I’m still here, hammering away at this keyboard, it must be said that I didn’t technically die. We don’t bring people back. Death, real death, doesn’t work on a “return to sender” basis. Once you’re gone, you’re gone, and the only thing bringing you back is a heavily fictionalised Netflix series.

Audio: NotebookLM podcast of this content.

No, this is a semantic cock-up, yet another stinking exhibit in the crumbling Museum of Language Insufficiency. “I died,” people say, usually while slurping a Pumpkin Spice Latte and live-streaming their trauma to 53 followers. What they mean is that they flirted with death, clumsily, like a drunk uncle at a wedding. No consummation, just a lot of embarrassing groping at the pearly gates.

And since we’re clarifying terms: there was no tunnel of light, no angels, no celestial choir belting out Coldplay covers. No bearded codgers in slippers. No 72 virgins. (Or, more plausibly, 72 incels whining about their lack of Wi-Fi reception.)

There was, in fact, nothing. Nothing but the slow, undignified realisation that the body, that traitorous meat vessel, was shutting down — and the only gates I was approaching belonged to A&E, with its flickering fluorescent lights and a faint smell of overcooked cabbage.

To be fair, it’s called a near-death experience (NDE) for a reason. Language, coward that it is, hedges its bets. “Near-death” means you dipped a toe into the abyss and then screamed for your mummy. You didn’t die. You loitered. You loitered in the existential equivalent of an airport Wetherspoons, clutching your boarding pass and wondering why the flight to Oblivion was delayed.

As the stories go, people waft into the next world and are yanked back with stirring tales of unicorns, long-dead relatives, and furniture catalogues made of clouds. I, an atheist to my scorched and shrivelled soul, expected none of that — and was therefore not disappointed.

What I do recall, before the curtain wobbled, was struggling for breath, thinking, “Pick a side. In or out. But for pity’s sake, no more dithering.”
In a last act of rational agency, I asked an ER nurse — a bored-looking Athena in scrubs — to intubate me. She responded with the rousing medical affirmation, “We may have to,” which roughly translates to, “Stop making a scene, love. We’ve got fifteen others ahead of you.”

After that, nothing. I was out. Like a light. Like a minor character in a Dickens novel whose death is so insignificant it happens between paragraphs.

I woke up the next day: groggy, sliced open, a tube rammed down my throat, and absolutely no closer to solving the cosmic riddle of it all. Not exactly the triumphant return of Odysseus. Not even a second-rate Ulysses.

Here’s the reality:
There is no coming back from death.
You can’t “visit” death, any more than you can spend the afternoon being non-existent and return with a suntan.

Those near-death visions? Oxygen-starved brains farting out fever dreams. Cerebral cortexes short-circuiting like Poundland fairy lights. Hallucinations, not heralds. A final, frantic light show performed for an audience of none.

Epicurus, that cheerful nihilist, said, “When we are, death is not. When death is, we are not.” He forgot to mention that, in between, people would invent entire publishing industries peddling twaddle about journeys beyond the veil — and charging $29.99 for the paperback edition.

No angels. No harps. No antechamber to the divine.
Just the damp whirr of hospital machinery and the faint beep-beep of capitalism, patiently billing you for your own demise.

If there’s a soundtrack to death, it’s not choirs of the blessed. It’s a disgruntled junior surgeon muttering, “Where the hell’s the anaesthetist?” while pawing desperately through a drawer full of out-of-date latex gloves.

And thus, reader, I lived.
But only in the most vulgar, anticlimactic, and utterly mortal sense.

There will be no afterlife memoir. No second chance to settle the score. No sequel.
Just this: breath, blood, occasional barbed words — and then silence.

Deal with it.

The Church of Pareto: How Economics Learned to Love Collapse

—or—How the Invisible Hand Became a Throttling Grip on the Throat of the Biosphere

As many frequent visitors know, I am a recovering economist. I tend to view economics through a philosophical lens. Here. I consider the daft nonsense of Pareto optimality.

Audio: NotebookLM podcast of this content.

There is a priesthood in modern economics—pious in its equations, devout in its dispassion—that gathers daily to prostrate before the altar of Pareto. Here, in this sanctum of spreadsheet mysticism, it is dogma that an outcome is “optimal” so long as no one is worse off. Never mind if half the world begins in a ditch and the other half in a penthouse jacuzzi. So long as no one’s Jacuzzi is repossessed, the system is just. Hallelujah.

This cult of cleanliness, cloaked in the language of “efficiency,” performs a marvellous sleight of hand: it transforms systemic injustice into mathematical neutrality. The child working in the lithium mines of the Congo is not “harmed”—she simply doesn’t exist in the model. Her labour is an externality. Her future, an asterisk. Her biosphere, a rounding error in the grand pursuit of equilibrium.

Let us be clear: this is not science. This is not even ideology. It is theology—an abstract faith-based system garlanded with numbers. And like all good religions, it guards its axioms with fire and brimstone. Question the model? Heretic. Suggest the biosphere might matter? Luddite. Propose redistribution? Marxist. There is no room in this holy order for nuance. Only graphs and gospel.

The rot runs deep. William Stanley Jevons—yes, that Jevons, patron saint of unintended consequences—warned us as early as 1865 that improvements in efficiency could increase, not reduce, resource consumption. But his paradox, like Cassandra’s prophecy, was fated to be ignored. Instead, we built a civilisation on the back of the very logic he warned would destroy it.

Then came Simon Kuznets, who—bless his empirically addled soul—crafted a curve that seemed to promise that inequality would fix itself if we just waited politely. We called it the Kuznets Curve and waved it about like a talisman against the ravages of industrial capitalism, ignoring the empirical wreckage that piled up beneath it like bones in a trench.

Meanwhile, Pareto himself, that nobleman of social Darwinism, famously calculated that 80% of Italy’s land was owned by 20% of its people—and rather than challenge this grotesque asymmetry, he chose to marvel at its elegance. Economics took this insight and said: “Yes, more of this, please.”

And so the model persisted—narrow, bloodless, and exquisitely ill-suited to the world it presumed to explain. The economy, it turns out, is not a closed system of rational actors optimising utility. It is a planetary-scale thermodynamic engine fuelled by fossil sunlight, pumping entropy into the biosphere faster than it can absorb. But don’t expect to find that on the syllabus.

Mainstream economics has become a tragic farce, mouthing the language of optimisation while presiding over cascading system failure. Climate change? Not in the model. Biodiversity collapse? A regrettable externality. Intergenerational theft? Discounted at 3% annually.

We are witnessing a slow-motion suicide cloaked in the rhetoric of balance sheets. The Earth is on fire, and the economists are debating interest rates.

What we need is not reform, but exorcism. Burn the models. Salt the axioms. Replace this ossified pseudoscience with something fit for a living world—ecological economics, systems theory, post-growth thinking, anything with the courage to name what this discipline has long ignored: that there are limits, and we are smashing into them at speed.

History will not be kind to this priesthood of polite annihilation. Nor should it be.

The Hard Problem of Consciousness

If you are reading this, you are likely familiar with David Chalmers’ idea of the Hard Problem of Consciousness—the thorny, maddeningly unsolvable question of why and how subjective experience arises from physical processes. If you’re not, welcome to the rabbit hole. Here, we’ll plunge deeper by examining the perspective of Stuart Hameroff, who, like a philosophical magician, reframes this conundrum as a chicken-and-egg problem: what came first, life or consciousness? His answer? Consciousness. But wait—there’s a slight snag. Neither “life” nor “consciousness” has a universally agreed-upon definition. Oh, the joy of philosophical discourse.

Video: Professor Stuart Hameroff and others promote the idea that consciousness pre-dates life. A fuller version is available at IAI.
Audio: Podcast on this topic.

For the uninitiated, Hameroff’s stance is heavily flavoured with panpsychism—the idea that consciousness is a fundamental feature of the universe, like space or time. In this worldview, consciousness predates life itself. From this vantage, Hameroff’s proposition seems inevitable, a tidy solution that fits neatly into a panpsychistic framework. But let me stop you right there because I’m not signing up for the panpsychism fan club, and I’m certainly not prepared to let Hameroff’s intellectual sleight of hand go unchallenged.

To make his case, Hameroff engages in a curious manoeuvre: he defines both life and consciousness in ways that conveniently serve his argument. Consciousness, for him, is not limited to the complex phenomena of human or even animal experience but is a fundamental property of the universe, embedded in the very fabric of reality. Meanwhile, consciousness eventually orchestrates itself into life—a secondary phenomenon. With these definitions, his argument clicks together like a self-serving jigsaw puzzle. It’s clever, I’ll grant him that. But cleverness isn’t the same as being correct.

This is the philosophical equivalent of marking your own homework. By defining the terms of debate to fit his narrative, Hameroff ensures that his conclusion will satisfy his fellow panpsychists. The faithful will nod along, their priors confirmed. But for those outside this echo chamber, his framework raises more questions than it answers. How does this universal consciousness work? Why should we accept its existence as a given? And—here’s the kicker—doesn’t this just punt the problem one step back? If consciousness is fundamental, what’s the mechanism by which it “pre-exists” life?

Hameroff’s move is bold, certainly. But boldness isn’t enough. Philosophy demands rigour, and redefining terms to suit your argument isn’t rigorous; it’s rhetorical trickery. Sure, it’s provocative. But does it advance our understanding of the Hard Problem, or does it merely reframe it in a way that makes Hameroff’s preferred answer seem inevitable? For my money, it’s the latter.

The real issue is that panpsychism itself is a philosophical Rorschach test. It’s a worldview that can mean just about anything, from the claim that electrons have a rudimentary kind of awareness to the idea that the universe is a giant mind. Hameroff’s take lands somewhere in this spectrum, but like most panpsychist arguments, it’s long on metaphysical speculation and short on empirical grounding. If you already believe that consciousness is a fundamental aspect of reality, Hameroff’s arguments will feel like a revelation. If you don’t, they’ll feel like smoke and mirrors.

In the end, Hameroff’s chicken-and-egg problem might be better framed as a false dichotomy. Perhaps life and consciousness co-evolved in ways we can’t yet fully understand. Or perhaps consciousness, as we understand it, emerges from the complexity of life, a byproduct rather than a prerequisite. What’s clear is that Hameroff’s solution isn’t as tidy as it seems, nor as universally compelling. It’s a clever sleight of hand, but let’s not mistake cleverness for truth.

What is Information?

I question whether reviewing a book chapter by chapter is the best approach. It feels more like a reaction video because I am trying to suss out as I go. Also, I question the integrity and allegiance of the author, a point I often make clear. Perhaps ‘integrity’ is too harsh as he may have integrity relative to his worldview. It just happens to differ from mine.

Chapter 1 of Yuval Noah Harari’s Nexus, ironically titled “What is Information?” closes not with clarity but with ambiguity. Harari, ever the rhetorician, acknowledges the difficulty of achieving consensus on what ‘information’ truly means. Instead of attempting a rigorous definition, he opts for the commonsense idiomatic approach—a conveniently disingenuous choice, given that information is supposedly the book’s foundational theme. To say this omission is bothersome would be an understatement; it is a glaring oversight in a chapter dedicated to unpacking this very concept.

Audio: Podcast related to this content.

Sidestepping Rigour

Harari’s rationale for leaving ‘information’ undefined appears to rest on its contested nature, yet this does not excuse the absence of his own interpretation. While consensus may indeed be elusive, a book with such grand ambitions demands at least a working definition. Without it, readers are left adrift, navigating a central theme that Harari refuses to anchor. This omission feels particularly egregious when juxtaposed against his argument that information fundamentally underlies everything. How can one build a convincing thesis on such an unstable foundation?

The Map and the Terrain

In typical Harari fashion, the chapter isn’t devoid of compelling ideas. He revisits the map-and-terrain analogy, borrowing from Borges to argue that no map can perfectly represent reality. While this metaphor is apt for exploring the limitations of knowledge, it falters when Harari insists on the existence of an underlying, universal truth. His examples—Israeli versus Palestinian perspectives, Orthodox versus secular vantage points—highlight the relativity of interpretation. Yet he clings to the Modernist belief that events have an objective reality: they occur at specific times, dates, and places, regardless of perspective. This insistence feels like an ontological claim awkwardly shoehorned into an epistemological discussion.

Leveraging Ambiguity

One can’t help but suspect that Harari’s refusal to define ‘information’ serves a rhetorical purpose. By leaving the concept malleable, he gains the flexibility to adapt its meaning to suit his arguments throughout the book. This ambiguity may prove advantageous in bolstering a wide-ranging thesis, but it also risks undermining the book’s intellectual integrity. Readers may find themselves wondering whether Harari is exploring complexity or exploiting it.

Final Thoughts on Chapter 1

The chapter raises more questions than it answers, not least of which is whether Harari intends to address these foundational gaps in later chapters. If the preface hinted at reductionism, Chapter 1 confirms it, with Harari’s Modernist leanings and rhetorical manoeuvres taking centre stage. “What is Information?” may be a provocative title, but its contents suggest that the question is one Harari is not prepared to answer—at least, not yet.

Top 5 Books Read 2024

These are my favourite books I read in 2024. Only one was first published this year, so it seems I was playing catch-up and rereading. Two are about history; two are about the philosophy of science; and one is about biological free will or the lack thereof.

5

Against Method (2010)
Philosophy of Science

Against Method is a re-read for me. It makes the list on the coattails of a higher-ranked book. Feyerabend makes a compelling case against the Scientific Method™. To complete the set, I’d also recommend Bruno Latour‘s We Have Never Been Modern.

4

Determined: A Science of Life without Free Will (2023)
Neuroscience, Philosophy

Determined arrives on the heels of Sapolsky’s Behave, another classic that I’d recommend even more, but I read it in 2018, so it doesn’t make the cut. In Determined, Sapolsky makes the case that there is no room or need for free will to explain human behaviour.

3

Guns, Germs, and Steel: The Fates of Human Societies (1998)
History

As with Against Method, Guns, Germs, and Steel makes the list only to complement my next choice. It views history through an environmental lens. To fill out the historical perspective, I recommend David Graeber’s The Dawn of Everything: A New History of Humanity (with David Wengrow). I’d recommend Yuval Noah Harari‘s Sapiens: A Brief History of Humankind, but it occupies a different category and is more about a plausible broad narrative than the detail explored in the others listed.

2

How the World Made the West: A 4,000 Year History (2024)
History

Quinn makes history approachable as she questions the uniformity of civilisations pushed by orthodoxy. Read this in context with the aforementioned historical accounts for a fuller perspective.

1

The Structure of Scientific Revolutions: 50th Anniversary Edition (1962/2012)
Philosophy of Science

I was born in 1961. This should have been bedtime reading for me. I’d heard of this work, but one really has to read it. It’s less Modernist than I had presumed—though not to the extent of Feyerabend or Latour mentioned above. Again, reading all three provides a robust perspective on the philosophy of science.

Like Quinn, the writing is approachable. I had expected it to be stilted. It is academic, and it may boost your vocabulary, but give it a gander. It also works well in an audiobook format if you are so inclined.

This about closes out 2024. What do you think about these choices? Agree or disagree? What are your top recommendations?

Required Reading: Science

The Structure of Scientific Revolutions was published in 1962. Written by Thomas Kuhn, it introduced the world to the concept of paradigm shifts in science — and, as it turns out, elsewhere. As I mentioned recently, I experienced a mishap, confounding it with Paul Feyerabend’s Against Method, first published in 1975. Both of these should be required reading FOR year 10 – or at least taught in summary.

I had read Feyerabend years ago but was only familiar with Kuhn from a distance. I’m clad we’ve become more intimate. These authors take different approaches to arrive at times in the same place. Kuhn takes a Modernist approach that he critiques and modifies. Feyerabend takes a Postmodernist path that sometimes cross.

Ah, the delightful dance of paradigms and anarchism in the hallowed halls of science! Let’s delve deeper into the intellectual pas de deux between Thomas Kuhn and Paul Feyerabend, those audacious thinkers who dared to challenge the sanctity of scientific methodology.

Kuhn’s Paradigm Shifts: The Scientific Waltz

Thomas Kuhn, in his seminal work The Structure of Scientific Revolutions, introduced us to the concept of paradigm shifts—a term now so overused that even corporate PowerPoint presentations aren’t spared. Kuhn posited that science doesn’t progress through a linear accumulation of knowledge but rather through a series of revolutionary upheavals. These upheavals occur when the prevailing scientific framework, or “paradigm,” becomes as outdated as last season’s fashion, unable to account for emerging anomalies. In Kuhn’s view, the scientific community clings to its paradigms with the tenacity of a dog to its bone, until the weight of anomalies forces a collective epiphany, leading to a paradigm shift. This cyclical process propels scientific advancement, albeit in a manner reminiscent of a drunken sailor’s stagger rather than a straight path.

Feyerabend’s Epistemological Anarchism: The Punk Rock of Science

Enter Paul Feyerabend, the enfant terrible of the philosophy of science, with his provocative manifesto Against Method. Feyerabend gleefully dismantled the notion of a universal scientific method, advocating for “epistemological anarchism.” He argued that the rigid adherence to methodological rules is about as useful as a chocolate teapot, stifling creativity and hindering progress. In Feyerabend’s anarchic utopia, “anything goes” in the pursuit of knowledge, and the scientific method is more of a loose suggestion than a strict protocol. His critique was not just a call for methodological diversity but a full-blown rebellion against the tyranny of scientific dogmatism.

A Comparative Analysis: Method to the Madness

While Kuhn and Feyerabend both challenged the orthodox views of scientific progress, their approaches were as different as chalk and cheese. Kuhn’s analysis was rooted in historical case studies, portraying scientific revolutions as communal shifts in perspective, akin to a collective midlife crisis. Feyerabend, on the other hand, took a more radical stance, suggesting that the very idea of a fixed scientific method is as mythical as unicorns. Where Kuhn saw periods of “normal science” punctuated by revolutionary shifts, Feyerabend saw a chaotic free-for-all, where progress is made not by following rules but by breaking them.

Implications for Scientific Practice: Order in Chaos

The implications of their critiques are profound. Kuhn’s work suggests that scientists should remain open to paradigm shifts, lest they become as obsolete as Betamax in a Netflix era. Feyerabend’s anarchism, while controversial, serves as a reminder that innovation often requires the audacity to defy convention. Together, they paint a picture of science not as a monolithic quest for truth but as a dynamic, often tumultuous, human endeavour.

Conclusion: The Legacy of Intellectual Rebellion

In conclusion, the works of Kuhn and Feyerabend invite us to view science through a more sceptical lens, questioning the sanctity of its methods and the rigidity of its paradigms. Their critiques serve as a clarion call for intellectual flexibility, urging us to embrace the chaos and complexity inherent in the pursuit of knowledge. After all, in the grand theatre of science, it’s often the most unconventional performances that leave a lasting impact.