Machines Hallucinate. Humans Call It Civilisation.

1–2 minutes

The fashionable complaint is that large language models ‘hallucinate’. They spit out confident nonsense: the tech pundits are scandalised. A chatbot invents a book, attributes it to Plato, and spins a dialogue about Quidditch. ‘Hallucination!’ they cry. Machines, it seems, are liars.

Spare me. Humans are the original hallucination engines. We fabricate categories, baptise them in blood, and call the result ‘truth’. At least the machine doesn’t pretend its fictions are eternal.

Take race and selfhood, our twin masterpieces of make-believe.

  • They carve boundaries out of fog. Race slices humanity into ‘Black’ and ‘White’. Selfhood cordons ‘I’ off from ‘you’. Arbitrary partitions, paraded as natural law.
  • They’re locked in by violence and paperwork. Race was stabilised by censuses, laws, and medical forms – the same bureaucratic gears that powered slavery, eugenics, and apartheid. Selfhood was stabilised by diaries, court records, and psychiatric charts. Write it down often enough, and suddenly the fiction looks inevitable.
  • They make bodies manageable. Race sorts populations into hierarchies and facilitates exploitation. Selfhood sorts individuals for property, accountability, and punishment. Together, they’re operating systems for social control.
  • They are fictions with teeth. Like money, they don’t need metaphysical foundations. They are real because people are willing to kill for them.

These hallucinations aren’t timeless. They were midwifed in the Enlightenment: anthropology provided us with racial typologies, the novel refined the autonomous self, and colonial administration ran the pilot projects. They’re not ancient truths – they’re modern software, installed at scale.

So when ethicists wring their hands over AI ‘hallucinations’, the joke writes itself. Machines hallucinate openly. Humans hallucinate and call it scripture, law, and identity. The only difference is centuries of institutional muscle behind our delusions.

And here’s the sting: what terrifies us isn’t that AI produces falsehoods. It’s that, given time, its inventions might start to look as solid as ours. Fifty years from now, people could be killing for an algorithm’s hallucination the same way they’ve been killing for gods, races, and selves.

Leave a comment