Not the darkness after the light, but the shadow the light forgot it cast
The Enlightenment promised light. What it delivered was fluorescence – bright, sterile, and buzzing with the sound of its own reason.
The Anti-Enlightenment Project gathers a set of essays, fragments, and quotations tracing how that light dimmed – or perhaps was never as luminous as advertised. It’s less a manifesto than a map of disintegration: how agency became alibi, how reason became ritual, and how modernity mistook motion for progress.
Preprints and essays (Against Agency, Rational Ghosts, Temporal Ghosts, and others to follow)
Related reflections from Philosophics posts going back to 2019
A living index of quotations from Nietzsche to Wynter, tracing philosophy’s slow discovery that its foundation may have been sand all along
This isn’t a war on knowledge, science, or reason – only on their misappropriation as universal truths. The Anti-Enlightenment simply asks what happens when we stop pretending that the Enlightenment’s “light” was neutral, natural, or necessary.
It’s not reactionary. It’s diagnostic.
The Enlightenment built the modern world; the Anti-Enlightenment merely asks whether we mistook the glare for daylight.
After revisiting MacIntyre on Nietzsche – with Descartes lurking in the background – I think it’s time for another round on dis-integrationism.
Audio: NotebookLM podcast on this topic.
Philosophy has a bad renovation habit. Descartes tears the house down to its studs, then nails God back in as a load-bearing beam. Nietzsche dynamites the lot, then sketches a heroic Übermensch strutting through the rubble. MacIntyre sighs, bolts Aristotle’s virtue table to the frame, and calls it load-bearing furniture. The pattern repeats: demolition, followed by reconstruction, always with the insistence that this time the structure is sound.
Video: Jonny Thompson’s take on Nietzsche also inspired this post.
But the error isn’t in tearing down. The error is in rushing to rebuild. That’s where the hyphen in dis-integrationism matters – it insists on the pause, the refusal to immediately re-integrate. We don’t have to pretend the fragments are secretly a whole. We can live with the splinters.
Yes, someone will protest: “We need something.” True enough. But the something is always a construction – provisional, contingent, human. The problem isn’t building; the problem is forgetting that you’ve built, then baptising the scaffolding as eternal bedrock.
Modernity™ is a museum of such floorboards: rationalism, utilitarianism, rights-talk, virtue ethics, each nailed down with solemn confidence, each creaking under the weight of its contradictions. The sane position is not to deny the need for floors, but to remember they are planks, not granite.
For the religious, the reply is ready-made: God is the foundation, the rock, the alpha and omega. But that is already a construction, no matter how venerable. Belief may provide the feeling of solidity, but it still arrives mediated by language, institutions, rituals, and interpretation. The Decalogue is a case in point: per the lore, God conveyed information directly to Abraham, Moses, the prophets, and onward in an unbroken chain. The claim is not only that the foundation exists, but that certain communities possess unique and privileged access to it — through scripture, tradition, and “reasons” that somehow stop short of being just more scaffolding.
Yet history betrays the trick. The chain is full of edits, schisms, rival prophets, councils, translations, and contradictions – each presented not as construction but as “clarification.” The gapless transmission is a myth; the supposed granite is a patchwork of stone and mortar. A dis-integrationist view doesn’t deny the weight these systems carry in people’s lives, but it refuses to mistake architecture for geology. Whatever floor you stand on was built, not found.
Dis-integrationism is simply the refusal to be gaslit by metaphysics.
Freud once quipped that people are “normal” only on average. To the degree that they deviate from the mean, they are neurotic, psychotic, or otherwise abnormal. Whatever else one thinks of Freud, the metaphor holds for Modernity.
Image: Picture and quote by Sigmund Freud: Every normal person, in fact, is only normal on the average. His ego approximates to that of the psychotic in some part or other and to a greater or lesser extent. —Analysis Terminable And Interminable (1937), Chapter V
We are “Modern” only on average, and only for the first standard deviation. Within one sigma, you can wave a flag and declare: rational, secular, Enlightened. But step further into the tails and the façade dissolves. The “normal” modern turns out to attend megachurches, consult horoscopes, share conspiracy memes, or cling to metaphysical relics that Enlightenment reason was supposed to have torched centuries ago.
„ Jeder Normale ist eben nur durchschnittlich normal, sein Ich nähert sich dem des Psychotikers in dem oder jenem Stück, in größerem oder geringerem Ausmaß. “
The problem isn’t that these people aren’t Modern. The problem is that nobody is Modern, not in the sense the story requires. The mean is an over-fitted abstraction. “Modernity” works like Freud’s “normal”: a statistical average that erases the deviations, then insists that the erased bits are pathology rather than reality.
But the tails are where most of human life actually happens. The “average Modern” is as mythical as the “reasonable person.” What we call Modernity is just a bell curve costume draped over the same mix of superstition, desire, and contingency that has always driven human behaviour.
Kant, bless him, thought he was staging the trial of Reason itself, putting the judge in the dock and asking whether the court had jurisdiction. It was a noble spectacle, high theatre of self-scrutiny. But the trick was always rigged. The presiding judge, the prosecution, the jury, the accused, all wore the same powdered wig. Unsurprisingly, Reason acquitted itself.
The Enlightenment’s central syllogism was never more than a parlour trick:
P1: The best path is Reason.
P2: I practice Reason.
C: Therefore, Reason is best.
It’s the self-licking ice-cream cone of intellectual history. And if you dare to object, the trap springs shut: what, you hate Reason? Then you must be irrational. Inquisitors once demanded heretics prove they weren’t in league with Satan; the modern equivalent is being told you’re “anti-science.” The categories defend themselves by anathematising doubt.
The problem is twofold:
First, Reason never guaranteed agreement. Two thinkers can pore over the same “facts” and emerge with opposite verdicts, each sincerely convinced that Reason has anointed their side. In a power-laden society, it is always the stronger voice that gets to declare its reasoning the reasoning. As Dan Hind acidly observed, Reason is often nothing more than a marketing label the powerful slap on their interests.
Second, and this is the darker point, Reason itself is metaphysical, a ghost in a powdered wig. To call something “rational” is already to invoke an invisible authority, as if Truth had a clerical seal. Alasdair MacIntyre was right: strip away the old rituals and you’re left with fragments, not foundations.
Other witnesses have tried to say as much. Horkheimer and Adorno reminded us that Enlightenment rationality curdles into myth the moment it tries to dominate the world. Nietzsche laughed until his throat bled at the pretence of universal reason, then promptly built his own metaphysics of will. Bruno Latour, in We Have Never Been Modern, dared to expose Science as what it actually is – a messy network of institutions, instruments, and politics masquerading as purity. The backlash was so swift and sanctimonious that he later called it his “worst” book, a public recantation that reads more like forced penance than revelation. Even those who glimpsed the scaffolding had to return to the pews.
So when we talk about “Reason” as the bedrock of Modernity, let’s admit the joke. The bedrock was always mist. The house we built upon it is held up by ritual, inertia, and vested interest, not granite clarity. Enlightenment sold us the fantasy of a universal judge, when what we got was a self-justifying oracle. Reason is not the judge in the courtroom. Reason is the courtroom itself, and the courtroom is a carnival tent – all mirrors, no floor.
We’re told we live in the Enlightenment, that Reason™ sits on the throne and superstition has been banished to the attic. Yet when I disguised a little survey as “metamodern,” almost none came out as fully Enlightened. Three managed to shed every trace of the premodern ghost, one Dutch wanderer bypassed Modernity entirely, and not a single soul emerged free of postmodern suspicion. So much for humanity’s great rational awakening. Perhaps Modernity wasn’t a phase we passed through at all, but a mirage we still genuflect before, a lifestyle brand draped over a naked emperor.
Audio: NotebookLM podcast on this topic
The Enlightenment as Marketing Campaign
The Enlightenment is sold to us as civilisation’s great coming-of-age: the dawn when the fog of superstition lifted and Reason took the throne. Kant framed it as “man’s emergence from his self-incurred immaturity” – an Enlightenment bumper sticker that academics still like to polish and reapply. But Kant wasn’t writing for peasants hauling mud or women without the vote; he was writing for his own coterie of powdered-wig mandarins, men convinced their own habits of rational debate were humanity’s new universal destiny.
Modernity, in this story, isn’t a historical stage we all inhabited. It’s an advertising campaign: Reason™ as lifestyle brand, equality as tagline, “progress” as the logo on the tote bag. Modernity, in the textbooks, is billed as a historical epoch, a kind of secular Pentecost in which the lights came on and we all finally started thinking for ourselves. In practice, it was more of a boutique fantasy, a handful of gentlemen mistaking their own rarefied intellectual posture for humanity’s destiny.
The Archetype That Nobody Lives In
At the core of the Enlightenment lies the archetype of Man™: rational, autonomous, unencumbered by superstition, guided by evidence, weighing pros and cons with the detachment of a celestial accountant. Economics repackaged him as homo economicus, forever optimising his utility function as if he were a spreadsheet in breeches.
But like all archetypes, this figure is a mirage. Our survey data, even when baited as a “metamodern survey”, never produced a “pure” Enlightenment subject.
3 scored 0% Premodern (managing, perhaps, to kick the gods and ghosts to the kerb).
1 scored 0% Modern (the Dutch outlier: 17% Premodern, 0% Modern, 83% Post, skipping the Enlightenment altogether, apparently by bike).
0 scored 0% Postmodern. Every single participant carried at least some residue of suspicion, irony, or relativism.
The averages themselves were telling: roughly 18% Premodern, 45% Modern, 37% Postmodern. That’s not an age of Reason. That’s a muddle, a cocktail of priestly deference, rationalist daydreams, and ironic doubt.
Even the Greats Needed Their Crutches
If the masses never lived as Enlightenment subjects, what about the luminaries? Did they achieve the ideal? Hardly.
Descartes, desperate to secure the cogito, called in God as guarantor, dragging medieval metaphysics back on stage.
Kant built a cathedral of reason only to leave its foundations propped up by noumena: an unseeable, unknowable beyond.
Nietzsche, supposed undertaker of gods, smuggled in his own metaphysics of will to power and eternal recurrence.
William James, surveying the wreckage, declared that “truth” is simply “what works”, a sort of intellectual aspirin for the Enlightenment headache.
And economists, in a fit of professional humiliation, pared the rational subject down to a corpse on life support. Homo economicus became a creature who — at the very least, surely — wouldn’t choose to make himself worse off. But behavioural economics proved even that meagre hope to be a fantasy. People burn their wages on scratch tickets, sign up for exploitative loans, and vote themselves into oblivion because a meme told them to.
If even the “best specimens” never fully embodied the rational archetype, expecting Joe Everyman, who statistically struggles to parse a sixth-grade text and hasn’t cracked a book since puberty, to suddenly blossom into a mini-Kant is wishful thinking of the highest order.
The Dual Inertia
The real story isn’t progress through epochs; it’s the simultaneous drag of two kinds of inertia:
Premodern inertia: we still cling to sacred myths, national totems, and moral certainties.
Modern inertia: we still pretend the rational subject exists, because democracy, capitalism, and bureaucracy require him to.
The result isn’t a new epoch. It’s a cultural chimaera: half-superstitious, half-rationalist, shot through with irony. A mess, not a phase..
Arrow’s Mathematical Guillotine
Even if the Enlightenment dream of a rational demos were real, Kenneth Arrow proved it was doomed. His Impossibility Theorem shows that no voting system can turn individual rational preferences into a coherent “general will.” In other words, even a parliament of perfect Kants would deadlock when voting on dinner. The rational utopia is mathematically impossible.
So when we are told that democracy channels Reason, we should hear it as a polite modern incantation, no sturdier than a priest blessing crops.
Equality and the Emperor’s Wardrobe
The refrain comes like a hymn: “All men are created equal.” But the history is less inspiring. “Men” once meant property-owning Europeans; later it was generously expanded to mean all adult citizens who’d managed to stay alive until eighteen. Pass that biological milestone, and voilà — you are now certified Rational, qualified to determine the fate of nations.
And when you dare to question this threadbare arrangement, the chorus rises: “If you don’t like democracy, capitalism, or private property, just leave.” As if you could step outside the world like a theatre where the play displeases you. Heidegger’s Geworfenheit makes the joke bitter: we are thrown into this world without choice, and then instructed to exit if we find the wallpaper distasteful. Leave? To where, precisely? The void? Mars?
The Pre-Modern lord said: Obey, or be exiled. The Modern democrat says: Vote, or leave. And the Post-Enlightenment sceptic mutters: Leave? To where, exactly? Gravity? History? The species? There is no “outside” to exit into. The system is not a hotel; it’s the weather.
Here the ghost of Baudrillard hovers in the wings, pointing out that we are no longer defending Reason, but the simulacrum of Reason. The Emperor’s New Clothes parable once mocked cowardice: everyone saw the nudity but stayed silent. Our situation is worse. We don’t even see that the Emperor is naked. We genuinely believe in the fineries, the Democracy™, the Rational Man™, the sacred textile of Progress. And those who point out the obvious are ridiculed: How dare you mock such fineries, you cad!
Conclusion: The Comfort of a Ghost
So here we are, defending the ghost of a phase we never truly lived. We cling to Modernity as if it were a sturdy foundation, when in truth it was always an archetype – a phantom rational subject, a Platonic ideal projected onto a species of apes with smartphones. We mistook it for bedrock, built our institutions upon it, and now expend colossal energy propping up the papier-mâché ruins. The unfit defend it out of faith in their own “voice,” the elites defend it to preserve their privilege, and the rest of us muddle along pragmatically, dosing ourselves with Jamesian aspirin and pretending it’s progress.
Metamodernism, with its marketed oscillation between sincerity and irony, is less a “new stage” than a glossy rebranding of the same old admixture: a bit of myth, a bit of reason, a dash of scepticism. And pragmatism –James’s weary “truth is what works” – is the hangover cure that keeps us muddling through.
Modernity promised emancipation from immaturity. What we got was a new set of chains: reason as dogma, democracy as ritual, capitalism as destiny. And when we protest, the system replies with its favourite Enlightenment lullaby: If you don’t like it, just leave.
But you can’t leave. You were thrown here. What we call “Enlightenment” is not a stage in history but a zombie-simulation of an ideal that never drew breath. And yet, like villagers in Andersen’s tale, we not only guard the Emperor’s empty wardrobe – we see the garments as real. The Enlightenment subject is not naked. He is spectral, and we are the ones haunting him.
I’ve been reading Octavia Butler’s Dawn and find myself restless. The book is often lauded as a classic of feminist science fiction, but I struggle with it. My problem isn’t with aliens, or even with science fiction tropes; it’s with the form itself, the Modernist project embedded in the genre, which insists on posing questions and then supplying answers, like a catechism for progress. Sci-Fi rarely leaves ambiguity alone; it instructs.
Simone de Beauvoir understood “woman” as the Other – defined in relation to men, consigned to roles of reproduction, care, and passivity. Her point was not that these roles were natural, but that they were imposed, and that liberation required stripping them away.
Octavia Butler’s Lilith
Lilith Iyapo, the protagonist of Dawn, should be radical. She is the first human awakened after Earth’s destruction, a Black woman given the impossible role of mediating between humans and aliens. Yet she is not allowed to resist her role so much as to embody it. She becomes the dutiful mother, the reluctant carer, the compliant negotiator. Butler’s narration frequently tells us what Lilith thinks and feels, as though to pre-empt the reader’s interpretation. She is less a character than an archetype: the “reasonable woman,” performing the script of liberal Western femininity circa the 1980s.
Judith Butler’s Lens
Judith Butler would have a field day with this. For her, gender is performative: not an essence but a repetition of norms. Agency, in her view, is never sovereign; it emerges, if at all, in the slippages of those repetitions. Read through this lens, Octavia Butler’s Lilith is not destabilising gender; she is repeating it almost too faithfully. The novel makes her into an allegory, a vessel for explaining and reassuring. She performs the role assigned and is praised for her compliance – which is precisely how power inscribes itself.
Why Sci-Fi Leaves Me Cold
This helps me understand why science fiction so often fails to resonate with me. The problem isn’t the speculative element; I like the idea of estrangement, of encountering the alien. The problem is the Modernist scaffolding that underwrites so much of the genre: the drive to solve problems, to instruct the reader, to present archetypes as universal stand-ins. I don’t identify with that project. I prefer literature that unsettles rather than reassures, that leaves questions open rather than connecting the dots.
So, Butler versus Butler on the bedrock of Beauvoir: one Butler scripting a woman into an archetype, another Butler reminding us that archetypes are scripts. And me, somewhere between them, realising that my discomfort with Dawn is not just with the book but with a genre that still carries the DNA of the very Modernism it sometimes claims to resist.
“What is up with us white people?” asks John Biewen in his TEDx talk The Lie That Invented Racism. It’s the sort of line that makes a roomful of middle-class liberals laugh nervously, because it’s the kind of question we’d rather leave to other people – preferably the ones already burdened with the consequences of our civilisational mess. But Biewen’s point, following Ibram X. Kendi, is that race is not some primordial fact, a tragic misunderstanding of melanin levels. It was invented, quite literally, by a Portuguese royal propagandist in the fifteenth century, and it has been paying dividends to “us” ever since.
Video: TEDx Talk with John Biewen
Yes, invented. Not discovered like a continent, not unearthed like a fossil, not deduced like a law of motion. Fabricated. Gomes de Zurara, a court chronicler under King Afonso V, was tasked with writing a stirring tale to justify Portugal’s shiny new business model: kidnapping Africans and selling them like cattle. Zurara obligingly lumped every tribe and tongue south of the Sahara into a single category – “the Blacks,” beastly and conveniently inferior – and thus performed the intellectual sleight of hand that would metastasise into centuries of racial taxonomy. It wasn’t science. It wasn’t reason. It was marketing.
And here lies the exquisite irony: this happened at the dawn of Modernity, that self-anointed Age of Reason. The Enlightenment’s sales pitch was universality – “all men are created equal,” etc. – but tucked in the fine print was the little caveat that “man” actually meant white, European, propertied man. Everyone else? Barbaric, uncivilised, or in need of civilising at the end of a whip. Modernity congratulated itself on escaping medieval superstition while simultaneously cooking up the most profitable superstition of all: that human worth can be ranked by pigmentation.
Audio: NotebookLM podcast discusses this topic.
This is why racism has proved so stubborn. If it were merely a misunderstanding, like thinking the Earth is flat, we’d have grown out of it. But racism was never about confusion; it was about utility. A well-tuned lie, weaponised to justify land theft, slavery, and empire, then codified into law, census, and property rights. As Kendi and others point out, race became the scaffolding for a political economy that had to square Christian salvation with chains and sugar plantations. Voilà: whiteness – not as an identity, but as a racket.
And yet, “good white people” (Dow’s term, delivered with that Minnesota-nice grimace) still act as though racism is a tragic but external drama: Black people versus hood-wearing villains, while we clap politely from the sidelines. But there are no sidelines. Whiteness was built to privilege us; neutrality is just complicity in better shoes. As historian Nell Irvin Painter reminds us, the Greeks thought they were superior, yes – but on cultural, not chromatic grounds. Race, as a concept, is a modern fix, not a timeless truth.
So what’s the moral? Stop romanticising the Enlightenment as though it were some grand emancipation. It was also a bureaucracy for inequality, a rationalisation engine that could make even human trafficking sound like a noble project. To dismantle racism is not to cleanse an ancient superstition but to tear out one of Modernity’s central operating systems.
The uncomfortable fact – the one Dow leaves hanging like smoke after the torch march – is this: if whiteness was invented for profit, then dismantling it is not philanthropy. It is debt repayment. And debt, as any bank will tell you, compounds with interest.
I’ve read Part I of Hobbes’ Leviathan and wonder what it would have been like if he filtered his thoughts through Hume or Wittgenstein. Hobbes makes Dickens read like Pollyanna. It’s an interesting historical piece, worth reading on that basis alone. It reads as if the Christian Bible had to pass through a legal review before it had been published, sapped of vigour. As bad a rap as Schopenhauer seems to get, Hobbes is the consummate Ebenezer Scrooge. Bah, humbug – you nasty, brutish, filthy animals!*
Audio: NotebookLM podcast conversation on this topic.
In any case, it got me thinking of free will and, more to the point, of will itself.
A Brief History of Humanity’s Favourite Metaphysical Scapegoat
By the time Free Will turned up to the party, the real guest of honour—the Will—had already been drinking heavily, muttering incoherently in the corner, and starting fights with anyone who made eye contact. We like to pretend that the “will” is a noble concept: the engine of our autonomy, the core of our moral selves, the brave little metaphysical organ that lets us choose kale over crisps. But in truth, it’s a bloody mess—philosophy’s equivalent of a family heirloom that no one quite understands but refuses to throw away.
So, let’s rewind. Where did this thing come from? And why, after 2,500 years of name-dropping, finger-pointing, and metaphysical gymnastics, are we still not quite sure whether we have a will, are a will, or should be suing it for damages?
Plato: Soul, Reason, and That Poor Horse
In the beginning, there was Plato, who—as with most things—half-invented the question and then wandered off before giving a straight answer. For him, the soul was a tripartite circus act: reason, spirit, and appetite. Will, as a term, didn’t get top billing—it didn’t even get its name on the poster. But the idea was there, muddling along somewhere between the charioteer (reason) and the unruly horses (desire and spiritedness).
No explicit will, mind you. Just a vague sense that the rational soul ought to be in charge, even if it had to beat the rest of itself into submission.
Aristotle: Purpose Without Pathos
Aristotle, ever the tidy-minded taxonomist, introduced prohairesis—deliberate choice—as a sort of proto-will. But again, it was all about rational calculation toward an end. Ethics was teleological, goal-oriented. You chose what aligned with eudaimonia, that smug Greek term for flourishing. Will, if it existed at all, was just reason picking out dinner options based on your telos. No inner torment, no existential rebellion—just logos in a toga.
Augustine: Sin, Suffering, and That Eternal No
Fast-forward a few hundred years, and along comes Saint Augustine, traumatised by his libido and determined to make the rest of us suffer for it. Enter voluntas: the will as the seat of choice—and the scene of the crime. Augustine is the first to really make the will bleed. He discovers he can want two incompatible things at once and feels properly appalled about it.
From this comes the classic Christian cocktail: freedom plus failure equals guilt. The will is free, but broken. It’s responsible for sin, for disobedience, for not loving God enough on Wednesdays. Thanks to Augustine, we’re stuck with the idea that the will is both the instrument of salvation and the reason we’re going to Hell.
Cheers.
Medievals: God’s Will or Yours, Pick One
The Scholastics, never ones to let an ambiguity pass unanalysed, promptly split into camps. Aquinas, ever the reasonable Dominican, says the will is subordinate to the intellect. God is rational, and so are we, mostly. But Duns Scotus and William of Ockham, the original voluntarist hooligans, argue that the will is superior—even in God. God could have made murder a virtue, they claim, and you’d just have to live with it.
From this cheerful perspective, will becomes a force of arbitrary fiat, and humans, made in God’s image, inherit the same capacity for irrational choice. The will is now more than moral; it’s metaphysical. Less reason’s servant, more chaos goblin.
Hobbes: Appetite with Delusions of Grandeur
Then along comes Thomas Hobbes, who looks at the soul and sees a wheezing machine of appetites. Will, in his famously cheery view, is simply “the last appetite before action.” No higher calling, no spiritual struggle—just the twitch that wins. Man is not a rational animal, but a selfish algorithm on legs. For Hobbes, will is where desire stumbles into motion, and morality is a polite euphemism for not getting stabbed.
Kant: The Will Gets a Makeover
Enter Immanuel Kant: powdered wig, pursed lips, and the moral rectitude of a man who scheduled his bowel movements. Kant gives us the “good will”, which acts from duty, not desire. Suddenly, the will is autonomous, rational, and morally legislative—a one-man Parliament of inner law.
It’s all terribly noble, terribly German, and entirely exhausting. For Kant, free will is not the ability to do whatever you like—it’s the capacity to choose according to moral law, even when you’d rather be asleep. The will is finally heroic—but only if it agrees to hate itself a little.
Schopenhauer: Cosmic Will, Cosmic Joke
And then the mood turns. Schopenhauer, world’s grumpiest mystic, takes Kant’s sublime will and reveals it to be a blind, thrashing, cosmic force. Will, for him, isn’t reason—it’s suffering in motion. The entire universe is will-to-live: a desperate, pointless striving that dooms us to perpetual dissatisfaction.
There is no freedom, no morality, no point. The only escape is to negate the will, preferably through aesthetic contemplation or Buddhist-like renunciation. In Schopenhauer’s world, the will is not what makes us human—it’s what makes us miserable.
Nietzsche: Transvaluation and the Will to Shout Loudest
Cue Nietzsche, who takes Schopenhauer’s howling void and says: yes, but what if we made it fabulous? For him, the will is no longer to live, but to power—to assert, to create, to impose value. “Free will” is a theologian’s fantasy, a tool of priests and moral accountants. But will itself? That’s the fire in the forge. The Übermensch doesn’t renounce the will—he rides it like a stallion into the sunset of morality.
Nietzsche doesn’t want to deny the abyss. He wants to waltz with it.
Today: Free Will and the Neuroscientific Hangover
And now? Now we’re left with compatibilists, libertarians, determinists, and neuroscientists all shouting past each other, armed with fMRI machines and TED talks. Some claim free will is an illusion, a post hoc rationalisation made by brains doing what they were always going to do. Others insist that moral responsibility requires it, even if we can’t quite locate it between the neurons.
We talk about willpower, will-to-change, political will, and free will like they’re real things. But under the hood, we’re still wrestling with the same questions Augustine posed in a North African villa: Why do I do what I don’t want to do? And more importantly, who’s doing it?
Conclusion: Where There’s a Will, There’s a Mess
From Plato’s silent horses to Nietzsche’s Dionysian pyrotechnics, the will has shape-shifted more times than a politician in an election year. It has been a rational chooser, a moral failure, a divine spark, a mechanical twitch, a cosmic torment, and an existential triumph.
Despite centuries of philosophical handwringing, what it has never been is settled.
So where there’s a will, there’s a way. But the way? Twisting, contradictory, and littered with the corpses of half-baked metaphysical systems.
Welcome to the labyrinth. Bring snacks.
* The solitary, poor, nasty, brutish, and short quote is forthcoming. Filthy animals is a nod to Home Alone.
I’ve finally had time to create some video content for the Modernity Worldview Survey. This content is a cursory overview and serves as an introduction to deeper content planned for the future.
This video is short of seven minutes, so briefly, it outlines the worldviews and the questions. I opted not to produce a single comprehensive video so the material could arrive sooner. The content is bookmarked, though this is likely overkill for such a short video.
A permanent page about the survey is always available on this blog.
I’m still accumulating responses, but the survey is available here if you haven’t taken it. Apologies in advance for the fact that it renders best on a larger monitor or tablet rather than a mobile phone. It doesn’t render at all on a landline, so there’s that.
Firstly, I’d like to thank the people who have already submitted responses to the Modernity Worldview Survey. I’ll post that you submitted entries before this warning was presented.
Google has taken action and very responsively removed this warning. If you saw this whilst attempting to visit the URL, try again. Sorry for any fright or inconvenience. I’ll continue as if this never happened. smh
I am frustrated to say the least. I created this survey over the past month or so, writing, rewriting, refactoring, and switching technology and hosts until I settled on Google Cloud (GCP). It worked fine yesterday. When I visited today, I saw this warning.
As I mentioned in my announcement post, I collect no personal information. I don’t even ask for an email address, let alone a credit card number. On a technical note, this is the information I use:
id autogenerated unique identifier
timestamp date and time stamp of record creation (UTC)
question-response which response option made per question
ternary-triplet the position of the average modernity score (pre, mod, post)
plot_x Cartesian x-axis plot point for the ternary chart
plot_y Cartesian y-axis plot point for the ternary chart
session_id facilitates continuity for a user's browser experience
browser* which browser being used (Chrome, Safari, and so on)
region browser's language setting (US, GB, FR)
source whether the user is accessing from the web or 'locally'
('local' indicates a test record, so i can filter them out)
* These examples illustrate the colected browser information:
- Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/132.0.0.0 Safari/537.36
- Mozilla/5.0 (Linux; Android 10; K) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/132.0.0.0 Mobile Safari/537.36
This is all.
This is a Chrome Warning. Ironically, a Google product. I tested this on Opera, Edge, and Safari without this nonsense.
The front end (UI) is written in HTML, Python, JavaScript, and React with some standard imports. The backend (database) is MySQL. It is version-controlled on GitHub and entirely hosted on GCP. I link to the survey from here (WordPress) or other social media presences. I did make the mistake of not making the site responsive. I paid the price when I visited the site on my Samsung S24. The page felt like the size of a postage stamp. I may fix this once this security issue is resolved.
I sent Google a request to remove this from their blacklist. This could take three weeks, more or less.
Meantime, I’ll pause survey promotions and hope this resolves quickly. The survey will remain live. If you use something other than Chrome, you should be able to take it. Obviously, I’ll also delay analysing and releasing any summary results.
Apologies for rambling. Thank you for your patience.