Myselves

Disappointed from the start, I was hoping to have coined a neologism in myselves, but I’ve been beaten to the punch. Although my spell-check doesn’t appear to agree, myselves is a legitimate albeit nonstandard term.

Followers of my content will recognise that I don’t fully subscribe to notions of self or identity, so being a philosopher and linguaphile I am constantly on the search for another way to describe my reality.

Galen Strawson — What Are Selves?

I became aware of Galen Strawson through Daniel Dennett and who I share perspective on in a recent post, Testudineous Agency. In an attempt to better understand his position, I resorted to a Google search and unearthed some first-person narratives. I find I share a certain affinity with him.

Ostensibly, Strawson feels that free will and moral responsibility don’t exist. But he goes deeper. He acknowledges that not only do the concepts of free will and moral responsibility not have shared meaning for unequivocal communication, but even if we parse the terms more fully into free, will, moral, and responsibility, we still don’t come to accordance. More on this later.

In the case of myselves, one of my first reactions was to consider the anti-plural-pronoun application-as-singular-object-reference cohort: It’s not proper to refer to he or she as they and him or her as them—or for that matter, his or hers for their.

As for me—the me interacting with this keyboard in this moment—, the idea of thin-slicing my differentiated selves, nanosecond by nanosecond, picosecond by picosecond—or by femtoseconds or attoseconds. Or why not Planck time slices?  

Just a short post for now. I’ll see where is ends up.

Testudineous Agency

In chapter 71, Ultimate Responsibility, in Intuition Pumps and Other Tools for Thinking, author and philosopher, Daniel Dennett presents a counterargument to the notion that an agent, a person, is not absolutely responsible for their actions. He questions some premises in the ‘the way you are’ line of argumentation, but I question some of his questions.

Here is a nice clear version of what some thinkers take to be the decisive argument. It is due in this form to the philosopher Galen Strawson (2010):
1. You do what you do, in any given situation, because of the way you are.
2. So in order to be ultimately responsible for what you do, you have to be ultimately responsible for the way you are—at least in certain crucial mental respects.
3. But you cannot be ultimately responsible for the way you are in any respect at all.
4. So you cannot be ultimately responsible for what you do.

Dennett, Daniel C.. Intuition Pumps And Other Tools for Thinking (p. 395). W. W. Norton & Company. Kindle Edition.

Dennett continues.

The first premise is undeniable: “the way you are” is meant to include your total state at the time, however you got into it. Whatever state it is, your action flows from it non-miraculously.

Dennett and I are in agreement with Strawson. There is not much to see here. It’s akin to saying the now is the result of all past events until now. This is “the way you are”.

The second premise observes that you couldn’t be “ultimately” responsible for what you do unless you were “ultimately” responsible for getting yourself into that state—at least in some regards.

This second premise asserts that one cannot be responsible for any action that one had no part in performing. Two scenarios come immediately to mind.

First, you are not responsible for being born. As Heidegger notes, we are all thrown into this world. We have no say in when or where—what country or family—or what circumstances.

Second, if one is hypnotised or otherwise incapacitated, and then involved in a crime, one is merely a cog and not an agent, so not responsible in any material sense.

But according to step (3) this is impossible.

Whilst Dennett fixates on the absolute aspect of the assertion, I’d like to be more charitable and suggest that we still end up with a sorites paradox. Dennett will return to this one, and so shall I.

So step (4), the conclusion, does seem to follow logically. Several thinkers have found this argument decisive and important. But is it really?

As Dennett invalidates step (3), he insists that the conclusion is also invalid. He asserts that the notion of absolute responsibility is a red herring, and I argue that Dennett doesn’t get us much further, perhaps redirecting us with a pink herring.

I’ve created an image with tortoises to make my point. There are actually two points I wish to make. The first point is to determine where the responsibility is inherited. This point is meant to articulate that the world can not be strictly deterministic and yet one can still not have significant agency. The second point is that culpability is asserted as a need, and acceptance of this assertion is the problem.

This image has an empty alt attribute; its file name is image-14.png
Testuditude

The image depicts an evolution of an agent, with time progressing from left to right. The tortoise on the right is a product of each of the recursive tortoises to its left. The image means to convey that each subsequent tortoise is a genetic and social and social product of each tortoise prior. Of course, this is obviously simplified, because tortoises require pairs, so feel free to imagine each precedent tortoise to represent a pair or feel free to add that level of diagrammatic complexity.

This is not meant to distinguish between nature and nurture. Instead, the claim is that one is a product of both of these. Moreover, as genetic, epigenetic, and mimetic influences are transmitted in family units, they also occur through social interaction and the environment, as represented by the orange and green tortoises.

…if one is a product of genetic and mimetic forces, how much agency remains for culpability?

The point here is that if one is a product of genetic and mimetic forces, how much agency remains for culpability? Each person is an emergent unit—autonomous, yes, and yet highly programmed.

If I programme a boobytrap to kill or maim any intruder, the boobytrap has no agency. I assert further, that the maker of that boobytrap has no more responsibility than the killing device.

The old hand grenade wired to a doorknob boobytrap trick

But who do we blame? you ask, and that’s precisely the problem. Asking questions doesn’t presume answers. This is a logical fallacy and cognitive bias. This heuristic leaves us with faulty jurisprudence systems. Humans seem hardwired, as it were, to blame. Humans need to believe in the notion of free will because they need to blame because they need to punish because vengeance is part of human nature to the extent there is human nature. There seems to be a propensity to frame everything as a causal relationship. Dennett calls this the Intentional stance. To borrow a from Dennett…

This instinctual response is the source in evolution of the invention of all the invisible elves, goblins, leprechauns, fairies, ogres, and gods that eventually evolve into God, the ultimate invisible intentional system.

Dennett, Daniel C.. Intuition Pumps And Other Tools for Thinking (p. 374). W. W. Norton & Company. Kindle Edition.
Fire Trap in Home Alone

Sins of the Fathers (and Mothers)

Let’s wrap this up with a sorites paradox. As I’ve already said, I agree with Dennett that the absolute aspect is unnecessary and undesired. The question remains how much agency™ does a person have once we account for the other factors? Is it closer to 90 per cent or 10 per cent? Apart from this, what is the threshold for culpability? Legal systems already have arbitrary (if not capricious) thresholds for this, whether mental capacity or age, which basically distils back to the realm of capacity.

I have no basis to even venture a guess, but that’s never stopped me before. I’d argue that the agency is closer to zero than to one hundred per cent of the total, and I’d propose that 70 per cent feels like a reasonable threshold.

I could have sworn I’d posted a position on this after I read Robert Sapolsky’s Behave. Perhaps it’s never made it out of drafts.

In closing, I don’t think we need to settle the question of determinism versus free will to recognise that even without strict determinism, personal agency is still severely limited, and yet as our political systems presume a level of rationality that is not apparent, so do legal systems presume a level of agency not present.

Instead Laugh

I promise not to make this blog about child development anecdotes. The house toddler exhibits an interesting adaptation.

Podcast: Audio rendition of this page content

I’ve raised a few toddlers in my day, and this one is different in the most delightful way. All other toddlers I’ve encountered—especially the pre-verbal variety—cry to solicit attention. Particularly, if they are napping in another room and awaken, eventually they beckon for attention. Where all others I’ve known—mine or those of strangers—have cried or fussed, this one laughs.

I can explain it in cognitive terms. She somehow connected laughter with attention, which was further reinforced by attention to her laughter. Having seen Joaquin Phoenix’s Joker, I am hoping she doesn’t go there. (Joking). It’s interesting. I wonder how common this is. I’d be interested in knowing if any readers have experienced this.

She still cries and fusses at frustration or discomfort—hair washing or brushing, falling, or wet diapers—, but not when it comes to attention.

OneSide Zero, Instead Laugh

I leave with an old favourite tune…

Kid Speak

A toddler lives with me. She’s been on the brink of verbal language for the past few months, and I am sharing some observations.

Listen on Spotify

Juicy Shoes

Juice (in a sippy cup) and shoe(s)

Two objects that play a large part of her verbal life are juice and shoes. As I hear her employ these words, they are virtually indistinguishable. I may actually misperceive her, and she could be uttering the same morphemes. I captioned how I perceive these images with IPA references.

In English (in IPA), juice is spelt / d͡ʒˈuːs / and shoes is / ˈʃuːz /. She simplifies ‘juice’ by not voicing the leading alveolar plosive ‘d’ and by shortening the diphthong to a monopthong vowel. For ‘shoe’, she similarly shortens the vowel sound and annunciates a voiceless rather than voiced alveolar fricative.

Awhr

Left to Right, Top to Bottom: Windscreen Ice Scraper, Silicone Dinosaur Hand Puppet, Stick, Cap Shaper

Can you guess the common thread the above objects share?

Spoiler Alert: They are each signified by the same signifier— Awhr, which I believe may be transliterated as Rawr. Bear with me.

She loves dinosaurs. Seeing one, she reflexively roars onomatopoeically, awhr. I know. You are thinking to yourself, that’s a no brainer. Dinosaurs roar. At least in the modern-day mythos. I don’t speak dinosaur, and perhaps dinos had regional dialects or species nuances. She’s just a toddler, so ‘awhr‘ is representative of all dinos.

But, you are thinking, these other things not only don’t roar, but they’re also inanimate. Sure, you tell yourself, the dinosaur is a puppet, but you can envisage the connexion. That’s a toy dinosaur, but these other things are an ice scraper, a stick, and who knows what that last thing is? It’s a hat shaper. It was an insert to a cap—like a baseball cap or trucker cap. It was removed for the cap to be wearable, and the insert is one of her favourite toys. All of these rank high on her list of preferred toys.

As far as I can tell, she envisages the cap insert as teeth—resembling, perhaps, the teeth of a dinosaur—hence ferocity, hence a roar. I believe the roar-stick connexion has an aetiology that involves the ice scraper, so I’ll share that origin story.

Whilst shopping for an ice scraper in Winter, we were in the automotive aisle. As she was interested in the variety of air fresheners, I parked her trolley and surveyed the aisle looking for scrapers. Finding one—and for reasons unknown to me; perhaps the ‘teeth’ on the back of it—, I represented it as a claw and produced my own roar. The impression was made. It’s been months, and whenever she interacts with it, she roars as if it were a dinosaur.

By extension, I believe, the stick is a simulacrum. We’ve travelled from the signified to the first-level signifier (puppet) to a second-level signifier (scraper) to a third-level signifier (stick). Absent the causal narrative, one would be hard-pressed to suss out why a child might be representing a stick with a roar. And now you know.

But wait. There’s more. I was so busy geeking out, that I almost forgot the story that prompted this post in the first place. We were driving somewhere. She sits in a car seat that, by design restricts her movement, and sometimes her playthings go out of reach, where ‘sometimes’ means ‘almost invariably’.

As we were heading wherever, I heard the cue, awhr. A quick glance in the mirror caught the dinosaur puppet. I reached back and handed it to her. Crisis averted. She played with the puppet for a bit, and that leads to a brief diversion from the narrative. Her roars have two noted contexts. The first is playing, awhr. She finds herself amused to be a dino ventriloquist, and she bursts out laughing if you acknowledge her playing. The second is serious business, awhr. Laughter is not the expected reaction. Anything less than feigned terror will get you the look. This is no play dino. This is a dino incarnate. But I digress.

A minute or so later, awhr. This roar is neither play nor serious. She wanted something else. A quick survey of available candidates, and I sussed out the cap stretcher cum teeth. Crisis averted. But only for a minute. After satisfactorily animating the stretcher, another roar. Another glance back. No good candidates. But there was a stick. What do I have to lose?

Awhr. Yep. This is what she had in mind. She animated the stick for another few minutes. And that brings this chapter to an end.

Moar

Though her vocabulary is so far quite constrained, she does have a few more available words. More was pretty early on her list, almost invariably accompanied with a gesture—thrusting an empty juice cup or just generically declaring that she wanted more of whatever it was that she’d just had. Nice general-purpose word, for sure.

Thank You

Thank you was another early entry. She pronounces it like the German, danke, but with a cut schwa sound at the end and perhaps more: / ˈtaŋk ə /. Schwa is already a short unstressed vowel, and I don’t know how to represent it shorter. In musical notation, I might have opted for a staccato-pianississimo combo in an attempt to capture the dynamics. Alas…

Hello, Goodbye

Other phatic and still enthusiastic utterances are hi and bye with attendant waving gestures.

Enfin

She does have words for dad and mum, respectively da and ma or maman, comme en français, and she employs nods and headshakes to communicate yes and no. And she uses sulking body posture to convey disappointment. Finally, she says ow to any number of things to indicate frustration.

I’m sure she’ll be adding many more words relatively soon and quickly.

Post-Postmodernism

I happened upon an article that notes that the postmodern label is now 50-odd-years old, so what’s next? Just a short response, the label never made sense for several reasons.

First, the prefix post suggests a new era or paradigm. In and of itself, this is not a problem. The challenge is the root: modern.

Effectively, modern means now, the current era, in the same manner as today sits between yesterday and tomorrow. The problem is that we are employing the term postmodern as if it’s tomorrow but today. Of course, except in jest, tomorrow is never simultaneously today. The notion reminds me of the sentiment captured in the quip when asked ‘When will you do this task?’ ‘I’ll do it tomorrow’. When queried the next day, ‘Why have you not yet done this task?’ and the response is ‘I’ll do it tomorrow’, ad infinitum.

I’ll caption this tomorrow

Modern derives from the Latin meaning ‘just now‘. People have been labelling themselves as modern since at least 1585 when it meant ‘of or pertaining to present or recent times‘. As early as 1500, it meant ‘now existing‘, so more toward ‘extant‘.

My point is that one might be able to retroactively reference post-X in relationship to X, but to name something duratively as post-X simply makes no sense. Add to this the complication that Latour mentions that we’ve never been modern or the further connotation that privileges the term adopter over others. Namely, whilst the West are modern at time-zero, being the height of modernity, some other contemporaneous other does not qualify. The United States are modern—just not Appalacia and certainly not Bangladesh. In a temporal sense, premodern takes on a similar meaning, e.g. Aztec or Mayan civilisations.

Besides the unfortunate naming, ‘postmodern‘ attempts to envelop many thoughts. As I’ve mentioned before, it is most typically pejoratively.

Whist I attempt to align myself with certain so-called postmodern figures, and I use the term myself because it still has some referential value, I do so with reservations and the understanding that it’s a nonsensical notion from the start. Perhaps, I’ll suggest a new solution tomorrow.

Righteous Mind

Preamble

All too often, I’ll read or listen to a book and place bookmarks with the best of intents to revisit and comment. yet either never to return or to return and not recall the context and not wanting to reread to regain it. I am going to attempt to document my reaction to Jonathan Haidt’s book, The Righteous Mind: Why Good People are Divided by Politics and Religion. If you’ve read some posts here, you’ll understand that I am not a moralist, so I don’t expect to like the book or agree with it. I’ve already ready the forward materials, so I’ll return to comment on that before I get too far ahead. I have done this before at university, and it is decidedly slow progress and can chase one down rabbit holes—this one, anyway.

I have a habit of abandoning books in favour of others including dropping them outright. This is one of 16 I have in progress at the moment, some commenced as many as 5 years ago. To be fair to myself, many of those books are substantially completed. I feel I got the intended message—or at least got what I wanted out of them—, and I just haven’t read the final few chapters. In some cases, the book is an anthology, and I have been slogging my way through it. A few books I’ve read before and am reabsorbing the material, so I may decide not to re-read cover to cover. I just pulled a second reading book off the list to get to 16 from 17.

I have striven not to laugh at human actions, not to weep at them, not to hate them, but to
understand them.

— Baruch Spinoza, Tractatus Politicus, 1676

Introduction

“Can we all get along?” — Rodney King

“Please, we can get along here. We all can get along. I mean, we’re all stuck here for a while. Let’s try to work it out.”

Born to be Righteous

I could have titled this book The Moral Mind to convey the sense that the human mind is designed to “do” morality, just as it’s designed to do language, sexuality, music, and many other things described in popular books reporting the latest scientific findings.

Empasis mine

Straight away, I have a contention. The human mind is not designed to do anything. It has evolved and performs functions. Perhaps, this is just a matter of semantics, but it puts me on guard. Moreover, that it does morality doesn’t evaluate the relative benefit or if it should even be done. Without going down the aforementioned rabbit hole, language is a perfect example. We use language to communicate, but language as a social mechanism may be a secondary or tertiary function. As I’ve argued—even quite recently—, this is a reason I feel that language is insufficient for the purpose of conveying abstract concepts, like for example, morals and morality.

But I chose the title The Righteous Mind to convey the sense that human nature is not just intrinsically moral, it’s also intrinsically moralistic, critical, and judgmental.

A primary function of the brain is as a difference engine. This is what allows us to discern friend from foe, edible versus poison, and so on. Reflecting on Kahneman and Tversky, most (if not ostensibly all) of this is a heuristic system I process, which is good enough but only at a distance. Morals allow us to create in-group and out-group distinctions.

I want to show you that an obsession with righteousness (leading inevitably to self-righteousness) is the normal human condition. It is a feature of our evolutionary design, not a bug or error that crept into minds that would otherwise be objective and rational.

To my first point—not only his insistence on a design metaphor, but doubling down and declaring it as not a bug or an error—, this is disconcerting. And it may be a normal human condition, but so is cancer. The appeal to nature isn’t winning me over.

Our righteous minds made it possible for human beings—but no other animals—to produce large cooperative groups, tribes, and nations without the glue of kinship.

Agreed.

What Lies Ahead

Part I is about the first principle: Intuitions come first, strategic reasoning second.

If you think that moral reasoning is something we do to figure out the truth, you’ll be constantly frustrated by how foolish, biased, and illogical people become when they disagree with you. But if you think about moral reasoning as a skill we humans evolved to further our social agendas—to justify our own actions and to defend the teams we belong to—then things will make a lot more sense.

Haidt and I are much aligned on these points.

Keep your eye on the intuitions, and don’t take people’s moral arguments at face value. They’re mostly post hoc constructions made up on the fly, crafted to advance one or more strategic objectives.

Not buying the ‘go with your intuitions‘ advice. Moving on.

…the mind is divided, like a rider on an elephant, and the rider’s job is to serve the elephant … I developed this metaphor in my last book, The Happiness Hypothesis.

I’m not sure I am going to like this dualism, and I haven’t read The Happiness Hypothesis, so I’ll just have to see where he takes it. It seems like Haidt is a hardcore Traditionalist.

Part II is about the second principle of moral psychology, which is that there’s more to morality than harm and fairness.

This feels about right.

The central metaphor of these four chapters is that the righteous mind is like a tongue with six taste receptors.

OK. Let’s see where this goes.

Part III is about the third principle: Morality binds and blinds.

I like this pair.

…human beings are 90 percent chimp and 10 percent bee.

Did he say bee? I agree with the chimp reference. Maybe this won’t be as bad as I thought.

A note on terminology: In the United States, the word liberal refers to progressive or left-wing politics, and I will use the word in this sense. But in Europe and elsewhere, the word liberal is truer to its original meaning—valuing liberty above all else, including in economic activities. When Europeans use the word liberal, they often mean something more like the American term libertarian, which cannot be placed easily on the left-right spectrum.10 Readers from outside the United States may want to swap in the words progressive or left-wing whenever I say liberal.)

Decent advice.

Why do you see the speck in your neighbor’s eye, but do not notice the log in your own eye? … You hypocrite, first take the log out of your own eye, and then you will see clearly to take the speck out of your neighbor’s eye.

— MATTHEW 7:3–5

I do find myself, probably too often, parroting this paragraph.

PART I

Intuitions Come First, Strategic Reasoning Second

Central Metaphor: The mind is divided, like a rider on an elephant, and the rider’s job is to serve the elephant.

Where Does Morality Come From?

A family’s dog was killed by a car in front of their house. They had heard that dog meat was delicious, so they cut up the dog’s body and cooked it and ate it for dinner. Nobody saw them do this.

A man goes to the supermarket once a week and buys a chicken. But before cooking the chicken, he has sexual intercourse with it. Then he cooks it and eats it.

TBD

The Origin of Morality

Quick reaction for now. Details to follow…

I’m not quite buying into Haidt’s attempt to parse the nature versus nature argument into three segments: nativism and empiricism whilst adding rationalism insomuch as rationalism is seen by many as ambiguous and not a mutually exclusive option. It feels as though he’s throwing up a rationalist strawman to take down. We’ll see where it leads

Nativism
the theory that concepts, mental capacities, and mental structures are innate rather than acquired by learning.

Empiricism
the theory that all knowledge is derived from sense-experience.

Rationalism
the theory that reason rather than experience is the foundation of certainty in knowledge.

Let’s pick up on this later. I knew this would take a lot longer.

Revolutionary Reformer

A social connection posted a piece on Humberto Maturana’s idea of “aesthetic seduction”. I found it interesting, so I wanted to understand more. Performing a Google search, I landed on The Edge, where I found an interesting comment by Dan Dennett. I share it in its entirety.

Daniel C. Dennett

Philosopher; Austin B. Fletcher Professor of Philosophy, Co-Director, Center for Cognitive Studies, Tufts University; Author, From Bacteria to Bach and Back

Post hoc ergo propter hoc! “After this, therefore because of this.” Francisco Varela is a very smart man who, out of a certain generosity of spirit, thinks he gets his ideas from Buddhism. I’d like him to delete the references to Buddhist epistemology in his writings. His scientific work is very important, and so are the conclusions we can draw from the work. Buddhist thinking has nothing to do with it, and bringing it in only clouds the real issues.

There are striking parallels between Francisco’s “Emergent Mind” and my “Joycean Machines.” Francisco and I have a lot in common. In fact, I spent three months at CREA, in Paris, with him in 1990, and during that time I wrote much of Consciousness Explained. Yet though Francisco and I are friends and colleagues, I’m in one sense his worst enemy, because he’s a revolutionary and I’m a reformer. He has the standard problem of any revolutionary: the establishment is — must be — nonreformable. All its thinking has to be discarded, and everything has to start from scratch.

We’re talking about the same issues, but I want to hold on to a great deal of what’s gone before and Francisco wants to discard it. He strains at making the traditional ways of looking at things too wrong.

Dennett’s response is a critique of Francisco Varela, which is not the part that interests me. What caught my eye is his distinction between revolutionary and reformer. And it dawned on me—perhaps re-dawned might be a better verb, or to illuminate or intensify, to shine a light.

I consider myself to be introspective, and times like these allow me to be self-critical. I view myself as a revolutionary as far as expectations go. This makes me impatient with little tolerance for the marginal changes that attendant with reformism.

Being a revolutionary doesn’t make one a Utopian—a common critique—, that one is seeking perfection. From my perspective, when things are so far off course or misaligned, incremental changes don’t seem to be enough.

Moreover, reform is a political misdirection tactic I am leery of. So, irrespective of core beliefs, I feel even a reformist should be wary of the tactic. In politics, sometimes new ideas arise that are not in concert with the prevailing orthodoxy but are building mass. The idea is to retain the status quo as much as possible. The tactic is to find the smallest least disruptive sliver and find a way to integrate it in a manner for the mass to diminish and to be able to claim concordance.

The first example that pops into my mind is the Affordable Care Act (AKA Obamacare) in the United States, which is not exactly affordable not all that caring, though it is a reformist act. Even the main alternative of Universal Single-Payer insurance wasn’t that revolutionary, making the delusion of the solution and the adopted approach all that much more disappointing.

Industrial and post-industrial countries have solved this problem, so it’s not revolutionary unless one considers being over a hundred years late to the party to be particularly impressive. Moreover, there are programmes in the United States, i.e. Medicare, that are ostensibly single-payer programmes. In fact, one approach suggested was to expand Medicare to include everyone. This was dubbed Medicare Part E.

What this exposes is that the Reform-Revolution debate is a sorites challenge. The reformers consider the Medicare Part E proposal to be radical or revolutionary whilst I viewed it as a couple more millimetres away from the original Obamacare promises. Since the status quo started from such a limited position, when they ended up with is a milquetoast implementation.

To me, the debate is about paradigm shift versus glacial change. As for me, when I regard the battle between the Democrats and Republicans in the United States, I am not satisfied with any solution that sees these parties still standing post-solution. As a revolutionary thinker, I don’t need to toss out the proverbial baby with the bathwater, but let’s lose the bathwater and at least the sieve of a tab. Of course, I argue that the direction the so-called Enlightenment has taken the Western world, which is different to the argument made by prior traditionalists, so I can see a lot of room for change—revolutionary change. In the case of implementing Enlightenment beliefs, they took the idea of revolution a bit more literally than was perhaps necessary, but since it was more about a power grab than some broader promise of freedom, I suppose it was necessary. Meet the new bosses, same as the old boss.

“Equality of rights under the law shall not be denied or abridged by the United States or by any state on account of sex.”

The United States doesn’t constitutionally protect women. This is where reformism gets you. Per Wikipedia,

The Equal Rights Amendment (ERA) is a proposed amendment to the United States Constitution designed to guarantee equal legal rights for all American citizens regardless of sex. Proponents assert it would end legal distinctions between men and women in matters of divorce, property, employment, and other matters. The first version of an ERA was written by Alice Paul and Crystal Eastman and introduced in Congress in December 1923.

Wikipedia — Equal Rights Amendment

If you read 1923 and wonder if that’s a typo, it’s not. It’s been almost 100 years and women still have no guarantee of equal rights. Women had only been granted voting rights with the ratification of the 19th Amendment to the Constitution on August 18, 1920.

For a country founded on the principle that all people are created equal, this feels like it should be considered to be a redundant act…

My bad, the US Declaration of Independence reads “We hold these truths to be self-evident, that all men are created equal, that they are endowed by their Creator with certain unalienable Rights, that among these are Life, Liberty and the pursuit of Happiness”. This is just men. And when this was written BIPOC did not fully qualify as men. Reformists have almost got that sorted out by now right? 1776 seems like almost yesterday. Change comes slowly.

By now, I’m rambling semi-coherently, so I’ll close this down. Keep in mind the foundation of your interlocutor. Is s/he a reformist or a revolutionary? Determine where on the scale s/he falls. You might save yourself a lot of time. Time Is on My Side is only a song not a recipe for living.

Ten Women

Given my last post, it had me reflecting on some women who’ve influenced me—especially my thinking and worldview. Unlike Mitt Romney, I don’t have binders full of women.

Spotify

Positive

Simone de Beauvoir (Philosopher)

Simone de Beauvoir

Beauvoir is brilliant. I consider her to be the first feminist. The women before her, I rather consider being proto-feminists. I was immediately gripped when I read her book, The Second Sex, and her idea that women are not born that way, and it’s not even a simple ageing-maturity function. It’s a performative role.

Addendum: Perhaps not as famous, she’s at least as apt as Sartre as an Existentialist philosopher. I still have a fondness for Existentialism in the same way I like Pragmatism. Although life has no inherent meaning, if making one up gets you through your life, by all means, conjure up a meaning.

Judith Butler (Philosopher, Gender Theorist)

Judith Butler

Butler taught me the perspective of gender expression and performativism. Whilst Beauvoir was more describing gender as a role, Butler extends the notion to that of performative speech acts, which is to adopt the identity and declare, I am a woman.

Ruth Shore (English Professor)

This was my undergraduate Critical Writing professor. All but one of the assignments were by female authors. And the only male author was for an assignment to critique, compare, and contrast articles by Gloria Steinham and Thorstein Veblen. I don’t remember Steinham’s article (though I recall coming down hard on her for citing sources that did not tie back to her position). Veblen’s work was Conspicuous Consumption, an essay from The Theory of the Leisure Class: An Economic Study of Institutions, which is worth the read even today.

To be completely honest, I’m not sure I’ve quite remembered her name. I believe it’s her name, but I’m not great with names. As evidence, I turned in a final essay for one of my undergraduate English Literature courses. His trip was American fiction authors and interpreting literature by understanding the life and time and place of the author—clearly not into Barthes or Derrida. In this case, on the cover page, I had typed the course name and Professor David Grace (or some such). It was returned by post a week or so later with two remarks. The first: ‘I’ll miss your sardonic humour‘; the second: My name is whatever it was [sorry again], not David Grace. I’m pretty sure Professor was one of my maths professors, but don’t hold me to that. As far as I know, not-David doesn’t identify as a woman, but I do recall spending time on Edgar Allen Poe (also not a woman) and Donald Barthelme (still not a woman), the Postmodern Absurdist, who at fate would have it, died a couple of months after I graduated.

Hannah Arendt (Philosopher)

Hannah Arendt

Arendt’s concepts of the banality of evil and totalitarianism. My first exposure was through Eichmann in Jeruselum, where she discusses the banality of evil and how my postmodern roots become more ossified. Sadly, here Origins of Totalitarianism are too relevant for comfort these past few decades in the West.

Sunera Thobani (Sociologist, Feminist)

Sunera Thobani

Thobani’s post-colonial feminism is a newer influence, but she really drives home the point that the Western perspective is privileged and intervention in other cultures to ‘save the women’ from oppression is imposing the privileged perspective in a colonial manner.

Elinor Ostrom (Economist)

Elinor Ostrom

I was inspired by her work showcasing that coöperation prevails over tired competitive models. She was also the first woman to receive a Nobel Prize for economics just before her death.

Margaret Atwood (Author)

Margaret Atwood

Before Handmaid’s Tale was a Hulu series, it was a book. When I read this genre-establishing speculative fiction in the 1980s, I took notice. That it remains relevant is cause for trepidation.

Ursula K. Le Guin (Author)

Ursula K Le Guin

Le Guin’s story, The Ones Who Walk Away from Omelas, truly unveils the Utilitarian, Consequentialist narrative of the greater good. This is more speculative fiction in the domain of Margaret Atwood, but Le Guin’s domain is typically science fiction. As often as I’ve tried, science fiction narratives don’t typically resonate with me, so I haven’t engaged her longer works. But the impact this had and continues to have on me should suffice.

Herculine Adélaîde Barbin (Intersex Case Study)

I believe I was introduced to Alexina through Michel Foucault. It helps to shine a spotlight on how arbitrary identity really is. Perhaps not capricious, but definitely arbitrary. A few years back, I even created a short video that, as I recall, commenced my YouTube channel.

Negative

I had originally intended to make this a post on the positive influences of women, but as I was searching my memories, a couple of reprehensible influences came to mind. Thatcher topped that list.

Margaret Thatcher (Politician)

Margaret Thatcher

In the US, Liberals and we Leftists demonise Ronald Reagan for being the beginning of the end of cordial US politics. Whilst this is true to a point, Thatcher predates Reagan’s destructive national policies by a couple of years. If not for the path she paved, we may never have taken it. Granted, the Clintons made sure to drive nails into the bipartisan coffin to seal the pact, and perhaps Thatcher and Reagan were just symptoms, not causes, in the manner that Johnson, Trump, and Biden are more expressions and conduits than catalysts…or at least generators.

In any case, she’s left a lot of destruction in her wake.

Ayn Rand (‘Philosopher’)

Ayn Rand

Rand is another woman I love to hate. Her so-called Objectivism has given permission to so many looking for an excuse or justification for their assholery.

To be fair, when I read Atlas Shrugged as an impressionable youth, I was taken in. I decided to read it after having heard her speak in an interview. I fell into her storyline without critical examination. I even tried to adopt it as a frame or lens. And then I read Fountainhead, which added nothing.

Fast-forward to the late ’90s, I was reevaluating my vantage and perspectives, so I decided to re-engage Atlas Shrugged as an audiobook. It was embarrassingly bad writing with 1-dimensional characters, which is appropriate because 1-dimensional people adopt this worldview. Apologies for the ad hominem, but I include myself in this cohort. I hope I’ve actually come to evolve additional dimensions rather than simply swap them, but Rand is a hack writer who had helped make the world a more toxic place to live. Not a fan.

Honourable Mention

Sophie Germain (Mathematician)

I named my second daughter Sophie Germain Surname, hoping it would be aspirational. Although it wasn’t, she is still proud to point out her legacy.

Harriet Tubman (Abolitionist)

Rosa Parks (Civil Rights Activist)

Jane Austen (Author)

Tori Amos (Musician)

PJ Harvey (Musician)

Sarah MacLauchlan (Musician)

Fiona Apple (Musician)

Kate Bush (Musician)

Laurie Anderson (Musician, Performance Artist)

More Women, But No

I could laundry-list a bunch of women I am aware of, but I can’t really claim they influenced me in a way I can grasp, so I won’t bother.

What are women?

I stumbled on Lily Alexandre’s What Are Women vid on YouTube. And despite already being in the midst of a dozen other things, I decided to watch it. Well, I’d been up all night and super tired, so after ten minutes I listened in bed until the end. After a few minutes, I felt compelled to respond on her channel. And then I was awake, so I figured I comment here as well—despite 2 or 3 of the dozen things I’ve got going on are draft posts here.

Lily presented her points well. And save for a few nits, I agreed fully. Getting the nits out of the way, I feel she took some shortcuts by (admittedly) overgeneralising the historical record of European gender history and anarcho-Communist hunter-gather or hunter-horticultural roots. I don’t disagree with the story point, but it’s a disservice to play the same game as the promoters of the primary narratives. Just say something along the lines that there is more about the historical record that we don’t know than we do, but there is evidence of X, Y, and Z. I recommended David Graeber’s The Dawn of Everything: A New History of Humanity. Moving on.

I recommend listening to her piece directly, as I am going to editorialise rather than fully recount it. Where she ended up is where I want to start. Adopting a Foucauldian perspective, the definition of woman is only important to those who want to employ it to control women, to gain power over them. Any definition of woman is going to exclude some who identify as women and include some who don’t.

A quick aside: When I was in my young twenties, I loathed being called sir, the polite title. It wasn’t the maleness that this suggested; rather I didn’t identify with the maturity aspect it conveyed. Whilst I identified as a male, neither did I identify as a boy nor a man. Sir tried to impose this on me. At least when someone attempted to label me a gentleman, I could retort that I wasn’t wearing a tophat and tails. Gentlemen, I viewed as Rich Uncle Milburn Pennybags, AKA Monopolyman—monocle and all. Did Mr Monopoly wear a monocle, or was that Mister Peanut? No matter.

Mr Monopoly

As anyone who’s read a few of my posts knows, I don’t really buy into the whole notion of identity. I’m not much of a fan of ranks and titles either, in case you wanted to know.

As I was listening, Lily got to where woman is defined in three words: adult human female. In my head, I’m already arguing against it. Like when watching a horror suspense movie—Don’t go in there! Alas, then so did Lily shoot it down as well. Each of these words is arbitrary. Admittedly, all words are arbitrary by definition, but these words have their own challenges

Adult

In turn, adulthood is defined differently depending on time and cultural place. Nowadays, in the West, 18 is probably the arbitrary cutoff most used. This is the age of majority as far as entering into legal contracts are involved—though people can’t drink alcohol or buy cigarettes until they are 21. And the brain continues to develop past 30. It may actually never stop, though it does shrink after 45, so there’s that. We could opt for a less legalistic litmus in favour of a naturalistic approach. As she points out, we could argue this happens at the onset of menses—but that’s a slippery slope on several accounts. Firstly, some females are precocious and might commence their cycle as early as 12 or 10 or even 8. We’re going to need to return to this litmus for the definition of female, so let’s continue.

Human

As she points out, human is ill-defined, and we’ve got a history of dehumanising people. Don’t get me started on negroes and indigenous Americans. This allows legal systems to simply rescind one’s human card. That’s no woman; she’s an animal—blah, blah

Female

And we arrive as female—the synonym we’ve managed so far to kick down the kerb. Lily didn’t spend too much time here, but this is attempting to tee up a CIS defence—a genetics double-X defence. We’ve already touched on the arbitrary categorisation. The intent here is to exclude. This is Beauvoir’s otherness. Derrida’s subordinate pair to the dominant male term. But we’re not discussing intent at the moment. Let’s regard the definition:

Female / ‘fi meɪl / noun

  1. a person bearing two X chromosomes in the cell nuclei and normally having a vagina, a uterus and ovaries, and developing at puberty a relatively rounded body and enlarged breasts, and retaining a beardless face; a girl or woman.
  2. an organism of the sex or sexual phase that normally produces egg cells.

Here, we see the double-X defence, but what about XXY and so on?

We get stuck in a circular logic loop at some point because the definition of female concedes that it is synonymous to girl or woman. A woman is a female who is a woman who is a female who is a woman who is a female who is a woman who is a female who is a woman who is a female who is a…

Normally having a vagina, a uterus and ovaries may not intentionally be trying to exclude transgender females. Rather, some XX females may have some genetic anomaly, and more probably, some women have their uterus and/or ovaries removed due to medical reasons.

In closing

Words have use, but if the intent of object words is to do more than describe, beware an agenda. As for gender words, I have no use for them. As for sex terms, I don’t really have a use for them either. Detouring to Saussure for a moment, we’d got female, the signifier noun, and the signified.

Parental Advisory

There is one and only one situation where I have any concern about the genital manifest, and that’s when I am performing some sex act—talking Crying Game here. I even admit that this is my own shortcoming, but I live with it. Your mileage may vary. Other than this extremely limited scope* of events, it really doesn’t matter.

Anyhoo, this impromptu post has run its course. Watch the vid yourself, and tell me or Lily or both of us what you feel—perhaps even what you think.

* Limited scope of events: Come on now. Don’t be judgy. It’s not that limited.

Trustwise

The lamb spends all its time worrying about the wolf and ends up being eaten by the shepherd.

— Unknown

I think one could look at this from several perspectives or through different lenses.

We worry about the wrong things.

At some level, this is about trust.

We trust the wrong people. Those whom we most entrust do us in. But I feel this is contextual.

One might feel this shepherd is Capitalism or the State or organised religion. Perhaps it’s culture or identity cohorts. Or all or these or none of these.

On another level, it recalls the inevitability of death. This shepherd reaper is always waiting in the wings whether or not one worries.

In the words of RATM, Know Your Enemy.