u n d e r v e r s e

Wednesday, September 17, 2014

"Bam, Bam, Bam."

I’m going to go out on a limb and say that it is self-evident to Sam Harris and Richard Dawkins that they are feminists. This would explain how Harris could write off the response to his statement that "angry atheism" is less attractive to women because it lacks a “nurturing estrogen vibe” as mere “pointless controversy.” Pointless, of course, because it is known to Sam Harris that he is a non-bigot. This is a truth in his heart. And why would you not accept the contents of the heart of a good man at faith—excuse me, I mean face—value?

Likewise, Richard Dawkins finds it patently absurd that his pal Michael Shermer would be a serial sexual predator, because after all, he’s a good guy, and “What decent person is NOT a feminist”?

From this vantage point—the presumption of one’s own obvious decency—it is easy to see why Harris and Dawkins and so many others like them (and like myself from time to time and probably you too, dear reader) are so quick to interpret any dissent as vituperative. To imagine themselves as the victim of “thought police” and “witch hunts.” In his own defense, Harris goes to great lengths to demonstrate his love and respect of women*, as though that should be enough to settle the matter for all to see that what is in his heart is good. And yet he cannot, apparently, recognize the extremely elemental problem with his “estrogen vibe” analysis: that it permits no role for structural sexism. The “I guess women are just different from men” argument is inherently victim-blaming.

Dawkins has been famously victim-blaming women since the “Dear Muslima” affair, and has recently taken to subtweeting former comrade PZ Myers for calling him out on his rather spotty command of feminist principles. For his part, Myers is hardly known for his fondness for kid gloves—most recently he wrote that Dawkins had been devoured by “brain parasites.” But mockery and name-calling are just the medicine that the New Atheists have insisted from the outset would rid the world of delusion and superstition. Unfortunately it would seem that to resist this medicine one only need inoculate oneself with the image of one’s own reasonable good-heartedness. Armed with such a self-image, no one need engage anyone else’s ideas or experiences with anything more generous than condescension and self-righteous pity.

***

The global all-time winner of the victim-blaming gladiatorial games has got to be Jerry Coyne, who wrote in 2011 that Anne Frank would not have perished in the Holocaust if only she were not Jewish. But here's another victim-blamer from today's headlines. Sean Hannity, in a segment defending Adrian Peterson's whipping of his infant child, said on his TV show yesterday "I got hit with a strap. Bam, bam, bam. And I have never been to a shrink. I will tell you that I deserved it." A heartbreaking statement.

It is a remarkable coincidence that those who have no time for what other people find afflicting so often must downplay--if not outright defend--their own childhood abuse.  Before I read that Hannity quote today, I wrote on Twitter that it was getting harder and harder to tell Dawkins and Hannity apart, twinned as they seem to be in their incapacity for self-reflection.

And indeed, Dawkins has written publicly of his own abuse multiple times. The most widely-known remark made the rounds a year ago when he told an interviewer that he thought being groped by his boarding school teacher did him "no lasting harm." Maybe it didn't. But such a denial is the exact form we would expect a defense mechanism to take in a child that age. Overgeneralizing just a bit, children are too emotionally invested in the rectitude of adult authority figures to attribute serious evil to them. When seriously wronged, they tend to either downplay the harm done them, or justify it through their own wrongdoing. The alternative--that the adults who take care of them might be cruel, capricious, or incapable--is too much for the young psyche to take on.

Here's a Dawkins quote that got a little less attention at the time, tucked away as it was in the pages of his comment forum:
I remember spending a lot of time at my [kindergarten] trying to call down supernatural forces to protect me from bullies. I had a distinct mental image of a large black cloud with a human face, which would swoop down out of the sky and deal with the bully.
It's facile to say that children who are bullied grow up to become bullies themselves, but that they often do is, again, no coincidence. I see the seeds of so much of Dawkins' disdain and contempt for religion in this memory, as though the only way he can bear the disappointment and terror of not being saved from bullies by "supernatural forces" is to make sure that everyone else who believes in the supernatural is never, ever allowed to forget how foolish they are. Just as Sean Hannity can only bear the memory of having been violently beaten by the man who should have been his protector by spewing venomous contempt for the weak and ill-favored, night after night, to an audience of millions.

When we ally ourselves with that kind of hatred and dismissal, when we take ritual glee in the stupidity and cravenness of others, we invariably declare war on parts of ourselves, forcing these parts to live in darkness, without a champion, finding expression only through violence. Sean Hannity was caught in hs own trap last night -- "as a woodcock to his own springe." Maybe the strange reenactment of his own child-beating will break the spell for some viewers. Maybe too the poisoned-sword combatants of the atheiosphere will be prompted to piece together that the mockery and demonization directed at many of them now by their erstwhile deacons and elders were just as shallow, myopic, and reactionary when they were directed at the common enemy of religious adherents. Maybe maybe.

[*Update: As Amanda Marcotte observes, all the women he cites as examples are in subordinate or subservient roles.]

Thursday, August 21, 2014

On Where The Time Goes

Let's face it, I don't blog here much, and when I do, I don't write the kinds of things I used to write.

The basic reason is that in 5 years I had run out the dilettante leash as far as it would go. I needed to either metamorphize into a full-fledged thoughtful, informed commentator, or get out of the game. Anyone who has followed me here over the last few years knows I chose the latter, even taking the extra step of archiving a lot of my earlier posts for being simply too deficient in circumspection, erudition, and reasoning to be of much value to anyone.

I don't think I'm done with blogging or essaying for good, but lately other projects have won the campaign for my time. Most notably, this strange musical project I've hitched my wagon to: Theater Oobleck's Baudelaire In A Box, for which I've written and performed something around 30 songs by now, from my own translation of poems from Charles Baudelaire's The Flowers of Evil.

I had thought there wouldn't be much crossover between the audience for that project and my audience here, but I was shown the wrongness of that yesterday when I came upon this wonderful review of the Baudelaire project at Ordinary Times. It was written by "Chris" --no last name -- who I'm going to presume is the erstwhile writer at Mixing Memory. Chris, if I knew how to get in touch, I'd thank you more directly.

The rest of you, if you want to keep up with what I'm up to, I'm pretty good about keeping my Twitter feed active. I'll still post here when I can.

Tuesday, July 29, 2014

On Ethnic Cleansing In Gaza (A Pipe Dream)

In my dream, someone smart writes "A Real and True Historie of Human Pathology," a genealogy of fuckedupness right down to each woman, man, and child in recorded history. Nate Silver and all the other data geeks make charts out of it, tracing patterns across time, geography, political history, fashion, and pop culture, and it is laid bare that all hurt and hate and fear and resulting dehumanization is a defense against a trauma whose origins are long forgotten.

Little pocket epiphanies begin to emerge among those minds minimally prepared by acts of ordinary kindness and small sacrifice, and when they do, they make that popcorn-popper lip-smacking sound from the Seinfeld theme. Seinfeld himself is awakened to the true nature of suffering, which is such an extraordinary event, all the network and cable news desk devote the full evening broadcast to covering it.

Even people whose minds are clouded by hate and greed and self-righteousness are able to recognize that something important is happening, even if they can't understand it. They try to pass it off as a distraction ginned up by whichever Outgroup they most despise, but are not themselves as fully convinced by the maneuver as they are accustomed, and an edgy malaise begins to take hold in their glands and fascia.

Some very angry people are made so uncomfortable that they commit acts of violence against members of the hated Outgroup, or just innocent passers-by, but there has been just enough of a shift in consciouness among law enforcement, the media, and the chattering classes, that these outbursts are instantly seen as part of a long human chain of inhospitability reaching back millennia. This recognition deepens and spreads the epiphany, until more and more it becomes the standard frame for understanding cruelty, revenge, and demagoguery.

Social scientists resign their posts in droves to atone for having contributed to such an erroneous public understanding of human behavior. Hamas and the Israeli Knesset both dissolve themselves without ceremony and call for new elections. Security walls are torn down, but there is an overwhelming sense of courtesy and respect for the other's sovereignty and space that each population has to practically beg the other to join them in cafes, homes, hospitals, wedding banquets and funerals.

New peace talks are immediately opened, and just as immediately put on hiatus due to the nearly constant bouts of weeping over the other's misfortunes. It is decided that the negotiations will be replaced in the short term by a period of mourning, with weekly grief ceremonies spontaneously taking over all the official government buildings throughout Israel, the West Bank, and Gaza.

The notion spreads. Soon, Greeks and Turks are weeping, Shia and Sunni. Even the Russians are weeping. Even the Russians.

Then I wake up.

Wednesday, December 18, 2013

There Are No Santa Deniers In Foxholes

[My piece for the War on Christmas edition of Write Club ("The Yulening"), at The Hideout, December 17th, 2013. The bout was Santa vs. Jesus.]

Dear Margaret,

Tomorrow I ship out for my 5th tour of duty in the War on Christmas. I’ll be stationed somewhere near a place called Altoona, Pennsylvania. I doubt I could even find it on a map. Ha ha.

I’m being assigned to a creche removal and neutralization unit, and we’re being told that we shouldn’t expect to see too much heavy action. Which is good, because my PTSD definitely isn’t getting any better. It takes just a single sleigh bell on a car commercial for me to break into a cold sweat, as my hand instinctively reaches for my service revolver. They said this war would be a cakewalk, that every grinch and scrooge would come out of the woodwork and greet us with chocolate and flowers. Instead we got candy canes and boughs of holly, and some of the toughest fighting any of us have ever seen.

On my last tour, one of the guys in my unit was telling me that there used to be just twelve days of Christmas. Just Twelve Days! You put your wreath on the door on Christmas Eve, and you took it down on something called “Epiphany.” Then life went back to normal, I guess. They called it Christmastide, and they had a big feast on each day. At first I found all this really comforting. I loved that the word for the Christmas season was “tide” — it made me think of listening to the surf coming in and out at that cottage I used to rent on Nasketucket Bay. I’ll tell you, that’s a nice memory to have when you are pinned down in a damp foxhole for days on end. And I got to thinking how time really is like a tide, how one event flows into another, like day turns into night, and how there’s really a time for everything. And that made me realize that the idea of having a War on Christmas was a horrible mistake, that it was basically like having a war against time itself, and that Christmas was really just like an infection that would go away on its own. And that idea was really comforting.

But one thing war gives you is a lot of time alone with your thoughts, and that can be a dangerous thing. It didn’t take long for me to remember that we don’t have twelve days of Christmas anymore, we have—well, I can’t even count them. It used to be that Thanksgiving was a bulwark against Christmas’s terrible insatiability, but now with all the big box stores opening at midnight on Thanksgiving eve, it’s like nothing can stand in Christmas’s way anymore. Instead of Christmas-tide it’s like we have a Christmas tsunami. And that just scares the hell out of me.

This same Sergeant in my unit who told me about the twelve days of Christmas also told me that in England in the 17th century, the Puritans had their own war against Christmas. Cromwell even succeeded in having Christmas criminalized in 1647, which is far more than we’ve been able to get Congress to do. When the Royalists took power again in 1660, Christmas was restored, but the sense that all that merrymaking was too uncouth for true Christianity never really went away. And by now you had Puritans fleeing to America by the boatload.

By the early 19th century, Christmas had just run out of steam, at least according to my Sergeant. His theory was that the industrial revolution made twelve days of feasting impractical. “Dark Satanic Mills have no patience for the liturgical calendar” he told me one chilly October morning, as we warmed ourselves by a bonfire of plastic reindeer we had just seized from a group of singing children. 3 days later he was killed by an improvised explosive device made out of discarded tree ornaments. They found a little wooden Tyrolean elf wearing lederhosen lodged in his medulla oblongata. Death was instantaneous.

I think the point Sarge was trying to make was that Christmastide was traditionally just a big two-week party, with drinking and feasting, the Lord of Misrule, and all that. Once the logic of industrial capitalism took all that away, there just wasn’t enough substance in the Nativity story to pick up the slack. I mean, think about it, once you get past the virgin birth business, there’s just not that much to talk about.

At the same time, you have this mythologized folk version of Saint Nicholas floating around the periphery of the culture in Dutch New York. It’s right after the American Revolution, and people are desperately searching for a cultural heritage that is not British. Introduce St. Nick in the mass media at just the right time, and you have the perfect vehicle to transform Christmas from a rowdy Bacchanal to a wholesome, pastoral children’s holiday. And that’s just what happened. You have these propaganda pieces that start showing up—Washington Irving giving St. Nicholas a major role in the “Knickerbocker’s History of New York,”  and Clement Clark Moore, who was this slave-holding real estate baron from Chelsea, adding the reindeer in “A Visit From Saint Nicholas” — that’s the one that starts “Twas The Night Before Christmas.” And all of a sudden, Santa is off to the races.

When I look at everything that Santa has been able to accomplish that Jesus never could—it’s like when Lincoln replaced General McClellan with Ulysses S Grant. Just, game over. Santa Claus is scaleable in a way that Jesus never could be. Jesus’s big weakness is that he’s just too sacred to be commodified. Like McClellan, Jesus never really changed his tactics in 2,000 years. Get born, lie down in a manger, get visited by the magi. Santa is constantly changing his tactics. He starts with just stockings, then, over the next few decades he adds the reindeer, the chimney, the elves, the List. The List! In all of military history, no one who has kept a list has ever lost a war.

Well, it’s getting late, Margaret, and I should probably wrap this up. I’ve got a long journey to Altoona ahead of me in the morning. I hope I haven’t darkened your spirits too much. Sarge could be kind of a crackpot, frankly, and I guess we should take what he said with a grain of salt. All I know is that we’ve lost a lot of good men in this war, and Christmas just keeps getting bigger.

Sunday, December 08, 2013

The Gene Supernatural

The neo-Darwinian old guard has come out hard against David Dobbs' (admittedly inflammatory) article in Aeon, "Die, Selfish Gene, Die," which summarizes some recent and not-so-recent objectives objections to gene-centric "modern synthesis" evolutionary theory. Evolutionary biologist Jerry Coyne devoted two posts to demonstrating how "muddled" Dobbs' piece was, and Richard "Selfish Gene" Dawkins himself responded on his site that nothing in Dobbs' article contradicted the theory that he laid out in his landmark book The Selfish Gene. Steven Pinker went so far as to use the opportunity to characterize all science journalists as "congenitally" sensationalist.

But PZ Myers, erstwhile ally of the aforementioned gentleman and scholars in their struggle against theism, has come out with two posts at Phayngula strongly defending Dobbs' basic argument that the gene-centric "modern synthesis" is no longer fully supported by molecular biology.

Five-odd years ago, I was engaged with Myers in a brief but bitter squabble, after I dubbed his "Courtiers Reply" argument the "Lout's Complaint." He replied that I was "clueless" and that he was "proud to be a hooligan." I got a flurry of comments from his readers calling me soft-headed, and then it was over.

When it comes to religion, Myers is still a hooligan. (Sorry, Paul!). But I was extremely pleased to see him strenuously and heterodoxically critique the gene-determinism that has erupted among some of his celebrated colleagues.

Dawkins' "selfish gene" is a beautifully elegant theory. It is near-impossible to argue with the logic of its central premise. To read it is to be utterly convinced that nothing but the gene could ever possibly be the unit of selection, and it is extremely valuable in helping to overthrow popular misconceptions of natural selection, such as that "traits survive for the good of the species."

The problem, as Myers shows with relentless detail in his post, is that genes don't operate in anywhere near the idealized fashion that Dawkins describes. I would propose that this is because in nature, "genes" don't actually exist. We can get a sense of this observing the way that defenders of gene-centric theory alternate between incongruous definitions whenever their theory comes under attack. In Chapter 3 of The Selfish Gene, Dawkins famously defines the gene as any portion of chromosomal material that persists long enough to serve as a unit of heredity. A little bit later he expands on this when he tells us not to worry that a complicated trait (like the mimesis pattern on a butterfly) seems too complex to be controlled by a single gene: we can just redefine the gene as whatever cluster of DNA is responsible for the pattern. (He later proclaims the triumph of his tautology: "What I have done now is define the gene in such a way that I cannot help being right.")

Then, rather astonishingly, he goes on to say that his concept of the gene is not definitive in an absolute yes-or-no way, like an electron or an elephant or a comet, but relative, like size or age. A gene may be more or less gene-y, compared to other genes. Dawkins is very explicit that his mission here is to rescue Mendelian genetics by expanding the concept of the particulate unit of heredity to whatever scale it needs to be for the theory to work.

Perhaps ironically, this is the same slipperiness that gives fits to anti-theists whenever the topic of God's causality comes up. What is the nature of God? Whatever it needs to be to explain the perceivable world around us. What is a (Dawkinsian) gene? Whatever it needs to be to explain the transmission of a corresponding phenotypic trait. Little surprise then, when critics poke holes in the theory, drawing on recent (and not so recent) findings in molecular biology, that Dawkins is able to reply, "Why, my definition of the gene can account for that too!"

Among molecular biologists, the gene was for many years typically defined as the portion of the genetic code (also called a cistron) that carries instructions for the manufacture of an individual enzyme. In Crick's phrase: "DNA makes RNA, RNA makes protein, and protein makes us." The mechanics here are much easier to observe than in the Dawkinsian usage, but even here the definition is not as clear as it would seem. DNA sequences often need a lot of "editing" before they are converted into RNA sequences, and there is in fact no one-to-one correspondence between cistrons and proteins. Some proteins get built from an RNA sequence that has no equivalent in the DNA. This presents some pretty tough challenges for any theory that proposes that heredity is strictly a "genetic" phenomenon, unless we are prepared to count as "genes" any number of factors that are not stored in DNA, and whose manner of hereditary transmission, if it exists at all, is unknown. (Note how Jerry Coyne--who Dawkins calls his "goto guru on population genetics"--bases his entire rebuttal on the notion that regulatory factors "must" reside in the DNA, which seems to indicate that theoretical population genetics has become seriously unmoored from molecular biology).

There's much more to the story: epigenetics, evo-devo, genetic assimilation, and genetic redundancy, much of which you can read about by clicking on the Pharyngula links above. The point I want to make is that we can go one further than David Dobbs and the scientists whose work he summarizes. It's not just that "Selfish Gene" biology is overly gene-centric and deterministic. It's that the central metaphor of that paradigm is based on a spook. The "gene" is barely even a coherent concept, let alone a natural entity that could have causative powers. For a century it has convoluted the way we think about morphology and heredity. If we were feeling especially uncharitable, we might even be tempted to call it ... a Delusion.








Sunday, August 11, 2013

On Ears Of Tin

Speaking of quasi-Racists, Richard Dawkins' new career as Twitter troll is progressing brilliantly. Brandon Watson at Siris says pretty much everything I would be tempted to say about Dawkins' latest moustache-twirling, but I want add a couple of additional comments.

As with the Hedy Weiss flap in my prior post, so much depends on the notion of race. After his original tweet on the topic of Muslim Nobel Prizes, Dawkins defended himself against charges of racism by observing, correctly, that "Muslims are not a race." This is not a particularly satisfying response. Jews are likewise not a race, but this fact did not prevent numerous historical attempts to banish or exterminate them on grounds of "racial" purity. But the fallacy goes much deeper than this. Not only does racism not require a "race" to operate upon,  but in fact it is required to operate in the absence of one, given that there is no such thing as race. Race is an outdated, pseudoscientific 19th century concept with about as much scientific validity as the four humors (probably less.)

Someone must have raised this point with Dawkins after tweeted about it, because in an FAQ-style collection of "calm reflections," he admits that race is a "controversial" topic, and then, in response to the objection that race is a sociological, not a biological phenomenon, begs to differ:
I have a right to choose to interpret “race” (and hence “racism”) according to the dictionary definition: “A limited group of people descended from a common ancestor”. 
Of course this just further obscures the fact that no such group exists on earth.  Shall we pause here to recall that Dawkins is regarded as one of the world's most prominent biologists? He continues...
Sociologists are entitled to redefine words in technical senses that they find useful, but they are not entitled to impose their new definitions on those of us who prefer common or dictionary usage. (my emphasis)
I don't how to read that sentence except as a defense of folk etymology lexography over actual scholarship: "Yes, I realize that there is broad consensus in both the humanities and the biological sciences that race is a cultural, not biological phenomenon, but look here in the dictionary!"

The word "racism" remains useful to us, despite the non-existence of a biological substrate on which to rest it, because all the substitutes available to us seem too watered down: bigoted, ethnocentric, prejudiced--all sins, to be sure; but only "racism" conjures the requisite degree of wild, animal hatred. Without going too deep in the weeds, we can safely redefine racism as tribalism, with all the paranoid fantasy that attends to it: those people, the ones who are colored differently than us, who dress differently, talk differently, who keep to themselves, those people are not to be trusted.

So, while it's true that Muslims are not a "race," neither are "blacks," neither are Amerindians, Rom, Arabs, Hispanics, or Asians. (Neither are Caucasians, or "Aryans.") Where does that leave us? We can still, in all those cases, and so many more, project complex traits onto these socially-defined groups based solely on adherence to the tribe: stupid, lazy, thieving, murderous, warlike, fanatical. And indeed, Dawkins does veer closely to this kind of characterization when talking about Muslims (though he's nowhere near as bad as his comrade Sam Harris, who has literally stated "you just can't reason with these people.")

Look, for example, at the original tweet on Muslim Nobel laureates:
All the world's Muslims have fewer Nobel Prizes than Trinity College, Cambridge. They did great things in the Middle Ages, though.
In case the context weren't clear enough, Dawkins elaborates in his calm reflections:
I certainly didn’t, and don’t, imply any innate inferiority of intellect in those people who happen to follow the Muslim religion. But I did intend to raise in people’s minds the question of whether the religion itself is inimical to scientific education.
What jumps out right away is that the hypothesis is instantly self-refuting. Until around the 13th century these very same Muslims led the world in scientific exploration. There are many competing theories for why this embrace of reason did not persist, but it's clear that the only way we could blame the religion itself for the decline would be to infer that 11th century Islam was significantly more enlightened than the variant practiced today. Any takers?

So the "question of whether the religion itself is inimical to scientific education is," at best, staggeringly ignorant. At worst, it's race-baiting. I can't peer into Dawkins' heart to say where he falls on that continuum, but there are no commendable options.

Brandon writes that Dawkins' bafflement (at how anyone could characterize his discourse as less than perfectly reasonable) seems perfectly genuine. This is not incidental to the question of how damaging his remarks may be. Like the old patriarch at the dinner table who doesn't know, or can't accept, that it's no longer acceptable to call women "skirts" anymore, and opens old wounds with each utterance despite his insistence he "didn't mean anything by it," at a certain point innocence becomes a mask for a lack of empathy, which all-too-predictably slips into a narcissistic martyrdom. "You're the real racist!" (Yes, he said this.)

And is it really just a "tin ear" that adds insult to injury by following on the heels of his calm reflections a tweet musing on why Jews are so disproportionally represented by the Nobel Foundation, then linking to Steven Pinker's 2005 address to the YIVO Institute for Jewish Research on studies that Ashkenazi Jews have a higher IQ than other groups? Whether the studies have merit or not, they can only be relevant at all by undermining Dawkins' earlier insistence that "Jews are not a race" (at least when it comes to Ashkenazim), but no apparent cognitive dissonance ensues. (And surely it is just an unconscious slip when he invokes the image of a global cabal in a follow up tweet that says "I want to know their secret in case we can copy it.")


Thursday, August 08, 2013

While we're being honest

Chicago Theater Critic Hedy Weiss has responded to critiques of her support of racial profiling in her review of Silk Road Rising's Invasion. The short version: "Hey, I was just being honest." (Read the whole thing at Jim Romanesko's blog.)

My reply is below.


Ms Weiss,

You are quite correct to point out that we bring the world with us every time we enter a theater. That fact makes it all the more incumbent upon those of us with a wide and public readership to be perspicacious when musing upon that world, making sure it is indeed the whole world and not just the provincial byways of our well-worn comfort zones. It is not enough to be "honest," if this honesty entails no more than a venting of one's unexamined biases, especially when one has been called upon to justify one's remarks by those hurt by them. And that is what is at stake here: the pain of those who wonder why you would call it a "necessity" that they be treated with suspicion because they share a name, a skin tone, or a mode of dress with a very small number of deranged religious fanatics.

You state this "necessity" as a bald fact, after it has been pointed out to you that profiling is not in fact an effective mode of law enforcement--was, in fact, not effective in the prevention of the Boston Marathon bombing. After it has been pointed out that the State Department alert you cite is a travel advisory for Americans traveling abroad, not for attacks on US soil, making the question of ethnic profiling entirely irrelevant. After it has been pointed out, should you need reminding, that countless innocent people have suffered from racial profiling in the last decade, many of them gravely. You had an opportunity to collect your thoughts and add some fine shading to your earlier remarks. Perhaps we misunderstood you? Perhaps you had some little-known facts supporting your controversial position? Perhaps you had developed a novel and nuanced moral argument that showed your critics to be overly hasty?

Alas, no. What you gave us instead was a restatement of your original remarks in all their crudity, without the barest attempt at justification, and without a glimmer of possibility that you had any real sense of why Arabs, Muslims, and South Asians (and, as long as we're being honest, just about anyone with a smattering of human empathy) would be distressed by your support of such a barbarous, unjust and counterproductive practice. Your clarification here brings to mind letters to the editor in small town papers, insisting that everybody knows that the races shouldn't intermingle, or that girls who dress provocatively get what they are asking for, invariably adding "Hey, I'm just being honest," as if that provided some kind of defense against bigotry or misogyny.

I suspect that's not the effect you are shooting for, and so I urge you to, please, the next time you feel a wave of honesty coming on, consider whether your "visceral reactions" might include opinions best left to your more private social interactions, until you are truly prepared to have the elevated conversations warranted by their tendency to inflammation and harm.


[Cross-posted in comments at Romanesko's].

Saturday, March 30, 2013

Finish

[Originally written for the September 18th 2012 edition of Write Club, where I soliloquized on “Finish” against Ian Belknap’s “Start.” Mine was the moral victory. In any case, a fitting post for Easter.]

When I was a child of 11 or 12 I was given, by my parents, the soundtrack of Jesus Christ Superstar for Christmas. Even though I had already decided by this young age that there was no god or heaven, I was still obsessed by one particular section, very near the end: Jesus is moaning on the cross, his senses bewildered by all sorts of buzzes and cackles and demonic chanting, until finally he says “It is finished. Father, into your hands, I commend my spirit.” And there the track abruptly ended, buzzes and cackles and all. In the sudden silence it felt a little as though the whole world had ended. I was fascinated and terrified by the magical finality of this ending. He said those words, and then ceased to be. I would lie awake at night convinced that if I too were to utter those same words, then I too would cease to be. My non-existent soul would be claimed by this non-existent Father, just as non-existent Jesus’s was. I was even a little afraid I might say the words by accident. In hindsight, it was probably a bit of wish fulfillment, as most fears are.

When we talk about being finished, we’re talking about being dead. Or not being dead, rather—you can’t actually be dead; to be dead is to not be. There is no aspect or quality of “being” called deadness. You can’t exist in a deadish fashion, deadily. Our grammar just breaks down if we try. We can’t even say that “so and so died.” Dying isn’t something that you can do, because, it’s the end of “you.” By the time you get to the end of the sentence the subject is already gone. You start out with an Abbot and Costello routine—Who died? And you end up babbling like Vinny Barbarino: What… Where … I’m so confused!

We are compelled by language to think of death as just some new state of extreme inactivity. I’ll sleep when I’m dead, we say, when our death doesn’t actually seem so close. I will miss you, we say when it does. We just can’t get it through our dumb dying heads that there will be no I or we to sleep or miss or even to not sleep and not miss. We will be finished, except no we won’t because if you are, if you are being, then you are not finished. It’s called grammar. Just go ahead and try to argue with it.

Meanwhile, for every single thing except for us—except for you and me and everyone else that couldn’t be here today—we have this law of conservation of matter and energy, so that nothing ever ceases to be, it just turns into something else. For every single thing except for us, nothing is ever finished. The story of every single thing except for us can never end; there’s always something more to say, some story within a story. For a while it looked like the universe was ultimately headed toward a state of entropy or heat-death, but now we have these “multiverses”, an infinite number of worlds—each with its own conditions of suspended disbelief. And that makes even our heat death universe just a little bit more suspenseful. Because maybe we’re actually in the universe where everything crawls to a complete standstill for eons and eons, and then one day a bunch of balloons and crepe streamers fall from the sky celebrating our one trillionth millennium of the perfectly distributed stasis of all matter and energy. I mean you just can’t know.

Speaking of crepe streamers, I am reminded that roughly around the same time I was lying awake making sure not to accidently Commend My Spirit, I was attending an elementary school whose students carried on a tradition of flying crepe streamers out the window of the bus on the last day of school. Not the last day of school—there’s still such a thing as school, school still exists—but the last day of the school year. In all my life I have probably never participated so fully in a ritual as I did flying those streamers out the window of that school bus. To this day crepe streamers have a mystical quality to me, archetypal and primordial in their perfect, tightly coiled state of origin. The dry, rustle as we unfurled them out the window to catch the mild June afternoon breeze seemed to be an involuntary gasp of anticipatory joy, and for the entire bus ride home the streamers’ fluttering, like Tibetan prayer flags, seemed to liberate us from time and everyday reality. The whole bus, and all of us in it, had become a benevolent dragon from some Madeleine L’Engle book. School was over and what lay before us was the eternal forever of summer.

And then summer ended and school returned, and there were a few rituals for that too, new school clothes, new school books, new sharpened pencils for writing on clean white new school pages, we were starting over, being reborn, but not entirely convincingly. It nagged a little that what we had so triumphantly put behind us three months hence was back, unvanquished after all, undead, like that Jesus guy. School, like that Jesus guy, had something more to say, and it was good news only in the way that Brussels sprouts were good, or good manners were good, which is to say very, very bad.

Only we can ever really end, only you and me and everyone else that couldn’t be here today. Everything else just goes on, or turns into something else, is reminded of some new important thing to do or be or say, and it’s a fearsome thing, to be finished with something that’s not finished with you. This is what maybe was so memorable about that scene from Jesus Christ Superstar—the dude just winked out like a light. And it was really finished, except no it wasn’t! he came back, and did more stuff, and according to John of Patmos at least, he’s going to do even more stuff later. And we have to keep hearing about it. It is so manifestly not finished…

At this point Scheherezade lapsed into silence. Her sister Dunyazade said to her, “what an unusual and entertaining story, sister. If you are not too sleepy, will you tell us what became of this strange, unsatisfied man and his oratatory contest?” “With the greatest pleasure” said Scheherezade. “But this story is nothing compared to the one I will next relate: The tale of the three wise judges…”

Adelaide's Fork

[I read this piece at the Ray's Tap Reading Series on March 16, 2013. The evening's theme was "Manners, Please."]

It is said that at meals the Holy Roman Empress Adelaide would hold aloft her fork midway between palate and plate. It was the custom in 10th century Europe for those dining with the King and Queen to stop eating when their Sacred Imperial Majesties set down their utensils. Adelaide, who had the appetite of a sparrow, knew her guests would starve if she indicated she was done eating prematurely, so she would pantomime in this manner, her fork extended in the air, as though posing for a painting.


Adelaide died in the year 999, and thus was not able to attend--or strike a pose at--Judy Chicago's 1979 installation piece, The Dinner Party, now on permanent exhibit at the Brooklyn Museum, a piece featuring place settings for 39 women from history and folklore: Sappho, Judith, Ishtar, Emily Dickinson, Margaret Sanger, and 34 others. Adelaide did, however, somehow connive to have her name inscribed on a porcelain tile on the floor on which the banquets tables rest, the “Heritage Floor,” bearing the names of 999 women in all, also drawn from history and folklore, and chosen to contextualize and support the 39 guests of honor.

Adelaide was canonized by Pope Urban II in 1097, but before she was Saint Adelaide, or even Empress Adelaide, she was just plain old "Addie from the Burg," Adelaide of Burgundy, named for the Kingdom where she was born―and, not coincidentally, of which her Dad, Rudolph II, was King. Judy Chicago was born Judith Sylvia Cohen in 1939. It was the custom at that time for children to take the family name of their fathers. In 1959, she took the name Judy Gerowitz, upon marrying her first husband in California, where it was the custom for women to take the family name of their husbands. When she remarried in 1965, instead of taking, this time, the name of her new husband, she took, in defiance of custom, the name of her city of origin.




It is the custom in the city of Chicago in 2013 not to deceive those who have entrusted us with the privilege of delivering our orations unto them. Great was the gnashing of teeth and rending of raiments when performance artist Mike Daisey tried to use "poetic license" to justify bending the facts for rhetorical purposes in his one-man show The Agony and Ecstasy of Steve Jobs. (The title of which being a play on Irving Stone's 1961 biographical novel of the life of Michelangelo.) The popular radio program This American Pledge Drive devoted an entire hour atoning for the social transgression of excerpting the show, crooked facts and all. So I would like to pause here to acknowledge that it was not actually a fork that Adelaide extended in the air, but a knife. Though the table fork was used in the 10th century by nobles in Persia and the Middle East, it was not customary in Northern Europe until as late as the 18th century. To eat using anything but ones fingers in the time of Adelaide was just in bad taste. I said fork, earlier, because it fits our social sense of what the facts should be better than a knife would have. But I want you to know, it was not a fork. It was a knife.

*** 

The custom of naming children according to the paternal family name is a troublesome one, and--second wave feminism notwithstanding--it would be no less troublesome if the convention was exchanged for its matrilineal equivalent, since in either case the family name of one of the parents must be effaced from history. We speak of names and bloodlines as though they were the same, but blood and words do not follow the same rules. Biology and culture do not follow the same rules. Some might say this is why we have culture in the first place―to liberate us from the shackles of biology, of the blood, that realm wherein nothing can be named, but only experienced, in a pulsing crush of now-ness. Each parent gives each child half of its genetic confabulation, but this does us little good when it comes to saying who each of us are. Sure, we're all children of the mitochondrial Eve, but try writing that in your next artist's bio. Try telling that to the person behind the counter when you renew your drivers license

It is the custom today in certain parts of Brooklyn, New York, to try to circumvent this problem by naming one's child after both parents. So Sally Smith and Tom Jones have a baby girl, who they name Eliza Smith-Jones. But what happens when Eliza Smith-Jones grows up and marries Ebenezer White-Brown? According to custom, their child will take the name Smith-Jones-White-Brown, and when little Jedediah Smith-Jones-White-Brown grows up and marries Bryce Miller-Rodriquez-Anderson-Sanchez, well, you can see where this goes. This is a custom that, to quote my good friend, Michael of Brooklyn, just doesn't scale.

We can easily reconcile the conflict of blood and bloodline, of culture and nature, as Judy Chicago did, by simply abandoning the whole concept of of genealogy, a concept whose social value may have outlived its usefulness. After all, who really cares who David Bowie's parents were? Or Amiri Baraka's? Or Marilyn Monroe's, or Madeleine Kahn's, or Louis CK's? However, my purpose here is not to solve your problems but rather to exacerbate them. I seek to raise blisters. In the realm of ideas--which is to say in the realm of customs, of manners--we accomplish very little if we are not prepared to exaggerate. A name cannot convey everything that we are. It conveys one thing only, with terrifying reduction. If we're feeling clever, we can employ a portmonteau, whose secret code points in two directions, like Texarkana, or spork, or sexcapade. But only two directions. Sometimes three, as with flounder, which is a collision of flounce and blunder, with an echo of founder for good measure. But even here, pointing in three directions, we are talking about a number so many fewer than infinity. In fact, if my math is right, three is infinity fewer than infinity.

You see the pickle we're in, the predicament, the predicklement. We cannot stop the world without names. We cannot transcend the muteness of the ever-flowing now until we implant something in it with just a little staying power, lodging it into the endometrium of the eternal ever-flowing now. And from this implantation grows everything we have ever known and everything we will ever know: language, law, economics, ethics, art, religion, theater, ads on buses, book jacket blurbs, facebook memes, reading series, mustaches... which is to say all the different kinds of manners our species can devise. But just how real can any of it be? Compared to the infinitely ever-flowing now, just how profound, how true, can something as one-dimensional as an identity, as a name, ever hope to be?

 *** 


Bronzino's famous painting “Venus, Cupid, Folly, and Time” is notable for, among other things, the use of the figura serpentina, that twisted and extended presentation of the human form into a spiral pose. Long before cubism, Bronzino and the other mannerist painters used this and other exaggerated techniques to show that which could not be represented with the techniques of classical naturalism. Torsos with both breasts and buttocks simultaneously vectored toward the viewer. Background figures with no fealty to classical perspective or unity of light and shadow. The mannerist painters knew that the word “grotesque” has its roots in the greek word krytpe, the hidden; the concealed. In order to display what was real and true, they had to portray those things which would never emerge in the natural, phenomenal world, not even given a thousand eternities. Things that may only come to be when they are contorted, extended, stretched, embroidered, masked, mocked.

That story I told of Adelaide, with first her fork, then her knife, I really don't know what utensil she used, don't know if that bit about having to stop eating when the Queen stopped is true, don't know if when she held her knife or fork aloft in the air, she also twisted her torso so her breasts and buttocks were pointing in the same direction, don't know, don't much care. I do care, just a little, that someone just told you a story about it, and that someone happened to be me, whoever that is.

Monday, December 31, 2012

Constitution: A Ghost Story

[Written for Lucky Pierre's America/n, a "13-hour Election Day discussion/performance of the Constitution" at Chicago's Defibrillator Gallery, November 6th, 2012. All of the presentations from the event were later published in book form by Half Letter Press. ]


So many of the basic concepts associated with our history were presented to us at such a young age that it can be very difficult for us to see them afresh. For example: Who were the authors of Constitution and Declaration of Independence? Some fellows called The “Founding Fathers,” we reflexively utter. To the extent we give it any thought at all, most of us take this term to indicate those men who founded, built, or established, a new nation, conceived in liberty, and so on and so forth. 
But I’m afraid we have fallen victim here to a bit of folk etymology. Sometimes the obvious definition is not the correct one. For instance, our word to “buttonhole” is a misrendering of “button-hold,” a little loop that holds down a button on a garment. And in the context of pinning someone down with your scintillating conversation, it makes much more sense this way, despite the etymological corruption. So too in the case of these men we call “Founders”: Washington, Jefferson, Hamilton, Franklin, Sam Adams, John Jay, Roger Sherman, Patrick Henry—the whole lot, were actually foundlings, abandoned by their mothers and left to die of exposure, only to be rescued and raised by wolves. 
Happily, the proper recovery of this term can give us important new insights to help understand this most essential of foundational documents, from which so much of our national philosophy, psychology, and jurisprudence springs forth.
***
One of the main functions of a constitution is to locate sovereignty. We’ve deposed the Prince, the traditional residence of sovereign power, necessitating a new home for it. In searching for this home, the first question we may ask is whether our sovereignty is  unitary or federal. Unitary sovereignty is centralized; federal sovereignty is distributed among states or provinces. (This can be a little confusing to those of us who paid attention in history class, because the original Federalists—people like Alexander Hamilton and James Madison--actually opposed the “federalist” model, which they felt was inadequate to the job of effective governance. The first American government, promulgated under the Articles of Confederation, was too decentralized, they argued, while the Anti-Federalists—in other words those who supported the federalist model—argued that placing too much power in a centralized unitary government would only lead to a resurgence in the kind of tyrannical oppression the Revolution had just thrown off. Monarchy again, in all but name.)
Where, then, is our sovereignty located?  Is it in “The People,” in the several states, in the Federal government, in the foundling document itself?
***
The typical wolf litter is around 5 to 6 pups. A female wolf has around 8 to 12 breasts. It is rare thing in nature for a wolf litter to number higher than the total amount of its mother’s breasts. But it is also a rare thing in nature for a group of human infant boys to simultaneously be abandoned by their mothers and left to die of exposure, only to be rescued and suckled by wolves until they are strong enough to fend for themselves. We owe our origin as a nation to a very unique historical event, precipitated by Oracular pronouncements that these infant boys would cause great upheaval (as they did). We don’t know exactly how many She-Wolves there were on hand to suckle the Foundling Fathers; that has been lost to history. We do know that at a certain point, the feeding of the Foundling Fathers was supplemented by woodpeckers and other birds. Nothing in the historical record indicates that any of the Foundling Fathers were lost to malnutrition or starvation. But it seems fair enough to surmise that—at least at first—the Foundlings experienced a great deal of anxiety over the impression that there were just not enough nipples to go around—a pathology universally glossed over in the many myths and fairy tales of Foundling heroes.
We can see the remnants of this anxiety reflected in the debate, in the pages of the Federalist Papers, and later at the Constitutional Convention itself, over whether or not to enumerate a Bill of Rights. Hamilton felt that the presence of enumerated rights would imply that any unenumerated rights would be presumed not to apply, which would lead to Tyranny. Anti-Federalists, in turn, argued that enumerating no rights whatsoever would guarantee Tyranny from the start. In both cases it is important, for our present purpose, to mentally substitute for the word rights, the word nipples; and for the word tyranny, a Deprivation of Nipples.
This is our founding document. We should know the minds of the men who wrote it, what their concerns, preoccupations, and even obsessions were. What we discover is that they were so fixated on whether or not there were going to be enough nipples that they never really got around to solving the problem of where sovereignty resided in our system of governance.
Orthodox historians will tell you that the lack of a clear solution owes to a stalemate between the opposing philosophical views of the Federalists and Anti-Federalists, but this view overlooks what all the Foundling Fathers had in common—that they were foundlings! It is much more parsimonious to suggest that they were spending so much energy on nipple anxiety that there wasn’t enough left over to creatively solve the problem of where sovereign power lies. So they fudged it, as in Amendment 10:

The powers not delegated to the United States by the Constitution, nor prohibited by it to the States, are reserved to the States respectively, or to the people.

In other words, some sovereignty is delegated to the States, or possibly to the citizens of those states (to whatever extent these constitute a separate political entity), except that more powers may be granted back to the central government at any time by constitutional amendment, which may be proposed by either Congress or a majority of states. Are we clear?

At the heart of The Constitution and Bill of Rights is this paradox: If Sovereignty resides in the People (“popular sovereignty”), then what do we even need government for? Isn’t need of a structured legislative, executive and judicial branch all the evidence you would require for the absence of sovereign power? A mob is not sovereign, nor is any random collection of people in a subway car. On the other hand, if Sovereignty resides in the Government, then what do we need Democracy for? Why should the “will of the people” be any more germane to our welfare than it was to the divine emperors of China or the Holy Roman Empire?
A corollary paradox: if the People who reside in the Several States are the same People who reside in the United States—as they must be—and Sovereignty resides in the People, then how can the states and the Federal government be at cross purposes? After all, each is the political expression of the same sovereignty. Why even have a debate over Federalism at all, if we are taking any of this seriously? Furthermore, if there really is such a thing as “The People;” if we are, as the Preamble says, a unitary “We the People” and not a collection of many peoples or persons, then again I ask what the point of Democracy is. One People, One Vote? 
In that one first phrase in the Preamble, “We the People,” are so many confusions sown. Being united is, you will notice, something that we can’t stop talking about. We’re obsessed with submerging our selfhood into a greater whole, like reverse mitosis. To be united, after all, is more than to merely be allied, or in league, or in solidarity; it is to be fused, like two neighboring vertebrae that have insufficient cartilage between them to continue to function independently. Good fences make good neighbors. “United We Stand, Divided We Fall” is the motto of the codependent family, terrified above all that one member will stand up for herself in a healthy way, disclose the family secrets on Wikileaks, expose the damage, call for accountability. “We must all hang together,” said Benjamin Franklin, “or we will assuredly hang separately.” Well, speak for yourself, Ben.  
This is just the kind of confused pathology one would expect to emerge out of the trauma of being abandoned and left to die, then being suckled by an indeterminate number of she-wolves, and fed by woodpeckers who never seemed to come around often enough, but it never seems to be the right time to bring this up, even now, 233 years after the fact. There’s always some emergency, and if it’s not being abandoned and left to die it’s being taxed without representation by the Imperial British monarchy, or being attacked by Indians, or Spaniards, or the Kaiser, or the bomb-throwers in Haymarket Square, or, Somebody just blew up the USS Maine, or launched Sputnik, or embargoed our oil, or, Violent extremists have taken over the Civil Rights movement, they want to steal your car radio and screw your daughter, or, Somebody just tried to detonate his underwear. There is always some kind of urgent crisis.
And so we eternally fail to confront the fact that we are living in an incomplete, unidimensional political landscape. It all sounds good until you get outside the bubble and start to realize how much doesn’t add up, how much is missing.  We got the One For All part, but we left out the All For One part. We got “From each according to his abilities” but we left out “To each according to his means.” And this makes it inordinately difficult to see actual hardship, privation, or injustice when it resides in an individual citizen or household. We can’t see the trees for the forest; the persons for The People.
***
Foundlings need to survive, and to keep from going crazy they often need to make up elaborate fantasies. But once these fantasies serve their purpose, they tend to just get in the way. At a certain point, these fantasies become useless fictions. Ghosts. 
One of the ways you can tell that the Constitution is a ghost-filled place is that the Supreme Court is always trying to have séances with it. It’s common practice when the Court convenes for Justice Scalia to actually drag out a Ouija board and try to contact the Spirit of the Original Intent of the Words of the Constitution. Scalia, like all originalists, believes that the We The People Ghost of 1789 is real, and trods the earth in chains, like poor Jacob Marley. (Little known fact: Before every session of the Supreme Court, Scalia makes sure to have an undigested bit of beef, a blot of mustard, a crumb of cheese, and a fragment of underdone potato for dinner the night before.)
As easy as it is to expose this position to the mockery it deserves, let’s not forget that the opposite interpretation, the “Living Constitution” of the loose constructionists, is just as spooky and supernatural. In 1920, Oliver Wendell Holmes wrote that the words of the Constitution
have called into life a being the development of which could not have been foreseen completely by the most gifted of its begetters. It was enough for them to realize or to hope that they had created an organism… The treaty in question does not contravene any prohibitory words to be found in the Constitution. The only question is whether it is forbidden by some invisible radiation from the general terms of the Tenth Amendment. (my emphasis)

We have on the one hand, the originalists communing with the One True Spirit who knows the Letter of the Law, and on the other hand, the loose constructionists Kabbalistically poring over the letter of the law in hopes of raising the Spirit that resides there. We are, in each case, spooked, haunted by our Constitution, forgetting it is an artifact of our own imagination, forgetting it was written under extreme duress, bordering on madness. Like those letters we would write to our friends, fresh out of college, right after we got dumped by the love of our life, and we were heading to Wyoming to become fire watchers. There was some good stuff in those letters, some good, wise, courageous stuff that holds up even today. It’s a good thing we saved them! But--we forget at our peril--we, the writers of those letters, were bonkers. Just like those Foundling Fathers.. We should humbly and sincerely thank them for what they have given us. But we should also consider that the custom of revering a political philosophy created by men raised by she-wolves and fed by woodpeckers may be due for gentle revision. We, the Parented, the Well-Fed, the Nurtured, the Sane, the Confident, the Hopeful, the Unhaunted.


Wednesday, May 30, 2012

On Empire

[Written for Write Club, a monthly reading series in Chicago that pits writers against each other cage wrestling style. In this bout I presented for "Empire," against "Revolution."]

Good evening, Ladies and Gentlemen. I am honored to welcome you to this convocation, and gratified that so many of you were able to make the long journey. As most of you arrived here tonight by means of secret underground tunnels, you may not be aware of the increasingly dire situation outside these very doors: a throng of humanity numbering in the thousands bearing torches and pitchforks. They await the outcome of these august proceedings.

The question of the hour, as testified by every broadsheet headline, every drawing room conversation, every sermon in every pulpit: should we shut down Write Club?

That is the resolution that stands before us. You all have your ballots. Now let me begin by saying that of all the many charges levied against Write Club: that it is uncouth, that it is lewd, that it is corrupting of morals, that it curdles milk, that it causes genital warts, that the Overlord is implicated in the illegal trade of rhinoceros horn--against all these charges I resolutely defend Write Club. But there remains one accusation that we must take seriously here tonight. That is the charge that Write Club is an instrument of Empire.

Before I move on to the formal charges, an aside, to that faction among you who are hoping for me to address the charge that Write Club is an instrument not of Empire but of Revolution: Let me dispense with your anxiety by assuring you that the two are in fact one and the same, in that both have aims that are total. Revolution is merely Empire dressed in rags. You can dispel this problem from your minds and be troubled by it no longer.

OK then, Exhibit A: Hegemonic expansion. 

Chicago, Atlanta, Athens, San Francisco, Los Angeles. The overlord might be inclined to characterize these as “chapters” of a “consortium” or “federation” of Write Clubs.  He may call them as he will. When the first of these chapters elects to experiment with an 8-minute bout, or when they instruct their audiences to tell six to nine friends about Write Club, well, we will be eyeing the Overlord's reaction carefully.

Exhibit B: The Loving Cup of Deathless Fucking Glory. 

The phrase is from Walter Scott's poem: “Soldier, wake ― thy harvest, fame/Thy study, conquest; war, thy game.

War thy game.

That brings us to:

Exhibit C: Violence.

Day versus Night. Country versus City. Land versus. Sea. Head versus Heart. Life versus Death. Man versus Machine. Pride versus Prejudice.

Philip K. Dick wrote that “Empire is the codification of derangement; it is insane and imposes its insanity on us by violence, since its nature is a violent one.” Let's look at one recent Write Club bout, staged this very evening at Chicago's Hideout Inn, less than 100 miles from our present location: Lock versus Key. Now in nature, you might observe that Lock and Key exist in a state of harmoniousness or complementarity. Keys exist that they may lock and unlock―without them locks are eternally fixed, functionless, ossified. And locks exist to consummate keys. A key without a lock that fits it is no key at all; it's just more idle detritus to clutter up some dish of mismatched buttons and old subway tokens on your bureau. To pitch lock and key in combat against one another can only result in one of two equally futile outcomes: a world of lonely, petrified locks, or a world of lonely useless keys. Which shall we have? It hardly matters.

Empire imposes its insanity upon us by violence. It is the essence of Empire to look around itself, observe everything that is other, and be filled with the relentless desire to replace that other with itself. And what it cannot replace with itself, it induces into combat by proxy― The Gladiatorial games. Bread and Circuses. It is momentarily cathartic, this discharge of tension between matched pairs, between foes, so-called “opposites.” But when it is over, the fallen are fallen forever, never to be re-animated. Among the corpses in Empire's long trail of dead, how many languages, how many species, how many songs, dances, visions, philosophies, how many men, women, children. It is discourse―conversation―that leads us to truth, but these corpses will never again speak.

Our way seems clear, then. By our love of truth, and dialogue, our love of multiplicity and diversity, we must oppose Write Club. And yet, this paradox. The very act of opposing an institution of opposition―of combat―constitutes a tacit endorsement. As Dick wrote, “whoever defeats a segment of the Empire becomes the Empire; it proliferates like a virus, imposing its form on its enemies. Thereby it becomes its enemies. To fight the Empire is to be infected by its derangement.”

And so, my fellow members of the secret Illuminati, Freemasons, Rosicrucians, Knights Templar―we would seem to be an impasse. What can be the stance toward a rank evil which can be neither countenanced nor opposed? As a secret society, we have always been known by our deeds, not our words. Tonight we must do the same, taking our cue from the infinite Godhead itself, which permitted the creation of the cosmos only when it contracted its infinitude, allowing finite actuality to condense out of infinite potentiality. Only by withdrawing, making space for what is Other, can the world come into being. It is the only meaningful anti-Imperial act, to make space for what is Other. And in that spirit, I contract my remarks here short of my allotted time.




Friday, May 04, 2012

Langer III

Because the prime purpose of language is discourse, the conceptual framework that has developed under its influence is known as "discursive reason." Usually, when one speaks of "reason" at all, one tacitly assumes its discursive pattern. But in a broader sense any appreciation of form, any awareness of patterns in experience is "reason"; and discourse with all its refinements (e.g. mathematical symbolism, which is an extension of language) is only one possible pattern.
From Feeling and Form (1953)

Wednesday, May 02, 2012

Susanne Langer II

Physics did not begin with a clear concept of "matter"--that question is still changing rapidly with the advance of knowledge--but with the working notions of space, time, and mass, in terms of which the observed facts of the material world could be formulated. What we need for a science of mind is not so much a definitive concept of mind, as conceptual frame in which to lodge our observations of mental phenomena.
From Mind: An Essay on Human Feeling, Volume One (1967)

Susanne Langer I

Any natural mechanisms we credit for the functions of life and try to trace from their simplest manifestations in a culture of Neurospora to human brains conceiving poetry, must be great enough to account for the whole spectrum of vital phenomena, i.e. for our genius as well as for the mold on our bread. Theories that make poetry "merely" an animal reaction, favored by "natural selection" as a somewhat complex way of getting a living, really prove, above all else, that our basic philosophical concepts are inadequate to the problems of life and mind in nature.
From Mind: An Essay on Human Feeling, Volume One (1967)

Saturday, April 28, 2012

Anti-intellectualism is Easy

Cosmologist Lawrence Krauss has "apologized" for some "off the cuff" statements he made to an interviewer for The Atlantic, disparaging the role of philosophy as it relates to physics and other sciences. Statements such as the following:
Philosophy used to be a field that had content, but then "natural philosophy" became physics, and physics has only continued to make inroads. Every time there's a leap in physics, it encroaches on these areas that philosophers have carefully sequestered away to themselves, and so then you have this natural resentment on the part of philosophers. 
 And:
Philosophy is a field that, unfortunately, reminds me of that old Woody Allen joke, "those that can't do, teach, and those that can't teach, teach gym."  
Pressed by the interviewer to defend such sweeping statements, which would seem to indict some pretty big names in analytic philosophy, Krauss dodges, now suggesting that many of those philosophers who did make important contributions weren't really doing philosophy. Wittgenstein, for example:
Formal logic is mathematics, and there are philosophers like Wittgenstein that are very mathematical, but what they're really doing is mathematics. (my emphasis)
 (This is false, by the way: math is a subset of formal logic, not the other way around. And while Wittgenstein may have a tendency to be very methodical in his analysis, in a way that could analogize to being mathematical, his primary occupation was with the way we use language. This is in no way "doing mathematics.")

And Russell:
Bertrand Russell was a mathematician. I mean, he was a philosopher too and he was interested in the philosophical foundations of mathematics... (my emphasis)
Some of Krauss's pals, whose careers fall directly in the lineage of analytic philosophy established by Russell and Wittgenstein, including a certain Daniel Dennett, apparently took exception with the implication that they were really just glorified mathematicians, inducing a bit of a walkback by Krauss on the Scientific American website yesterday, where he took pains to stress that the statements I have cited above (not to mention his tendency to repeated append the modifier "moronic" before mentions of philosophers) were not intended as a "blanket condemnation of philosophy as a discipline."

His defense of philosophy's value, however, is quickly dispensed with in a couple of short paragraphs, and Krauss applies the rest of his considerable verbiage defending the idea that philosophy of physics (his field) is best left to physicists, a defense which consists entirely of the argument that Krauss is only interested in the ideas he is interested in.

That's his right, of course, but the reason this particular cloud of cyber dust has been stirred up is that Krauss has also claimed to have answered, in his recent book A Universe From Nothing, the old and intractable philosophical problem "why is there something rather than nothing?" -- a question which dates back to Leibniz, but is also implied by the classic metaphysical distinctions between "being" and "becoming" dating back to Plato.

The question itself serves as kind of a shibboleth between curious and incurious minds. We cannot, on the one hand, inquire into the reasons for things, (as the physical sciences do), and simultaneously cordon off some of those possible reasons as boring or moribund because physics cannot explicate them. Once we have decided that it is interesting to ask why certain things are the way they are, and use causal reasoning to provide answers, we are stuck with problems of infinite regress, which come with the territory. There is no logically intrinsic reason why some questions admit of answers (why is the universe expanding, why can't matter exceed the speed of light? what happened in the first 3 seconds of cosmic time?) and some do not (why does the universe have the properties it has, and not other properties? why does it have any properties at all?)

Sean Carroll, for example, writes in a 2007 blog post that the correct answer to the question why is there something instead of nothing is "Why not?" Can we possibly imagine him as blithely presenting this answer to the question why are there galaxies, or why are the nuclei of atoms bound together?

Krauss's incuriosity is worse, though, because he explicitly claims (using Richard Dawkins as a proxy, in an afterward to A Universe From Nothing) that "Even the last remaining trump card of the theologian, 'Why is there something rather than nothing?' shrivels up before your eyes as you read these pages." That is he claims (or at least endorses Dawkins' claim) to be answering not merely a scientific question of how the first instances of matter could have arisen from quantum fields, but the ontological question of why the universe is inherently structured to allow these fields. Why these fields, and not others? --or no fields at all?

When it is brought to Krauss' attention that he has addressed the former, but not the latter, he firmly denies he ever set out to do anything more than this. From the Atlantic interview:
I don't really give a damn about what "nothing" means to philosophers; I care about the "nothing" of reality. And if the "nothing" of reality is full of stuff, then I'll go with that.
So Leibniz' question stands, then? Wherefore then Dawkins' talk of trump cards? This is the point physicist and philosopher of science David Albert made in the New York Times (earning him the characterization of "moronic" from Krauss):
Relativistic-quantum-field-theoretical vacuum states — no less than giraffes or refrigerators or solar systems — are particular arrangements of elementary physical stuff. The true relativistic-quantum-field-­theoretical equivalent to there not being any physical stuff at all isn’t this or that particular arrangement of the fields — what it is (obviously, and ineluctably, and on the contrary) is the simple absence of the fields! The fact that some arrangements of fields happen to correspond to the existence of particles and some don’t is not a whit more mysterious than the fact that some of the possible arrangements of my fingers happen to correspond to the existence of a fist and some don’t. And the fact that particles can pop in and out of existence, over time, as those fields rearrange themselves, is not a whit more mysterious than the fact that fists can pop in and out of existence, over time, as my fingers rearrange themselves.
Interestingly, after dismissing any explanations that don't invoke empirical fact, Krauss concludes his apologia by offering just that, an angels-on-the-heads-of-pins type rationalization made of pure speculation:
If all possibilities—all universes with all laws—can arise dynamically, and if anything that is not forbidden must arise, then this implies that both nothing and something must both exist, and we will of necessity find ourselves amidst something.  A universe like ours is, in this context, guaranteed to arise dynamically, and we are here because we could not ask the question if our universe weren’t here.   
This is a muddle, logically, but the least we can say of it is that it begs the question, posed by Albert, of why "all universes with all laws can arise dynamically." It just gets worse from here:
If “something” is a physical quantity, to be determined by experiment, then so is ‘nothing’. 
I look forward to these experiments with great interest.

Again, there's no reason at all for Krauss to be interested in anything other than what he is interested in. His field of cosmology is rich with opportunities for him to remain very engaged for several lifetimes without ever venturing into other areas. It is his going out of his way to paint other people's concerns as "moronic," "sterile," "impotent," "useless," and "just noise" while demonstrating serious difficulties in even summarizing those concerns (Wittgenstein is "doing mathematics") and in engaging in questionable ontology himself ("nothing" is a "physical quantity to be determined by experiment") that makes Krauss, in this context at least, something of a reactionary boor. This is the same kind of defensive, know-nothing, tough guy posturing we see all too often in the "hard" sciences, and it is completely unnecessary.





Sunday, April 22, 2012

Law of the Jungle (re-post)

[Here's a re-post from 2009. I'm posting it again because it gets at one of the big problems raised by the "hard-determinist" or "incompatibilist view that free will is an illusion: If our thoughts are not something we actively and consciously engage in, but rather the pre-determined "effects" of prior genetic and environmental causes, then how are reason and morality even possible? I've made a few slight revisions to the original post, and elaborated on a couple of points that were earlier unclear.]

In working toward a definition of human nature and intelligence, Kant drew a distinction between the actual and the potential -- that is, the world as it is, and the world as it might be. Even allowing for some porosity between humanity and other species, it should not be controversial to suggest that such a distinction does not exist in any developed form in non-human intelligences. (I exclude so called "artificial intelligences," which are really just extensions of human intelligence by other means). So far as we know, only humans have "oughts." To whatever extent non-human organisms choose their behavior, they do not so do by reasoning among choices, for this would require a symbolic thought process they do not possess. (An exception may be the cetaceans, but we'll leave that aside for now).

The appearance of humanity's faculty to envision potential alternatives to "what is" marks the origin of (among other things) morality. Without a system to order our possible choices as preferences, we would either be reduced to paralysis, or forced to return to the realm of pre-conscious behavior. This is to say that the world of actuality (as described by biology, for example) is, for humans, transected by a realm of thought not confined to its borders, and often in opposition to it. Indeed, one of the main functions of language is to discourse on things that do not, but might, exist. Highly ordered metaphysical schemes like those of Plato or of Christian theology are specific manifestations of this kind of transcendence, but no system of thought is completely free of it. Even supposedly amoral "anything goes" philosophies, like Nietzsche's or Sartre's, stand in opposition to an "actual" state of affairs they wish to disparage, such as traditional Christianity, or "bourgeois" values.

The question that emerges for an ethical system that purports to be "naturalistic" (that is, explained entirely in biological terms) is this (very old) one: Given our ability to imagine multiple possible worlds (if not, in fact, our inability to refrain from imagining them), what is to be our rationale for choosing among them? Any answer that appeals to biology alone will fail to account for moral reasoning (if not culture altogether), since the distinction between "is" and "might" cannot be found in genes or neurons. It is a property only of minds, which is to say of intelligence experienced subjectively. (We can dispense with the silly objections about dualism here, I hope.)

Before it is proposed that no moral philosophy would ever try to explicate an ethos in strictly biological terms, let's look at a famous article by the Australian philosopher J.L. Mackie (1917-1981), titled "The Law of the Jungle," and published in Philosophy in 1978. This paper was one of the first philosophical responses to Richard Dawkins' The Selfish Gene. Dawkins himself was careful in that book not to imply that biological "selfishness" (that is, the persistence of successful traits throughout time) justified psychological egoism. Mackie's take was far less cautious.

The main body of "Law of the Jungle" is a fairly innocuous exploration of a type of group selection that Dawkins overlooked in The Selfish Gene. But he closes with a palpably ethical conclusion:
What implications for human morality have such biological facts about selfishness and altruism? One is that the possibility that morality is itself a product of natural selection is not ruled out, but care would be needed in formulating a plausible speculative account of how it might have been favoured. Another is that the notion of an ESS may be a useful one for discussing questions of practical morality. (my emphasis)
ESS, as readers of The Selfish Gene know, stands for "Evolutionarily Stable Strategy," which is a type of biological homeostasis worked out by game theorists. ESS theory is called upon to demonstrate why "reciprocal" altruism exists in populations where we might expect a brute selfishness to prevail: Since a pugilistic stance is thought to require a huge outlay of energy (having constantly to defend oneself in fights), the smart strategy would be to lay low and live in harmony until that harmony is disrupted by another member of the population.

Following Dawkins, Mackie cites the example of bird grooming behavior, which ESS theory divides into three types: Sucker, Cheat, and Grudger. The Sucker embodies the extreme of complete altruism, removing ticks from other birds without reservation. The Cheat embodies pure selfishness, allowing other birds to remove its ticks but never going out of its way to return the favor. The Grudger bridges the difference, grooming all other birds with the exception of those who don't reciprocate.

Game theory predicts that the Grudger "strategy" of reciprocal altruism will spread through a population, displacing the less sophisticated strategies of pure selfishness or pure altruism. And so it may. And we might pause to notice, as Mackie does, that there is an echo in this strategy of our own concept of fairness. ("Do unto others...")

This is not a problem as far as it goes. Birds have been employed as symbols of justice and wisdom as at least as far back as Athena's owl. We find the flock-as-jury in Farid Ud-Din Attar's allegorical poem The Conference of the Birds from the 12th century, and Chaucer's Parlement of Fowles 200 years later. As valuable as modern ethology is, it is nothing new to demonstrate that we share with birds certain social norms.

But this is a far different thing than asserting, as Mackie does, that because some birds have evolutionarily developed behaviors which are "healthy in the long run" and which resemble our own notion of fairness, our notions of fairness are thereby justified. Other far less savory bird behaviors, such as eating the young in a neighboring nest, would appear to be just as "stable" as grooming behavior. Are they, too, to be adopted as preferred human behavior?

After the standard disclaimer that "there is no simple transition from ‘is’ to ‘ought,' no direct argument from what goes on in the natural world and among non-human animals to what human beings ought to do," Mackie goes on to promote exactly that argument. After linking reciprocal altruism to our modern common sense notions of fairness (although it bears a much closer resemblance to older modes of justice like the vendetta or blood feud--"An eye for an eye"), he associates the "Sucker" strategy with the philosophies of Jesus, and Socrates, who advocated, he says, "repayment of evil with good." Then, switching back to ESS theory, he writes:
[A]s Dawkins points out, the presence of suckers endangers the healthy Grudger strategy. It allows cheats to prosper, and could make them multiply to the point where they would wipe out the grudgers, and ultimately bring about the extinction of the whole population. This seems to provide fresh support for Nietzsche’s view of the deplorable influence of moralities of the Christian type.
This attenuation between discussions of biological stability and moral programs happens so quickly it's easy to miss Mackie's move, in this paragraph, of using the "is" of biology to justify ("provide fresh support for") the "ought" of the Nietzschean moral structure. But it's there, in very clear terms: Always retaliate. It works for birds! (Note also there is no historical evidence that Nietzschean morality is more "evolutionarily stable" than Christian morality.)

(It's important to mention in passing that Mackie is wrong on the science too. According to ESS theory, if the cheats prosper and wipe out all the suckers and grudgers, we are left with a stable population of cheats (until the rise, through variation and selection, of new grudgers, which would re-dominate the population.) It would be no less "healthy," in biological terms, than an all-grudger population. We can hypothesize that there might be more sickness through tick infestation, but whether this is sufficient to threaten extinction is not captured in this particular model, which only measures the relative effectiveness of the three strategies within a closed system.

What this suggests is that Mackie has unwittingly added a moral dimension to the grudger strategy among birds where none belongs. At the same time he is employing the example of bird populations to demonstrate why Nietzschean morality is better than Christian morality, he is simultaneously using our own human concepts of fair play to valorize the behavior among birds. Everyone knows an all-cheat population would be "bad," after all.)

Mary Midgley, in her famous response to Mackie, which kicked off her ongoing feud with Richard Dawkins (a "Grudger," in temperament, if there ever was one), points out the fairly obvious shortcomings of such a linkage between evolutionary stability and ethics. Like the birds in the game theorists' model, we appear to already be congenitally prepared by our genes to retaliate against transgressions against us. We need no special help from the world of ideas --the realm of the possible--to remember to do harm to our enemies when transgressed upon. As Midgley puts it, "The option of jumping on one’s enemies’ faces whenever possible has always been popular." She does not, however, follow Mackie's lead in suggesting that the only other option is to make a wholesale replacement of the strategy of retaliation with a strategy of saintly restraint. She suggests that the ethos of the paying good to evil arose as an intelligent, reasoned--not dogmatic--response to the limitations of our emotional makeup:
This disregard of the essential emotional context reappears in Mackie’s idea that the undiscriminating ‘sucker’ behaviour is one recommended by Socrates and Christ. Neither sage is recorded to have said ‘be ye equally helpful to everybody’. Both, in the passages he means, were talking about behaviour to one narrow class of people, with whom we are already linked, namely our enemies, and were talking about it because it really does present appalling problems. (my emphasis)
She goes on:
Of course charity and forgiveness have their drawbacks too, especially if they are unintelligently practised. As Mackie rightly says, there are problems about reconciling them with justice, and justice too has its roots in our emotional nature. There are real conflicts here as both Socrates and Christ realized. (my emphasis)
In other words, in the moral realm we are dealing with considerations here far beyond the ability of game theory to effectively model. The issue now becomes one of flexibility and versatility, which are dramatically multiplied in the human capacity to represent things symbolically, and of intelligence--the ability to hold multiple variables in one's consciousness while working out a problem. There is simply no way for specific usages of complex reason to be genetically encoded or learned by rote: there are far too many unknown contingencies to account for. Game theory is unlikely to predict the pythagorean theorum, the Critique of Judgement, the Theory of Evolution by Natural Selection, the Parable of the Talents, or even "The Law of the Jungle." (Mackie's article, not Kipling's maxim, though probably that too.) Fortunately, we need no more than what we already have: a formal, methodical study of reason and  symbolic representation. As soon as we stop letting fears of "dualism" drive us into the arms of an untenable "naturalistic" understanding of human thought and behavior (which is really just a zombie version of that old hard-to-kill doctrine of Behaviorism), we can return to an actual, meaningful study of the human condition.