u n d e r v e r s e

Wednesday, December 18, 2013

There Are No Santa Deniers In Foxholes

[My piece for the War on Christmas edition of Write Club ("The Yulening"), at The Hideout, December 17th, 2013. The bout was Santa vs. Jesus.]

Dear Margaret,

Tomorrow I ship out for my 5th tour of duty in the War on Christmas. I’ll be stationed somewhere near a place called Altoona, Pennsylvania. I doubt I could even find it on a map. Ha ha.

I’m being assigned to a creche removal and neutralization unit, and we’re being told that we shouldn’t expect to see too much heavy action. Which is good, because my PTSD definitely isn’t getting any better. It takes just a single sleigh bell on a car commercial for me to break into a cold sweat, as my hand instinctively reaches for my service revolver. They said this war would be a cakewalk, that every grinch and scrooge would come out of the woodwork and greet us with chocolate and flowers. Instead we got candy canes and boughs of holly, and some of the toughest fighting any of us have ever seen.

On my last tour, one of the guys in my unit was telling me that there used to be just twelve days of Christmas. Just Twelve Days! You put your wreath on the door on Christmas Eve, and you took it down on something called “Epiphany.” Then life went back to normal, I guess. They called it Christmastide, and they had a big feast on each day. At first I found all this really comforting. I loved that the word for the Christmas season was “tide” — it made me think of listening to the surf coming in and out at that cottage I used to rent on Nasketucket Bay. I’ll tell you, that’s a nice memory to have when you are pinned down in a damp foxhole for days on end. And I got to thinking how time really is like a tide, how one event flows into another, like day turns into night, and how there’s really a time for everything. And that made me realize that the idea of having a War on Christmas was a horrible mistake, that it was basically like having a war against time itself, and that Christmas was really just like an infection that would go away on its own. And that idea was really comforting.

But one thing war gives you is a lot of time alone with your thoughts, and that can be a dangerous thing. It didn’t take long for me to remember that we don’t have twelve days of Christmas anymore, we have—well, I can’t even count them. It used to be that Thanksgiving was a bulwark against Christmas’s terrible insatiability, but now with all the big box stores opening at midnight on Thanksgiving eve, it’s like nothing can stand in Christmas’s way anymore. Instead of Christmas-tide it’s like we have a Christmas tsunami. And that just scares the hell out of me.

This same Sergeant in my unit who told me about the twelve days of Christmas also told me that in England in the 17th century, the Puritans had their own war against Christmas. Cromwell even succeeded in having Christmas criminalized in 1647, which is far more than we’ve been able to get Congress to do. When the Royalists took power again in 1660, Christmas was restored, but the sense that all that merrymaking was too uncouth for true Christianity never really went away. And by now you had Puritans fleeing to America by the boatload.

By the early 19th century, Christmas had just run out of steam, at least according to my Sergeant. His theory was that the industrial revolution made twelve days of feasting impractical. “Dark Satanic Mills have no patience for the liturgical calendar” he told me one chilly October morning, as we warmed ourselves by a bonfire of plastic reindeer we had just seized from a group of singing children. 3 days later he was killed by an improvised explosive device made out of discarded tree ornaments. They found a little wooden Tyrolean elf wearing lederhosen lodged in his medulla oblongata. Death was instantaneous.

I think the point Sarge was trying to make was that Christmastide was traditionally just a big two-week party, with drinking and feasting, the Lord of Misrule, and all that. Once the logic of industrial capitalism took all that away, there just wasn’t enough substance in the Nativity story to pick up the slack. I mean, think about it, once you get past the virgin birth business, there’s just not that much to talk about.

At the same time, you have this mythologized folk version of Saint Nicholas floating around the periphery of the culture in Dutch New York. It’s right after the American Revolution, and people are desperately searching for a cultural heritage that is not British. Introduce St. Nick in the mass media at just the right time, and you have the perfect vehicle to transform Christmas from a rowdy Bacchanal to a wholesome, pastoral children’s holiday. And that’s just what happened. You have these propaganda pieces that start showing up—Washington Irving giving St. Nicholas a major role in the “Knickerbocker’s History of New York,”  and Clement Clark Moore, who was this slave-holding real estate baron from Chelsea, adding the reindeer in “A Visit From Saint Nicholas” — that’s the one that starts “Twas The Night Before Christmas.” And all of a sudden, Santa is off to the races.

When I look at everything that Santa has been able to accomplish that Jesus never could—it’s like when Lincoln replaced General McClellan with Ulysses S Grant. Just, game over. Santa Claus is scaleable in a way that Jesus never could be. Jesus’s big weakness is that he’s just too sacred to be commodified. Like McClellan, Jesus never really changed his tactics in 2,000 years. Get born, lie down in a manger, get visited by the magi. Santa is constantly changing his tactics. He starts with just stockings, then, over the next few decades he adds the reindeer, the chimney, the elves, the List. The List! In all of military history, no one who has kept a list has ever lost a war.

Well, it’s getting late, Margaret, and I should probably wrap this up. I’ve got a long journey to Altoona ahead of me in the morning. I hope I haven’t darkened your spirits too much. Sarge could be kind of a crackpot, frankly, and I guess we should take what he said with a grain of salt. All I know is that we’ve lost a lot of good men in this war, and Christmas just keeps getting bigger.

Sunday, December 08, 2013

The Gene Supernatural

The neo-Darwinian old guard has come out hard against David Dobbs' (admittedly inflammatory) article in Aeon, "Die, Selfish Gene, Die," which summarizes some recent and not-so-recent objectives objections to gene-centric "modern synthesis" evolutionary theory. Evolutionary biologist Jerry Coyne devoted two posts to demonstrating how "muddled" Dobbs' piece was, and Richard "Selfish Gene" Dawkins himself responded on his site that nothing in Dobbs' article contradicted the theory that he laid out in his landmark book The Selfish Gene. Steven Pinker went so far as to use the opportunity to characterize all science journalists as "congenitally" sensationalist.

But PZ Myers, erstwhile ally of the aforementioned gentleman and scholars in their struggle against theism, has come out with two posts at Phayngula strongly defending Dobbs' basic argument that the gene-centric "modern synthesis" is no longer fully supported by molecular biology.

Five-odd years ago, I was engaged with Myers in a brief but bitter squabble, after I dubbed his "Courtiers Reply" argument the "Lout's Complaint." He replied that I was "clueless" and that he was "proud to be a hooligan." I got a flurry of comments from his readers calling me soft-headed, and then it was over.

When it comes to religion, Myers is still a hooligan. (Sorry, Paul!). But I was extremely pleased to see him strenuously and heterodoxically critique the gene-determinism that has erupted among some of his celebrated colleagues.

Dawkins' "selfish gene" is a beautifully elegant theory. It is near-impossible to argue with the logic of its central premise. To read it is to be utterly convinced that nothing but the gene could ever possibly be the unit of selection, and it is extremely valuable in helping to overthrow popular misconceptions of natural selection, such as that "traits survive for the good of the species."

The problem, as Myers shows with relentless detail in his post, is that genes don't operate in anywhere near the idealized fashion that Dawkins describes. I would propose that this is because in nature, "genes" don't actually exist. We can get a sense of this observing the way that defenders of gene-centric theory alternate between incongruous definitions whenever their theory comes under attack. In Chapter 3 of The Selfish Gene, Dawkins famously defines the gene as any portion of chromosomal material that persists long enough to serve as a unit of heredity. A little bit later he expands on this when he tells us not to worry that a complicated trait (like the mimesis pattern on a butterfly) seems too complex to be controlled by a single gene: we can just redefine the gene as whatever cluster of DNA is responsible for the pattern. (He later proclaims the triumph of his tautology: "What I have done now is define the gene in such a way that I cannot help being right.")

Then, rather astonishingly, he goes on to say that his concept of the gene is not definitive in an absolute yes-or-no way, like an electron or an elephant or a comet, but relative, like size or age. A gene may be more or less gene-y, compared to other genes. Dawkins is very explicit that his mission here is to rescue Mendelian genetics by expanding the concept of the particulate unit of heredity to whatever scale it needs to be for the theory to work.

Perhaps ironically, this is the same slipperiness that gives fits to anti-theists whenever the topic of God's causality comes up. What is the nature of God? Whatever it needs to be to explain the perceivable world around us. What is a (Dawkinsian) gene? Whatever it needs to be to explain the transmission of a corresponding phenotypic trait. Little surprise then, when critics poke holes in the theory, drawing on recent (and not so recent) findings in molecular biology, that Dawkins is able to reply, "Why, my definition of the gene can account for that too!"

Among molecular biologists, the gene was for many years typically defined as the portion of the genetic code (also called a cistron) that carries instructions for the manufacture of an individual enzyme. In Crick's phrase: "DNA makes RNA, RNA makes protein, and protein makes us." The mechanics here are much easier to observe than in the Dawkinsian usage, but even here the definition is not as clear as it would seem. DNA sequences often need a lot of "editing" before they are converted into RNA sequences, and there is in fact no one-to-one correspondence between cistrons and proteins. Some proteins get built from an RNA sequence that has no equivalent in the DNA. This presents some pretty tough challenges for any theory that proposes that heredity is strictly a "genetic" phenomenon, unless we are prepared to count as "genes" any number of factors that are not stored in DNA, and whose manner of hereditary transmission, if it exists at all, is unknown. (Note how Jerry Coyne--who Dawkins calls his "goto guru on population genetics"--bases his entire rebuttal on the notion that regulatory factors "must" reside in the DNA, which seems to indicate that theoretical population genetics has become seriously unmoored from molecular biology).

There's much more to the story: epigenetics, evo-devo, genetic assimilation, and genetic redundancy, much of which you can read about by clicking on the Pharyngula links above. The point I want to make is that we can go one further than David Dobbs and the scientists whose work he summarizes. It's not just that "Selfish Gene" biology is overly gene-centric and deterministic. It's that the central metaphor of that paradigm is based on a spook. The "gene" is barely even a coherent concept, let alone a natural entity that could have causative powers. For a century it has convoluted the way we think about morphology and heredity. If we were feeling especially uncharitable, we might even be tempted to call it ... a Delusion.








Sunday, August 11, 2013

On Ears Of Tin

Speaking of quasi-Racists, Richard Dawkins' new career as Twitter troll is progressing brilliantly. Brandon Watson at Siris says pretty much everything I would be tempted to say about Dawkins' latest moustache-twirling, but I want add a couple of additional comments.

As with the Hedy Weiss flap in my prior post, so much depends on the notion of race. After his original tweet on the topic of Muslim Nobel Prizes, Dawkins defended himself against charges of racism by observing, correctly, that "Muslims are not a race." This is not a particularly satisfying response. Jews are likewise not a race, but this fact did not prevent numerous historical attempts to banish or exterminate them on grounds of "racial" purity. But the fallacy goes much deeper than this. Not only does racism not require a "race" to operate upon,  but in fact it is required to operate in the absence of one, given that there is no such thing as race. Race is an outdated, pseudoscientific 19th century concept with about as much scientific validity as the four humors (probably less.)

Someone must have raised this point with Dawkins after tweeted about it, because in an FAQ-style collection of "calm reflections," he admits that race is a "controversial" topic, and then, in response to the objection that race is a sociological, not a biological phenomenon, begs to differ:
I have a right to choose to interpret “race” (and hence “racism”) according to the dictionary definition: “A limited group of people descended from a common ancestor”. 
Of course this just further obscures the fact that no such group exists on earth.  Shall we pause here to recall that Dawkins is regarded as one of the world's most prominent biologists? He continues...
Sociologists are entitled to redefine words in technical senses that they find useful, but they are not entitled to impose their new definitions on those of us who prefer common or dictionary usage. (my emphasis)
I don't how to read that sentence except as a defense of folk etymology lexography over actual scholarship: "Yes, I realize that there is broad consensus in both the humanities and the biological sciences that race is a cultural, not biological phenomenon, but look here in the dictionary!"

The word "racism" remains useful to us, despite the non-existence of a biological substrate on which to rest it, because all the substitutes available to us seem too watered down: bigoted, ethnocentric, prejudiced--all sins, to be sure; but only "racism" conjures the requisite degree of wild, animal hatred. Without going too deep in the weeds, we can safely redefine racism as tribalism, with all the paranoid fantasy that attends to it: those people, the ones who are colored differently than us, who dress differently, talk differently, who keep to themselves, those people are not to be trusted.

So, while it's true that Muslims are not a "race," neither are "blacks," neither are Amerindians, Rom, Arabs, Hispanics, or Asians. (Neither are Caucasians, or "Aryans.") Where does that leave us? We can still, in all those cases, and so many more, project complex traits onto these socially-defined groups based solely on adherence to the tribe: stupid, lazy, thieving, murderous, warlike, fanatical. And indeed, Dawkins does veer closely to this kind of characterization when talking about Muslims (though he's nowhere near as bad as his comrade Sam Harris, who has literally stated "you just can't reason with these people.")

Look, for example, at the original tweet on Muslim Nobel laureates:
All the world's Muslims have fewer Nobel Prizes than Trinity College, Cambridge. They did great things in the Middle Ages, though.
In case the context weren't clear enough, Dawkins elaborates in his calm reflections:
I certainly didn’t, and don’t, imply any innate inferiority of intellect in those people who happen to follow the Muslim religion. But I did intend to raise in people’s minds the question of whether the religion itself is inimical to scientific education.
What jumps out right away is that the hypothesis is instantly self-refuting. Until around the 13th century these very same Muslims led the world in scientific exploration. There are many competing theories for why this embrace of reason did not persist, but it's clear that the only way we could blame the religion itself for the decline would be to infer that 11th century Islam was significantly more enlightened than the variant practiced today. Any takers?

So the "question of whether the religion itself is inimical to scientific education is," at best, staggeringly ignorant. At worst, it's race-baiting. I can't peer into Dawkins' heart to say where he falls on that continuum, but there are no commendable options.

Brandon writes that Dawkins' bafflement (at how anyone could characterize his discourse as less than perfectly reasonable) seems perfectly genuine. This is not incidental to the question of how damaging his remarks may be. Like the old patriarch at the dinner table who doesn't know, or can't accept, that it's no longer acceptable to call women "skirts" anymore, and opens old wounds with each utterance despite his insistence he "didn't mean anything by it," at a certain point innocence becomes a mask for a lack of empathy, which all-too-predictably slips into a narcissistic martyrdom. "You're the real racist!" (Yes, he said this.)

And is it really just a "tin ear" that adds insult to injury by following on the heels of his calm reflections a tweet musing on why Jews are so disproportionally represented by the Nobel Foundation, then linking to Steven Pinker's 2005 address to the YIVO Institute for Jewish Research on studies that Ashkenazi Jews have a higher IQ than other groups? Whether the studies have merit or not, they can only be relevant at all by undermining Dawkins' earlier insistence that "Jews are not a race" (at least when it comes to Ashkenazim), but no apparent cognitive dissonance ensues. (And surely it is just an unconscious slip when he invokes the image of a global cabal in a follow up tweet that says "I want to know their secret in case we can copy it.")


Thursday, August 08, 2013

While we're being honest

Chicago Theater Critic Hedy Weiss has responded to critiques of her support of racial profiling in her review of Silk Road Rising's Invasion. The short version: "Hey, I was just being honest." (Read the whole thing at Jim Romanesko's blog.)

My reply is below.


Ms Weiss,

You are quite correct to point out that we bring the world with us every time we enter a theater. That fact makes it all the more incumbent upon those of us with a wide and public readership to be perspicacious when musing upon that world, making sure it is indeed the whole world and not just the provincial byways of our well-worn comfort zones. It is not enough to be "honest," if this honesty entails no more than a venting of one's unexamined biases, especially when one has been called upon to justify one's remarks by those hurt by them. And that is what is at stake here: the pain of those who wonder why you would call it a "necessity" that they be treated with suspicion because they share a name, a skin tone, or a mode of dress with a very small number of deranged religious fanatics.

You state this "necessity" as a bald fact, after it has been pointed out to you that profiling is not in fact an effective mode of law enforcement--was, in fact, not effective in the prevention of the Boston Marathon bombing. After it has been pointed out that the State Department alert you cite is a travel advisory for Americans traveling abroad, not for attacks on US soil, making the question of ethnic profiling entirely irrelevant. After it has been pointed out, should you need reminding, that countless innocent people have suffered from racial profiling in the last decade, many of them gravely. You had an opportunity to collect your thoughts and add some fine shading to your earlier remarks. Perhaps we misunderstood you? Perhaps you had some little-known facts supporting your controversial position? Perhaps you had developed a novel and nuanced moral argument that showed your critics to be overly hasty?

Alas, no. What you gave us instead was a restatement of your original remarks in all their crudity, without the barest attempt at justification, and without a glimmer of possibility that you had any real sense of why Arabs, Muslims, and South Asians (and, as long as we're being honest, just about anyone with a smattering of human empathy) would be distressed by your support of such a barbarous, unjust and counterproductive practice. Your clarification here brings to mind letters to the editor in small town papers, insisting that everybody knows that the races shouldn't intermingle, or that girls who dress provocatively get what they are asking for, invariably adding "Hey, I'm just being honest," as if that provided some kind of defense against bigotry or misogyny.

I suspect that's not the effect you are shooting for, and so I urge you to, please, the next time you feel a wave of honesty coming on, consider whether your "visceral reactions" might include opinions best left to your more private social interactions, until you are truly prepared to have the elevated conversations warranted by their tendency to inflammation and harm.


[Cross-posted in comments at Romanesko's].

Saturday, March 30, 2013

Finish

[Originally written for the September 18th 2012 edition of Write Club, where I soliloquized on “Finish” against Ian Belknap’s “Start.” Mine was the moral victory. In any case, a fitting post for Easter.]

When I was a child of 11 or 12 I was given, by my parents, the soundtrack of Jesus Christ Superstar for Christmas. Even though I had already decided by this young age that there was no god or heaven, I was still obsessed by one particular section, very near the end: Jesus is moaning on the cross, his senses bewildered by all sorts of buzzes and cackles and demonic chanting, until finally he says “It is finished. Father, into your hands, I commend my spirit.” And there the track abruptly ended, buzzes and cackles and all. In the sudden silence it felt a little as though the whole world had ended. I was fascinated and terrified by the magical finality of this ending. He said those words, and then ceased to be. I would lie awake at night convinced that if I too were to utter those same words, then I too would cease to be. My non-existent soul would be claimed by this non-existent Father, just as non-existent Jesus’s was. I was even a little afraid I might say the words by accident. In hindsight, it was probably a bit of wish fulfillment, as most fears are.

When we talk about being finished, we’re talking about being dead. Or not being dead, rather—you can’t actually be dead; to be dead is to not be. There is no aspect or quality of “being” called deadness. You can’t exist in a deadish fashion, deadily. Our grammar just breaks down if we try. We can’t even say that “so and so died.” Dying isn’t something that you can do, because, it’s the end of “you.” By the time you get to the end of the sentence the subject is already gone. You start out with an Abbot and Costello routine—Who died? And you end up babbling like Vinny Barbarino: What… Where … I’m so confused!

We are compelled by language to think of death as just some new state of extreme inactivity. I’ll sleep when I’m dead, we say, when our death doesn’t actually seem so close. I will miss you, we say when it does. We just can’t get it through our dumb dying heads that there will be no I or we to sleep or miss or even to not sleep and not miss. We will be finished, except no we won’t because if you are, if you are being, then you are not finished. It’s called grammar. Just go ahead and try to argue with it.

Meanwhile, for every single thing except for us—except for you and me and everyone else that couldn’t be here today—we have this law of conservation of matter and energy, so that nothing ever ceases to be, it just turns into something else. For every single thing except for us, nothing is ever finished. The story of every single thing except for us can never end; there’s always something more to say, some story within a story. For a while it looked like the universe was ultimately headed toward a state of entropy or heat-death, but now we have these “multiverses”, an infinite number of worlds—each with its own conditions of suspended disbelief. And that makes even our heat death universe just a little bit more suspenseful. Because maybe we’re actually in the universe where everything crawls to a complete standstill for eons and eons, and then one day a bunch of balloons and crepe streamers fall from the sky celebrating our one trillionth millennium of the perfectly distributed stasis of all matter and energy. I mean you just can’t know.

Speaking of crepe streamers, I am reminded that roughly around the same time I was lying awake making sure not to accidently Commend My Spirit, I was attending an elementary school whose students carried on a tradition of flying crepe streamers out the window of the bus on the last day of school. Not the last day of school—there’s still such a thing as school, school still exists—but the last day of the school year. In all my life I have probably never participated so fully in a ritual as I did flying those streamers out the window of that school bus. To this day crepe streamers have a mystical quality to me, archetypal and primordial in their perfect, tightly coiled state of origin. The dry, rustle as we unfurled them out the window to catch the mild June afternoon breeze seemed to be an involuntary gasp of anticipatory joy, and for the entire bus ride home the streamers’ fluttering, like Tibetan prayer flags, seemed to liberate us from time and everyday reality. The whole bus, and all of us in it, had become a benevolent dragon from some Madeleine L’Engle book. School was over and what lay before us was the eternal forever of summer.

And then summer ended and school returned, and there were a few rituals for that too, new school clothes, new school books, new sharpened pencils for writing on clean white new school pages, we were starting over, being reborn, but not entirely convincingly. It nagged a little that what we had so triumphantly put behind us three months hence was back, unvanquished after all, undead, like that Jesus guy. School, like that Jesus guy, had something more to say, and it was good news only in the way that Brussels sprouts were good, or good manners were good, which is to say very, very bad.

Only we can ever really end, only you and me and everyone else that couldn’t be here today. Everything else just goes on, or turns into something else, is reminded of some new important thing to do or be or say, and it’s a fearsome thing, to be finished with something that’s not finished with you. This is what maybe was so memorable about that scene from Jesus Christ Superstar—the dude just winked out like a light. And it was really finished, except no it wasn’t! he came back, and did more stuff, and according to John of Patmos at least, he’s going to do even more stuff later. And we have to keep hearing about it. It is so manifestly not finished…

At this point Scheherezade lapsed into silence. Her sister Dunyazade said to her, “what an unusual and entertaining story, sister. If you are not too sleepy, will you tell us what became of this strange, unsatisfied man and his oratatory contest?” “With the greatest pleasure” said Scheherezade. “But this story is nothing compared to the one I will next relate: The tale of the three wise judges…”

Adelaide's Fork

[I read this piece at the Ray's Tap Reading Series on March 16, 2013. The evening's theme was "Manners, Please."]

It is said that at meals the Holy Roman Empress Adelaide would hold aloft her fork midway between palate and plate. It was the custom in 10th century Europe for those dining with the King and Queen to stop eating when their Sacred Imperial Majesties set down their utensils. Adelaide, who had the appetite of a sparrow, knew her guests would starve if she indicated she was done eating prematurely, so she would pantomime in this manner, her fork extended in the air, as though posing for a painting.


Adelaide died in the year 999, and thus was not able to attend--or strike a pose at--Judy Chicago's 1979 installation piece, The Dinner Party, now on permanent exhibit at the Brooklyn Museum, a piece featuring place settings for 39 women from history and folklore: Sappho, Judith, Ishtar, Emily Dickinson, Margaret Sanger, and 34 others. Adelaide did, however, somehow connive to have her name inscribed on a porcelain tile on the floor on which the banquets tables rest, the “Heritage Floor,” bearing the names of 999 women in all, also drawn from history and folklore, and chosen to contextualize and support the 39 guests of honor.

Adelaide was canonized by Pope Urban II in 1097, but before she was Saint Adelaide, or even Empress Adelaide, she was just plain old "Addie from the Burg," Adelaide of Burgundy, named for the Kingdom where she was born―and, not coincidentally, of which her Dad, Rudolph II, was King. Judy Chicago was born Judith Sylvia Cohen in 1939. It was the custom at that time for children to take the family name of their fathers. In 1959, she took the name Judy Gerowitz, upon marrying her first husband in California, where it was the custom for women to take the family name of their husbands. When she remarried in 1965, instead of taking, this time, the name of her new husband, she took, in defiance of custom, the name of her city of origin.




It is the custom in the city of Chicago in 2013 not to deceive those who have entrusted us with the privilege of delivering our orations unto them. Great was the gnashing of teeth and rending of raiments when performance artist Mike Daisey tried to use "poetic license" to justify bending the facts for rhetorical purposes in his one-man show The Agony and Ecstasy of Steve Jobs. (The title of which being a play on Irving Stone's 1961 biographical novel of the life of Michelangelo.) The popular radio program This American Pledge Drive devoted an entire hour atoning for the social transgression of excerpting the show, crooked facts and all. So I would like to pause here to acknowledge that it was not actually a fork that Adelaide extended in the air, but a knife. Though the table fork was used in the 10th century by nobles in Persia and the Middle East, it was not customary in Northern Europe until as late as the 18th century. To eat using anything but ones fingers in the time of Adelaide was just in bad taste. I said fork, earlier, because it fits our social sense of what the facts should be better than a knife would have. But I want you to know, it was not a fork. It was a knife.

*** 

The custom of naming children according to the paternal family name is a troublesome one, and--second wave feminism notwithstanding--it would be no less troublesome if the convention was exchanged for its matrilineal equivalent, since in either case the family name of one of the parents must be effaced from history. We speak of names and bloodlines as though they were the same, but blood and words do not follow the same rules. Biology and culture do not follow the same rules. Some might say this is why we have culture in the first place―to liberate us from the shackles of biology, of the blood, that realm wherein nothing can be named, but only experienced, in a pulsing crush of now-ness. Each parent gives each child half of its genetic confabulation, but this does us little good when it comes to saying who each of us are. Sure, we're all children of the mitochondrial Eve, but try writing that in your next artist's bio. Try telling that to the person behind the counter when you renew your drivers license

It is the custom today in certain parts of Brooklyn, New York, to try to circumvent this problem by naming one's child after both parents. So Sally Smith and Tom Jones have a baby girl, who they name Eliza Smith-Jones. But what happens when Eliza Smith-Jones grows up and marries Ebenezer White-Brown? According to custom, their child will take the name Smith-Jones-White-Brown, and when little Jedediah Smith-Jones-White-Brown grows up and marries Bryce Miller-Rodriquez-Anderson-Sanchez, well, you can see where this goes. This is a custom that, to quote my good friend, Michael of Brooklyn, just doesn't scale.

We can easily reconcile the conflict of blood and bloodline, of culture and nature, as Judy Chicago did, by simply abandoning the whole concept of of genealogy, a concept whose social value may have outlived its usefulness. After all, who really cares who David Bowie's parents were? Or Amiri Baraka's? Or Marilyn Monroe's, or Madeleine Kahn's, or Louis CK's? However, my purpose here is not to solve your problems but rather to exacerbate them. I seek to raise blisters. In the realm of ideas--which is to say in the realm of customs, of manners--we accomplish very little if we are not prepared to exaggerate. A name cannot convey everything that we are. It conveys one thing only, with terrifying reduction. If we're feeling clever, we can employ a portmonteau, whose secret code points in two directions, like Texarkana, or spork, or sexcapade. But only two directions. Sometimes three, as with flounder, which is a collision of flounce and blunder, with an echo of founder for good measure. But even here, pointing in three directions, we are talking about a number so many fewer than infinity. In fact, if my math is right, three is infinity fewer than infinity.

You see the pickle we're in, the predicament, the predicklement. We cannot stop the world without names. We cannot transcend the muteness of the ever-flowing now until we implant something in it with just a little staying power, lodging it into the endometrium of the eternal ever-flowing now. And from this implantation grows everything we have ever known and everything we will ever know: language, law, economics, ethics, art, religion, theater, ads on buses, book jacket blurbs, facebook memes, reading series, mustaches... which is to say all the different kinds of manners our species can devise. But just how real can any of it be? Compared to the infinitely ever-flowing now, just how profound, how true, can something as one-dimensional as an identity, as a name, ever hope to be?

 *** 


Bronzino's famous painting “Venus, Cupid, Folly, and Time” is notable for, among other things, the use of the figura serpentina, that twisted and extended presentation of the human form into a spiral pose. Long before cubism, Bronzino and the other mannerist painters used this and other exaggerated techniques to show that which could not be represented with the techniques of classical naturalism. Torsos with both breasts and buttocks simultaneously vectored toward the viewer. Background figures with no fealty to classical perspective or unity of light and shadow. The mannerist painters knew that the word “grotesque” has its roots in the greek word krytpe, the hidden; the concealed. In order to display what was real and true, they had to portray those things which would never emerge in the natural, phenomenal world, not even given a thousand eternities. Things that may only come to be when they are contorted, extended, stretched, embroidered, masked, mocked.

That story I told of Adelaide, with first her fork, then her knife, I really don't know what utensil she used, don't know if that bit about having to stop eating when the Queen stopped is true, don't know if when she held her knife or fork aloft in the air, she also twisted her torso so her breasts and buttocks were pointing in the same direction, don't know, don't much care. I do care, just a little, that someone just told you a story about it, and that someone happened to be me, whoever that is.

Monday, December 31, 2012

Constitution: A Ghost Story

[Written for Lucky Pierre's America/n, a "13-hour Election Day discussion/performance of the Constitution" at Chicago's Defibrillator Gallery, November 6th, 2012. All of the presentations from the event were later published in book form by Half Letter Press. ]


So many of the basic concepts associated with our history were presented to us at such a young age that it can be very difficult for us to see them afresh. For example: Who were the authors of Constitution and Declaration of Independence? Some fellows called The “Founding Fathers,” we reflexively utter. To the extent we give it any thought at all, most of us take this term to indicate those men who founded, built, or established, a new nation, conceived in liberty, and so on and so forth. 
But I’m afraid we have fallen victim here to a bit of folk etymology. Sometimes the obvious definition is not the correct one. For instance, our word to “buttonhole” is a misrendering of “button-hold,” a little loop that holds down a button on a garment. And in the context of pinning someone down with your scintillating conversation, it makes much more sense this way, despite the etymological corruption. So too in the case of these men we call “Founders”: Washington, Jefferson, Hamilton, Franklin, Sam Adams, John Jay, Roger Sherman, Patrick Henry—the whole lot, were actually foundlings, abandoned by their mothers and left to die of exposure, only to be rescued and raised by wolves. 
Happily, the proper recovery of this term can give us important new insights to help understand this most essential of foundational documents, from which so much of our national philosophy, psychology, and jurisprudence springs forth.
***
One of the main functions of a constitution is to locate sovereignty. We’ve deposed the Prince, the traditional residence of sovereign power, necessitating a new home for it. In searching for this home, the first question we may ask is whether our sovereignty is  unitary or federal. Unitary sovereignty is centralized; federal sovereignty is distributed among states or provinces. (This can be a little confusing to those of us who paid attention in history class, because the original Federalists—people like Alexander Hamilton and James Madison--actually opposed the “federalist” model, which they felt was inadequate to the job of effective governance. The first American government, promulgated under the Articles of Confederation, was too decentralized, they argued, while the Anti-Federalists—in other words those who supported the federalist model—argued that placing too much power in a centralized unitary government would only lead to a resurgence in the kind of tyrannical oppression the Revolution had just thrown off. Monarchy again, in all but name.)
Where, then, is our sovereignty located?  Is it in “The People,” in the several states, in the Federal government, in the foundling document itself?
***
The typical wolf litter is around 5 to 6 pups. A female wolf has around 8 to 12 breasts. It is rare thing in nature for a wolf litter to number higher than the total amount of its mother’s breasts. But it is also a rare thing in nature for a group of human infant boys to simultaneously be abandoned by their mothers and left to die of exposure, only to be rescued and suckled by wolves until they are strong enough to fend for themselves. We owe our origin as a nation to a very unique historical event, precipitated by Oracular pronouncements that these infant boys would cause great upheaval (as they did). We don’t know exactly how many She-Wolves there were on hand to suckle the Foundling Fathers; that has been lost to history. We do know that at a certain point, the feeding of the Foundling Fathers was supplemented by woodpeckers and other birds. Nothing in the historical record indicates that any of the Foundling Fathers were lost to malnutrition or starvation. But it seems fair enough to surmise that—at least at first—the Foundlings experienced a great deal of anxiety over the impression that there were just not enough nipples to go around—a pathology universally glossed over in the many myths and fairy tales of Foundling heroes.
We can see the remnants of this anxiety reflected in the debate, in the pages of the Federalist Papers, and later at the Constitutional Convention itself, over whether or not to enumerate a Bill of Rights. Hamilton felt that the presence of enumerated rights would imply that any unenumerated rights would be presumed not to apply, which would lead to Tyranny. Anti-Federalists, in turn, argued that enumerating no rights whatsoever would guarantee Tyranny from the start. In both cases it is important, for our present purpose, to mentally substitute for the word rights, the word nipples; and for the word tyranny, a Deprivation of Nipples.
This is our founding document. We should know the minds of the men who wrote it, what their concerns, preoccupations, and even obsessions were. What we discover is that they were so fixated on whether or not there were going to be enough nipples that they never really got around to solving the problem of where sovereignty resided in our system of governance.
Orthodox historians will tell you that the lack of a clear solution owes to a stalemate between the opposing philosophical views of the Federalists and Anti-Federalists, but this view overlooks what all the Foundling Fathers had in common—that they were foundlings! It is much more parsimonious to suggest that they were spending so much energy on nipple anxiety that there wasn’t enough left over to creatively solve the problem of where sovereign power lies. So they fudged it, as in Amendment 10:

The powers not delegated to the United States by the Constitution, nor prohibited by it to the States, are reserved to the States respectively, or to the people.

In other words, some sovereignty is delegated to the States, or possibly to the citizens of those states (to whatever extent these constitute a separate political entity), except that more powers may be granted back to the central government at any time by constitutional amendment, which may be proposed by either Congress or a majority of states. Are we clear?

At the heart of The Constitution and Bill of Rights is this paradox: If Sovereignty resides in the People (“popular sovereignty”), then what do we even need government for? Isn’t need of a structured legislative, executive and judicial branch all the evidence you would require for the absence of sovereign power? A mob is not sovereign, nor is any random collection of people in a subway car. On the other hand, if Sovereignty resides in the Government, then what do we need Democracy for? Why should the “will of the people” be any more germane to our welfare than it was to the divine emperors of China or the Holy Roman Empire?
A corollary paradox: if the People who reside in the Several States are the same People who reside in the United States—as they must be—and Sovereignty resides in the People, then how can the states and the Federal government be at cross purposes? After all, each is the political expression of the same sovereignty. Why even have a debate over Federalism at all, if we are taking any of this seriously? Furthermore, if there really is such a thing as “The People;” if we are, as the Preamble says, a unitary “We the People” and not a collection of many peoples or persons, then again I ask what the point of Democracy is. One People, One Vote? 
In that one first phrase in the Preamble, “We the People,” are so many confusions sown. Being united is, you will notice, something that we can’t stop talking about. We’re obsessed with submerging our selfhood into a greater whole, like reverse mitosis. To be united, after all, is more than to merely be allied, or in league, or in solidarity; it is to be fused, like two neighboring vertebrae that have insufficient cartilage between them to continue to function independently. Good fences make good neighbors. “United We Stand, Divided We Fall” is the motto of the codependent family, terrified above all that one member will stand up for herself in a healthy way, disclose the family secrets on Wikileaks, expose the damage, call for accountability. “We must all hang together,” said Benjamin Franklin, “or we will assuredly hang separately.” Well, speak for yourself, Ben.  
This is just the kind of confused pathology one would expect to emerge out of the trauma of being abandoned and left to die, then being suckled by an indeterminate number of she-wolves, and fed by woodpeckers who never seemed to come around often enough, but it never seems to be the right time to bring this up, even now, 233 years after the fact. There’s always some emergency, and if it’s not being abandoned and left to die it’s being taxed without representation by the Imperial British monarchy, or being attacked by Indians, or Spaniards, or the Kaiser, or the bomb-throwers in Haymarket Square, or, Somebody just blew up the USS Maine, or launched Sputnik, or embargoed our oil, or, Violent extremists have taken over the Civil Rights movement, they want to steal your car radio and screw your daughter, or, Somebody just tried to detonate his underwear. There is always some kind of urgent crisis.
And so we eternally fail to confront the fact that we are living in an incomplete, unidimensional political landscape. It all sounds good until you get outside the bubble and start to realize how much doesn’t add up, how much is missing.  We got the One For All part, but we left out the All For One part. We got “From each according to his abilities” but we left out “To each according to his means.” And this makes it inordinately difficult to see actual hardship, privation, or injustice when it resides in an individual citizen or household. We can’t see the trees for the forest; the persons for The People.
***
Foundlings need to survive, and to keep from going crazy they often need to make up elaborate fantasies. But once these fantasies serve their purpose, they tend to just get in the way. At a certain point, these fantasies become useless fictions. Ghosts. 
One of the ways you can tell that the Constitution is a ghost-filled place is that the Supreme Court is always trying to have séances with it. It’s common practice when the Court convenes for Justice Scalia to actually drag out a Ouija board and try to contact the Spirit of the Original Intent of the Words of the Constitution. Scalia, like all originalists, believes that the We The People Ghost of 1789 is real, and trods the earth in chains, like poor Jacob Marley. (Little known fact: Before every session of the Supreme Court, Scalia makes sure to have an undigested bit of beef, a blot of mustard, a crumb of cheese, and a fragment of underdone potato for dinner the night before.)
As easy as it is to expose this position to the mockery it deserves, let’s not forget that the opposite interpretation, the “Living Constitution” of the loose constructionists, is just as spooky and supernatural. In 1920, Oliver Wendell Holmes wrote that the words of the Constitution
have called into life a being the development of which could not have been foreseen completely by the most gifted of its begetters. It was enough for them to realize or to hope that they had created an organism… The treaty in question does not contravene any prohibitory words to be found in the Constitution. The only question is whether it is forbidden by some invisible radiation from the general terms of the Tenth Amendment. (my emphasis)

We have on the one hand, the originalists communing with the One True Spirit who knows the Letter of the Law, and on the other hand, the loose constructionists Kabbalistically poring over the letter of the law in hopes of raising the Spirit that resides there. We are, in each case, spooked, haunted by our Constitution, forgetting it is an artifact of our own imagination, forgetting it was written under extreme duress, bordering on madness. Like those letters we would write to our friends, fresh out of college, right after we got dumped by the love of our life, and we were heading to Wyoming to become fire watchers. There was some good stuff in those letters, some good, wise, courageous stuff that holds up even today. It’s a good thing we saved them! But--we forget at our peril--we, the writers of those letters, were bonkers. Just like those Foundling Fathers.. We should humbly and sincerely thank them for what they have given us. But we should also consider that the custom of revering a political philosophy created by men raised by she-wolves and fed by woodpeckers may be due for gentle revision. We, the Parented, the Well-Fed, the Nurtured, the Sane, the Confident, the Hopeful, the Unhaunted.


Wednesday, May 30, 2012

On Empire

[Written for Write Club, a monthly reading series in Chicago that pits writers against each other cage wrestling style. In this bout I presented for "Empire," against "Revolution."]

Good evening, Ladies and Gentlemen. I am honored to welcome you to this convocation, and gratified that so many of you were able to make the long journey. As most of you arrived here tonight by means of secret underground tunnels, you may not be aware of the increasingly dire situation outside these very doors: a throng of humanity numbering in the thousands bearing torches and pitchforks. They await the outcome of these august proceedings.

The question of the hour, as testified by every broadsheet headline, every drawing room conversation, every sermon in every pulpit: should we shut down Write Club?

That is the resolution that stands before us. You all have your ballots. Now let me begin by saying that of all the many charges levied against Write Club: that it is uncouth, that it is lewd, that it is corrupting of morals, that it curdles milk, that it causes genital warts, that the Overlord is implicated in the illegal trade of rhinoceros horn--against all these charges I resolutely defend Write Club. But there remains one accusation that we must take seriously here tonight. That is the charge that Write Club is an instrument of Empire.

Before I move on to the formal charges, an aside, to that faction among you who are hoping for me to address the charge that Write Club is an instrument not of Empire but of Revolution: Let me dispense with your anxiety by assuring you that the two are in fact one and the same, in that both have aims that are total. Revolution is merely Empire dressed in rags. You can dispel this problem from your minds and be troubled by it no longer.

OK then, Exhibit A: Hegemonic expansion. 

Chicago, Atlanta, Athens, San Francisco, Los Angeles. The overlord might be inclined to characterize these as “chapters” of a “consortium” or “federation” of Write Clubs.  He may call them as he will. When the first of these chapters elects to experiment with an 8-minute bout, or when they instruct their audiences to tell six to nine friends about Write Club, well, we will be eyeing the Overlord's reaction carefully.

Exhibit B: The Loving Cup of Deathless Fucking Glory. 

The phrase is from Walter Scott's poem: “Soldier, wake ― thy harvest, fame/Thy study, conquest; war, thy game.

War thy game.

That brings us to:

Exhibit C: Violence.

Day versus Night. Country versus City. Land versus. Sea. Head versus Heart. Life versus Death. Man versus Machine. Pride versus Prejudice.

Philip K. Dick wrote that “Empire is the codification of derangement; it is insane and imposes its insanity on us by violence, since its nature is a violent one.” Let's look at one recent Write Club bout, staged this very evening at Chicago's Hideout Inn, less than 100 miles from our present location: Lock versus Key. Now in nature, you might observe that Lock and Key exist in a state of harmoniousness or complementarity. Keys exist that they may lock and unlock―without them locks are eternally fixed, functionless, ossified. And locks exist to consummate keys. A key without a lock that fits it is no key at all; it's just more idle detritus to clutter up some dish of mismatched buttons and old subway tokens on your bureau. To pitch lock and key in combat against one another can only result in one of two equally futile outcomes: a world of lonely, petrified locks, or a world of lonely useless keys. Which shall we have? It hardly matters.

Empire imposes its insanity upon us by violence. It is the essence of Empire to look around itself, observe everything that is other, and be filled with the relentless desire to replace that other with itself. And what it cannot replace with itself, it induces into combat by proxy― The Gladiatorial games. Bread and Circuses. It is momentarily cathartic, this discharge of tension between matched pairs, between foes, so-called “opposites.” But when it is over, the fallen are fallen forever, never to be re-animated. Among the corpses in Empire's long trail of dead, how many languages, how many species, how many songs, dances, visions, philosophies, how many men, women, children. It is discourse―conversation―that leads us to truth, but these corpses will never again speak.

Our way seems clear, then. By our love of truth, and dialogue, our love of multiplicity and diversity, we must oppose Write Club. And yet, this paradox. The very act of opposing an institution of opposition―of combat―constitutes a tacit endorsement. As Dick wrote, “whoever defeats a segment of the Empire becomes the Empire; it proliferates like a virus, imposing its form on its enemies. Thereby it becomes its enemies. To fight the Empire is to be infected by its derangement.”

And so, my fellow members of the secret Illuminati, Freemasons, Rosicrucians, Knights Templar―we would seem to be an impasse. What can be the stance toward a rank evil which can be neither countenanced nor opposed? As a secret society, we have always been known by our deeds, not our words. Tonight we must do the same, taking our cue from the infinite Godhead itself, which permitted the creation of the cosmos only when it contracted its infinitude, allowing finite actuality to condense out of infinite potentiality. Only by withdrawing, making space for what is Other, can the world come into being. It is the only meaningful anti-Imperial act, to make space for what is Other. And in that spirit, I contract my remarks here short of my allotted time.




Friday, May 04, 2012

Langer III

Because the prime purpose of language is discourse, the conceptual framework that has developed under its influence is known as "discursive reason." Usually, when one speaks of "reason" at all, one tacitly assumes its discursive pattern. But in a broader sense any appreciation of form, any awareness of patterns in experience is "reason"; and discourse with all its refinements (e.g. mathematical symbolism, which is an extension of language) is only one possible pattern.
From Feeling and Form (1953)

Wednesday, May 02, 2012

Susanne Langer II

Physics did not begin with a clear concept of "matter"--that question is still changing rapidly with the advance of knowledge--but with the working notions of space, time, and mass, in terms of which the observed facts of the material world could be formulated. What we need for a science of mind is not so much a definitive concept of mind, as conceptual frame in which to lodge our observations of mental phenomena.
From Mind: An Essay on Human Feeling, Volume One (1967)

Susanne Langer I

Any natural mechanisms we credit for the functions of life and try to trace from their simplest manifestations in a culture of Neurospora to human brains conceiving poetry, must be great enough to account for the whole spectrum of vital phenomena, i.e. for our genius as well as for the mold on our bread. Theories that make poetry "merely" an animal reaction, favored by "natural selection" as a somewhat complex way of getting a living, really prove, above all else, that our basic philosophical concepts are inadequate to the problems of life and mind in nature.
From Mind: An Essay on Human Feeling, Volume One (1967)

Saturday, April 28, 2012

Anti-intellectualism is Easy

Cosmologist Lawrence Krauss has "apologized" for some "off the cuff" statements he made to an interviewer for The Atlantic, disparaging the role of philosophy as it relates to physics and other sciences. Statements such as the following:
Philosophy used to be a field that had content, but then "natural philosophy" became physics, and physics has only continued to make inroads. Every time there's a leap in physics, it encroaches on these areas that philosophers have carefully sequestered away to themselves, and so then you have this natural resentment on the part of philosophers. 
 And:
Philosophy is a field that, unfortunately, reminds me of that old Woody Allen joke, "those that can't do, teach, and those that can't teach, teach gym."  
Pressed by the interviewer to defend such sweeping statements, which would seem to indict some pretty big names in analytic philosophy, Krauss dodges, now suggesting that many of those philosophers who did make important contributions weren't really doing philosophy. Wittgenstein, for example:
Formal logic is mathematics, and there are philosophers like Wittgenstein that are very mathematical, but what they're really doing is mathematics. (my emphasis)
 (This is false, by the way: math is a subset of formal logic, not the other way around. And while Wittgenstein may have a tendency to be very methodical in his analysis, in a way that could analogize to being mathematical, his primary occupation was with the way we use language. This is in no way "doing mathematics.")

And Russell:
Bertrand Russell was a mathematician. I mean, he was a philosopher too and he was interested in the philosophical foundations of mathematics... (my emphasis)
Some of Krauss's pals, whose careers fall directly in the lineage of analytic philosophy established by Russell and Wittgenstein, including a certain Daniel Dennett, apparently took exception with the implication that they were really just glorified mathematicians, inducing a bit of a walkback by Krauss on the Scientific American website yesterday, where he took pains to stress that the statements I have cited above (not to mention his tendency to repeated append the modifier "moronic" before mentions of philosophers) were not intended as a "blanket condemnation of philosophy as a discipline."

His defense of philosophy's value, however, is quickly dispensed with in a couple of short paragraphs, and Krauss applies the rest of his considerable verbiage defending the idea that philosophy of physics (his field) is best left to physicists, a defense which consists entirely of the argument that Krauss is only interested in the ideas he is interested in.

That's his right, of course, but the reason this particular cloud of cyber dust has been stirred up is that Krauss has also claimed to have answered, in his recent book A Universe From Nothing, the old and intractable philosophical problem "why is there something rather than nothing?" -- a question which dates back to Leibniz, but is also implied by the classic metaphysical distinctions between "being" and "becoming" dating back to Plato.

The question itself serves as kind of a shibboleth between curious and incurious minds. We cannot, on the one hand, inquire into the reasons for things, (as the physical sciences do), and simultaneously cordon off some of those possible reasons as boring or moribund because physics cannot explicate them. Once we have decided that it is interesting to ask why certain things are the way they are, and use causal reasoning to provide answers, we are stuck with problems of infinite regress, which come with the territory. There is no logically intrinsic reason why some questions admit of answers (why is the universe expanding, why can't matter exceed the speed of light? what happened in the first 3 seconds of cosmic time?) and some do not (why does the universe have the properties it has, and not other properties? why does it have any properties at all?)

Sean Carroll, for example, writes in a 2007 blog post that the correct answer to the question why is there something instead of nothing is "Why not?" Can we possibly imagine him as blithely presenting this answer to the question why are there galaxies, or why are the nuclei of atoms bound together?

Krauss's incuriosity is worse, though, because he explicitly claims (using Richard Dawkins as a proxy, in an afterward to A Universe From Nothing) that "Even the last remaining trump card of the theologian, 'Why is there something rather than nothing?' shrivels up before your eyes as you read these pages." That is he claims (or at least endorses Dawkins' claim) to be answering not merely a scientific question of how the first instances of matter could have arisen from quantum fields, but the ontological question of why the universe is inherently structured to allow these fields. Why these fields, and not others? --or no fields at all?

When it is brought to Krauss' attention that he has addressed the former, but not the latter, he firmly denies he ever set out to do anything more than this. From the Atlantic interview:
I don't really give a damn about what "nothing" means to philosophers; I care about the "nothing" of reality. And if the "nothing" of reality is full of stuff, then I'll go with that.
So Leibniz' question stands, then? Wherefore then Dawkins' talk of trump cards? This is the point physicist and philosopher of science David Albert made in the New York Times (earning him the characterization of "moronic" from Krauss):
Relativistic-quantum-field-theoretical vacuum states — no less than giraffes or refrigerators or solar systems — are particular arrangements of elementary physical stuff. The true relativistic-quantum-field-­theoretical equivalent to there not being any physical stuff at all isn’t this or that particular arrangement of the fields — what it is (obviously, and ineluctably, and on the contrary) is the simple absence of the fields! The fact that some arrangements of fields happen to correspond to the existence of particles and some don’t is not a whit more mysterious than the fact that some of the possible arrangements of my fingers happen to correspond to the existence of a fist and some don’t. And the fact that particles can pop in and out of existence, over time, as those fields rearrange themselves, is not a whit more mysterious than the fact that fists can pop in and out of existence, over time, as my fingers rearrange themselves.
Interestingly, after dismissing any explanations that don't invoke empirical fact, Krauss concludes his apologia by offering just that, an angels-on-the-heads-of-pins type rationalization made of pure speculation:
If all possibilities—all universes with all laws—can arise dynamically, and if anything that is not forbidden must arise, then this implies that both nothing and something must both exist, and we will of necessity find ourselves amidst something.  A universe like ours is, in this context, guaranteed to arise dynamically, and we are here because we could not ask the question if our universe weren’t here.   
This is a muddle, logically, but the least we can say of it is that it begs the question, posed by Albert, of why "all universes with all laws can arise dynamically." It just gets worse from here:
If “something” is a physical quantity, to be determined by experiment, then so is ‘nothing’. 
I look forward to these experiments with great interest.

Again, there's no reason at all for Krauss to be interested in anything other than what he is interested in. His field of cosmology is rich with opportunities for him to remain very engaged for several lifetimes without ever venturing into other areas. It is his going out of his way to paint other people's concerns as "moronic," "sterile," "impotent," "useless," and "just noise" while demonstrating serious difficulties in even summarizing those concerns (Wittgenstein is "doing mathematics") and in engaging in questionable ontology himself ("nothing" is a "physical quantity to be determined by experiment") that makes Krauss, in this context at least, something of a reactionary boor. This is the same kind of defensive, know-nothing, tough guy posturing we see all too often in the "hard" sciences, and it is completely unnecessary.





Sunday, April 22, 2012

Law of the Jungle (re-post)

[Here's a re-post from 2009. I'm posting it again because it gets at one of the big problems raised by the "hard-determinist" or "incompatibilist view that free will is an illusion: If our thoughts are not something we actively and consciously engage in, but rather the pre-determined "effects" of prior genetic and environmental causes, then how are reason and morality even possible? I've made a few slight revisions to the original post, and elaborated on a couple of points that were earlier unclear.]

In working toward a definition of human nature and intelligence, Kant drew a distinction between the actual and the potential -- that is, the world as it is, and the world as it might be. Even allowing for some porosity between humanity and other species, it should not be controversial to suggest that such a distinction does not exist in any developed form in non-human intelligences. (I exclude so called "artificial intelligences," which are really just extensions of human intelligence by other means). So far as we know, only humans have "oughts." To whatever extent non-human organisms choose their behavior, they do not so do by reasoning among choices, for this would require a symbolic thought process they do not possess. (An exception may be the cetaceans, but we'll leave that aside for now).

The appearance of humanity's faculty to envision potential alternatives to "what is" marks the origin of (among other things) morality. Without a system to order our possible choices as preferences, we would either be reduced to paralysis, or forced to return to the realm of pre-conscious behavior. This is to say that the world of actuality (as described by biology, for example) is, for humans, transected by a realm of thought not confined to its borders, and often in opposition to it. Indeed, one of the main functions of language is to discourse on things that do not, but might, exist. Highly ordered metaphysical schemes like those of Plato or of Christian theology are specific manifestations of this kind of transcendence, but no system of thought is completely free of it. Even supposedly amoral "anything goes" philosophies, like Nietzsche's or Sartre's, stand in opposition to an "actual" state of affairs they wish to disparage, such as traditional Christianity, or "bourgeois" values.

The question that emerges for an ethical system that purports to be "naturalistic" (that is, explained entirely in biological terms) is this (very old) one: Given our ability to imagine multiple possible worlds (if not, in fact, our inability to refrain from imagining them), what is to be our rationale for choosing among them? Any answer that appeals to biology alone will fail to account for moral reasoning (if not culture altogether), since the distinction between "is" and "might" cannot be found in genes or neurons. It is a property only of minds, which is to say of intelligence experienced subjectively. (We can dispense with the silly objections about dualism here, I hope.)

Before it is proposed that no moral philosophy would ever try to explicate an ethos in strictly biological terms, let's look at a famous article by the Australian philosopher J.L. Mackie (1917-1981), titled "The Law of the Jungle," and published in Philosophy in 1978. This paper was one of the first philosophical responses to Richard Dawkins' The Selfish Gene. Dawkins himself was careful in that book not to imply that biological "selfishness" (that is, the persistence of successful traits throughout time) justified psychological egoism. Mackie's take was far less cautious.

The main body of "Law of the Jungle" is a fairly innocuous exploration of a type of group selection that Dawkins overlooked in The Selfish Gene. But he closes with a palpably ethical conclusion:
What implications for human morality have such biological facts about selfishness and altruism? One is that the possibility that morality is itself a product of natural selection is not ruled out, but care would be needed in formulating a plausible speculative account of how it might have been favoured. Another is that the notion of an ESS may be a useful one for discussing questions of practical morality. (my emphasis)
ESS, as readers of The Selfish Gene know, stands for "Evolutionarily Stable Strategy," which is a type of biological homeostasis worked out by game theorists. ESS theory is called upon to demonstrate why "reciprocal" altruism exists in populations where we might expect a brute selfishness to prevail: Since a pugilistic stance is thought to require a huge outlay of energy (having constantly to defend oneself in fights), the smart strategy would be to lay low and live in harmony until that harmony is disrupted by another member of the population.

Following Dawkins, Mackie cites the example of bird grooming behavior, which ESS theory divides into three types: Sucker, Cheat, and Grudger. The Sucker embodies the extreme of complete altruism, removing ticks from other birds without reservation. The Cheat embodies pure selfishness, allowing other birds to remove its ticks but never going out of its way to return the favor. The Grudger bridges the difference, grooming all other birds with the exception of those who don't reciprocate.

Game theory predicts that the Grudger "strategy" of reciprocal altruism will spread through a population, displacing the less sophisticated strategies of pure selfishness or pure altruism. And so it may. And we might pause to notice, as Mackie does, that there is an echo in this strategy of our own concept of fairness. ("Do unto others...")

This is not a problem as far as it goes. Birds have been employed as symbols of justice and wisdom as at least as far back as Athena's owl. We find the flock-as-jury in Farid Ud-Din Attar's allegorical poem The Conference of the Birds from the 12th century, and Chaucer's Parlement of Fowles 200 years later. As valuable as modern ethology is, it is nothing new to demonstrate that we share with birds certain social norms.

But this is a far different thing than asserting, as Mackie does, that because some birds have evolutionarily developed behaviors which are "healthy in the long run" and which resemble our own notion of fairness, our notions of fairness are thereby justified. Other far less savory bird behaviors, such as eating the young in a neighboring nest, would appear to be just as "stable" as grooming behavior. Are they, too, to be adopted as preferred human behavior?

After the standard disclaimer that "there is no simple transition from ‘is’ to ‘ought,' no direct argument from what goes on in the natural world and among non-human animals to what human beings ought to do," Mackie goes on to promote exactly that argument. After linking reciprocal altruism to our modern common sense notions of fairness (although it bears a much closer resemblance to older modes of justice like the vendetta or blood feud--"An eye for an eye"), he associates the "Sucker" strategy with the philosophies of Jesus, and Socrates, who advocated, he says, "repayment of evil with good." Then, switching back to ESS theory, he writes:
[A]s Dawkins points out, the presence of suckers endangers the healthy Grudger strategy. It allows cheats to prosper, and could make them multiply to the point where they would wipe out the grudgers, and ultimately bring about the extinction of the whole population. This seems to provide fresh support for Nietzsche’s view of the deplorable influence of moralities of the Christian type.
This attenuation between discussions of biological stability and moral programs happens so quickly it's easy to miss Mackie's move, in this paragraph, of using the "is" of biology to justify ("provide fresh support for") the "ought" of the Nietzschean moral structure. But it's there, in very clear terms: Always retaliate. It works for birds! (Note also there is no historical evidence that Nietzschean morality is more "evolutionarily stable" than Christian morality.)

(It's important to mention in passing that Mackie is wrong on the science too. According to ESS theory, if the cheats prosper and wipe out all the suckers and grudgers, we are left with a stable population of cheats (until the rise, through variation and selection, of new grudgers, which would re-dominate the population.) It would be no less "healthy," in biological terms, than an all-grudger population. We can hypothesize that there might be more sickness through tick infestation, but whether this is sufficient to threaten extinction is not captured in this particular model, which only measures the relative effectiveness of the three strategies within a closed system.

What this suggests is that Mackie has unwittingly added a moral dimension to the grudger strategy among birds where none belongs. At the same time he is employing the example of bird populations to demonstrate why Nietzschean morality is better than Christian morality, he is simultaneously using our own human concepts of fair play to valorize the behavior among birds. Everyone knows an all-cheat population would be "bad," after all.)

Mary Midgley, in her famous response to Mackie, which kicked off her ongoing feud with Richard Dawkins (a "Grudger," in temperament, if there ever was one), points out the fairly obvious shortcomings of such a linkage between evolutionary stability and ethics. Like the birds in the game theorists' model, we appear to already be congenitally prepared by our genes to retaliate against transgressions against us. We need no special help from the world of ideas --the realm of the possible--to remember to do harm to our enemies when transgressed upon. As Midgley puts it, "The option of jumping on one’s enemies’ faces whenever possible has always been popular." She does not, however, follow Mackie's lead in suggesting that the only other option is to make a wholesale replacement of the strategy of retaliation with a strategy of saintly restraint. She suggests that the ethos of the paying good to evil arose as an intelligent, reasoned--not dogmatic--response to the limitations of our emotional makeup:
This disregard of the essential emotional context reappears in Mackie’s idea that the undiscriminating ‘sucker’ behaviour is one recommended by Socrates and Christ. Neither sage is recorded to have said ‘be ye equally helpful to everybody’. Both, in the passages he means, were talking about behaviour to one narrow class of people, with whom we are already linked, namely our enemies, and were talking about it because it really does present appalling problems. (my emphasis)
She goes on:
Of course charity and forgiveness have their drawbacks too, especially if they are unintelligently practised. As Mackie rightly says, there are problems about reconciling them with justice, and justice too has its roots in our emotional nature. There are real conflicts here as both Socrates and Christ realized. (my emphasis)
In other words, in the moral realm we are dealing with considerations here far beyond the ability of game theory to effectively model. The issue now becomes one of flexibility and versatility, which are dramatically multiplied in the human capacity to represent things symbolically, and of intelligence--the ability to hold multiple variables in one's consciousness while working out a problem. There is simply no way for specific usages of complex reason to be genetically encoded or learned by rote: there are far too many unknown contingencies to account for. Game theory is unlikely to predict the pythagorean theorum, the Critique of Judgement, the Theory of Evolution by Natural Selection, the Parable of the Talents, or even "The Law of the Jungle." (Mackie's article, not Kipling's maxim, though probably that too.) Fortunately, we need no more than what we already have: a formal, methodical study of reason and  symbolic representation. As soon as we stop letting fears of "dualism" drive us into the arms of an untenable "naturalistic" understanding of human thought and behavior (which is really just a zombie version of that old hard-to-kill doctrine of Behaviorism), we can return to an actual, meaningful study of the human condition.


Thursday, April 19, 2012

Not Even A Little Bit!

Cal Tech physicist Sean Carroll has a post on his blog Cosmic Variance today, called "Jon Stewart Doesn’t Understand How Science Works Even a Little Bit" in which he takes Mr. Stewart to task for "misrepresenting science" in saying (in a 2010 interview with Marilynne Robinson) that a lot of it seems to rely on faith. Here is the offending passage:
I’ve always been fascinated that, the more you delve into science, the more it appears to rely on faith. You know, when they start to speak about the universe they say, well, actually, most of the universe is antimatter. Oh, really, where’s that? Well, you can’t see it. [Robinson: "Yes, exactly."] Well, where is it? It’s there. Can you measure it? We’re working on it. And it’s a very similar argument to someone who would say God created everything. Well where is he? He’s there. And I’m always struck by the similarity of the arguments at their core.
Carroll takes Stewart to have meant "dark matter," not antimatter, and he's upset that Jon doesn't appreciate the great deal of evidence that supports the existence of dark matter, which makes it a direct contrast to a hypothesis like Deism, for example. And indeed the evidence for dark matter is pretty good. There are gravitational fields in the Bullet Cluster, for example, where regular baryonic matter used to reside until it was pushed away by a collision with another cluster. Dark matter, being non-interactive with normal matter, would not be pushed away by the collision, which would explain the gravitational fields in the cluster's old location. Not a slam dunk, but compelling evidence all the same.

But what if Stewart wasn't referring to dark matter--what if he meant dark energy instead? The names are similar, and both dark matter and dark energy have an influence on cosmic gravitation, so it's easy for a layperson to confuse them.  Almost 3/4 of the universe is postulated to be "made of" dark energy, and we don't know anything about it, except that it "must" be there, to explain the expansion rate of the universe. We don't even know what properties dark energy might have. It might be something called "quintessence," about which all we can say is that it acts as a kind of anti-gravity, pushing the universe away from itself.

If we take Jon Stewart's comparison at its most basic, replacing "God" with some kind of generic First Cause, it's not really so far fetched. The first cause is a logical construct that would explain the existence of the universe (though it does raise new logical problems, like what caused the first cause). And at present dark energy is not much more than a logical construct, either, the explanandum in this case being the slightly less sexy expansion, rather than creation, of the universe. Unlike someone like Bill O'Reilly, who ignorantly claimed that science can't explain the tides, or magnets, Stewart is essentially correct here. We don't know why the universe is speeding up, and the words "dark energy" comprise the name we give that ignorance.

Faith is often taken to mean something like "dogmatism." In this sense it has not much value for either science or religion, but is present in each. Even in the best science, humans being what they are, it is never possible to apply the null hypothesis to all our tacit assumptions and motivations. We don't dwell on these excesses, but historically they are many. Behaviorism, for one, which promoted a near-complete lack of affection in child-rearing. The prevalence of pre-frontal lobotomies, for another, in the mid-20th century.

A common response to this line is that what is incidental to science is essential to religion, which is centered upon acts of faith. This is where I think we need to introduce complementary meanings of the word, connoting something more like dedication in the face of the unknown. Anyone who has read Marilynne Robinson knows she was not on the Daily Show to encourage religious dogmatism.

This kind of faith is essential to science. (It is also not unrelated to curiosity and play, though the stakes can be much higher in scientific contexts.) This is the kind of faith a scientist might have in thinking she might redeem the work of prior scientists, or in believing there is a rational explanation for an unexpected observation. These proceed without any certainty regarding the results whatsoever. The greatest discoveries must have required immense faith (in this second sense), so upsetting were they to our common understanding of things. Galileo's discovery of moons around Jupiter, Darwin's discovery of common descent. These required an incredible commitment and dedication to see their work through, and both suffered terribly for it.

It is attractive to believe, perhaps, that religion has room only for the first, dogmatic type of faith, and I cannot and will not argue with the menacing prevalence of it. But to pretend that all of religion is just unreasoned adherences and appeals to authority, while all of science is merely evidentiary, betrays no understanding of how much we know about the sociology of science, and the varieties of religious experience. The "conflict thesis" was once compelling, but given all the data we've been able to accumulate over the 150 years since White and Draper, there's no reason for any educated person to believe it.

Tuesday, April 17, 2012

Peter Singer, Concern Troll

I've often written here that utilitarianism, the view that "the morally right action is the action that produces the most good" (Stanford Encyclopedia of Philosophy), has a built-in bias to status quo values and power relations. The fallacy that makes this possible was pointed out by G.E. Moore 100 years ago: when we start from the position that it is obvious what "the good" is, we elide the need for moral reasoning altogether. Utilitarianism cannot distinguish between what is desired and what ought to be desired. In Moore's words:
The fact is that “desirable” does not mean “able to be desired” as “visible” means “able to be seen.” The desirable means simply what ought to be desired or deserves to be desired; just as the detestable means not what can be but what ought to be detested.
Our ability to evaluate the differences between what is and what might be, is what enables morality in the first place. Once we have worked out, through contemplation and dialogue, what we desire for ourselves and others, then all that remains are the policy questions. What distinguishes utilitarianism form other forms of moral philosophy is that it starts there, eschewing the need to have the difficult conversations about meaning and value in the first place. Critics of utilitarianism don't object to "doing the math" once these conversations are over. Of course we all want the consequences of our actions to align with our moral preferences. What we object to is the refusal to allow debate on what our aims -- our "oughts" -- should be, as Moore described.

I could not have expected to come across so stark an example of the problem as in the recent half-hearted defense of Colonialism in Africa by the renowned utilitarian philosopher Peter Singer, who remarked as follows, in an BloggingHeads conversation (transcript here) with libertarian economist Tyler Cowen, who asked "Do you think the end of colonialism was a good thing or a bad thing for Africa?"
That's a really difficult question. I think, clearly, there were lots of bad things about colonialism, but you would have to say that some countries were definitely better administered and that some people's lives, although they may have had some sort of humiliation, perhaps through not being independent, being ruled by people of a different race, in some ways they were better. It's hard, really, to draw that balance sheet. Independence has certainly not been the unmitigated blessing that people thought it would be at the time.
The dialogue continues, with Cowen honing down the question a little:
Cowen: Let's say we have the premise, that with colonialism there would not have been wars between African nations. It's not the case that a British ruled colony would have attacked a French colony, for instance. It's highly unlikely. So given just that millions have perished from wars alone, wouldn't the Utilitarian view, if you're going to take one, suggest that colonialism was essentially a good idea for Africa, it was a shame that we got rid of it, and that the continent would have been better off under foreign rule, European foreign rule.
Singer: I don't think we can be so sure that it would have continued to be peaceful. After all we did have militant resistance movements, we had the Mau Mau in Kenya, for example. We had other militant resistance movements. It may simply have been that the fact of white rule would have provoked not one colony going to war against another but civil war within some of those countries. If what you're asking is would colonialism, had it been accepted by the people there, without military conflict, would that have been better than some of the consequences we've had in some of these countries, you would have to say undoubtedly yes. But we can't go back and wind back the clock and say "how would it have been if" because we don't really know whether that relative stability and peace would have lasted.

Cowen: If we compare the Mau Mau, say, to the wars in Kenya and Rwanda, it seems unlikely that rebellions against colonial governments would have reached that scope, especially if England, France, other countries, would have been willing to spend more money to create some tolerable form of order. My guess is you would have had a fair number of rebellions but it's highly highly unlikely it would compare to the kind of virtual holocausts we've had in Africa as it stands.

Singer: I certainly agree that if you look at what's been happening in the Congo, just as one example, or countries like Sierra Leone or Liberia, yes, you could certainly think that it might have been better for those countries. (my emphasis)
This is the type of argument known now, in our internet age, as "concern trolling," where the essayist or commenter expresses a token empathy for the principle at issue (in this case political liberty and self-determination) but sadly concludes that the time is just not right, or that other needs must take precedence. In most cases it is accompanied by a patronizing stance, tacitly casting all dissent as understandable but naive exuberance. (In political writing, the avatar of concern trolling is David Brooks, whose frequent appearance on left-leaning outlets like NPR and PBS, not to mention his regular gig at the Times, suggests that liberals can't get enough of his tough love--that concern trolling works.)

The first thing we must note is the utter historical illiteracy of the presumption of a simple dichotomy between the stability of white rule and the relative chaos and violence of self-determination. A great number of governments in post-colonial Africa were, for example, propped up by the US as anti-communist bulwarks (Mobutu, Idi Amin), in a more indirect mode of colonialism by other means. Very few African governments actually followed the Pan Africanist template of Ghana, for example. We therefore cannot point to the impact on ordinary citizens of these regimes, and use it to indict the notion of self-rule as untenable.

Similarly, a number of political events in post-colonial Africa trace back to causes that go deep within the colonial era. The Rwandan Civil War was not, as Cowen and Singer both imply, the result of ethnic tensions that had earlier been eased by Belgian colonial rule; rather it was in large part fomented by the Belgian encouragement of strict racial identity in Rwanda. This is not to say that there was no tension between Hutu and Tutsi before the Belgians arrived, but this was largely a social and economic rift, with little basis in tribal lineage. (In the Pre-Colonial Kingdom of Rwanda, "Hutu" was essentially a marker for low socioeconomic status.)

All of which makes the question of whether the end of colonialism was a good thing or a bad thing one of false premises. The violence and disorder that we have witnessed in post-colonial Africa was in large part created by the conditions which Cowen and Singer would like to contrast it with. Peaceable colonialism in our time is a fiction that can only be conjured if one neglects to consult history. (To be fair to Singer, he does not rule out that colonialism may actually promulgate war by proxy, as in fact it has--though it's interesting that someone so interested in the social conditions in the poorest parts of Africa would not know this for sure.)

All of this would be true whether or not Singer was offering a utilitarian solution. But here we go from bad to worse. In Singer's analysis it apparently goes without saying that social order and a modestly decent standard of living are in themselves more important "goods" than political self-determination. This is, of course, exactly the same argument mounted against abolitionism in the 19th century, and exactly the same argument being mounted today against feminism by the Republican party. In fact it's the standard reactionary response to almost any assertion of rights in our modern era: we simply can't afford the price of granting them.

Singer professes to adhere to "Preference Utilitarianism," which defines the greatest good as that which actual rational beings would choose for themselves, so it is interesting that he makes no allowance here for what actual Africans would prefer; it is simply assumed that independence has not been worth--could not be worth--the cost in lives and misery. This leads me to surmise that "Preference Utilitarianism" is the public relations version of the doctrine, which tends to fall away when one encounters any "preference" that might conflict with ones own ethical or metaphysical biases.

I don''t mean to defame Singer by suggesting he actually supports a return to white rule in Africa; I'm sure he doesn't. The point is that his utilitarianism affords him no tools to defend against even so simple a transgression as human exploitation, provided a certain basic standard of living is maintained. This, of course, is a fact known well by the exploiters themselves, all but the most sociopathic of whom will offer some semblance of compensation for their dominion. But this is just sleight of hand, to distract from the real damage. Where, for example, is any mention in Singer and Cowen's discussion of the underlying reason for Colonialism? It's not to dominate other social groups, but to enrich oneself off those groups' natural resources. The control (if not outright destruction) of social relations is the means, not the end. Surely this must play some role in a utilitarian consideration of whether Africans would benefit from continued colonialism. But it rates not a mention.

***

Nor can utilitarianism, it would seem, provide a ready defense against the logic of capital here at home. Look at the way that Cowen gets Singer to sign on for tax breaks for the rich in the US:
Cowen: If we give a greater tax break to charitable donations, and here I mean only true charity, not say a fancy art museum, disproportionately this will benefit wealthy people. Wealthy people have a lot of money. In essence you're cutting their taxes. They're giving more, they may not have a higher level of consumption, but would you be willing to raise your hand and say "I, Peter Singer, think that cutting taxes on the US wealthy is in fact one of the very best things we could do for the world's poor, if we do it the right way"? Yes or no?

Singer: Yes, if the tax break only goes to those of the wealthy who are giving to organizations that are effectively helping the poor, I'll raise my hand to that.
Of course, the truth is that very rich people have enough money to pay a higher rate in taxes and be charitable to the world's poor, both of which are direly needed (though not sufficient without major political and social changes on the part of the entire global community). By assenting to Cowen's either/or proposal, Singer is validating the libertarian idea that rich people already pay too much in taxes, and that the very rich cannot be incentivized to give up more of their fortunes. I doubt Singer actually believes this; it's a shame he couldn't spot the fallacy sooner.

We also might raise the objection that a much bigger problem resides in the way wealth is created in the first place. Cowen does not suggest, nor does Singer reply, that anything is wrong with the expectation of rich investors of a certain rate of return. This expectation is one of the reasons why debt forgiveness, which would accomplish far, far more than individual charitable giving could ever hope to, is now taboo. Why make the libertarian assumption that economic decisions should be driven by investors rather than governments, especially when there is so much good evidence to the contrary? Why focus on the relative role of individual charitable giving when public policy would be much more effective? I suggest the main reason is that utilitarianism cannot actually propose even mildly radical social change, being far too reliant on normative ideas of "the good." This has the effect of making it, in essence, the ultimate vassal philosophy.