on rectitude

Music: Drive-By Truckers: Brighter Than Creation's Dark (2007)

In the New Testament Paul of Tarsus tells us that entry into the Kingdom of Heaven can be achieved by "righteousness," and as Robert Plant amends, "there are two paths you can go by": the Law of Moses (the Torah) and a faith in substitutional atonement. I suppose, then, on this "holy day" it was fitting that This American Life was about the ten commandments and the subduing of "animal passions." I was particularly amused by the latter as just yesterday I was reading William Reich's The Mass Psychology of Fascism, which asserts, of course, the commonality between the Nazis and organized religion as "sexual repression." And then, as I was listening to Ira Glass's nasality achieve humorous highs as yet unheard, reading the newspaper I was greeted by this full-page advertisement (to the left; a detailed PDF of the advertisement is here).

The ad is astonishing because it is multi-colored and large---quite an expensive advertisement, to say the least. Its sponsor is the Green family of Oklahoma (David and son Mart) and the Christian enterprise anchored by their stores, Hobby Lobby and Mardel. The Greens have been publishing large, religious advertisements for Easter and Christmas since 1997 (all the ads are available in an archive on the Hobby Lobby website). The Greens have consistently put their money where their mouths are: more recently they famously bailed out Oral Roberts University (Mart is the head trustee now). They donate significant money to charity, and were even embroiled in a lawsuit over an alleged hostile takeover of the Feed the Children. Their businesses are described as a "ministry," which is aimed principally at putting bibles into the hands of every human being on earth.

There's a lot to say about the advertisement's content: although the image was probably selected in part because of its age (e.g., I suspect it's so old that it's in the public domain), notably the figure of Jesus is that stereotypical white guy. Aside from the ideology of Whiteness betokened by the image, the yellow-green color scheme is visually arresting, locating Christ as the source of light with rays radiating outward, while men (presumably apostles) cower in His grace. The image is a visual depiction of the route of rectitude, a righteousness that leads to salvation, or what it means to be "born again."

The rhetoric of this image is classically, even stereotypically, Evangelical. Mostly associated with U.S. protestant religious systems (rooted in the first and second Great Awakenings in the 18th and 19th centuries, respectively), evangelicalism is diffuse but nevertheless organized by four basic principles: the necessity of a conversion experience (being "born again"); the absolute authority of the bible; a conviction in the resurrection; and the necessity of proselytism. Indeed, the advertising campaigns are, pun intended, proselytism on a stick. Each one of these principles is tidily summarized by the components of the ad, with text and image reinforcing each of them. This is not simply a Christian message, but a very specific one grounded in a certain, contemporary version of righteousness.

I say "righteousness" here as a special concern, because what the term used to mean in my youth at Rockbridge Baptist Church is not what it appears to mean in contemporary religious/popular discourse today. The righteousness of the Green "Alive" advertisement in the newpaper is different from what I was taught as a young Evangelical, which was more in keeping with the biblical exegetical tradition of the nineteenth century: righteousness was steering a course of correct moral behavior, living by biblical code. What I was taught is perhaps best captured by one of my favorite bands, the Drive-By Truckers:

"I don't know God but I fear his wrath/ I'm trying to keep focused on the righteous path." DBT really do capture the spirit of a certain Protestant way of thinking: I'll try to be a good person, with the understanding that Deity is "watching." The implication of the song is that I'm in charge of my salvation, and you're in charge of yours, and minding our own we do the best we can.

The "Alive" ad today, however, reflects the Green's charismatic convictions in Dominion Theology, which, again, is diffuse but organized around a core principle: the nation-state should be governed in accord with the law of God as expressed in scripture, to the detriment of secular law. That is to say, Dominion Theology is directly posed against the Jeffersonian doctrine of a strict separation of church and state. It advocates theocracy, without apologies. In the scant interviews Mart has granted, he discusses his Pentecostal upbringing and conviction in the absolute authority of the bible. That absolutism is plain, for example, in the "statement of purpose" of Hobby Lobby: "Honoring the Lord in all we do by operating the company in a manner consistent with Biblical principles." Unlike our Alabama truckers, these folks claim to know God, and intimately.

Increasingly in popular discourse we've been seeing the emergence of a new brand of righteousness that has lost its sense of humility and tacit admission of ignorance. To be sure, the certitude of prophetic rhetoric has been a motivator for the formation of the republic since it's inception (as James Darsey has so eloquently argued), and Martin Luther King, Jr. was nothing if not convicted and evangelical in his righteousness. Still, it seems to me the full-frontal assault on Jeffersonian doctrine that would stave off theocracy has been ramping up steadily from the 1970s (and the formation of the "religious right") forward. The "Alive" ad in today's paper is an apt condensation of contemporary religious righteousness: we alone know how you should be redeemed, we alone know the single path (remember, Paul specified two), and---as pastor Rick Warren said on ABC this morning---we alone are sentinels guarding the only way to salvation. The rest of you are damned.

on the hunger games

Music: Little Barrie: King of the Waves (2011)

Yesterday afternoon I screened a matinee of The Hunger Games, having become another victim of a sale endcap at Tar-jay. Last week I had only targeted the store for sugar free gum and Topo Chico (they didn't carry TC, by the way, which is a shame), but found myself carrying home the cheap hardcover and reading it that evening. My reasoning here was, first and foremost, wanting to remain "plugged" into mainstream popular culture, since that's what I teach about for a living, and it behooves me to teach to where my students are at. (I found the whole Harry Potter thing a snore, and after reading a couple of Collins' chapters I was delighted in comparison; Collins is a much better, if forgettable, writer.) Second, someone suggested the books are "science fiction" and might possibly work in the science fiction seminar I'm planning (I don't think so, however). Finally, someone asked if I wanted to see the film, and I declined because I said I'd like to "read the book first," thereby locking myself into the labor of the entertained. I read the book, but by the time I was done my friend had lost interest in the film.

So I've labored and I've watched. I enjoyed the novel; less so the film. What follows is just some meandering thoughts about both, but assumes knowledge of the series (that is to say, there are spoilers below).

As for the novel, while it still does nothing to compete with my favorite young adult writers (Madeleine L'engle and C.S. Lewis), Collins writes well and cleanly which makes reading enjoyable if a bit breezy. J.K. Rowling is a poor writer, but that poor writing coupled with made-up-words forced the reader to read and reread passages, driving the story into one's memory. Collins seemingly effortless prose lends itself well to, say, reading half-asleep, but I found myself repeatedly re-reading what I read the day before because I had forgotten it. Still, I'm thankful that a popular culture "young adult" author can actually write a sentence, which means the reader is focused on the story and "message."

I appreciate the politics of the novel, which Collins very cleverly submerges in the story (no lengthy, preachy monologues). If the genre is science fiction, then politics are par for the putt in general. In a sense, this is a post-feminist novel because the protagonist "gives in" to those who would demean her as a woman for survival---but she does, nevertheless, carry all the other characters. Katniss lets herself function as an "object of desire" (a point Haymitch makes during the pageant) so that she can focus on saving her family and her life. Katniss is a strong female character who proclaims frequently in the novel she has no interest in the institution of marriage (although I do wince at her apparent lack of sexual desire), and her love interest---Gale---is presented in the book as her equal and companion, not a "soul mate." Indeed, the thing I most love about this novel written specifically for the Great Teen Age is that it puts serious holes into the Dream of Disney, of a rescuing Other that will make the drudgeries of life go away. That is to say, the politics of the novel is really about our contemporary fantasy of romance, masquerading as a critique of fascist state. That fantasy is shot-through with holes (or, er, arrows) with a prolonged and rather biting critique of class. All that is evil in the world of The Hunger Games is embodied in the elite, wealthy class that orchestrates deadly spectacle. In this sense, the class critique of the novel is as much about the logics of mainstream media as it is the class that controls the continent's natural resources. The dystopia Collins paints is equal parts classic Marxism and Baudrillard, because the means of production concern both "natural" resources and mediated spectacle.

I found Collins' navigation of the logics of "reality television" particularly intriguing. The Hunger Games represent the ultimate version of CBS' Survivor legacy, but tied to a very Agamben-like conception of the sovereign whereby the State is the Mainstream Media is the State---an Orwellian/Huxley theme, but cleverly rewritten in terms of the enjoyment (in the Lacanian sense) of surveillance and control. Throughout the novel, Katniss struggles with how she represents herself to the "outside" (the presumption is that there is a camera following her at every moment) and her internal affective states; as she tries to affix meaning to how she is feeling, she frequently admits to confusion. For example, are her feelings for Peeta real or manufactured? Only after the games are over does Katniss start to realize that her staged romance did inspire feelings for Peeta, but that these feelings, while real, were forged into a meaning she doesn't really cotton to (namely, the soul mate or "star crossed lovers" fantasy). Cue my friend Dana's conception of the "irony bribe" in her savvy critique of The Bachelor.

The novel also has a number of moving moments, framed in the trappings of popular sentimentality but, nevertheless, effective. I think my favorite is the point when Katniss describes how she secured the goat for her sister, a love that is set in the immediate context of a manufactured romance with Peeta (they're exchanging stories, cowering in a cave).

That said, I must also say I'm not real fond of Katniss as a protagonist. Perhaps this is the double bind of writing a strong female lead (do I want it both ways?). I'm reminded here of Karlyn Campbell and Kathleen Jamieson's work on the impossible position of strong female politicians in the press. Anyhoo, the film does a better job of making her sympathetic; she's much more affectionate toward Peeta, Cinna, and Haymitch in the film. (As an aside, I envisioned Jim Broadbent's character in Moulin Rouge as Haymitch; I was more than surprised to see Woody Harrelson pop on screen).

As with most movie-versions of novels, the film was not as evocative or good as the novel. What the film does well is pattern itself after the book---very well, in fact, and Collins' screenplay did about as good as any that I could imagine (they were smart to have her do it; she started her writing career this way in the first place). I was impressed with the economy of the film, which did a good job setting-up the franchise; the "action" was sacrificed for development, but I think that was necessary. Effie is marvelous. Lenny Kravitz, by all accounts a beautiful man, wasn't queer enough. All the actors cast as the Tributes were, in my view, perfect.

I'm not quite sure what to make of the controversy surrounding the character of Rue, who is described in the novel as a nimble sprite. Collins seems to deliberately avoid racial identity in the novel---that is, race is just not an issue, she hammers on class---but Rue is described as black. Apparently a number of teen viewers expressed racist desires to see Rue as a blonde white girl (which baffles me; I did read the novel and Collins provides plenty of cues that the Tributes are racially diverse). The novel and film are very obviously secured by a logic of Whiteness (as is much of science fiction, with notable exceptions), which is problematic. But that young people would express such hostility at the deliberate attempt of the filmmakers to emphasize racial dynamics is surprising, even though as academics we're not supposed to be surprised by such reactions.

If I do have a quibble with the attempts of the filmmakers to feature race more prominently in the film, it's the brief scene and plot innovation evoking Watts: after Rue is killed, crowds in her home district 11 begin to riot. Police or "Peace Keepers" are brought out in formation and are show to be hosing down the crowd to disperse them. In the novel gender is certainly at the fore. But, each district is described in classed terms and united by common interests that way and racial and sexual identification is only incidental. In the film, the districts are subtly discordant with respect to racial identifications (district 11 is clearly cast as a black district, while Cato from district 2 is a blond haired, blue-eyed hulk). I'm somewhat ambivalent by this decision. On the one hand, I can appreciate the attempt to portray how folks might react "realistically" to a situation (and certainly folks seeing the film are caused to think of the plight of Trayvon Martin shooting, an event the filmmakers could not have predicted, of course, but an event rooted in a well-known, cultural condition). On the other hand, MSM portrayals of race riots are almost always cast in a degrading, primitivism frame from a centering Whiteness of reasoned control (recall, for example, the overblown looting frames of Katrina coverage). That is to say, "the road to hell is paved . . . . "

Ultimately, however, both the film and novel versions of The Hunger Games are an intriguing and, I think, rewarding cultural phenomenon that is sparking discussion beyond "it's so good" and "I love it." I harbor no hopes about how the franchise will inspire discussions of gender, class, and race; that there is a frank discussion about racist reactions to the film is good, but I'm not so sure those discussions will go beyond the familiar finger pointing: "they're racist; I'm not." A better interrogation of racism will require folks to interrogate their own projective tendencies, brown and white alike, the racism that we all carry with us as members of this so-called American culture. The same goes, too, for sexism and homophobia. And there's a lot to critique in the novel and in the film, such as its post-feminist stance, the heterosexism that underwrites the whole narrative, and so on. Even so, unlike Harry Potter, which reasserts traditional Oedipal dynamics over and over and over and over (not to mention ports the impossible Cinderella fantasy onto boys), The Hunger Games does provide an entry for discussing issues of class in the classroom. The story does provide a number of ways to discuss the contemporary logics of publicity and how the advent of "reality television" has domesticated surveillance. Empirical research has shown, for example, that the majority of the current generation in college expects to be "famous" in one way or another---which is a startling expectation. The film and novel issue warnings about such desires (and its hero grounds her life in the mundane and the people around her, not her life on the screen). And The Hunger Games's clever critique of the soul mate fantasy---at least in the first film and movie---are also good references for teaching college students about the ways in which ideology works, and how the motor of that labor is contradiction. It's easy to be cynical about every mass mediated phenomenon that comes down the pike and explodes on the mediated scene; here's one, however, that gives us something to talk about and a place exercise our critical thinking.

one reason

Music: Beck: Sea Change (2002)

The soft palate quivers a bit as Beck sings, "that's what you thought love was for." Sea Change remains one of his most meaningful statements, and "Lost Cause" one of his best songs. For reasons well known by another blind man and only dimly aware by the blinking one writing here, I thought to play it to sustain a mood. I learned my Masonic coach, in many ways a grand artificer, is succumbing to the good night, but heeding well Dylan Thomas. We are not supposed to write about such things so frankly, but we think them: he pushed bombs out of planes, and if that deafening sound didn't deaden the nerves, then working in the wind-tunnel did. Hearing aids as big as his ears, curving around them like cats, Cal nodded and read my lips: "Are you a Master Mason?"

"I am."

"How do you know yourself to be a Mason?"

Because I love you and am thankful for your time, even though you lived most of a life I only received as a series of colorful stories. Cal sat with me for countless hours rehearsing the liturgy, sometimes knee-to-knee, and I came to know better what it meant to be a good person, or rather, a person taught to recognize an inherent goodness faced with the cold, brick wall of our mutual mortality.

My friend Mirko sent me an intriguing essay by Darin Barney, a communication studies professor at McGill. Titled "Miserable Priests and Ordinary Cowards: On Being a Professor," the short essay details the familiar gutlessness of the professoriate, or rather, the complexity of the academy and the systemic comforts of risk-aversion. Barney sketches a familiar feeling, the realization that, say, some policy decision is forming that is not in the best interests of others (or oneself) but which one cannot seem to muster the energy to oppose. The labor of risk is exhausting but, the reader is told, this is the uncomfortable seat of the political. Citing Zizek's now famous sketch of ideology, so goes the motto: "I know very well what is happening at and to the university, but all the same, I am a professor."

I don't know. That is, I don't know what is happening at or to the university other than what is happening at Texas (insert neoliberal rhetoric of "accountability" here). And I don't know if the phrase, "I am a professor" necessarily entails a "but." I have been a professor for ten years now, and I have sat in such storied meetings and have recognized that reticence in others and certainly myself. In this stead, wisdom (phronesis) is not the taking of risks but an ethic, a knowing when the risk-taking is truly done in a field of contingency, when things could very well be otherwise as they are. Contingency is the default where people are concerned, I know that, but something like trust---two people sitting, knee-to-knee---has to secure the risk for "things to be otherwise." Trust in this sense is not a sense of the predictable, which has become a metric of interpersonal relation in the academy, but an openness to difference. One can cite Badiou on the "event" all she wishes, but one cannot will the contingent.

Vagueries, again, I know---furies of obfuscation because public statements invite scrutiny. Public feelings, on the other hand, invite something other.

Here's what I can say: if there is one reason to become a professor it is the trust one can build and the friendships one can forge. For the past three weeks I have had the pleasure of hosting a series of guests, and we have dined together and talked together and drank together. As I said at a dinner party some weeks ago, surrounded by smart colleagues who exude affection, "this is the reason for doing what we do---making friendship." Unlike my friends and acquaintances with others in different lines of work, having an intellectual conversation with a fellow academic is not going to make me any money; there is little "to gain" from my academic friends, no investments (other than love) are at stake, and I'm not going to contract a new deal. Certainly there is something like "social capital," in that more commonly discussed notion of "networking." But that term seems too instrumental to me to capture the welcome risk of stupidity: of not having to be "on," just needing to be "with" over a cocktail and some words and a handful of half-baked ideas.

Role modeling is and remains important.

I have really enjoyed "crashing" with friends at a conference, and then having friends "crash" here, these past few weeks. I've spent time with Jennifer, Dwayne, Rosa, Katie, Dana, Megan, Shaun, and Melissa. We've talked ideas, food, and especially music. So, on this glorious and beautiful Sunday, sitting outdoors, I'm going to put off the politics of the academy for a day and meditate on what really matters and the reason for doing what I do. In the end.

systemic ills and Afghani innocents

Music: Shearwater: Animal Joy (2012)

Friday the federal government revealed the identity of the American soldier who killed 16 Afghani civilians in a murderous rampage. The government has alleged the 38-year-old Robert Bales, devoted father of two and husband to an apparently loving wife, went on the killing spree after drinking with other soldiers.

What I find interesting---well, what I always find interesting---are the printed speculations about the motives and reasons behind the incident before official details are formally released. Initially, reports circulated that Bales' family life was strained and he was disgruntled that he was on a third tour. Other reports suggested a "brain injury" sustained in a previous attack may be to blame. Others speculated about possible psychosis, reaching for rationales that would reduce the cause of the violence to a tragic flaw or shortcoming of Bales person.

These speculations have now receded as reports of Bales personal and professional life have been released. His lawyer has actively denied the suggestion of an unstable marriage or that Bales suffered from any kind of mental illness. Reports from Bales' colleagues have been, thus far, universal in their assertions that Bales' behavior was very uncharacteristic of the man. In short, it's coming out that there was nothing really to come out: the man was previously "stable" and of good character. A good soldier. A good father. A good husband.

There are two un-goods that appear to remain: first, Bales was disappointed at having been passed over for a promotion, and second, his lawyer will be pursuing a PTSD defense, which "is common" in cases such as these.

The impulse of the mainstream media reporting machine, and to some extent those of us who read and watch the machine, is to "pin the tail on the donkey"---to find fault with Bales or identify some essential brokenness (or as is often the case, to suggest some harbored evil or racism or hatred). This projective tendency makes it easier to confront our enjoyment of the reportage of such atrocities---"oh, the horror!'---while overlooking, of course, the true locus of horror here: that all of us are capable of unspeakable things and subject to the influence of larger structures and the environment (if two days stranded in a Dulles airport can almost "snap" me, I cannot possibly imagine what a combat zone would do). Of course, the military is suggesting alcohol . . . but why not the conditions of war? From Hollywood to personal life narratives, all of us have been exposed to the message that "war is hell" and that the stresses and pressures of a combat situation are almost incomprehensible (or are, as in the case of PTSD).

A murderous rampage is truly evil and repugnant, but it's also a kind of script---someone plays out the script on autopilot; it's consequence of a flipped switch that turns the Other into a pixilated character in "real life" video game. So what caused the shift to autopilot? What flipped the switch? Alcohol may have been a lubricant, but that's hardly the explanation. I think we have to face up to the fact a Bales is at some remove the inevitable consequence of a well-run death machine. That machine is not new, but has been in operation for almost a century (I date here modern warfare and its technical instrumentality to the first World War). There have been many Bales before. There will be many more.

Hasn't anyone seen Full Metal Jacket? Why is that "fictional" film not a better explanation for Bales' deed than the digging for personal dirt?

What is new is that this kind of information and atrocity gets reported almost instantaneously. What is new is that the secrecy that has been the cloak of the Nation-state dagger is increasingly difficult to wear. What's new, perhaps, is that we have to start thinking about atrocity systemically, not as the product of an individual pathology. PTSD is not an person's flipped switch, but the consequence of "snaps" or "breaks" produced by a Leviathan that, as reports about failures in Afghanistan seem to hit, is in more control of "us" than we are of it.

Again, as Jaggar once sang: "I shouted out, 'who killed the Kennedys'/ when after all, it was you and me."

a subtle vehemence

Music: Hammock: Chasing After Shadows . . . Living with the Ghosts (2010)

Perhaps violence concerns a frustration over telepathy?

Vanus: empty, without substance---the trouble with (the) vanity is that it's so isolationist, in (or at) the end. One can put two sinks in there but, you know, we all shave alone. ("This is life in the fall.")

But that's not the point of (a) vanity: there is the yearning for and provision of a mirror.

My friend and colleague and mentor and often teacher of life-important things Rosa Eberly ] departed many days ago, after visiting for a stint, but memory of her presence lingers. I spy water rings on the patio table. We ate and talked and processed the conference we had just attended on "symbolic violence." It was a strange topic to process together because our visit was so peaceful, downright joyful; one night we sat on the patio into the wee hours of the morning blowing smoke (rings). It's been a long time since I've had such a communion. We discussed her paper (and the marvelously fecund notion of a "vernacular cloud") and her response to Kevin DeLuca's plenary presentation concerning our world of "infinite violence." I've been thinking a lot about her response to DeLuca, which was a challenge, and not just because the bathroom didn't have two sinks.

DeLuca and only a handful of others at the conference had the notion we needed to confront the affect of violence; we knew the tendency of our ilk, we/us academics, would be to abstract---too abstract---to stay on this side of the "symbolic" in "symbolic violence." DeLuca wanted to "go there" and confront us with our own (complicity in) barbarism. My co-presenter Claire Cisco King and I had the same idea. We all showed disturbing images and reacted, or asked a reaction, to them. This was our go-to habit of confrontation: to display monadic, invite the dyadic.

But in retrospect, and in the wake of Rosa's visit and (graciously softer) influence, I'm wondering about this impulse, that violence confronted demands a violence vehement---a violence experienced close to the source, a recon trauma. As I have been processing Rosa's response to DeLuca, which was subtle and careful and delicate, I second guess. A/I a third.

Thirdness. Charles Sanders Pierce is in da house (in the membrane), laying down the "law of love." My dim memory of Pierce from graduate school is crusty with frustration on the edges. I remember firstness as the is-ness (the indeterminate thisness). Secondness is the condition of alienation, the evagination of subject/object; and thirdness the mediation, the font of meaning and "sign," creativity. Pierce is often perceived (and rightly so, I feel) as a fierce logician but also as a self-styled "agapist." We may very well be the vocabularies that we inhabit, and given the promise of poetry, we very well may overwhelm Auschwitz with our words.

Still, "inhibition" or station or "the sink" is the problem. Rosa repeatedly characterized rhetoric as "processural," which is distinct from the "event" or period or flash or bang we would fashion for violence as such; we make it a thing because our retrospective sense-making periodizes "it" in the mourning that is meaning. Rosa ended her response with a reference to the unspoken (or at least muffled) conversation of voices circulating at Penn State (the movie!) and its interiors removed: a private home in which the words exchanged between Paterno and McQueary gleaned a violence processural, a violation of being yet-to-be. What violence is that, elongated and senseless but for the quiet (re)covery? What violence is that? indeed. I take note and bow my head. Yes, friend: there is that violence too.

I speak oblique but not without purpose or care. What slow violence do we but ignore, that unseen violation next door, the subtle vehemence that does as much if not more damage because it lacks recognition? To make this thinking concrete, but to abjure having to make a definitive point (which is the luxury of bloggishness), I reference an unsettling violence heard in recent weeks: A female violence calling from a neighboring apartment complex to "call the police! call the police!"

I called the police.

But that was not enough, I felt, deep down in "this." And this feeling is guilt. Unlike "giving a paper" or collecting a paycheck, the call to conscience is not merely machinic or programmed, the script "to be good" that we (my friends reading this) all know. It's something Other. The call, the visitation, is an invitation for rethinking violence as the condition of objecting-the-Other at all. That's not a "deep thought," of course, although I recognize (even delight in) the abstraction of words here at the same time as I worry about their reception ("oh, Josh is talkin' shit again"). I'm trying to say other-than-the-words-in-the-way that the words-in-the-way-say: violence is what happens when I make you a character, a figure in "my" movie. That much is inevitable, and that much is why I should remind myself you are out there, and that you are not like me. Perhaps "rhetoric" is simply another word for "guilt."

getting violent

Music: Robin Guthrie: Continental (2007)

I've recently returned from a small conference organized by Jim Aune (and gang) and hosted by the department of communication at Texas A&M University on the topic of "symbolic violence." About fifty-to-seventy (it was hard to gauge, but I don't think more than 70) attended for four days or so of panels and plenaries about a topic that, admittedly, could get one down after a while. The four of us who spoke on the last day---on a Sunday morning, no less---were delightfully surprised at the energy and resilience of the audience, still quite robust in size and not as exhausted as one would expect after three days of discussing Nazis, rape, torture, and other sorts of atrocity.

I mention the usual and expected topoi of "violence" because, while we all referenced these, a good amount of labor went into thinking about and getting a better handle on what, exactly, violence might be, and how the qualifier "symbolic" relates to it. Although there was a consensus that we all agreed certain kinds of violence were, prima facie, bad, only a handful of folks seemed prepared to summarily dismiss violence out-of-hand. Indeed, much work went into advancing various taxonomies of violence to better enable us to formulate questions about "it," or better, about "violences." In this respect, DJ Dee-Zee-Ski (a.k.a. Prof. David Zarefsky) kicked us off with a keynote set that methodically detailed a typology of violences (and refreshingly unanswered questions), that really did capture the tone of the conference to come. Prof. Pat Gehrke advanced an alternative typology that was equally provocative and helpful.

Most of the other discussions I heard were more case-study focused, but again, most seemed to take-up quite seriously the realization we may not actually know how to grapple with violence as a thing---or an event or structure. (By the way, Bordieu and Zizek were perhaps the most cited theorists I heard.) I found this attitude or disposition really, really refreshing and productive, and I suspect we were all reacting, to some extent, to the way in which prefabricated responses to violence (or perceived violences) are so central to mediated ideology today. Everywhere you turn, one is supposed to respond to violence as an unquestioned evil and to respond . . . well, to respond in kind.

Zarefsky did not rule out violence as a possible necessity, but he did pose civility as the preferred default. Karlyn Kohrs Campbell responded by stating symbolic violence is a default mechanism of identity---although her focus was on the centrality of what I would term "projection" (and her example was Cuba). The next day, however, taking the position of the inevitability of multiple interlocking forms of violence (structural, objective, political, and so on), Gehrke posed the provocative question: "how do we do violence better?" That question stuck with me and many others for the rest of the conference. At lunch today my colleague and friend Jennifer Mease said she was quite taken with Erin Rand and Dan Brouwer's papers today, which concerned a kind of "queer intimacy" or interaction. One of many of Rand's biting insights was that the gay male suicides of recent years has produced a public figure of the lonely and isolated bullied gay boy that runs cover for the excruciating din or "cacophony of the social" (that is, suicide was seen as an escape from the social, not an isolation from it). Mease asked: "is queer intimacy a better violence?" meaning not so much displays of public affection but a different comportment toward what Brouwer described as the "stream of life" (he described a scene in which a drag queen helped emergency personal get through a crowd). Well, I cannot possibly replicate the nuanced context of Jen's intriguing question here---at the moment I'm almost brain dead and weary from the conference, but still: what a question!

For my part (and to some extent that of others, especially Chris Lundberg in his discussion of evangelical popular cultures), the goal was similarly to pose what we term "violence" on this side of language as a recognition of the drive central to the subject. Like Erin and Dan, Claire Cisco King and I deliberately crafted our papers as a conversation about how violence travels across different modalities of symbolicity. We examined violence in film, primarily (although Claire was careful to underscore all sorts of clear parallels in Western art and disturbing, iconic photography). So, for example: imagine a horizontal line labeled "language." Prior to this line is something we might label "compulsion" or "drive," which is references a human tendency toward enjoyment. A full realization of that enjoyment would be "death"---not necessarily destruction, but something like stasis or equilibrium. After the line, we have various symbolic ways to channel the compulsion, which we might recognize in good or bad terms, like pain or pleasure. Now, my argument was that genre was one of the ways we organized these impulses or compulsions into regimes of meaning, and that historically when it's pushed into the service of destruction, that has been done over the body of woman (cue Laura Mulvey on the gaze, etc.).

None of these discussions were satisfying, especially mine. But that's not a criticism---quite the opposite. The point is that for four days a group of people zoned in on a very difficult social problem that concerns what we study (rhetoric) and were working toward a map or ground clearing for future thinking. I just really, really appreciated the general disposition of thinking aloud---of no one claiming to have a definitive map or answer, just thoughts about how we might coordinate future inquiry.

Aune did a good job today summarizing four (or five?) thematics that seemed to emerge from the conference. The one that seemed foremost, to me, was the "dialectic of civility and incivility," two buzzwords at the forefront of the popular imaginary today, especially in political discourse. What became very clear to me by the second day of the conference is that, just like "violence," there were many civilities. It seemed to me folks were working with three different but interrelated conceptions of civility: (1) procedural or functional; (2) aesthetic or tonal; and (3) publicity. Most everyone seemed to agree with the value of procedural or functional civility, which concerns a reasoned exchange of ideas. Things seemed to break down, a bit, with tonal civility and civility as publicity. A number of folks seemed confused by what I was calling civility as publicity, by which I meant the appearance of procedural and tonal civility (that is, PR). For example: the repeated and often passionate calls for more "civil" public discourse is often just an appeal for the appearance of procedural civility in a manifestly uncivil, disrespectful, or otherwise oppressive state of affairs. I need to get off to bed, but I'll sign off with the observation that I think we need to disarticulate these forms of civility if those particular discussions are going to do any sort of conceptual work. Too often we slide between them interchangeably, on the surface of things, in a way masking what my colleague Dana Cloud termed the "violence of civility."

Well, this is an admittedly disorganized gloss on the conference and the ideas that circulated in my head as I made the two-hour drive home. And that is to say, this conference did what I wish every conference did: gave me a puzzle I actually find myself invested in working-through, even though it may never come together. Despite the troubling and at times depressing topic, what a rewarding and productive conference this was!

of senior fellows and sci-fi (sf/sf!)

Music: Kirlian Camera: Still Air (Aria Immobile) (2000)

As I detailed in the previous post, I've been thinking about a course I'm developing for next spring for the College of Communication Senior Fellows program, which is an honors program designed to give gifted upperclassfolks a more intimate and challenging classroom experience. I've never taught one, but I'm told the class is conducted like a graduate seminar with adjusted expectations. I'm sharing the description I developed, and then after that, some thoughts and concerns about the course (and content). Here goes:

___________________________________

Is Communication (Science) Fiction?

Whether we figure "communication" as the exchange of information, a form of symbolic inducement, a process of understanding, or the means by which we exchange, induce, and understand, each definition is informed by what John Durham Peters describes as a centuries old "dream of communication as the mutual communion of souls." This philosophy of communication seminar grapples with the dream of communion through the idiom of science fiction. The goal is to help participants not only come up with their own answer to the titular question of the class, but perhaps more importantly, to help students toward a stronger understanding of what the question means.

As an idiom, science fiction references various attempts in the domain of popular culture to speculate about the future. Although for most of us "sci-fi" is synonymous with galactic battles, planetary exploration, and spaceships, a dominant theme is the possibility of communicating with extraterrestrial intelligences or "aliens." Read as a social commentary, we can substitute the figure of the alien with "other people," such that the theme is also a question: can human beings communicate at all? The question is not as simple as it initially seems. The College of Communication itself is premised on a positive answer to this question, however, science fiction asks us to consider the alternative.

In this seminar we take up the alternative by studying communication theory and speculative fiction in tandem. Participants will be introduced to the history of the study of communication in the United States, including the formation of the field in the early twentieth century and the history of the College of Communication at the University of Texas. Students will learn about how early scholars attempted to situate communication at the center of public education, as well as the assumptions made about idea of communication in doing so. We will be comparing theories of elocution and public speaking, speech hygiene, general semantics, semiotics, and "communications" (e.g., broadcast technology) to the ideas advanced by speculative fiction writers such as Arthur C. Clarke, Ursula K. Le Guin, Bram Stoker, and Stanislaw Lem. We will also be viewing and discussing a number of "hard sci-fi" popular films, including 2001: A Space Odyssey and Solaris, to see if we can extract the implied theories of communication these advance. Special attention will also be given to government-sponsored attempts to communicate with intelligent life in the universe in the Search for Extra-Terrestrial Life (SETI), Pioneer, and Voyager programs.

As an honors course, this is a "graduate-style" seminar designed to encourage critical thinking by engaging "big questions," setting-aside the pursuit of specific or practical skills. Students should enroll expecting challenging reading and no definitive answers from the course material. Throughout the semester, students will be expected to formally share their responses and reactions to course material and to help organize class discussion. The seminar will culminate in a rigorous term paper in which each participant develops his or her own answer to the question, "is communication (science) fiction?"

___________________________________

Fortunately, there's time enough to tinker with the description, and I can think about what books I shall assign for the next . . . [Brief pause: I must remark on the beautiful sunset happening right now in Austin; the skies are overcast with purple clouds, a cheering orange glow peeps around the edges, getting brighter and then dimmer and illuminating this keyboard with kind of slowly pulsating pink light---a sci-fi scene, indeed.] Where was I? Oh, yeah, I have seven months to think about the books to assign and read them.

Although it's still a bit hazy---and it probably should be---the goal of the course is nothing more than what I've already sketched. In part, one could describe the goal as the proverbial "pulling out the rug," as they say. Grads in our college have assumed, from day one, that communication is a thing, event, or process, that it's possible, and that it's mostly positive. Some colleagues teach the "darkside of communication" literature, as its called, but even that work tends to assume (though not all of it) that communication occurs. I think it may be helpful for students (that is, students who want to) to question the fundamental assumptions of our curricula.

To what end?

Well, if I specified that I suppose the course and it's guiding question would not be a seminar, but something else. Even so, purposefully bracketing the point of questioning the assumptions behind the idea of communication, I'm not beyond questioning the point of my going to science fiction as an illustration. My chosen pulp-era magazine covers for this post are deliberate. I worry: to what extent is science fiction a masculine appeal? We know, for example, that the formative marketing of the genre in the states was aimed primarily at adolescent boys and that choice has been lock-step with Hollywood film. Regardless of the merits of the stuff we'll be reading (and there are many), should I worry about a skewed, male enrollment?

I've been thinking about the questions of sci-fi's "male appeal" as I've been reading the "greatest hits" of science fiction from the 50s forward. Much of what I am reading, insofar as it doesn't aspire to art or philosophical reflection, advances a phallocentric and, frankly, racist ideology (and this is not surprising). As I read into the 60s, there's a more self-conscious effort to explore (and explode) gender and race categories, and I appreciate that. Even so, this "male-focused" sense I get reading science fiction is something beyond the marketing, certainly more than the themes of "xenogenesis." Woman is present, indeed, ever-present in her absence (think of Psycho or Fight Club here, but change out the protagonist with Ender or Lem's Prof. Hogarth in His Master's Voice ). I reckon what it comes down to is why does this stuff appeal to me so much? And does that something have to do with masculinity? I suspect it does.

Just thinking aloud, but, I will need to figure out this masculinity issue for myself---beyond an all-too-easy or obvious critique of masculinity chest-beating or anxiety (the adolescent male fear of, attraction to, and oppression by masculinity)---before I teach the course. It may very well be that there is no way to even approach the question of communication's possibility without seriously interrogating gender. Indeed, were I to teach this through a Lacanian lens, the there would be no way out, only through (since communication would be, as it were, poised on sexuation---no desire/need for communion without difference). But this isn't a class from a psychoanalytic vantage.

Yet.

is communication (science) fiction?

Music: The Caretaker: Patience (After Sebald) (2011)

For those who know me outside of RoseChron, at first blush this titular question is a rhetorical one: understood as communion or a real connection between two symbol-using critters, communication is indeed a fiction (cf. John Durham Peter's Speaking Into the Air). Since I read it in grad school, I've always found Richard Rorty's explanation of communication as a kind of coordinated dance, or an attuning of behavior or a squaring of "squeaks" and "barks," rather persuasive. Communication might be better expressed as a coordination of behaviors via symbolic means (which would imply, obviously, that left-handed Tantra is not intercourse after all---ha ha ha). Yes, I think communication is a fiction from the standpoint of popular parlance. But if you get inside the question to ask, semantics aside, if understanding---not as abstract, but as a kind of open-sourcing of the Other---is possible, the question gets pretty interesting, and I recognize this is why all that dialectic-bashing is so appealing today for so many (with nods to Gilles). I tentatively qualify "fiction" with "science" here to point to that "interesting" aspect of the query, and my current reading of Stanislaw Lem's novel, His Master's Voice, has really got me thinking this weeked about the question and, by extension, the foundational promise of my chosen field and profession.

Lem's curious (and emotionally difficult) novel crystalizes a theme I've been encountering repeatedly in my recent attempt to survey and digest the most celebrated science fiction of the twentieth century: can we communicate with aliens? Of course, if anything, sci-fi is a philosophy of futurity, so this quixotic theme is really about whether humans should trouble with communicating with each other. Lem fascinates me because he is among the first authors I've read who is explicitly negative on this question; I've not finished His Master's Voice yet but, so far, the moral is something like, "humans cannot communicate with aliens because they cannot communicate with one another."

The novel is darkly comedic, which is to say, it is deathly serious. It concerns a renowned professor of mathematics, Peter Hogarth, and his involvement with a secret, Pentagon project to decode an assumed neutrino blast (read: radio-like transmission) from extraterrestrials. Like just about every zombie story, the real plot here is not the transmission, but the imbecility of humans grappling with a constitutive outside. Lem skillfully paints, through a first person narrative by Hogarth, how impossible it is for humans to communicate with one another because of the power of projection: the narrator is so self-absorbed (and self-loathing) that anything approaching an openness to the "outside" seems impossible. So far the book reminds me of Sartre's Nausea, however, the melancholy is traded-in for a kind of abject cynicism. The novel is fascinating and difficult to put down, even though Hogarth is so unpleasant. I don't know how the thing will end (so don't spoil it for me), but the hilarious account of how the "message" or "letter from the stars" was received---basically as the consequence of a kind of EVP get-rich-quick spectacle in popular culture---brings to mind, immediately, Konsantin Raudive's serious, well-intentioned 1971 study, Breakthrough: An Amazing Experiment in Electronic Communication With the Dead.

The fascination I have with Lem's work follows on my reading of Arthur C. Clarke's Childhood's End, 2001, and 2010, all of which concern the ability of humans to communicate with intelligences beyond their capacity to comprehend them. Like Lem, Clarke is something of a pessimist, but he also has a profound hope in the possibility of transcendence (while an avowed anti-religionist, religious themes flower all over the place in his books). The irony here is that, at least in a formal or compositional frame, Lem seems fixed on comedy, while Clarke is resolutely tragic. Clarke finds hope in failure; Lem finds comedy in hope.

Reading Clarke and Lem reminded me of my first encounter of the "Pioneer Plaques," which went out in the early seventies on Pioneer 10 and Pioneer 11, now billions of miles in space somewhere. The golden plaques indicate hydrogen, our solar system, human beings and their limbs (apparently we are white), and so forth:

You visual rhetoric mavens will quickly discern why these images were controversial (and I don't mean the fact some folks got pissy the bipeds were nekkid!). The plaques represent the views of Carl Sagan about communicating with extra-terrestrials, who was instrumental in their design (his wife, apparently, rendered the drawing). Perhaps even more intriguing was the creation of the Voyager Golden Records, which were launched in 1977. They are inscribed with sounds from earth and say a lot of something about our prior faith in analogics. Sagan thought, even though the likelihood these messages would reach aliens was low, they nevertheless represented the "hope" central to human being.

Wondering aloud: how has the character of that hope changed because of the apotheosis of the digital?

The longer I think about rhetoric, persuasion, and (the possibility of) communication, the more I am torn between Clarke and Lem's visions. For me, the most inspiring component of science fiction is that people turn outward, exploring together. That attitude toward the unknown is constantly threatened by the temptation to turn inward---to explore the innerspace of another human being as if to discern her inner mystery. Clarke is good with the exploration part, but Lem is much better at showing how too easily that becomes an interrogation of the Other, with terrible and comic consequences.

I'm also thinking about this question, "is communication (science) fiction?" because it looks increasingly probable I'll be teaching a course by this or a similar title for the honors college in 2013 (advanced undergraduates in a seminar-like setting). Which is to say, I guess, that the course is about "love" by another name.

crying (over---or at least for---you)

Music: Kate Bush: 50 Words for Snow (2011)

Hey girl.

I know you came here looking for my musical love---and a shared hatred for the commodification of human emotion.

But frankly, girl, I'm just not feeling it. I am feeling love for you, and certainly a profound distaste for cardboard hearts. But I somehow couldn't muster the energy to sit for hours in front of my mixing deck this year, as I have in years past. It's as if I didn't want to feel for such an extended period of time this weekend, with that familiar, enveloped intimacy of headphones. Next year, I'm sure I'll have endurance (and, with luck, the inspiration).

I reference my aversion to enveloped intimacy with the admission that music is central to my life---it's constantly on at my house---and because of its remarkable ability to evoke affect. I watched the Grammys on Sunday, and the power of music to do this to us was celebrated, first of course with a number of tributes to the late (and tragic) Whitney Houston, and second with the deservedly lauded song-craft of Adele, whose album 21 swept no less than six awards. Although the song "Rolling in the Deep" was featured, just about everyone I know really falls apart with this song:

If one hasn't already been deadened by the compulsion to repeat "Someone Like You" on the radio or television incessantly (good songs are frequently ruined this way), the tune is remarkably moving---so much so Saturday Night Live lampooned its power to induce throat-lumps in a skit:

The humor of the skit doesn't simply trade off of the song, but our compulsiveness to feel and the enjoyment of yearning, the way music is a catalyst for what Jacques Lacan termed jouissance. I think Adele's song and the skit says something interesting about Valentine's Day: the embarrassment of wanting to feel and using something artificial---something contrived---to get there. We love to deride this commercial holiday because we enjoy feeling whatever it is "Someone Like You" seems to inspire, but are embarrassed or register some kind of guilt for the fact that artifice can get us there, that something "artificial" can get us off.

The NPR show All Things Considered broadcast an intriguing story yesterday about the ability of music to inspire deep feelings. The spot featured music psychologist John Sloboda, who argued certain musical shifts he terms "appoggiatura" (Italian for "to lean"), a hard-to-define musical concept that refers to a sort of tension-release change-up in melodic structure. Sloboda isolates appoggiatura in the way in which Adele sings "you" in the chorus to "Someone Like You." "The music taps into this very primitive system that we have which identifies emotion on the basis of a violation of expectancy," Sloboda argues. "It's like a little upset which then gets resolved or made better in the chord that follows." The argument here, however, is familiar to rhetoricians as the (dis)pleasure of form, which Kenneth Burke defined as the creation and satisfaction of "appetites" in auditors; affect is heightened and the satisfaction is sweeter to the degree that satisfaction is frustrated. Appoggiatrua would appear to be, then, a musical theory of foreplay.

Sloboda describes our affective response to Adele's crooning as "primitive" and "hard-wired," however, Adele's co-writer Dan Wilson insists on the import of the song's lyrics, too: "With Adele, we wrote this song that was about a desperately heartbreaking end of a relationship, and she was really, really feeling it at the time, and we were imaginatively creating," Wilson says. "That walked her back through that experience. And when you and l listen to that song, we walk through her shoes through that heartbreaking experience — but it's in our imagination."

The NPR story thus foists our affective response to broken-heart ballads as a contest between the hard-wired brain and cultural fantasy, physiological response and narrative structure. Of course, the way music can deeply affect us is both of these---body and mind; the wedding of the two is why we cherish art so much. I think the music psychologist is on to something here by pointing to cognitive processes and essential features of the song's chord structure, timbre, and melody, however, I think Wilson captures the suasive appeal of the song in terms of a cultural fantasy we all know: the music evokes feeling, but also meaning, and that meaning is the imaginative scenario Adele paints in the song. Someone whom we loved---or didn't know we loved---has "moved on." On the side of meaning, Adele expresses the yearning registered in the loss, not simply of a lover, but of the possibility of sharing a life with someone.

No one, as the saying goes, wants to die alone. Sadly, although we don't like to admit it, we also know that all of us will.

"It's just a song." Right?

The song "Someone Like You" manages to yoke the body to meaning in the sign of "the cry," which was Freud's term for the "satisfaction" that results from a (dis)comforting experience of powerful tensions and releases. Crying can (and often does) spout from sheer exhaustion, acting as a release from some unbearable tension. But tension cuts both ways, or at least is experienced as physical and psychological. Freud would agree with Sloboda that the affect is "primitive," or more to the point, infantile: the cry of the infant registers simultaneously the abject need or necessity of the Other (initially, the mother, to battle pain, satiate hunger, and so on) at the same time as it does a recognition of an irrevocable separation. The cry, as such, registers a kind of impossibility and a need to be delivered from and to that impossibility. That "Someone Like You" makes us cry is apropos of what the song actually is, in essence: a cry for love. The embarrassment we might feel tearing up to the song is that crying with it is an admission of a certain dependency at the core of our being. And perhaps most importantly, that cry for love is not for a "masturbatory concession," the recognition of sexual desire, but something much more, shall we say, existential.

However cheaply the SNL skit delivers us from the cry to the laugh (from the tragedy of being to the comedic), the deeper truth of human affective response is still there: the skit ends with all of those moved deeply by the song expressing a mutual need for togetherness as a solution to "the cry." (This kind of mutual recognition, by the way, is the "fellowship" Freemasonry and related fraternal organizations is self-consciously built upon.) It is, alas, a temporary solution or substitute satisfaction; full price buffalo wings may be the best we can hope for. Such is the disappointment of a commercial holiday that skirts above the surface ruptures of abjection, as well as our cynical enjoyment of celebrating or deriding its contrived exotica.

"What's the whole point?" a friend exclaimed last week, only half in jest. "Am I supposed to fuck you more special or something? Such a silly holiday," she said. But I'm not so sure. I think the sexualization betokened by red hearts and flowers has nothing to do sexual pleasure, but rather a deeper frustration sexual "union" appears to represent, this "cry" that murmurs just below the surface. It is a cry that too many think can be stifled by plugging the hole. As Lacan puts it, "the big secret of psychoanalysis is that the sexual act does not exist."

I think that's the big secret of love, too. Some months ago I was talking with a different friend who remarked he would likely get back with his ex-partner, despite the fact they were not really attracted to one another in a sexual way. I had thought to remark, although I did hold my tongue, that love is not reducible to genital pleasure---that it's not reducible, period. Crying together is what it's about, not plumbing the depths of another's soul or interrogating their being for that secret something that makes the world shine in a different way.

Full price buffalo wings.

The same friend who lamented that she did not have a date for this evening and that the holiday was silly, nevertheless, gets it. She said she decided to have a special dinner with another date-less friend. Her dinner date said that "we each have a 'get-out-of-jail free' card for tonight. If either of us manages to snag a date with a boy," she reported, "we can cancel our dinner."

"That's more than I got," I said, smiling. "I'm poaching fish and buttering fennel and hanging out with the dog."

What I didn't say to my friend is that having a nice dinner with a best friend is precisely what celebrating this holiday should be about, and that a date with a boy would be much less fulfilling or enjoyable, in the end. Friendship. Everything else is coming up candy and roses, and too much of that eventually makes you sick.

One more, from our patron saint, for the road:

on free labor

Music: Gillian Welch: Time (the Revelator) (2001)

Growing up I sometimes accompanied my dad to "work." Although he is now retired and still takes jobs, he was a professional photographer. He made most of his living doing photography---and later video---for companies and couples. I remember, mostly, the couples, the large southern weddings with dozens of family and friends variously arranged on some churchy dias, or scattered about under large trees in a park setting. I carried his "camera bags." I liked the time with my dad, although I worried (as did my mother) about how hard he worked (and how demanding some folks were and how poorly they would treat him) and . . . frankly, it was often boring to me as an onlooker.

Although I was off to college by the time the digital revolution hit, I noticed during the 1990s the gradual decline of the photography industry. Digital cameras got cheaper and cheaper and, soon, Uncle Bob could take thousands of photographs and at least have a dozen or so that were, by shear chance, of a professional quality. To compete, my dad shifted his business to focus on video and video editing (at this time, the most lucrative work in the professional photography business, however, that too is weaning). He also started charging just for the photography and the monies paid by companies and happy couples were for burned discs or USB drives with images instead of "prints."

One would think that the steady march of DYI photography would have led to a revaluing of the photographer's art, as more and more people came to appreciate the skill with which it takes to compose (and now edit) a good shot. Instead, however, what happened was a thickening of an attitude among the general population about artistic endeavor: that it is easy and that it is, more or less, "inspired," that the creativity of the artist somehow springs from the head of Zeus and appears, like magic, on the page, screen, or stage. Musicians are keenly aware of this attitude and have been critical of it since the birth of the music industry since the turn of the 20th century, and it has really been at the center of the discussion of (Internet) piracy as of late.

The relatively recent emergence of "intellectual property," spurred onward by innovations in design by high-tech companies, has made us more aware of the problem. Still, it persists, and we see the attitude toward intellectual endeavor percolating in the debates over higher education too, particularly in discussions concerning "teacher accountability" and "education reform." Although the discussion has mostly been couched in terms of the well-known and well-documented failures of our educational system at all levels, primary, secondary, and higher, I think the root of the problem is still the generally shared attitude that mental exertion, thinking, is effortless, like magic.

Karl Marx recognized this attitude even way back during the emergence of industrialization, when sweat-labor ran the machines. In the first volume of Capital, when he describes "labor," he is always careful to discuss human productive capacity in terms of muscles and brains. I suspect, in part, he always included the life of the mind as a significant form of labor because, as is well known, he wrote for a living and did not make a very good living writing (his children sometimes didn't get to eat). Marx's predicament is echoed by humorist David Thorne, whose exchange with a businessperson about his "design" labor is as hilarious as it is depressing (my thanks to Shaun Treat for passing along this nugget of guffaws).

Today, mental labor---creative or intellectual, as if you can disentangle them---is devalued and its devaluation is at the core of the now familiar cultural critiques of teachers and the professoriate. You see it in the high-stakes discussions of public education, in which teachers are chastised for not producing enough students that score well on exams, and yet these same teachers have "long breaks off" (which is not, of course, the reality of most teachers). You see it in the popular, cultural representation of the college professor, reclining in his or her leather reading chair in a lavish, book-lined office pontificating to a curious, respectful student sitting anxiously at his or her knee. You see it in television commercials by various "for-profit" universities, such as the so-called University of Phoenix, who feature "professors" with "real world, practical experience" promising personal relationships with students across the Internet via computer screens ("real world, practical experience" is code for a certain brand of market-driven anti-intellectualism, of course).

And I want to say you see this attitude toward the supposed non-labor of intellectual work in the requests of non-academics for expertise. The attitude is most stark in mainstream media requests for opinions and statements about this or that cultural event or thing: reporters asking for opinions about this political candidate, or television producers asking for a sound-bite about, oh, the historical links between the Ancient Mysteries and Freemasonry (note: there are none). In recent years I have declined a number of "interviews" from journalists about this or that popular culture event, or to appear on this or that television program, because of the expectation I would drop everything to take an hour-long phone conversation or take a day off to tape a show or assist with a workshop, without compensation. The appeal is usually that doing this or that gig is "good publicity" or a nice line to add to my resume.

There is value to offering one's expertise for the good of a community or a welcome cause, I cannot deny this. And I also recognize the importance of publicity for one's work or a larger, important project. But even so, whence the expectation that offering explanation or opinion or expectation is not work?

I've thought about this in recent years---even discussed it with my shrink. My therapist, recognizing the way in which academics are "trained" to work for free, made me sit-down and figure out what my base-line fee should be for all speaking engagements; she's held me to this figure and, so far, I have too. I confess that when I am approached by someone for a speaking engagement---especially if it is a friend---I sometimes feel guilty saying, "I'd love to, but I do have a minimum speaking fee . . . ." I'm trying to get over that sense of guilt, and I have a feeling a lot of folks in cognate fields---especially artists---really struggle with it, trying to balance the need for publicity for their "art" or intellectual labor with the actual expense of one's time.

So where does this expectation of free intellectual/creative labor come from? In part, of course, we can explain it in reference to the way capitalism works and the basic logic of the wage Marx explained over a century ago. But what is the ideology and its fantastic face? I can only conclude it has to do with that soul-deep conviction in something called "inspiration," the recesses of the unconscious and the ways in which insight does often "spring forth" or "come out" in ways that, in retrospect, seems like possession. It's almost as if labor or work is not supposed to be enjoyable, and to the extent that creative or intellectual labor is transportative or fun---like a good teaching day, when the whole classroom seems alive with curiosity---one is supposed to accept it as a "gift," something for nothing. Why should one be compensated for something that is enjoyable?

Well, yes: If work is enjoyable or unenjoyable, it is still work. Labor is labor.

One thing my father taught me as I was growing up---and I suspect he doesn't know this---is that you cannot give away your labor, however inspired, for free. He would sometimes do jobs for a good cause, or do portraits for free, just because of his generosity. As I watched his business develop and grow, I noticed him doing this less and less. I remember him saying, once, that the "free job" seemed like a good idea, but increasingly it created expectations that were undoing the business itself. Eventually, and painfully, I remember he came to the decision to stop photographing family events for free, or honoring requests by family members for free portraits. Well, not entirely. He still does this. But he did eventually come to the realization that he could not do it so much.

It's a hard reckoning, to be sure, but: love is money too.

Such sentiments are sung better by Gillian Welch better than me:

Everything is free now
That's what they say
Everything I ever done
Gonna give it away.
Someone hit the big score
They figured it out
They were gonna do it anyway
Even if doesn't pay.

I can get a tip jar
Gas up the car
Try to make a little change
Down at the bar.
Or I can get a straight job
I've done it before
Never minded working hard
It's who I'm working for.

Everything is free now
That's what they say
Everything I ever done
Gotta give it away.
Someone hit the big score
They figured it out
They were gonna do it anyway
Even if doesn't pay.

Every day I wake up
Humming a song
But I don't need to run around
I just stay home.
Sing a little love song
My love and myself
If there's something that you want to hear
You can sing it yourself.

'Cause everything is free now
That's what I said
No one's got to listen to
The words in my head.
Someone hit the big score
And I figured it out
That I'm gonna do it anyway
Even if doesn't pay.

Oh, and then there's this delightful essay about the song and the way in which the issue of labor refers to loving, too.

inscriptions mécaniques sur la vie: un teaser

Music: Anna Calvi: self-titled (2011)

The pace of RoseChron has slowed, hasn't it? When I reflect on how much less I've shared on the blog over the past year, I realize that the trade off is unquestionably professional "service." I have often used this blog as a space to work-through ideas, either on cultural issues of the time or in my own teaching and scholarship. When I'm not working-through here, it just means I'm doing work somewhere else. In recent years that somewhere else has been the space of others---graduates especially, but also colleagues as a blind reviewer for books or journal articles. When it's my own work, I don't have much trouble sharing "in public," but you know, when I'm working with others on their work, it's just not my place---this is not the place. For example, in a recent post I wrote about the work of others I had been reading---friends and students---and I almost didn't post it because I worried it wasn't my place (I decided that because the focus was on my worry, not their work, however, it was ok). In short: I've written less here as of late because I seem to become more invested in collaboration, in various forms, or alternately said, I've written less here because I've been writing less of my own in general.

This week, however, I've been working on my own stuff and am at that point where I can share some half-baked blatherings. Blogging about things I'm working on is often very helpful because of the way writing here causes a sort of switching-of-gears: here, the audience is very different from the one I write for on the page intended for print, or the audience I imagine sitting in front of me at a conference. The change-up in audiencing (who you are and whom I imagine), in other words, is helpful for the processes or labors of invention.

So?

I was asked some months ago to share some work in progress with a panel of distinguished visiting scholars on the topic of technology, memory, and rhetoric, and that moment of sharing is swiftly approaching [insert panicked, muffled scream here]. I agreed some months ago to share my work, knowing that it would give me a kick-in-the-pants to start drafting a chapter that only exists in the form of a lecture for the twelfth seminar in "the object" course. The title, "du mécanique plaque sur du vivant," is a phrase from Henri Bergson's famous formula for human laughter as this funky intersection of the machinic and the human. Laughing "breaks the frame" when, for example, a person seems much too "rigid" for the situation, or we find ourselves or others behaving a bit too robotically, like when Eddie Murphy does his impression of "white people." Bergson's ruminations on the comic are elegantly written and just a delight to read. Although his 1901 examples no longer track with the structures feeling (up/of) our times (sex is now the orifice of the funny bone), his views nevertheless remain relevant: there is a very thin membrane between the hilarious and the uncanny, and total satisfaction on either side risks a deathly puncture (jouissance, of course). Faced with the proverbial ghost in the machine, if you are on the side of Bergson you laugh and if you find yourself on side of Freud you scream. The notion "peals of laughter" captures both nicely.

Laugher indexes two levels of experience that, I think, we can figure as repetition and representation---incidentally, the two approaches to "rhetoric" that seem to be vying for dominance in scholarly circles in recent years. I'll be arguing a number of things on Friday (none of which can be developed in fifteen minutes), and among them a certain "psychoanalytic" extension or version of Diane Davis's argument for "a rhetoric of laughter" in her brilliant book, Breaking Up [at] Totality: a rhetorical approach to persuasion, or suggestive assent, or whatever it is we decide it is that we study, is a kind of dialectical navigation or preservation that does not collapse on the side of the machinic or the classically humanist, but unsteadily and never finally reckons with both. As Judith Butler puts it somewhere in Gender Trouble (I'm too lazy to look for a blog post), immanentist approaches to materiality or performativity as a concept should not disavow representation---as if we can do away with representation anyway. And I know the tension or approach between representational and alternative forms of rhetorical studies is on a lot of folks minds lately (perhaps it always was?); just today Nate Stormer said he was putting something together on the topic for our annual convention on the speech-side of rhetorical studies. Anyway, back in 2000 Diane posed laughter as a fecund object for thinking through the struggles of rhetorical studies to mourn the death of the humanist subject (she's always a decade ahead of the rest of us). Just let language "be," says Diane, stop trying to control it or make it do violence; let the laughter in, whatever it is, this "tropiate."

Of course, Davis doesn't recommend the kind of total-topple into difference-reveling either. It's to easy to advocate a party, and while I like a good party too we all know the damn thing can be quite destructive (cue scene's of Woodstock's aftermath). Embracing laughter entails the risk of many of those who embrace an affective pancreas (ignoring, for example, aggression can trend toward nihilism). I'm also quite taken with Alenka Zupančič's approach to the comedic as an interplay between repetition and representation, and while I'm still not quite clear on the finer distinctions between Deleuze and Lacan on repetition that she is careful to outline, I think Zupančič's lucid explanation of why Bergson errs too much on the side of humanism is compelling (Lacan is baby bear's porridge, you see, between a hot bowl of Deleuze that is difference all-the-way-down, and a humanist's representational pudding that stops at a spine or a brain, or something like that).

The challenge for me is to think these issues through the concept of memory, which is something that is assumed at the core of my current project but which is also something I've given short-shrift. Bradford Vivian's work on public memory and repetition has been quite helpful to me today, as has Kendall Phillips work on the topic. In a number of publications Kendall has explained how collective memory is fundamentally a rhetorical fashioning (rhetoric as re-membering), and Brad has helped me to make some connections with Deleuze. But what of laughter: isn't its seemingly automatic or spasm-like qualities associated with a kind of forgetting? Diane Davis suggests as much in Breaking Up, and to be certain there is a form of amnesia in our "laughing together" (and especially when it's at the expense of something Other). Not that amnesia is all bad---or that we can do away with it.

Well, I'm sort of floundering around here, which is par for the course when mucking through a constellation of stars that I think I know but whose collective form (an animal? a god? a kitchen utensil?) I cannot quite make out. Without giving too much away, I think I'll be taking laughter to the archive with Derrida: all compulsions, either the encyclopedic enterprise that is now "social media," to uncontrollable laughing, drive toward death. Yeah, where repetition is concerned the death drive churns and chafes against. This, I think, is the skeleton key: I'm just not sure which way it turns quite yet, or if its going to catch.

So, a parade of concepts: laughter, representation, repetition, jouissance, the (death) drive, memory, and the archive. It's a lot to cram into one short paper and, for the sake of sanity, I probably shouldn't. But these are the concepts of the larger chapter, and at its center is a fun-canny object. Here's a bit of the introduction I've been working on:

They probably found the Whistling Coon down by the Hudson, busking among the ferry-goers. For a small fee George Washington Johnson could whistle the popular tunes of the 1890s with alacrity and an uncanny accuracy. At that time New York was the seat of the entertainment industries, and gramophone peddlers were scrambling for those curious, cylindrical inscriptions that lured patrons to their coin-operated phonographs. Although a black man, Johnson's vocalic abilities were novel and minstrelsy was increasingly welcome as white Americans confronted their racial anxieties in popular entertainments. He was paid twenty-cents for every two-minute song he recorded for the phonographers, which was a lucrative enterprise when you consider at that time every recording made was a master: only three or four cylinders could be inscribed at once, the horns of the recording machines arranged around Johnson's resonant mouth. Within ten years technological innovation would enable the simultaneous inscription of multiple slave copies, even copies of copies such that, gradually, the master's voice---the master's recorded voice---became autonomous, needing that seat of inspiration, the diaphragm, just the once for innumerable ears. At first they desperately needed Johnson all day, every day, and then they didn't need him at all. By 1905 Johnson's recording career was over.

This march of inscriptive technology maps, in an unintended way, Henri Bergson's formula for laughter: "something mechanical encrusted upon the living." With nods to the original French phrasing of Bergson's formula (I dare not try to pronounce it unless you need a good laugh), we can also render laughter as something lawful encrusted upon the living. My remarks today will orbit a number of ways in which we can imagine the mechanical or lawful coming to bear upon that nominal domain of the human spirit, rendered variously as the "life impulse" in Bergson's account and, as we will see, jouissance in the theories of Jacques Lacan.

Now, at first blush the mechanical encrusted upon the living human voice betokens that all-too-familiar dialectical tension between what Marx dubbed the relations and forces of production; that our livelihoods always seem beholden yet resistant to technological contradictions is a hopelessly familiar regularity. But there are the mechanics of respiration too, some autonomic, some purposefully labored, and the law that is figured between them as signification. I've really begun with Johnson's example because his first, best-selling recording was not fixated on his unusual talent for whistling, but rather, on his ability to laugh in tune. Phonographers thought the racist song the "Whistling Coon," coupled by the fact that Johnson was the first African American on record, would secure their riches. It turned out, however, that the companion song---or what we would term the "b-side" today---became the runaway hit: "Laughing Song" purportedly sold over 25,000 copies by 1894 and was among the most popular phonographic cylinders of the late nineteenth century . . . .