I just finished screening the first season of HBO's now discontinued series, In Treatment. The series is about psychotherapist Dr. Paul Weston, , and four sets of patients who meet weekly for half-hour sessions. The show originally aired five nights a week on HBO, and judging from online reviews and stories about the show, was fairly well received. Although each character with which Weston interacts develops relatively independently, they are eventually woven together in three interrelated arcs that converge with Weston's personal life, particularly in respect to his strained marriage and his relationship with his children. The dramatic dynamo of the narrative, at least for the first season, is the difficulty Weston has maintaining a boundary between the personal and the professional because, of course, the professional is so deeply personal.
I enjoyed the show very much, although I have the predictable quibbles regarding poetic license. I find it somewhat implausible, for example, that a therapist would let the countertransference work him or her so thoroughly (Weston allows himself to fall in love with a patient)---although we know it happens, especially in the history of psychoanalysis. But this and related quibbles are just that---quibbles---and knowing I am watching a television show and a work of fiction made the show enjoyable. I especially appreciated how the show wove resistance to the therapeutic method into the plot, often at key moments (characters most resistant to therapy, such as Alex's father or Sophie's father, ended up falling most dramatically into naked confessions in the manner of minutes).
As I was watching the show, part of the enjoyment was the triangulation of the script, my own personal experience in therapy, and what I know and have been reading from an academic vantage. I would enjoy talking with practicing psychotherapists about their own perceptions of the show, if only because it's very unclear from what "school" of thought Weston is coming from (my own therapist hasn't seen the show). For example: Weston almost never allows silence in a session, which is an important tool of therapy. Of course, that doesn't work very well for television (I can imagine, for example, how poorly an episode would rate if it really did depict a psychotherapeutic session in which the client didn't say anything for ten or twenty minutes, which has happened often in my own experience). Weston also interjects theories or interpretations when, I think, most therapists would remain silent or wait much longer to do so. The only analyst Weston ever mentions is Christopher Bollas, a well-known American-cum-British analyst, novelist, and cultural critic usually associated with the child psych/object relations school, but not associated with cottoning to a particular party-line. I suspect this is deliberate on the part of the writers, but still, after the first season the question remains: to what ultimate end is analysis put?
Well, we know that end is "good television," and In Treatment is that. I'm anxious to see how the second and final seasons will play out. We cannot expect entertainment to cling to academic approaches or clinical experiences, of course, but it sure is fun watching something with that tacit promise. It's a little, I guess, like science fiction: the promise of a practice, but without farts on the couch of innumerable butts.
I was in T.J. Marshalls today to pick up a nice All-Clad pan that I spied and should have bought yesterday, as someone else also spied it and bought it. On my way out I slummed through the clothes and was instantly annoyed: there was row after row of nice, button up pajama tops, but no matching pants. There were rows and rows of "lounge wear" bottoms, but no matching tops.
In recent years I have noticed various "designers"---if that's what you call folks who use underpaid children in countries across the globe to make their clothes---who now only make pajama bottoms but no matching tops. I suppose this is because, presumably, men do not like to lounge in matching pajamas.
This man, however, does. This man has even been known to iron his pajamas. Some of us lounge in matching style and smoke pipes and read books.
I suppose some dolt at Ralph Lauren's lower circle of hell (the one dedicated to making inferior Ralph Lauren products to sell directly to stores like T.J. Marshall's) decided that there would be a market for pajama tops without matching bottoms. Now, in what world does that really work? Who buys a plaid pajama top with long sleeves and then wears it around with sweat pants?
In a just and civilized world a man would not be forced to look so hard for matching pajamas.
Glass furniture troubles me. Glass table tops, especially. Glass is cold and, unlike wood, harbors the secret promise of a sharp break. Perhaps it's that promise that some mistake for the delicate character of elegance?
Elegance is another word for danger in disguise. And soap scum.
I am particularly unnerved by the idiot who thought a glass cutting board was a good idea. The only worse idea: concrete cutting boards. I have not seen these yet, but I'm sure someone and Bed, Bath, and Bad Idea is on the case.
Recently a dear friend and colleague asked me to look over and comment on an essay that was reviewed and rejected by major academic journal in our shared field. She wanted to know if the essay she wrote as "as bad" as the reviewers seemed to suggest that it was (I had an inkling it was not, of course). After spending some time with her manuscript and reading the reviewers' comments, I concluded the answer to her question was, assuredly, "NO." Both reviewers rejected the essay for different reasons, and the editor concurred with each set. Reading the reviews like not-so-obscure tealeaves, it seemed to me one reviewer rejected the essay for reasons that we might say reduce to disciplinary politics: s/he did not like the deployment of a certain theorist, whom some folks dismiss out-of-hand for various reasons (a familiar problem for me, when I started working on psychoanalysis some eight years ago). The other reviewer, much more charitable, rejected the essay for reasonable reasons that had to do with the framing and contribution of the essay, not the heart or analysis of it. In most journals these reasons would not rise to a rejection, since it seems to me my friend only needed to spend a weekend thinking about and reframing the essay to address the issue of its contribution, and then, it would be good to go. I suspect, however, this second reviewer passed on the essay because of the stature of journal, which is among the most prestigious of my field and receives more submissions a year than any editor could possible handle.
What struck me about my friend's rejection letter was the consensus of both reviewers and the editor: the essay was very well written. All three commented on the excellent prose and clarity of her argument, which, frankly, is a rare compliment to see in manuscript reviews. It is often the case, I think, that flawed but well written essays are often invited for revision and resubmission because such writing is an indicator of scholarly competence. When I review essays for publication, I will often recommend a revision if the manuscript is well composed. Not always, of course, but often, and for the reasons I mentioned (good writing indicates the author is competent enough to consider the suggestions of reviewers and revise accordingly). Whether it is because my own writing is getting better, or because the writing of others in the field is getting worse, I am increasingly noticing writing problems (basic problems, like grammar issues, punctuation, and spelling) in manuscripts that I review for journals. I have turned down blurbing a book in press because I thought the writing---and therefore the proofreading---was bad, reasoning that I would not want my approval to be associated with so many comma splices. Sometimes an essay may have an exciting, smart idea, but I will reject the essay because the writing is so convoluted the reader has to work too hard for comprehension.
It's with the background of these thoughts on scholarly writing that I worked through last semester's batch of graduate seminar papers, only recently returning them. I know, I'm terrible (and I don't typically do well on course evaluations for the item, "returns work promptly."). On average, I spent three hours with each paper, using the "track changes" editing tool in MS Word to comment on arguments as well as make corrections to grammar, spelling, punctuation, and so forth. By the time I am done commenting on a graduate paper, I know for each author it probably looks very daunting, perhaps even dispiriting. But I find myself doing it thinking about the challenges of the blind review process when one attempts to publish. My reasoning is that writing well gets one's foot in the door; only by writing something well can an author get a reviewer to consider her argument, on its own terms. Reviewers can and often do reject manuscripts for basic writing errors.
I'm am blogging about this topic today, however, because I struggle with "grading" as a graduate student educator. I suspect a number of students, when they get papers back from me, think I am mean, or cruel, or overly harsh---and I worry about that. I worry about crushing spirits, or discouraging their enthusiasm for a particular idea or project. Of course, I remember feeling discouraged many times after getting back a paper graded by my own mentors (we joked one of my mentor's like to return "blood on the page" because she uses red pen to make comments and corrections)---but, of course, I kept at it. My hope is that however blunt my comments on the writing of graduate students is, they will also keep at it, that they will not internalize criticism as a critique of their person, or that somehow I am working out my own "issues" on their papers (a temptation, indeed).
I suspect I am not alone in my worrying here, that many of you who are charged with training graduate students struggle similarly with balancing the need to provide thoughtful encouragement with the disciplinary demand to prepare students for the toils of the scholarly life. I have never liked the idea that one is rough on students to "toughen them up" for the "market," which often seemed to me a rationalization for cruelty in the field. Stories abound, in fact, about programs in my field that were notorious for a kind of brutal, "tough love" approach to graduate training. At the same time, there is the an imperative, a professional responsibility I think, to make sure graduate students understand where the general bar is set for writerly competence and help them to get their chins up there.
Finally, there are two related points here. First, over the years I've learned that my particular field (Communication Studies) is better than many, perhaps most, in attending to our graduates' writing. I often get the comment, almost always from graduate students who come from cognate fields, that they have yet to receive the kind of "feedback" we give to graduates in my department. That is a curious thing and I'm not sure what to make of it yet. Second, part of the worry about grading toughly is the fact that many graduate students will not go on to a life of scholarly endeavor, and that many don't want to, either! Of course, it seems to me I still need to comment on graduate work as if students will continue on to a research career, as if all students should be prepared for that particular end, even though we know, increasingly, a research-heavy career is not the end (desired or existential) for which many of our students are headed. I often preface my comments on graduate work with a statement like, "you may not want a research career, however, insofar as we are a research-based department, it's my job to prepare you accordingly."
At the very least, the paternal and maternal observations often made by new parents also applies to graduate instructors: you never really appreciate what your own advisors and mentors did for you until you start doing it for others. "Blood in the page" often hurt to see, but here, a decade after I received my Ph.D., I celebrate the blood and am thankful and admiring.
Music: Manchester Orchestra: An Imaginary Country (2011)
This time every year I find myself struggling to decide what my favorite albums of the year have been. I have to dither among dozens of very worthy contenders, and I think 2011 has been one of the best years in popular music in a very long time---or at least for folks with tastes that veer toward the 80s, postpunk, goth or have come of age during that fateful decade. Digital technology and the increasing speed of---and ease of access to---the InterTubes have made it possible for even the ramen-eating artist and bathroom dubber to distribute and circulate artwork widely, and I discovered a great many artists that have no label support that I absolutely love. It's really difficult for me to narrow things down this year!
I decided on a few criteria for my 2011 list, if only to help me cut down on how much I would need to write! First, only those artists whom I found myself playing over and over would make the list. This criterion eliminated a good number of albums I would say are "among the best," even worthy of the top ten, including Active Child's phenomenal, angelic debut You Are All I See and St. Vincent's delightfully angular Strange Mercy. Second, the artists could not be mainstream, meaning they didn't chart in Billboard or, at the very least, would not be played on commercial radio (indie radio airplay is ok). This criterion eliminated discussing what I think is the best album of the year, without question: Adele's 21. Besides, so much has been written about Adele that touting the virtues of this album---which is a masterpiece of soul-pop---is really superfluous. It's simply fantastic and mark my words: Adele will sweep the Grammys this year, even though you can buy her album at the coffee shop check-out line. Finally, my third criterion was that I needed to mention top albums that many of you may not have heard of. This is the hardest criterion to meet, since a number of the albums on my list meet my first one (viz., played it a ton) and are in tension with it, but, I thought I'd veer toward making some happy introductions for you music-fiends like me if I could.
So, armed with those three criteria here is my top ten of 2011 and, notably, I'm going to list them in order of awesomeness, beginning with "awesome" and descending toward "most awesome." With Adele out of the way, who will be my number one choice? Read on, fellow pop junkie.
10. self-titled by Bon Iver: Justin Vernon's recent masterpiece of creative synth-dripping folkishness led me to an obscure musical memory: back in 1989 Eric Clapton helped to score the soundtrack to the low-budget, film version of Whitley Strieber's bestselling alien abduction yarn, Communion. The film isn't terrible (although the alien scenes are pretty bad), and Walken's portrayal of Strieber is nothing short of unintentional hilarity, however, the most memorable aspect of the film is the late night bourbon sipping musical motifs churned out by Clapton: lonesome, somewhat melodic riffs waft above filter sweeps of synth, as if laying down the last hurrah of the sanitized sound of the 80s, all punk sensibilities finally purged. Well, Vernon's sophomore effort as Bon Iver is Clapton's soundtrack unearthed, but with his charming falsetto singing old-school Michael Stipe lyrics (that is, word-images more than statements) over the top. Now, if you don't believe me, I'm going to ruin any thought you might have had that Vernon's self-titled album is original: Compare Clapton's soundtrack to Jenson's beautifully strummed "Calgary." Hear what I mean? Now, either you're delighted with this comparison (as I am) or horrified. Either way, Bon Iver is a delightful, retro-statement that reclaims what is the worst, sanitized synth of the 80s and turns it into something gentle and, for me, breathtaking. It's a late night album meant for winter, but perfectly crafted and, given the raging success of his first album, a welcome risk-taking by Vernon---either folks will "get it," or they'll hate it. I, for one, think it's in my to five of the year (although I sunk it to ten because it's been nominated for a Grammy---yawn, I'm channeling my inner-hipster, beyatch).
9. The King is Dead by the Decemberists: Speaking of Michael Stipe, R.E.M. officially died this past year---and about two or three albums too late. Their retreat to "rock" in the last two volumes spoke more desperation than wisdom, more "burn out" than "rock out." Their work has been just a fuzzy mess. Strange, then, that Peter Buck would pick up a guitar and Colin Meloy and company to help produce what is arguably the best Decemberists album of their career (if not the best R.E.M. album in a decade; the opening riff of "Calamity Song" is clearly a bald nod to the Athens giants). The lyrics are stripped of arcane literary references and pared down to simple, evocative expressions of raw emotion. The arrangements are jangly and tightly crafted, and Meloy has taken the band in a much more Americana direction (harmonicas, steel guitar, accordion (!!!!) and other folksy instruments abound). What declared the decline of R.E.M. was their deliberate purge of the Byrds; what declares the Decemberists the heirs of fantastic American pop is the embrace of that Jangly Phoenix. To be sure, the British influences are still heavy (notably late 80s era Robyn Hitchcock this time), but they're pared with a distinctly folksy roots music sensibility making for a delightful marriage. Meloy's singing, by the way, couldn't be stronger---full throated and soaring at times, plenty of harmonies (not characteristic of their back catalog), and just great, straightforward lyrical expressions scrubbed of the British pretention that mars much of Meloy's earlier work. This is just a fantastic album that got pushed to the upper-reaches of my rankings because it charted on Billboard. Still, it's unquestionably a masterpiece. Bon Iver's homage to cheesy 80s Clapton may not be everyone's cup of tea, but any self-respecting American pop fan cannot do without this album. For serious.
8. Within Without by Washed Out: Bedroom crooner Ernest Greene gets the homeboy vote (Perry, Georgia in da house!) for producing the most lavish laptop album of the year. Searing, sweet-guy vocals float along sweeping synth and hip-hop beats in a stand-out "chillwave" spectacular. This stuff is a downbeat electronic music, the sort that's ripe for television drama soundtracks---the kind of music 2011's spate of unsigned keyboard kinglets and queenlets used to replace the mousy-chick-with-a-guitar folk blather A Fine Frenzy made famous, but it's still different and nuanced enough to merit serious listening---and enjoyment. Ok, the cover of Green's SubPop is also a bit too precious, and I suspect many would label this music similarly. Even so, I enjoy Greene's unrestrained singing and full-frontal embrace of 80s romantic synth pop (including the harmonies layered with himself, reminiscent of Camouflage's "The Great Commandment" or When in Rome's "The Promise"). What makes this album endearing is the obvious unrestraint---dude's just "going with it," making his own music, and if we like it so much the better. Three electronic hand-claps for this out-of-work librarian and his bedroom recording studio!
7. Black Orchid: From Airlines to Lifelines by Ascii.Disko: I discovered Daniel Gerhard Holc's incarnation as "Ascii.Disko" over a decade ago on a dance floor. At that time, Holc was fusing infectious EBM grooves with the in-vogue rash of "electro-clash" retro-trash, barking commands in French and German, a sort of crunchy-slick combination of Niter Ebb and Legowelt. Unlike the acts that mostly crashed when the clash became "electro-house" (whatever happened to the Chicks on Speed?), Holc continued to experiment and mold his sound, keying his keyboards to the pulse of ever-changing underground tastes of goths, rivetheads, and self-proclaimed label-defying hipsters. Black Orchid is an unexpected, trend-bucking, quasi-dancey low-tempo love fest, with hushed and whispered vocals and echo-chamber backing woos and moans. Melodic, postpunk guitar riffs and fuzz permeate almost every track, bringing in a darker, gloomier vibe than anything Ascii.Disko has committed to disk. For Holc, this stuff is minimalist, even understated, and guitar work is definitely new territory for him. Fans of darkwave and postpunk ambles (especially mid-era Wolfgang Press) will love this album which, regrettably, too few have heard of.
6. Common Era by Belong: I've been following this New Orleans duo for some time, whose sound began in ambient-drone soundscapes (comparisons abound to Tim Hecker's work, discussed below) under the aegis of "experimental." Common Era sounds a bit like throwing Pornography era Cure and My Bloody Valentine's entire oeuvre into a sonic blender, then playing it back underwater. There's more reverb here than you can shake a whammy bar at (reverbarama!), making it difficult to tell if a synthesizer or guitar is churning out the melodic drones. Unlike their debut, this album has noticeable song-structures and there is an emphasis on the plaintively sung male lyrics (which you will not be able to understand). It sounds a bit, I guess, like overly processed synths with a drum machine and sweet, boyish mumbles over the top, yet it still has a mood that is strangely moving and perfectly appropriate for late night meditations. It's music to take Codeine to, really. And if you cannot have Codeine---because, you know, opiates are yummy but addictive---a close listen to this album can at least get you half there, maybe induce some sleeping. But this is a good effect---a very good effect and good ear-chomping, and was my constant soundtrack on airplanes in 2011.
5. Hurry Up, We're Dreaming by M83: Mon dieu! French synth-gaze maestro Antony Gonzalez has not only found his voice, but the symphonic gesture as well. Owing to my tastes (certainly the reader has noticed a trend in this year's top ten list? Reverb ahoy!), I've been following m83's ambient soundscapes since the turn of the century, the sucker for down-tempo synth melodies that I am. Unlike a lot of his contemporaries, what Gonzalez has done over the past decade is innovate and change, never losing his signature, overly processed sounds but still ever-exploring, trying new things, and never putting out the same boozy-synth album twice. By most accounts, this double album is Gonzalez's masterpiece, the ambling drones and melodies replaced with operatic pop and---mon dieu!---his very pleasant vocals! It's a rare event on previous albums to hear Gonzalez sing; on the past pop-oriented gem Saturdays=Youth he farmed out many of the vocals; here, he has an alternately whiney scream and understated, hushed croon on a dozen or so tracks. Admittedly, when I saw this was a double-album I was worried: every m83 album has been a tight and coherent whole (with perhaps the exception of the debut); each album has been often shorter than one would prefer and ended too quickly, leaving you wanting for more. So, would this super-long foray into more standard pop songs be too much of a good thing? NO! It's a marvelous achievement of pop sensibility, lots of layered ooohs and ahhs and harmonics that bring to mind a Beach Boy sensibility and, dare I say it? Yes: the grand, operatic sweeps of Coldpay ballads. Yes, Coldplay. But whereas we love to hate Coldplay (the last album, even with Eno's help, is dreadful), in Gonzalez's hands those big gestures of symphonic, Phil Specter slabs of sound are fun, not pretentious. There's also a lot of unexpected surprises, like the cheesy 80s sax riffs that appear in the break-out single, "Midnight City" (strangely, they don't sound cheesy in the context of the song). And weirdly, someone ad Victoria's Secret's advertising firm latched on to this album early, as it's already featured in a commercial:
(Nothing wrong with getting paid, Gonzalez. Just don't start writing music FOR commercials and you're ok.) Another break-out track is "OK Pal," a lovely pop ditty carried along by a "boo-doo-wop" loop, with power-chords and, yes, more cheesy 80s-style synth washes that, again, work for some strange reason. This album is like the roller-skating companion to Bon Iver's re-port of Clapton's soundtrack for interdimensional aliens.
4. Aurora Lies by Work Drugs: And now for something sort-of different, except that 80s cheese is still in play (sort-of). This Philly-based duo describe their sound as "sedative wave" and "smooth-fi," but most musical cats would call it what it is: yacht pop. Take two parts Hall & Oates, one part Prince (primarily vocalics, but without the grunts), and one part Steely Dan, shake, and you have an Aurora Lies cocktail, a deliberately playful, ambling, stoner kind of groove with frequent falsetto, a groove that demands a head-nod and slow dancing and another bong hit. It's completely derivative and totally new at the same time; I've had a hard time explaining why I like this album so much---I've plum worn it out. I've become such a fan I ordered a band t-shirt, which I wear around the house and sometimes, occasionally, I give myself a nipple-pinch through it when one of their soulful, castrati choruses hits the sweet spot of my inner-ear. These boys are neither on a major labor nor distributed, so ya gotta get the album direct from the source. Get stoned. Put this on. You're welcome.
Now, for the record, I do not smoke dope. And even then, I recognize the merit of a nice, stoned groove . . . .
3. Ravedeath, 1972 by Tim Hecker: Of all the albums in my top ten this year, Hecker's Ravedeath, 1972 has been listened to the most. It is an experimental/ambient album of a sustained mood, unquestionably achieved through the primary instrument woven throughout but most obvious in the first half: a pipe organ. I think it's my most played album because I can listen to it while working, trying to sleep, or wanting to relax. Some of the experimental elements (jarring, synthesized feedback) work against the purpose of relaxing, however, even when the album is its most rowdy (part three of "In the Fog," in which the growling feedback flanges into an abrasive up-top wash), there's a subtle wave of calming organ that keeps the mood anchored in a kind of melancholic peace. The album starts quite strong in repetitive cycle reminiscent of Philip Glass' Organ Works and moves to muted clarinet melodies and Mellotron drones. If you're into ambient music---or more adventuresome, non-cheesy new age-ish stuff without percussion---you'll love this album, which is phenomenal, headphone bliss from beginning to end. If you're not, I feel bad for you. Go watch an action film or something.
2. Blessed by Lucinda Williams: Williams 2010 Austin City Limits appearance was nothing short of embarrassing, and damn near unlistenable. When I caught her appearance I was astonished by how bad it was; she was so drunk she looked like she couldn't stand-up (and reports from those at the taping said it took many takes to cobble together something they could broadcast . . . although I think the final edit is still questionable). But if Van Morrison can put the booze away long enough to produce musical brilliance, I had to have the same hope for Lucinda. And she came through with an astonishing record, brilliantly penned and magnificently executed. The opener is a high-octane, Texas-boozy send-off, no question, but goddamn the woman can write and slur articulately: "Good luck finding your buttercup," sings Williams, with a fiery snarl at whomever this poor soul is (he sounds like a real loser). After a spirited opening Williams launches into a sad, steel-guitar driven ballad about a lost lover (heart-wrenching), then back to a more up-tempo ditty, and so the whole album goes. On the whole, this is not a happy record. One of my favorite tracks, apparently about a suicide---"Seeing Black"---is about as devastating as Texas rock and roll can get, with Hammond organ swirls helping to build a slow-boiling anger as Williams chastises a friend or lover for taking the "easy way out." This is the kind of powerfully expressed, earnestly written record you put on driving across the country for uninterrupted hours so that no one can see you tear up. Williams just poured this stuff out. It stands right up there with Car Wheels on a Gravel Road (heck, I like it much better). The initial pressing came with a bonus disk of demos which, frankly, I don't recommend you listen to---they're raw versions, but are also missing the talented intensity of musicians Williams assembled to create the album proper. This album came out early in 2011 to lots of initial praise, then airplay and discussion seemed to fade. That's too bad. This album should be on a lot of top ten lists for the year, and I'm surprised that it's not. It's really, really good, coming straight from the heart and gut.
1. Skying by the Horrors: I've followed the Horrors since their 2007 debut album, Strange House, got the attention of the British press as the new goth garage band flavor of the month. I liked the album, but it's an acquired taste with more in common with The Birthday Party than the Cure (to which they were erroneously compared). The innovative organ work was the hook: it was like the stadium organ at a baseball game, but with crunchy guitars and screaming. I liked the follow-up Primary Colours much better, which veered into post-punk territory, much less angry and certainly less "spooky"; the restraint paid off and the shoegaze territory they were exploring (think Jesus and Mary Chain, still angry, just not as much) was promisingly pleasant. Skying is definitely the sweet-spot: guitars are still there as are the frenetic strumming at times, but these emos have discovered Spandau Ballet and My Bloody Valentine and mixed 'em together. The result is an unquestionably British brew, but, er, happy. Farris Badwan has traded in his screaming for a surprisingly pleasant crooning, even "la la las" at times ("I Can See Through You"); Rhys Webb and Tom Cowan are now playing with Mellotrons and eighties-style sweeps (gee, I know I sound like a broken record now), and the gang has discovered the brass horn family. It works. It really works well. Skying is an upbeat, poppy, post-punk thing with an unflinching embrace of dream pop, pop with minor chords and drenched with reverb, but dream pop naked and glorious nonetheless. The stand out track, which I have played over and over and over, is "Moving Further Away," which weaves a melodic synth cycle through an eight minute epic of soaring choruses over a Stone Roses style percussive amble (it harkens, frankly, to Happy Mondays era drug-a-delic deliciousness). I couldn't be more delighted with a pop album (the good, asymmetrical hair styles of Badwan and company are a bonus)---which is why Skying is my top album of 2011.
This post is about the contemporary character of electoral politics, with the recent event of the Iowa primary in mind. But it will take a number of paragraphs to get there---bear with me---and when I do it will be only in peripheral vision. As it should be.
In 1809 Goethe published his third novella, Elective Affinities, the title of which has often captured my thinking about the political process in the United States. In the novella Goethe draws on an increasingly popular theory of "chemical affinity," basically, that some chemicals combine with others based on an underlying, "natural" attraction. Goethe drew on the theory as a metaphor to explore the institution of marriage and question the role of "chemistry" in the choice of life partners. Interpretations of what Goethe believed vary, but the thread that unites each is that human relationships, while forged in the vocabulary of choice, are often irrational compounds.
Today we know there is an empirical reality to what is often termed "chemistry," and research in both the social sciences and humanities has been pushing toward materialist accounts of decision-making. Still, many of these accounts make a similar, dissatisfying assumptions concerning "mate choice," "attraction," "influence," and so forth: chemicals, biology, genetics, affects, and so-on, come first, and then thought "rationalizes" a kind of pre-symbolic selection process, much like Goethe's novel seems to suggest. Although I tend to agree with both Goethe and Freud that we humans are motivated by desires that are largely unconscious and far removed from rational processes, I also question the imposition of certain temporality---or chronology---that abides these observations. In the theoretical humanities, most of us have long abandoned the classic, Marxist position that the base determines the superstructure because of the crude determinism implied. What makes this position crude is not simply that it is, "in the last instance," non-dialectical (that is, that the base and superstructure are somehow separable in thought, or that there is not, at least, an interanimation between the two, a la Walter Benjamin's observations), but that the 1 --> 2 logic of causation depends on a linear notion of temporality that doesn't really capture the recursive character of thinking or the (probably) timelessness of the in-itself (whatever that is).
I don't think it is possible to think without the necessary guide of linear time, or at least in a way that is meaningful (not much can be accomplished pragmatically, outside of a literary mode, in thinking that dwells in its own recursivity, such as in meditation or a psychedelic trip). The notions of "retrojection" or "retroaction" seem to be the better tools for capturing the way in which temporality works in thinking, even though I think a form of synchronicity or simultaneity (what Benjamin termed "now time") better captures the immanence of human agencies from a material perspective. Still, on this side of language meaning is "after the fact," always after, although I also think meaningful structures get us to the point of leaping from a fact to a retro-fact over some unconscious and material abyss often experienced as feeling. Goethe was perplexed by this abyss and sought to explore, in Elective Affinities, where the "elective" was in the kindred.
Which, of course, a good question to ask of contemporary politics: where is the elective among the kindred?
I think the answer depends on how the question is made to mean outside of a pleasing polysemy (my favorite way to write, to the frustration of many readers). One meaning of the question is: "In our current political system, is choice truly possible?" Another form of the question is: "In a group of like-minded individuals, does choice really occur?" And another way to frame the question is: "To what extent are our political choices reasoned?" or "do we choose our political candidates on the basis of feelings and then rationalize those feelings retroactively?"
Notably, I cannot seem to make the question meaningful without recourse to a linear temporality. Such is the trap of language. Also, I recognize such questioning is not properly analytical, because terms like "choice" and "reason" smuggle-into meaning those inescapable assumptions about thought as an un-feeling enterprise (the legacy of Western thought, into which most of us are inculcated). Still, in every iteration the question is a rhetorical one, since most of y'all will know how I want to answer: in our general and everyday lives, no, choice is not possible, no, choice does not occur, and yes, we feel first and think later.
But I could not write at all if I thought thinking/feeling otherwise---that which I would designate as the true possibility of "election" or choice, free thought, and therefore "the radical" properly construed---wasn't a possibility. I have been powerfully influenced by the thought of Theodor Adorno, whose hard thinking attempts to pry-open cracks in the edifice of language-meaning-borg toward hope (I am often baffled by the critics of critical theory who dismiss the project as pessimistic, because to me the enterprise concerns the possible). Lately I've also been quite taken by the work of Chris Hedges, whose recent, devastating interview on CSPAN eloquently crystalizes many of my own thoughts/feelings about contemporary politics; but even Hedges, despite his doom and gloom, promotes hope and the possibility of choice.
Both Adorno and Hedges argue that political choice in our time is an illusion or myth generated by the Matrix, the apt psy-fi metaphor for the way in which global corporations churn out feel-good myths of prosperity to detract---as opposed to obscure, which is no longer possible---from the realities of violation, exploitation, oppression, violence, and injustice (peddled here as Oprah, peddled there as neoliberalism, peddled here as Obama, peddled there as Romney, and so on). As I watch the politainment industries "track" the nomination of the next Republican candidate for the presidency, I cannot help but to slide, both smugly and with pangs of sadness, into a sandpit of pessimism. It's difficult for me not to get swayed by the notion of a Leviathan that needs a beheading---that the Matrix is a behemoth brain of instrumental rationality that must be obliterated like a video-game boss at the end of Super-Contra. And while I know there is no such entity, that Jagger is right ("after all, it was you and me"), I often wonder if this kind of rhetoric, like progressive temporality, is avoidable.
Probably not; whether we dub it "culture industry" (Adorno) or "corporate governance" (Hedges), an agency must be named despite the systemicity of the trouble.
What both Adorno and Hedges share is a conviction in psychoanalysis, or rather, in a fundamental assumption of psychoanalytic thinking: people are not driven by rational choice, but motivated by desires and feelings that are wrestled into something "rational" via punditry, spin, and often as a sense of (moral) superiority (KMFDM sings: "Now is the time/to get on the right side/ you'll be godlike!"). David Hume argued similarly, although he left much more room for reason, understood as the radicality of choice: reason informs a decision, and reason makes meaning of a decision past, but either is a linguistic edge of a yawing abyss. One crosses it, says Hume, via "sentiment"---a dispositional judgment that is at its core, nevertheless, a kind of leaping.
As a scholar of rhetoric, what I've been trying to think about this past year is the character of that leaping and, if possible, of that abyss. Traditionally, I think, rhetoricians have argued their art is situated on the cliff prior (invention, arrangement, style, memory, delivery). More recently, at least in the last century, we have moved toward the retroactive cliff after the leap, often couched in terms of "the rhetorical situation" or the magical moment of contingency. Some have argued for secret bridge: Goethe suggested chemistry, and more recently scholars in my field have wanted to investigate biology, or genetics, or other "hard wired" dimensions of the human experience that, like it or lump it, operate on the assumptions of predictability.
The assumption of predictability (and behind it, progressive temporality) seems to guide our thinking about the political and, more recently, the affective dimension of "influence." With my relatively recent interest in the political, I'm wondering what happens if we locate rhetoric at the place of the abyss, or even if it's possible to do so. My colleague Diane Davis explores this possibility in her book Inessential Solidarity, and in a way that I think potentially redefines how we think about rhetoric (Levinas inspires an examination of the "non-appropriative relation" prior to or despite the symbolic/representation). Other colleagues have been moving in this direction similarly but with different vocabularies: Chris Lundberg has been plumbing Lacan's theory of rhetoric to suggest the unconscious as the abyss (and tropology has having some scientific purchase); Barb Biesecker's work on "eventual rhetoric" goes there; and Ron Greene's Deleuzian approach to rhetorical also examines the rupture (although in a way that trades-out the subject for the apparatus).
I'm not sure where I want to go in my thinking of rhetoric and politics, but I do know this: those of us interested in the abyss or rupture or aporia on either side of judgment are (a) increasingly invested in thinking-through affect; and (b) not satisfied which accounts of the political that collapse the dimension of power into the reason/unreason or thinking/feeling binary. And speaking only for myself, it's very clear, mainstream political discourse is hopeless, or pushes toward the demise of hope---of a radical otherwise---in the name of choice and faith in an illusory prosperity. In the name of "elective" and "choice" mainstream political discourse only promotes the kindred, the intolerance of difference in the Name of the Same. Irrelevant of the lesser evil we will end up "voting" for (something I wholly promote, however pessimistic my disposition, since it's the only sliver of contingency we have outside of, well, the outside), the fact remains none of us has a real "choice" in the upcoming presidential election: Romney and Obama my be discrete biological entities, but the are scripted characters from the same neoliberal-corporate-matrix-behemoth fantasy of change-via-consumption. There's a reason that every summer blockbuster, Hollywood films often appear in pairs of different but nevertheless homologus plots. Manufacturing is no longer industrial, but serial.
The Occupy Movement should not promote a candidate and should not adumbrate a series of demands; how long the movement can remain off-script in the ob-scene is the question of the elective among our social-ist kind.
The nature boy said that the hardest thing one will ever know is not a singular surrender, but a certain form of exchange. He didn't mean it in the way Mauss described the gift as "total prestation," or representative of a total way of communal being and relations based on a muted demand of reciprocity (he traces this back to agonistic display between tribes and so forth). But nature boy was on to something quite profound nonetheless, for implicated in the love relationship is the whole of the world of social being, a reflection of a certain cultural way of interaction. If one wishes to understand a culture, then one need look no further than the gift which, fundamentally, is a negotiation and gesture of love. Objects of exchange and display are reductions of bodies: family bodies and singular bodies, but bodies nonetheless; in mythic (and lived) schemes, these bodies historically have been those of women ("don't forget to bring your wife," or "who gives this woman?") and children.
A scented candle in a gift bag is nothing so weighty, one thinks. But were that true, Martha Stewart would not have a job.
Many of us have mastered the art of giving, but that is not knowledge (necessarily)---certainly not a consequence of learning, but an (almost) automatic response to the crying. One can give too much, especially of oneself, as in the destructions of hysterical self-effacement, or the faux-effacing sadism of sharing one's private pain (often to as large a receivership as possible).
Too often the gifts of love are offered "for our own good." Learning to accept these gifts, or at least recognize them as intended gestures of affection, is sometimes difficult, and part of that difficulty is Mauss' astute observation that there is no such thing as a free gift (or lunch), that a form of reciprocity is implied with just about any gift, however unwittingly so. Some frequent gift-givers (I do not exempt myself here) would be horrified to realize that their gestures of affection and care were at some level demands (at the very least, for recognition). Perhaps that horror is abated, somewhat, by the realization that this demand is not lodged at the level of the personal (although it may be for some) but at the level of culture. We are just emerging from "the holiday season" and every news story, for months now, has been about the retail purchase and return of "gifts." Presumably our "economy," now synonymous with U.S. culture, sits precariously balanced on the gesture of the gift. In other words, you cannot escape this "social fact": to opt out entails consequences (Grench, Scrooge, etc.) and for many risks the possibility of love.
There are ways to navigate this, and thankfully, we still teach our children that "homemade gifts"---construction paper art, finger paintings, flowering weeds from the side of the road---are "just as good" as an iPad. I saw this message repeatedly in the mass media, and was encouraged. If you cannot opt out, one can at least participate in the cultural exchange and acknowledge the social fact in meaningful ways.
With each year's shedding it has become easier for me to accept tokens of esteem (what another would have you wear, or read, or believe), to see beyond the object of the gift to the person who bears it, to see the gift as a vehicle for something else: a relationship, cultural reproduction, necessary forms of loving.
"Thank you son," my father said when he received his gift from me on Christmas day. "You thought about this, I know you thought about it."
"I know it's not much," I said, "but I hope the fact I puzzled over what I might get you makes up for that."
"It does. Papa used to give me a twenty, and that was that. It's nice to get something you thought about." He said this at least three times during my visit, and it made me feel good.
My father's mention of his own father's gifts was a pregnant moment, to queer a metaphor, because the cross one cannot bear and the bridge one cannot cross often concerns fathers and sons. My relationship with my father is a working-through of a relationship with his father, something I've known at a very young age and, I suspect, something familiar to many of us "sons" out there. Closely related to these complexities---how I'm not quite sure---is my father's passion for guns. I have never understood that passion (I'm scared of most weapons, with perhaps the exception of knives), and I'm always careful when I open a drawer in the house not to explore or fumble with my hands unaided by the reconnaissance of the eye. As much as it troubles me to feed my father's fasciation with things that kill, I gifted my father a weaponry magazine subscription and an encyclopedia of firearms for Christmas. There is, admittedly, a perverse pleasure in knowing this is a gift for which Jesus would not likely approve (unless, of course, it is the Jesus of a certain Georgian culture that I'm quite familiar with, the kind in which the Holy Bible is stored in the armrest of a Ford F150 with "monster" tires and a Confederate flag sunscreen applique right behind the gun rack).
I never knew, by the way, Gram Parsons was serious when he wrote that Jesus was just alright with him. He was a fool, not a grievous angel, not to pursue Emmylou instead of death. But I regress . . . .
There is a tall, non-descript, stone Confederate soldier with a moustache toting a rifle in the middle of the city square in Monroe, Georgia. I spent some time in Monroe over the holiday. I learned the city government gifts the community every year with a "live nativity" at the Walton County courthouse in the middle of the small, downtown area. Monroe is the type of small town that had died and then returned with a struggling but "revitalized" town square. The old hardware store is still in operation and, despite having been closed for two days, the owners left a flank of nice, handmade rocking chairs on the sidewalk. No one would think to take one unless she was visiting from out of town (yes, I thought to take one, although I would never actually do such a thing). The other vibrant business is, apparently, a tattoo parlor, which is in a small flat above what is dubbed a "Family Billiard Hall" (reminiscent of, I think, of stylings Hooters, which my friend Rob humorously describes as a "family titty bar"). I had an inedible salad, which I ate, in a nice restaurant with a best friend near the town square on Christmas Eve, and after we said our goodbyes, I enjoyed a secret cigar as I toured the town square alone. The "nativity" scene on the courtyard stuck me as an odd gift. What odd "prestation" was this?
When I came upon the manger scene, there were no live people animating it; for that I would need to come back in a few hours, after nightfall (and I would have, but most things were closed and there was nothing to do and I didn't bring a book and I am not yet brave enough to get those tattoos I've often fantasized about getting). In their stead were a series of plywood cutouts painted to resemble the New Testament nativity action figures. Mary, Joseph, and Jesus had rosy, pink flesh and wore some fetching pastel garments. The three wise men were deliberately depicted as the raced Other come to honor the white baby Jesus, one of them painted in very dark hues (the African wise man, you know), all dressed in fancy garb. I had seen such a scene thousands of times in my youth, but what puzzled me at the moment was the strange way in which a racial history was negotiated in this holy re-presentation. Rising just twenty feet behind the scene, between the makeshift hay shelter housing the plywood Christ and the steps of the courthouse, stood a thirty-foot monument dedicated to fallen Confederates who gave their lives to protect Monroe city and Walton county from the War of Northern Aggression. It's a common monument to see in the rural-ish south (apparently hundreds of replicas were made and dispersed to widows' groups in the early twentieth cetury). Standing at the corner of the square was the nativity, and rising above it, the stone soldier with rifle, and just above it, the modest edifice of the courthouse erected in 1884.
"Where do I locate the mediation?" I wondered. Is the black wise man mediating the relation between the Confederate monument and Christ? Or, was Plywood Jesus mediating the relation between the wise man and the Confederacy? Given the white Holy Family gathered in the hay to the right of the wise man, one could easily conclude the stiffs were created and painted by a white man (no doubt well intentioned and probably donated as a gift some years ago). It's also true the courtyard scene overdetermined an obvious, intended meaning: through Christ there is racial harmony. One is challenged to call this kind of gift anything other than tough love, since the economy of racial harmony is not one of exchange, but assignation. Oh, and assassination, lest we forget. It's a difficult love to accept and we are right to examine the horse's mouth very closely when the gift is to know our proper place.
Then again, isn't the gift about one's station? Prestation, indeed.
With this familiar, southern scene there is also a widely acknowledged, cultural pedagogy: this child is God's gift to humanity and you should learn to accept it and be grateful. The gift of the nativity scene is a pedagogy of receiving a gift, of knowing how to receive love. And the wise men, of course, brought gifts of their own in a proper reciprocity, following yonder star.
I would have liked to have seen the live nativity. I would have enjoyed observing how visitors reacted, or seeing if the actors moved. Would they have broken into spontaneous song? And what would they sing? And what is the racial diversity of the crowd?
I don't ask these questions cynically. My "reading" of the scene as a gifting-zone and site of social reproduction is clear, I think. But that does not mean folks assembled there would recognize this reading at all; I think, perhaps, they would experience what Mauss said of the exchange of gifts in general, that it is a quasi-spiritual experience in which the community refashions the image of itself.
Watching television with my mother and father during the last night of my visit, my mother, a ridiculously early-riser, dozed off by 7:30 p.m. A story appeared on television about corporate greed, and my father began complaining about the wealthy, corporate jet set (embodied, I think, by Michael Douglas as Gekko in Wall Street). Last week I reviewed an interesting book in which the author made an intriguing (and at times very funny) analogy between the transnational corporation and a cyborg in order to explain how these entities of Capital are structured in such a way as to make individual responsibility impossible to identify. I explained the analogy to my dad, and shared the author's example of the disastrous BP oil spill in the Gulf of Mexico (it's hard to hold any one person at BP accountable because the spill was a beautiful storm of instrumental reason and hive-like decision making). "That's an interesting way to see it, and it makes a lot of sense," my father said. "But tell me, would you rather not know that you know that? Now that you are an educator, do you ever sometimes think that 'ignorance is bliss?'"
I thought about the fabled role of the Receiver of Memory and how teaching in the humanities has become the realization of Giver. Children's literature is the new pornography, the new threat of the repressed returned. I also thought about where the level of ignorance lodged: that the true state of ignorance was thinking there was someone, or a group of specific someone's, to hold individually responsible. The problem, as the socialist "we" would say, is the system.
"No, I'm glad I know it," I said.
My father responded that the amount of energy it takes to think complexly about "the world" seemed exhausting, and the reward was often depressing. I thought immediately about Adorno and his insights about the culture industry revisited (or as I put it to my students, borrowing an example from my buddy Laura borrowed from The Matrix: "do you want the steak?"). Channeling Adorno in my folks' living room, however, was not a good idea at the moment.
I remembered earlier in the day my father had asked about Socrates, so I went with an example from Plato: "In one of his plays about rhetoric," I said, "Plato has Socrates win an argument with this dude about knowledge. He gets everyone to agree that knowledge is good, and even more astonishingly, that to know the good is to do the good. I don't know about that, since capitalism seems to prove the opposite is true. Anyway, as a side note, Socrates argues that it's better to suffer evil than commit it. Or something like that."
My dad thought the claim was interesting. I said I did not believe a lot of what Plato seemed to believe (though we really don't know), but I agreed with that particular ethical truth. And the observation that evil seems to entail some degree of knowledge, of knowing someone will suffer and doing it anyway.
There's really no moral to these ramblings other than the obvious one: the only true gift is the thoughtful one, and the only bad gift is the one that one gives knowing it's a bad gift. The worst gift is the one that punishes or puts others in their "place," either because it is "good for them" or what you or someone else thinks they "need."
Sitting in the middle of a buffering, hydrocodone haze, in that unfamiliar echo chamber of the muted screeches of chronic pain, this post comes. The pain is neither wicked nor the consequence of some misdeed, but rather, and complexly, a result of a genetic disposition finally flowering.
In my foot.
I'm enduring my second bout of gout in as many years, a largely genetic form of arthritis caused by a build of uric acid in the blood. "Gout," interestingly derived from the medieval Latin gutta meaning "drop," got its name from the not-too-far-off idea that the pain was caused by the blood depositing disease in the body's joints. The disease was thought to afflict only the wealthy (and obese), who could afford the kind of rich foods that contribute to the build-up from uric acid (red meat, shellfish, and other foods high in purines). When I was first diagnosed---after mistakenly thinking I broke my big toe---I thought somehow my diet was to blame, at that time a high protein, low-carb attempt to maintain my weight (obesity is also a family issue stretching way back on my father's side). But I've since learned from my doctors that diet is not the causal factor in most of the afflicted: it's genetically predisposed in a majority of the cases, exacerbated by diets rich in purines. I've had a relatively healthy diet since that time (mostly eat fish and chicken as my proteins these days, with a steak every four months or so) and exercise daily, but still, that doesn't mean I have avoided the affliction. Once you have an attack, you have the disease and are prone to "flare ups" for the rest of your life. My mother reports that father used to get it pretty bad (managed well today), but I don't remember that. My grandfather had it much worse; when I was a kid, I can remember his swollen elbows and his immobile posture on the couch watching reruns of Matlock . . . .
After enduring the pain for four weeks, and after various meds, some helpful some not-so-much, I finally had enough and went in today to get a cortisone shot directly in the toe to speed the healing. Apparently the pain from this kind of injection is almost unbearable, but one has no choice but to bear it for the benefits to come in a couple of days (they give you a "bullet" to bite on). Sitting in the doctor's office today, feeling dopey and playing the new version of Bejeweled on my phone (an horribly addictive game on painkillers), I got to thinking about the experience of pain and the limits of language. We know of a rather large number of experiences that are incommunicable, love and ecstasy among them. Love inspires all kinds of words (I can write long letter after letter in swoon), and spiritual experience, much of the same. These ineffable affects also motivate all kinds of positive behaviors. Pain, on the other hand, invokes one conscious desire and narrows the need for expression to one simple demand: make it go away. Or, to pull all the powerful affective experiences of life into one well-worn phrase, "Oh, for the love of god, please make it go away!"
Because what ties together most of my intellectual interests today is, more or less, the experience of the ineffable and our attempts to talk about that experience, I have predictably started thinking about "the rhetoric of pain" as a future avenue of research. I started my academic career writing and teaching about the experience and uses of popular music; for me, the "rhetoric of music" concerns the ways in which folks speak and write about the musical experience (that, in fact, is the topic of the third book I plan to write). Currently I'm working on a manuscript about cultural or collective mourning and the role of human speech in that process, and I've recently written about talk about "love," whatever that is. Pain, however, confounds a lot of my assumptions about ineffable experience, the key among them: that we want to talk about, that we want to share our experiences of the ineffable in the proximity of its immediacy. In pain, sharing is not as much of a deep desire---one just wants it to stop. Maybe we want to talk about it later (like I am now, with the pain held at bay), but in the strange, elongated experience of pain, sharing is not the secondary impulse.
Of course, I don't mean psychological pain (mourning, depression, anguish, regret, hurt, and so on). I mean here physiological pain. I realize, when you push the concept, distinguishing between the physiological and psychical is not so easy (and, in a sense, one of the founding challenges of psychoanalysis)---and in one's conscious life the two are often conflated. Thinking pragmatically, however, I do think the distinction is helpful because the validating discourse of medicine lends legitimacy to the physiological and has trouble contending with the psychical, which is where the body butts-up against "culture." The logics of distinction between the two are fundamentally rhetorical themselves, and interesting to think about (for there is where ideology is, more or less, naked for all to see, like the emperor).
I started to think about the rhetoric of pain when I was in the hospital a few years ago with acute pain in my chest. I had a virus that attacked the lining of my heart, and until that point in my life, I had never felt such pain. It was truly breathtaking---debilitating to the point I couldn't breathe with each pang. I remember the nurse pointing to a chart on the wall with happy faces and increasingly sadder faces, asking me which face (or number, from 1-10) best accurately indexed my pain. I thought the request was absurd and, of course, in retrospect I never pointed the most pained face or highest number when I should have. In retrospect, my pain at times was a solid 12 on a 10 point scale, but I said "uh, six?" Why did I say that? What forces were at work to get me to impose a modesty on my physiological experience? What cultural filters were at work, there? What rhetoric of pain was I channeling?
Recently a dear friend had a child, and she hinted at the pain (and her husband's abject terror at seeing her in such pain). There is an experience---childbirth---that is routinely described as the most unbearable pain a (female) human being can experience. And yet, it is literally "the way of the world." Much as been written about that experience, and to some extent the pain of labor, but it's often couched in terms of the tremendous pain endurance of women.
Pain is a very curious and fecund area of rhetorical study, I think, and perhaps one place at which those in the social sciences and those in the humanities might collaborate. Those who work with numbers and those who work with words are faced with the very same impossibility: measuring and describing the experience of pain. So much cultural work has to be conducted at that intersection, the place where the body and language meet-up, in pain.
Fascinating.
I'm gearing up to watch the season finale of American Horror Story tonight, which is rooted in the experience of pain. So much of the show works to suture psychological and physical pain together; Violet, the goth-ish teenager of the show, is a "cutter." The narrative of her cutting has been that the physical pain she imposes on herself concretizes the otherwise ineffable psychical pain she experiences (which would indicated it is not "depression," at least in the clinical sense, but something else). Last week's episode was about the pain of childbirth. For some characters, death is a relief from pain, but for most of the ghosts, death is merely an other-worldly preservative: it not only concretizes pain in material finality, but spiritualizes it to the point that it "haunts" the living (and each other; the television show is interesting because the ghosts are very complex characters). Pain is the place at which the living and the dead come in contact, and the locus at which the living can commune with the dead. (Such a belief, I'm reminded, is central to exorcism; one encounters the demonic via his or her own pain---it's where the demon gets in, and it's where the exorcist encounters the demons of the one s/he's exorcizing, through the exorcist's own pain.)
When you suffer chronic pain, your senses for it in other places---Others' experiences---are heightened. You start to notice the "rhetoric of pain" everywhere, you start to see it as one of the central experiences of life, all these things we do to contemplate pain, and all these things we do to make it go away. It could be argued (as Freud and others have) that the experience of pain is sometimes courted, because there is some pleasure mixed up with it. I would say, however, that this pleasure is "rhetoric" getting in there, somehow, the word lodging between body and its representation. Words create a metaphorical distance; the death of the word---the dead word---is a strange palliative, a version of Plato's pharmakon, if you want.
Is the hydrocodone typing? One wonders. Paging Timothy Leary . . .
I am normally not one to get "addicted" to television programs, and largely because I don't have any "premium" channel subscriptions like HBO. Owing to my love of (psychological) horror, however, this fall I was glued to two series: The Walking Dead and American Horror Story. The former was not very good (except for the last episode of the "first half" of this season), so I'll reserve judgment. AHS, however uneven, has been, as a whole, a delight to watch: it's very different and one of the most perverted shows I've ever seen on television. Because of travels I missed out on a couple of episodes, however, last night I'm all caught up and ready for the season finale on Wednesday. Readers who have not kept up with the show or who want to start watching it (you can stream it on Hulu and the FX websites) should stop reading now, as there are some spoilers below.
First, let me talk about the perverted aspect, because it's an intriguing connection with haunting that I have not made before---or, at least, not in the context of televisual horror. Long time Rosechron readers will remember that perversion, from a Lacanian vantage, refers to a peculiar subject position (a brief recap of the Lacanian take on perversion is here) in which the subject has undergone castration, thereby experiencing alienation, but has refused or is incapable of separation. The idea here is that the pervert knows very well the paternal forbids a continued union with mama, yet still remains within the maternal zone, refusing to acknowledge "Mother" is totally other (or more technically, that she lacks something). What's so interesting about AHS is that this is the default condition of the "ghost"---the ghost is someone who has been killed (or killed herself), yet has not left the murmuring house. So, the "cut" from the corporeal has happened, but each ghost cannot separate, stuck in this strange liminal space that both knows "the law" is there and has been announced, but refuses to acknowledge it. There is a continuum of perversion too: some ghosts want to go but have to stay, while others seem content to haunt (Tate and Hayden---the most perverted pair). Nifty.
The show---what with the "rubber suit" logo and all that---deliberately confuses haunting with perversion, such that "horror" represents that point of "no return" when the law has been clearly transgressed (but with no "safe" word). Last night, one of the ghosts even mentions that haunting is "perverted" . . . .
I'm wondering what you AHS watchers thought about last night's episode, the show itself, and particularly your predictions for the season finale and season two. Here's my thinking (spoiler alert spoiler alert spoiler alert):
LAST WEEK'S EPISODE: I sorta figured Vivian might die (since they seemed to be writing her "down," as it were), but I don't know how she will now last or in what capacity, since . . . well, she's the least sad of the bunch.
Last week we had a weird wedding of Rosemary's Baby and The Omen: the birth scene was like the RB rape scene (same blurry techniques) and Tate is sort of line the soul of the antichrist whom Violet tells is, you know, evil (dis)incarnate.
And Constance---the campiest of all---emerges as a racist and homophobic bigot. Well, duh. She just gets better and better.
Now, if we go the Rosemary's Baby route, Ben will simply accept it and convince himself to stay with the ghost wife and raise the antichrist . . . well, not exactly sure. But it will not be until season 2 for Ben to kill himself---if he does at all. They've painted him as the most narcissistic character on the show, so he will be the last to go.
The show will conclude with Ben's decision to "take care" of the house and become it's living protector or whatever.
FINALE PREDICTIONS: The "folk" ritual worked, but not on Zach; Teddy got it.
The stillborn child wasn't really stillborn. The ether-addicted doctor gave it to his wife; they were putting on or something. Probably Ben's kid, but who knows. Still, baby ain't dead (yet). Likely scenario: Vivian will get the stillborn/newborn, which will be the ghostly counterpart to the living child. Perhaps the innovation here is that the ghost baby will similarly age, mirroring the living evil baby. Maybe the ghost-baby will emerge as a sort of savior in ghost-land.
One of the twins (presumably Tate's) is the antichrist or whatever, and the Pope thread will pick up in the season finale. This is where the Omen thread will pick up, Tate being the temporary body of Satan who must work through ghosts and shadows because direct contact with the living is not possible (so the living kid is Satan's son, a life for a life). The show will take a sort of religious twist, a la REC 2.
SEASON TWO PREDICTIONS: The second season will be about rearing the Antichrist/Satan in that house and about Vivian and/or Ben's indecision about whether the kid is evil or if he can be saved or whatever.
Connie Britton will get increasingly uncomfortable with how absurd the writing is going to get, and eventually they'll write her character out. I don't predict she will last past season 2.
Ben will go insane next. But he won't kill himself, since his narcissism is abject.
Constance will have yet more surprises to reveal. Of the cast, only she has let on that she knows what is going on (like when she screamed at "Tate: Do you know what you've done!"). Someone made a deal with the devil, I predict. Constance did, you know, to preserve Addy or what have you---and she is cursed, thereby, and so all her children go one by one or something. She is immortal or something--the price she has to pay for the bargain.
I'm taking a very brief break to report I head out tomorrow to the headquarters of a very, very, very large U.S. retailer to deliver a presentation on Halloween as a cultural practice. I'll share with leaders and executives what scholars know about the history of the "holiday," however, the bulk of my focus will be on the cultural and ritual elements of celebration. Some of what we'll be discussing I cannot talk about for confidentiality reasons (not sure what at the moment, but I'll be debriefed); still, I look forward to writing about it upon my return and sharing with my readers (most of you are buddies) my observations about the experience, since this is a pretty unique opportunity. Most scholars hang out on the "reception" end of the culture industries from a critical vantage; this is a rare opportunity to get a peek a the production side.
Reviewing films, television programs, and popular texts (papers and magazines) from the last century on the holiday has been very interesting. We see, from the 1920s to present, a number of transformations in how the holiday is celebrated, enjoyed, and experienced. Halloween used to be a kind of "independent culture" for young people to vent aggression (especially class-based aggression); it transformed post-war in the 1950s into a family-focused mediation (of adulthood and death); in the late 60s and 70s Halloween started to evolve what I'll simply term "two cultures": the innocence focused child-and-family culture, and the gory/sexy adult culture. Although my mind is not completely made up, it seems to me that today (based on what little empirical research there is and personal experience) these two cultures are in pretty stark tension: Halloween is increasingly becoming, on the one hand, an amplification of carnival (think Mardi Gras) for adults, while on the other hand, a forceful battleground for arguments about what constitutes a family. We see the latter beginning in the 1970s and reflected, not coincidentally, in Spielberg's E.T., which chronicles the successful "one parent" family vis-à-vis Halloween. The "trunk or treat" parties hosted by family-friendly groups (e.g., churches) also reflect the reinscription of the family concept as a protectorate; in 1982 it was still a "kids left to themselves" activity. Today, not so much.
I've noticed a lot of changes in my lifetime, too. I was a rabid monster fanatic, and Halloween was always (and remains) a big holiday for me. It seems like the "adult" culture has deepened substantially as I've grown older. These pop-up stores like "Spirit" did NOT exist when I was a kid, and if they had, I would have lived in them . . . .
I'm curious---as most of you are my age---if you've also noticed a similar transformation? Gore and sexy are on the rise, while "cute" similarly has ramped up for the child-centric culture. Halloween was always a big holiday for me, but culturally it seems to have grown in significance for the U.S. It's also becoming somewhat of a "battleground" for the culture wars---particularly in respect to "family values" or larger, socio-economic transformations of what a family "is."
Perhaps with age the rest of us can develop an organ for detail, a thing to help process the kind of observations that a born artist can whip up at a very young age (in word or image or dream). Even a facility with detailed observation, however, doesn't mean one can self-monitor with the same, "natural" or hard-won skill. This is the blindness of the artist and the nudity of the aged. One of us tends to put on more clothes, to dress more conservatively or, to echo a mentor of mine, to channel an advisor in gesture and deed.
[I put my hands to my chest, to relay a funny story, like my advisor does, and the gesture was a deliberate homage.]
Borrowed patterns are sure, even loving. Dresses made of meat are not.
Sort-of.
In our present posty predicament, there is little insight to the observation that we cannot occupy another's interior monologue without invitation. As a culture, we've learned "the Respect." You know? Yeah, you do. But I worry this kind of recognition, the respect of that non-psychical occupation, has created a strange brand of over-projection---that the more we have refrained from prying into the personal in "meat space," the more we start projecting the missing information, the information we crave, the very human information of intimacy, onto our stranger peeps in ways that fashion them into unwitting mirrors.
In other words: in Being and Nothingness, Sartre describes the cultural fantasy of romance was invested in "the look," and paints this disturbing scenario of two hypnotists battling it out in a padded room. That much is the Hell of the Other, the ceaseless drive to "know" the mystery of the other, the stranger . . . . In postmodernity, we have sluggishly (if not cynically, but I do not yet think we are cynical, or may be prepared now to argue we have never been cynical)---we have sluggishly given up the drive to understand mystery for "mad props."
How many Facebook friends do you have? Why?
Coupled with the almost complete evaporation of secrecy as the horizon of intimacy and the increasing necessity of distortion or honest deception (e.g., social networking personas), "the Respect" is becoming almost a kind of entitled self-reflection ("in your eyes" is no longer a wedding day metaphor for the depth of conviction in a stranger, but rather, where I see myself in the pupils).
I've been reading R. Crumb's illustrated Book of Genesis, which is delightfully disturbing, so God is on the brain (a very hairy God that looks like Heston, with a beard down to his knees and who has seen every B-movie). And I think God perhaps gets the worst of it: "Thank you so much, to the Firm, for voting me in for this award; I'd also like to thank God, who makes all things possible . . . ."