As everyone has probably heard in the news earlier this week, Pharrell Williams and Robin Thicke were sued for copyright infringement because their song “Blurred Lines” borrows so heavily from Marvin Gaye’s classic hit “Got to Give It Up.” The court ruled in favor of Gaye’s family. My friend and colleague Leigh over at the ReadMoreWriteMoreThinkMoreBeMore blog quickly and adroitly responded to this event by reflecting on the conditions of popular music. She somewhat disagrees with the court decision and public shaming of Pharrell and Thicke that followed by pointing out that all pop music involves appropriation of some kind or another. Therefore, we should be focusing our attention on the felony crime of piracy rather than wagging our self-righteous fingers at the misdemeanor crime of petty thievery. The more serious question for the courts is how new forms of digital distribution don’t always compensate artists for their work. I encourage you to follow the link to her argument, since I don’t think I’m fully doing justice to Leigh’s nuance and sincere love of music.
And I encourage you to read her argument because I am about to disagree with it, so we’re going to have a bit of a blog-o-sphere throw down.
But before I go on the attack (as is my Scorpio-nature), I want to give some respect and agree with a lot of what she’s saying. I was immediately reminded of Public Enemy’s song “Caught, Can I Get a Witness,” that begins,
Caught, now in court ’cause I stole a beat.
This is a sampling sport, but I’m giving it a new name.
What you hear is mine, PE you know the time…
Like hey, I’m on a mission, I’m talking about conditions….
And he goes on. And of course, what I like about Leigh’s blog post is that — like Chuck D — she is also talking about conditions. I would agree with the point that piracy is the more serious issue and that appropriation is inherent in all pop music. The point is not just valid but also essential. And in the case of hip hop, which is the most innovative form of appropriation and re-articulation, one of the conditions in the 1980s when Public Enemy produced that song was urban poverty, the brutality of Reaganomics, and street music (so who could pay royalties anyway?)
But I think Leigh’s argument is conflating different conditions and missing a key distinction. One of her examples is the many versions of “Mustang Sally,” which she cites to point out how musical innovation occurs, but in this case, everyone knows they are versions of the same song. So, obviously not a case of copyright infringement. Likewise, when hip hop DJs sample, everyone knows they are sampling. That’s why they are called DJs. Hiphop is somewhat of a feedback loop, the new song bringing renewed appreciation for — or even critical thought about — the old song. Similarly, when John Coltrane released his album, My Favorite Things, what was innovative (a term I prefer over the term “original” to describe art) about his jazz was that it changed how we think about older songs. Such is the nature of jazz, which may confirm Leigh’s point about appropriation and innovation, except that Coltrane is open about what and how he is appropriating. As for the case of Sam Smith’s borrowing from Tom Petty’s “I Won’t Back Down” for his song “Stay With Me” that has been cited as a case comparable to the Thicke-Williams appropriation of Gaye, it seems to me that there is a lot of manipulation of the audio of that YouTube mash-up in order to make the case (and to my ears, the songs don’t really sound all that similar), but more importantly, Smith openly acknowledged his debt and respected Petty. The important thing about all the examples I just cited is that these artists show respect and acknowledge their debt. And here we might recall Jacque Derrida’s conceptualization of “debt” for philosophy, as I have in previous blog posts about my own professor Marshall Grossman and about the Caribbean poet and philosopher Eduoard Glissant. Derrida points out that we read, write, think, and be through the labor of others, and that we are partially, but always incompletely, constituted by this relation.
The case of Pharrell and Thicke is significantly different from all of Leigh’s examples, because the issue is that they didn’t acknowledge their debt, even though they well knew that they were borrowing the riffs. A key point of evidence legally speaking is that they were fully aware of what they were doing, and Pharrell, of all people, should know better. They didn’t respect the industry and the culture. They blurred the lines. We can even contrast the case of “Blurred Lines” to Pharrell’s other hit song “Get Lucky” which is somewhat derivative of Stevie Wonder’s style (though not of any song in particular.) Hence, the highlight moment of last year’s Grammys was when Pharrell and Stevie played together. Jedi Masters of music both of them, keeping it real, showing due deference and mad respect, and it was glorious. That’s how you get lucky. Not by blurring the lines.
We can broaden this theoretical conversation beyond the legal issue of copyright to think about the music culture more generally. Consider the difference between Vanilla Ice and Eminem. Both white guys, borrowing riffs. What killed Vanilla Ice’s career? Not that he stole a beat without acknowledgement.That wasn’t it. Rather, he didn’t show deference to the black culture he was appropriating. In contrast, Eminem (like the Beastie Boys before him) is always careful to pay his respect to the culture he is working within, even as he worked to radically transform what hip hop could be. And so the older generation of black artists give him respect in return. Similarly, why are people pissed at Iggy Azalea? It’s not because she’s a blond white girl from Australia appropriating African-American culture. Contrary to what the popular media seems to think, her racial identity is not the issue. The issue is that she doesn’t acknowledge her debt and show any respect.
To get even more theoretical, we can look (again) at Michel Foucault’s famous essay, “What is an Author?” where he introduces the term discourse to talk about how the work of “Marx” goes far beyond any authorial intention or originality of Marx and becomes a discourse unto itself, within which other innovators can say and do things that are “Marxist” even if we might imagine the actual human being named Karl Marx disagreeing with those Marxists. (And since Marx tended to disagree with just about everyone around him, I think we can easily imagine that scenario.) More relevant to the case of Pharrell/Thicke/Gaye, I am reminded of an interview with Foucault included in the book Power/Knowledge where he was asked why he doesn’t cite Marx in his many writings. His response was that his work was so totally situated within the tradition of Marxist discourse that he didn’t feel the need to cite him — it would be like citing Newton about gravity. No scientist publishing a paper today would feel the need to put Newton or Einstein in their Works Cited page, because their work is foundational for the whole academic discipline. But there was something a little disingenuous about Foucault’s flip comment when the stakes of the Marxist tradition were so politically fraught at the time he was speaking. His comment in the interview doesn’t quite square with his own analysis of the author function and his conceptualization of discourse, which includes the changing conditions of copyright law, the circumstances of writing, and the institutional structures within which the writing event happens as well as the ideologies and politics of the situation.
All of these things are what Leigh calls in her blog post the “conditions” of artistic and intellectual production. However, to quote Kenny Rogers (or rather, Teddy Hill and the Southern Soul, or rather the writer Mickey Newberry, to give credit where credit is due), as I am kinda dropping in to participate in a conversation that she initiated, “I just dropped in to see what condition my condition was in.” Did Pharrell Williams and Robin Thicke?
I wonder what it’s like to be a prominent scholar with a name that so lends itself to puns that it almost serves as evidence for his own philosophical innovation. When I say “fish” out loud in the market, one might think “dinner” (and perhaps Friday dinner if one is Catholic), but when I say “Fish” in a class on literary theory, one might think, “Oh, that guy I read for my class, Stanley Fish.” And the avid reader of the New York Times, might even think, “oh, is he the guy who writes those snarky columns about higher education and famously pisses off both the right and the left?” If you’re a reader of my blog, you may recall my own angry reaction to a column he wrote way back in 2009, so for you, Fish might signify “that guy that Steve Thomas both teaches and makes fun of at the same time.” And the fact that the word F/fish might conjure up so many different ideas serves as a somewhat silly illustration of Fish’s philosophical point that all communication happens within specific contexts that guide interpretation. The contexts are social — and actively so — and therefore, he dubs them “interpretive communities.”
But how does this notion of “interpretive communities” in any way affect how we read? I usually wouldn’t waste space on the blog-o-sphere with a discussion of Fish’s theory, but it’s a snow-day, and classes have been cancelled, so I’m putting a few snippets of my usual Fish lecture on this blog today.
We can — and should — begin with hilarity.
The first time I ever taught Fish’s “How to Recognize a Poem When You See One” way back in 2006, the NY Times and other newspapers were giggling over a column published in the right-wing magazine The National Review about the “Top 50 Conservative Songs.” Since the list included songs by rock groups that were well-known to be anything but conservative (e.g., The Who, The Beatles, U2, etc., etc.), the list looks absurd, and many considered the list to be a desperate attempt by the conservatives to reconcile their ideological beliefs with their taste in music. The fall-out in the press reminded many people of Ronald Reagan’s attempt to use Bruce Springsteen in his re-election campaign and the many times liberal and left-leaning rock singers have sued politicians for mis-using their songs for political purposes. Democrats snickered to themselves that Republicans were obviously “bad readers” who didn’t understand the lyrics of the songs they listened to. We might jokingly title a response to The National Review, “How to Recognize a Conservative Song When Nobody Else Sees One.”
But we are begging the question of how one reads a song, so how do we? For example, Springsteen’s “Born in the U.S.A. ” might seem to be a patriotic song if you only listen to the refrain, which is why it has been used quite often in political campaigns to rally patriotic feeling. However, the rest of the lyrics tell the sad story of a Vietnam War veteran who can’t get a job — a chilling and dark critique of American culture. The meaning of the refrain is obviously ironic (tragically so) in light of the verses, and the juxtaposition of the anthemic chorus with the dark lyrics is the ironic tension that makes the poetry of the song work, the dark lyrics in poetic tension with the anthemic music. One might say the same thing about John Mellencamp’s “Pink Houses” where he says, “Ain’t that America, home of the free, little pink houses for you and me” in which the sarcastic muttering about pink houses questions the patriotism of the previous line. Hence, if one were to perform such a formalist “new critical” reading of the songs that I have just performed, which focuses entirely on the literary devices that create ironic tension, one would not be reading the poems ideologically. The liberalism or conservatism of the song wouldn’t be the point. But there are other ways of reading. Those who were merely die-hard fans of the artists interpreted the songs in an entirely different way, as part of the biographical record of the artists’ lives, so that each song was read through what the fan knew of the “life and times” of the artist. Knowing the political commitments of the artists means you know the message of the song. Their response to the National Review article was simply “how dare you say that about our beloved… [whoever]… whom we know so well.”
I have questioned the ideological reading of pop songs elsewhere in a blog post a few years ago about Nicki Minaj and Fun, but my real point here is to contrast “new critical reading” that emphasizes the rules of poetry and Fish’s reading that situates them within “interpretive communities” and social contexts. Because, contrary to liberal snickering at Republicans for so totally misunderstanding rather simple rock songs, I’d like to suggest that the “Top 50 Conservative Songs” might have a point if one begins with a different interpretive framework for evaluating songs. In the wake of the New Deal and an increase in government programs for the poor, the definition of conservative would include an opposition to government, taxes, etc. Any song that repeated that political “refrain” (pun intended) was furthering the conservative agenda, and therefore within this interpretive framework, the intentions and biography of the individual artist are irrelevant and so are the subtle ironies of each song conjured up by the dialectical counterpoint between the thesis of the chorus and the antithesis of the verse. The “refrains” of the songs repeat the “refrain” of a conservative political agenda, and in the social context of a party or political rally, that’s good enough. In fact, it might be the liberal attachment to a nostalgia about 60s social movements that interferes with their reading of a specific song’s content, and perhaps they are also ignoring how most people actually experience music (i.e., they experience it in a context.)
I think it’s easy to see different modes of reading when we are talking about pop songs — modes of reading that are new critical, biographical, ideological, and socially situated — and hopefully it’s not too hard to see that similar modes of reading might influence how we read poetry in a classroom. The point of Stanly Fish, however, is not to argue a relativist position where any interpretation of the song is as good as any other. Rather, he is suggesting that each “interpretive community” is using a set of rules for evaluating the poem, and within that community, reasonable people can revise and adjust their position, so that upon hearing further evidence, one might change one’s mind, so long as the new evidence fits the rules of interpretation. For instance, one might easily change one’s mind about Mellencamp’s “Pink Houses” being a conservative song (given the sarcastic irony of the lines), but maintain that his other song “Small Town” is conservative, whatever Mellencamp’s intentions may or may not have been. It is important to remember that Fish is allowing for people to realize that they might have been wrong and revise their interpretations in the face of evidence. After all, most conservatives would readily admit the folly of that list of 50 songs in the National Review but still enjoy them.
How does this change our own critical strategies for reading a poem. Recognizing the social context within which poems are written, read, performed, and quoted can help us listen carefully to how a poet might play and tease that context to produce new results. If we agree that interpretive communities carry with them certain expectations and cultural norms, then we can also agree that a poet might intentionally toy with those expectations in order to produce emotional and political transformation. They might actively address and seek to affect “interpretive communities” that are complex and include multiple viewpoints. What is, in fact, missing from Fish’s fishy theory are two things: (1) how the norms of any social situation are possibly self-contradictory and multiple, and (2) how and why change happens (change of culture, change of mind, change of feeling, etc.) As for the first, every college student knows that the expectations of being a college student are to have fun and study hard, and that these two expectations conflict with each other, hence causing all interpretations of daily situations to be a negotiation between such conflicting demands. Thinking about this in a larger social context, a Marxist would note the inherent ideological contradictions within the interpretative strategies of much literary criticism. As for the second, from a historical perspective, Fish never gives us an analytical tool for accounting for why the poetry of the twentieth century looks so different from the nineteenth or the eighteenth (and this is precisely the sort of question that motivates scholars such as Raymond Williams to look for the political, economic, and social realities that push poetry in different directions.)
Let’s come back to Fish’s argument and see how his theory might help us actually read a poem, which is something he doesn’t actually ever do in his essay entitled, “How To Recognize a Poem When You See One.” Instead, he begins with something that’s not a poem but just a list of names on a chalk board, and he is able to get his students to perform a reading of that list as if it is a poem. For him, this proves his point about interpretive communities (which embeds the reading of literature in social contexts — an approach that makes sense to me, since I would agree with Fish that no poem has ever in the history of the world been composed or read outside of such a context.) But what if we began with a poem that kinda-sorta isn’t a poem, except that it is? Let’s take one of the most famous “poems” in American literature by William Carlos Williams that, when you read it out loud, sounds more like a note that your room-mate or spouse left for you on the kitchen table.
This is just to say
I have eaten
that were in the icebox
you were probably
they were delicious
and so cold
The poem follows no conventions of poetic verse — no rhyme or meter, no condensation of speech onto dense literary devices such as metaphor, synecdoche, allusion, or irony. As scholars have observed, some critics have tried to impose a meaning onto the text through their own idiosyncratic interpretive lens, but in many ways the poem resists such attempts at reading a deeper meaning into it — it’s almost hardly a poem, but rather a statement that is simply arranged on the page to look somewhat like a poem. The total lack of poetic devices and plainness of the language is surprising in its richness. What I might suggest is that what makes this poem so enjoyable is precisely that it playfully pushes us to revise our sense of why we like poetry. The poem also conjures up a somewhat idyllic domestic situation that draws attention to the hidden beauty of ordinary, middle-class life. The almost anti-poetic quality of the lines that seems to belie our expectations for poetry plays with a desire for simplicity. The language mirrors the content. Such desire for the plain and simple could be read in either a political context (i.e., the complexity of class conflict, considering the poem was composed in the midst of the Great Depression) or a classroom context (i.e., students exhaustion with the complex poetic style of modernist poetry at the end of a semester.) My point being, however one might criticize the poem, it is clear that one can only read and appreciate the poem in light of the context of literary history and the socially conditioned practice of reading poems with which the poem plays. My interpretation of the poem in some ways follows the approach of Stanley Fish to attend to the social context in which a poem is read — a social context that includes a consciousness of literary history — but revises Fish’s approach somewhat to focus on the playfulness of writing and the possibility for it to deconstruct itself in a way that is transformative of the interpretive community and our culture. After all, poets are fishing for readers, and the hook that catches us may be precisely the thing that pulls us out of the static that we feel is so much a part of our social context.
Two weeks ago, somewhere up in the air, mid-flight from Sri Lanka to the Philippines, Pope Francis announced his intention to canonize Junipero Serra. You can read a transcript of his in-flight statements on the Pope’s website. As all Californians well know, Father Serra was the Jesuit priest who established the many missions along the west coast, after which many of California’s cities are now named. The purpose of the missions was to evangelize the diverse Native American nations that densely populated the region while, at the same time, Spain conquered the territory. As a child growing up in southern California, I remember visiting the mission at San Juan Capistrano, famous for the many swallows that nest in its quaint and well-preserved colonial-era buildings. I also remember, some time around the third or fourth grade, my entire class had to build little mission models out of playdough. Serra’s legacy is quite controversial, since the European occupation of California meant enslavement, displacement, rape, and death on a mass scale for the indigenous peoples there. The Catholic Church’s official position admits the atrocities committed during the colonial era but maintains that Serra’s role was benevolent. Others argue that the missions were instruments of colonial violence. The Pope’s declaration surprised even Serra supporters, provoking a range of opinions, from the happy response in the Catholic News to the angry response from Native Americans published both in the on-line Native American magazine Indian Country and the Los Angeles Times.
One remark in the Los Angeles Times op-ed struck me, and in this blog I want to think a bit more on its point. The author, Karin Klein, wrote, “Because the missions mixed different Native American groups together and forced all of them to give up much of their cultural identity, many of these groups cannot meet the requirements of continuous cultural and geographical identity required to be federally recognized tribes, with the many benefits such recognition bestows.” The key question here is the politics of identity and the complex legal apparatus that has grown up around it. The Bureau of Indian Affairs insists on the distinctness of separate Indian tribes or nations and the criteria for identity that include blood quantum, but the practice of colonial violence had the opposite effect, either codifying “Indians” all together as a single racial category or attempting their total erasure through genocidal strategies. The situation for some Native Americans is a paradoxical double-bind involving contradictory policies and articulations of cultural identity (e.g., on the one hand racial, but on the other hand tribal/national.) In this context, the question of identity and the politics of Serra’s legacy are very complex.
Coincidentally, about a week or so before the Pope’s announcement, during my winter break, I happened to read two things relevant to the op-ed in the L.A. Times and the debate about Father Serra’s canonization. One is a PhD dissertation in psychology on “Tribal Enrollment, Blood Quantum, and Identity among the Confederated Salish and Kootenai Tribe of Western Montana,” written by Kimberly Nenemay, who is a member of the Salish and a practicing psychologist in New York. The other is a new work of cultural theory, Habeas Viscus: Racializing Assemblages, Biopolitics, and Black Feminist Theories of the Human, published last August by Alexander Weheliye, a professor of African American studies at Northwestern University.
For her study, Dr. Nenemay interviewed nineteen women, all tribal members and all of them mothers, about how they felt about their families, their tribal affiliation, their fluency in the Salish language, their experience on the reservation, their commitment to both Christian and Native religious practices, and ultimately their feelings about their identity. The interviews and individual stories of these women provide considerable evidence for the point made all too briefly by Klein in the L.A. Times. Significantly, while the study was taking place, the Confederated Salish and Kootenai Tribes were in the midst of an intense debate about the criteria for membership, which they put to a formal vote in January of 2003. At that time, and still now, one must demonstrate at lease “one quarter blood quantum” to be legally considered a member. At issue in the debate was families who are “split” — one sibling eligible for membership while another sibling is not. Complicating the issue further was the question of inheritance, since in some cases, the children of the women were not eligible for tribal membership, in spite of how active the individuals may be in the culture of the community. Since membership confers several privileges and benefits such as the ownership of tribal land, access to health care, and participation in tribal governance, individuals often had complex thoughts and feelings not only about their own identities as tribal members but also about the identities of their children. As the study quoted the nineteen women at length and narrated their responses to a variety of questions, it richly illustrated the diversity of views that exist on that reservation and most strikingly the ways in which some women were thoughtful and self-conscious about having contradictory ideas and conflicting emotions. Some women considered “blood quantum” to be in some ways an important and in other ways an unimportant aspect of their identity; some also noted its effect on their relationships with potential husbands and how they raised their children. Almost all had experienced some form of discrimination and questioning of their identity both on and off the reservation as they navigated the various opportunities and challenges presented by their multiple identities. One example of this discrimination that came up repeatedly was the abuse and condescension from the Catholic missionaries and the white teachers at the mission schools. Complicating the question of identity still further were individuals whose ancestry may include different Native American tribes, so while they may be more than half Indian in terms of blood quantum, they are less than a quarter for any specific tribe and consequently do not qualify for membership.
In terms of what all this means both for clinical practice and for public policy, the conclusions Nenemay draws from the study are disappointingly brief and incomplete, but important and deserving of more critical thought. Psychologically, we see that identity is a complex assemblage of legal definitions about blood quantum, access to benefits, cultural practices, and family relationships, so that each individual may have very different experiences of their identity. (The term “assemblage” is mine, not Nenemay’s, but it agrees with her argument about the social construction of identity, and I will explain its implication further in the next paragraph.) Politically, we see the effects not only on individuals but also on the social fabric of the community, and so Nenemay suggests that, however the tribe might ultimately vote on blood quantum, they should also address the psychological and social effects of this ruling, particularly with regard to the feeling of solidarity among both members and non-members as well as to the sense of security and personal options. Historically speaking, we see the contradictions of U.S. government policy, most especially the Dawes Act of 1887. After forcing Native Americans onto reservations, this act both undermined tribal governance and opened up that land to the possibility of white settlement by insisting that plots of land be privately owned by individuals rather than community owned — and therefore, able to be bought and sold. The obvious intent of the Act was to not only to encourage individuals to sell their land to white settlers but also to undermine political solidarity and create the economic conditions of land scarcity. It is this condition of scarcity (later including not just land but also health care and other government-subsidized programs) that is one of the motivating factors behind the “blood quantum” rule. Significantly, before the Dawes act, Native Americans understood their identity primarily in cultural terms, and it was common for tribes to adopt individuals from other tribes (including European, African, and Asian tribes.) But since the 1930s, the federal government has mandated that tribes define their membership in terms of blood.
It seems to me that, in a curious and significant way, the blood-quantum mandate actually correlates with the fungible-property mandate in the Dawes Act by linking blood to soil and by creating a condition that would motivate the confederated tribes to protect their limited assets by limiting membership even as individuals within the tribe might sell their land. From all of this, we can see the contradictions of American ideology and the conflicting demands of its policies. The American ideology of a universal and inclusive democracy of citizens grounded in private property is undermined by a land-grabbing territorialization of Native land that racializes “Indianness” as the antithetical “Other” of its own universal democracy and in doing so strictly defines the tribe in exclusive terms according to blood quantum. As a result, because the integrity of Native communities is threatened (literally “under siege” and in effect deterritorialized from its earlier forms of life), tribes have resorted to blood quantum and other political and cultural tactics to preserve and reterritorialize their identities in new, innovative ways.
Here I want to argue that the racializing assemblage of American policy is not simply something we can address through some sort of sensitivity to cultural difference. Rather, the implications of Nenemay’s study go far beyond what she is willing to say, because what we see between the lines of her many interviews are the ways in which there is a profoundly deep and paradoxical relation between liberal capitalism and the celebration of “cultural difference” that is the hallmark of multiculturalism. At the heart of the conundrum of “blood quantum” and tribal identity, as we see from the effects of the Dawes Act and the machinery of the Bureau of Indian Affairs, is the paradox of capitalist property relations (including the many opportunities for jobs off the reservation as well as the artificial scarcity of resources on the reservation) and the identity politics around which minority populations galvanize in their struggle to survive. In other words, the conquering nation (America) can position itself as (theoretically) inclusive (anyone can be American), but the oppressed peoples invent new forms of exclusivity as a tactic of self defence, and as such, Native culture shifted over the course of the twentieth century from an adoptive, welcoming one to an exclusive blood-line. Beneath this dynamic, we can surmise that both the American and the Indian articulations of cultural identity are in some ways a mystification of the reality of property ownership, power relations, and the commercial expropriation of value from bodies.
Coming back to the Pope’s canonization, the question of whether Junipero Serra was a good guy or a bad guy is clearly too simplistic, but he most definitely was a cog in the violent, territorializing colonial machine and its racializing assemblage. However, his function was not simply vertical (the territorial conquest and deterritorialization of the Indians) but also horizontal (the innovation of new forms of life and community, a reterritorialization of Indian identity, and a hybridized Indian-Catholic culture.) Such horizontal connections are in some ways empowering for some Native Americans, giving them the tools to survive in the new political order, even if such connections are part of the same vertical structure that is oppressive. How might these terms give us a framework for analyzing the Pope’s decision to make Father Serra a member of the canon of saints. Normally a saint is canonized for performing miracles and introducing something new (a new “cult” or culture) to the church, but Father Serra did not perform miracles and is considered in church law to be a case of “equivalent canonization” which means that the Pope is acknowledging in hindsight something that already exists (an enduring “cult” or culture). In Serra’s case, I can only guess, the Pope is acknowledging the long term and lasting cultural effects of his evangelism. It is of course, also precisely these enduring effects that many (including myself) find so troubling.
As you might have guessed already from the words that I am putting in bold italics, I want to make the case in my blog for an engagement with the concepts developed by the professor of philosophy Gilles Deleuze and the practicing psychoanalyst Felix Guattari in their work of creative philosophy, A Thousand Plateaus. This is one of the works of philosophy that Dr. Weheliye both uses and critiques in Habeas Viscus: Racializing Assemblages, Biopolitics, and Black Feminist Theories of the Human.
Weheliye’s project in Habeas Viscus is to engage in a full-frontal critique of western philosophy by taking on three of the most influential philosophers for his discipline of cultural studies: Michel Foucault, Giorgio Agamben, and the dynamic duo Deleuze and Guattari. His argument is that the work and methodologies of these philosophers have been adopted by the American academy as if they are universally applicable to all situations. As he glibly remarks, “judging from the writings of Deleuzians, once you’ve had D&G, you never go back” (referencing the well-known sexual joke that once you’ve had black, you never go back.) In contrast, the work by black feminists scholars such as Hortense Spillers and Sylvia Wynter have been viewed as “particular” (rather than universal) and as such have been academically ghettoized as pertaining only to “black studies” or “women’s studies.” Weheliye’s argument is that the “racializing assemblage” is not peripheral or secondary to the work of Foucault, Agamben, and D&G, but actually central to it. What’s more, their work ought to be critically challenged and revised through the work of Spillers and Wynter to better account for the ways in which the marginalized and oppressed have productively engaged in the sort of cultural innovation and politically alternative forms of life called for — but never fully articulated or imagined — by Foucault, Agamben, and D&G. If you want to listen to him talking about his book, click here for the podcast on the Archipelgo Project.
His book is a tough read, and at this point, I don’t have time to work out the sections of his book that critique Foucault’s concept of “biopower” or Agamben’s concepts of “homo sacer” and “bare life,” so I will focus on D&G’s concept of the “assemblage” and how this all relates back to Nenemay’s study and the Pope’s decision to canonize the missionary Junipero Serra. For D&G, in chapter four of Thousand Plateaus that critiques structuralist linguistics, the “assemblage” (or agencement in French) is the relational arrangement of bodies, things, actions, representations, and speech in which different elements coalesce and recede in both productive and destructive ways. From my summary of Nenemay’s study, I think one can see how “identity” is actually an assemblage of legal, familial, cultural, and political structures that manifests itself in multiple ways. But merely pointing out that identity is culturally constructed through such assemblages doesn’t offer much insight into the contradictory ways in which that construction emerges and what effects it produces. D&G further analyze the working of this assemblage. Looking at the “horizontal” (or non-hiearchical and somewhat anarchic) connectivity of such assemblages and how they work in productive ways, D&G invent the phrases “machinic assemblages” (the intermingling of bodies, actions, and desires) and the “collective assemblage of enunciation” (the speech acts and representational communication produced by and yet also a part of the machinic assemblages). Their revision of structuralist linguistics posits language not as a superstructural representation of things via metaphor and metonymy but rather as one segment in a series of other segments and relations. But such assemblages are also subject to “vertical” (hierarchical) orderings and even violence, which they call territorialization, deterritorialization, and reterritorialization. As I aimed to illustrate earlier, one could consider Father Junipero Serra’s evangelism in relation to both the vertical territorial conquest and the horizontal openings of new identities and hybrid forms of culture.
Working from the perspective of black feminism, Weheliye approaches D&G in two ways, first by observing how essential and problematic the question of race is in their work — something that is often overlooked by the academy, he argues — and second by criticizing their all-too-easy celebration of racial mixture and innovative hybrid forms of culture, which doesn’t fully account for the structures of racial violence. (As a side criticism of Weheliye, one thing I want to briefly mention about the chapter of Thousand Plateaus in which D&G analyze the assemblage, which is also the chapter that Weheliye quotes the most from, is that the two primary examples in that chapter are the work of Kafka, a Czech-speaking Jew writing in German, and the ways in which “Black English” has completely transformed mainstream English. It is somewhat strange that Weheliye ignores these examples, since they would seem to be the perfect test cases for his critique of racializing assemblages, but perhaps he leaves those examples out deliberately because acknowledging them might risk deflating the excitement over his rhetorical gesture that his book offers something new and that D&G don’t talk as much about race as they should. Whatever his motivation, the absence is odd.) Hence, by critiquing the “machinic assemblage” concept as a metaphor that is somewhat too fast (and perhaps too sci-fi), as if segments of society can so easily attach and detach from other segments of society like a machine, Weheliye revises D&G’s machinic assemblage to call it a “viscous assemblage” that emphasizes its slow viscosity and its fleshy aspects. The title of his book, Habeas Viscus literally translates as “you shall have the flesh” (in contrast to the legal doctrine of habeas corpus — you shall have the body). The flesh (viscus), its scar tissue that develops from wounds of centuries of racial violence, but also the ways in which flesh organizes itself: for example, the ways in which the mothers interviewed in Nenemay’s study might find meaningful connection with each other and their children that are perhaps more complex and more fleshy (habeas viscus) than the territorializing terms of multiculturalism and even the study itself, which was a series of interviews with individuals, rather than groups, that individualized the subjects and deterritorialized them from their subcultural connectivity and insisted on a territorial mapping of social relations in terms of multicultural identity.
If I had time, I would now try to offer a conclusion or say something clever about a novel or a poem by a Native American, perhaps Sherman Alexie’s famous novel, Reservation Blues, or the critically acclaimed movie based on Alexie’s writings, Smoke Signals (considering that the Spokane Tribe that Alexie belongs to and writes about is also Salish speaking, like Nenemay’s tribe), but my blog post is already a lot longer and more convoluted than I intended it to be, so I will stop here.
Earlier this week, the college where I teach honored me and one of my colleagues with “diversity action” awards in recognition of our contributions to campus life. My contributions this past year included, among other things, mentoring the black women student organization, helping to resolve a racially sensitive controversy on campus, and inviting guest speakers of Jamaican and Oromo ancestry to engage our students, not to mention, of course, the content of my scholarship and teaching. My colleague’s contribution was to raise awareness about the diversity of disability and multiple forms of intelligence by involving his students in collaborative projects with individuals with disability. I am humbled by the recognition, as I’m sure my colleague is too, but at the same time, it provoked me to think more deeply about the meaning of the word “diversity” and whether or not that’s how I conceptualize my aspirations. Given that I fancy myself a provocateur, wielding my weapons of sarcasm, irony, and dialectical reasoning to get my students to think outside the box, I came up with some other words to help me imagine an alternative to diversity, such as alterity and adversity, and even made up a few words that don’t actually exist, such as obversity and subversity. These made-up words might be translated as challenging, counterpoint, oppositional, dissenting, or insurrectionary.
When did the word “diversity” became so much a part of American culture that it became the unquestioned ideal — so unquestioned that it is now an essential component of college curricula throughout the country (the “D” requirement)? It certainly wasn’t the case in 1980 when Congress was still debating whether to celebrate Martin Luther King, Jr. day and when most colleges required students to take not just one, but two, classes on the so-called “classics” of “western civilization.” A radical transformation took place in the 1990s, and consequently now most colleges require some form of “diversity” in their curricula instead of the old “western civ.” At the same time, corporations such as General Mills and Target began making “diversity” part of their corporate missions. Similarly, the Republican Party’s campaign strategy that once upon a time focused on what it called the “silent majority” (code for “white men”) under Presidents Nixon and Reagan changed significantly under President George W. Bush to become more ethnically “diverse,” especially with his famous attempt at a campaign speech in Spanish. Perhaps the most extreme example of what I’m talking about is something I just heard on the radio yesterday morning, that even some Klu Klux Klan members have recently begun re-branding their organization as “diverse” (according to this story on NPR.) Listening to the radio story, one can’t help but hold one’s chin in hand and wonder, if the Klan is no longer “racist,” then who is?… Is it not totally bizarre that two such different organizations as a liberal arts college and a white-supremacist organization might appropriate the same word to describe their aspirations? Language is a funny thing, as the meaning of words sometimes changes, used metaphorically to apply to so many different phenomena that they almost seem to lose meaning altogether.
But words matter. As teachers and college administrators, we use them to organize our classes. Museum curators use them to organize their exhibits. Corporations use them to organize their advertising as well as their hiring practices. Politicians use them to organize their campaigns. If diversity is the name of the game, then one can expect an effort to reach-out to different… um… different what? A long list of identity categories, both the sort of identities that I might proudly give to myself as well as the identities that others might maliciously or ignorantly give to me.
To illustrate my point, let’s take something really ordinary and commonplace such as the phrase that’s on the back of our dollar bills, e pluribus unum (translation: “out of many, one.”) The phrase might be interpreted by some to indicate that Americans have always celebrated their diversity, but this would be false. The phrase originally indicated the unity (not the diversity) of several states (the original 13 colonies) coming together to achieve an independent federal government with a centralized bank that could issue the paper money upon which the slogan is printed. As everyone knows from their high school history class, the unity of the “unum” in the e pluribus unum and the banking system that it subsequently unleashed was intensely debated in the Federalist Papers and the Anti-Federalist Papers. But since then, it came to be used to mean something else, something vaguely about who and what America is. Some have interpreted the e pluribus unum through the metaphor of the melting pot, which is a curious phrase, often applied so widely that the metaphorical sense of “melting” is scarcely noted. After all, it takes considerable heat to melt something. The origin of this well-known phrase is chapter three of the American literary classic, Letters from an American Farmer, written in 1782 by a French traveler to America, J. Hector St. John de Crevecoeur, who imagined how the American experience would melt the (metaphorical) metal of various European ancestries into a new alloy in which the flaws of European culture are burned away. Over a century later, in 1908, Israel Zangwill wrote the play The Melting Pot, which imagines all European immigrants giving up their old cultures and becoming completely Americanized. There is something a little violent in all this melting.
What I think is fascinating is that people often use the e pluribus unum and melting pot metaphor for diversity when in fact this metaphor actually meant the exact opposite. The opposite of diversity, after all, is unity and sameness. And in the melting pot, all difference is melted away to transform individuals into a new man with a common destiny. Hence, in response to the “melting pot” metaphor, the “salad bowl” metaphor was coined to imagine ways that everyone can keep their unique cultural identities but still be American or something, whatever the pot or bowl is assumed to be. For instance, if we take the pot or bowl to be the “university” where I work, then what assumptions do we have about the “uni” that unites the diversity of its students and its curricula?
What is apparent in the e pluribus unum formulation is a rather ironic dialectic of opposite meanings. On the one hand, it means celebrating our differences (you’re OK, I’m OK, hip hip hooray). On the other hand, it means abandoning what makes us different and embracing what makes us the same. If we question “diversity” by drawing attention to this dialectic, does the term begin to unravel as a contradictory ideology that says one thing (a celebration of difference) but actually, underneath, implies its opposite (an assumption of sameness, or, at least, the assumption of a patriotic nationalism)? What seems to enable a celebration of diversity is the belief deep down that our differences ultimately don’t matter (materially speaking) quite as much as the sameness that binds us (whether we like it or not) to our situation. For example, whether you are black or white, Christian or Buddhist, you still have to pay your taxes and buy your bread with money issued by the Federal Reserve Bank (rather than by bartering, for instance.)
Point being, the assertion of “the thesis of diversity” implies its own antithesis (or obverse). That’s the dialectic. In other words, when Walt Whitman hears American singing about all of our infinite multiplicity (to cite a famous poem), it’s possible that his poetic verses of universal diversity signify also its problematic obverse within a universe filled with adversity in such a way that might reverse the subversion originally intended by his revolutionary verses against (or versus) an oppressive and limited world…. Or so thought Langston Hughes in the subtle reversals that he made through his allusion to (or re-versification of) Whitman’s verse.
Can we think of other words to use? Perhaps alterity rather than diversity? When we think of cultural difference through the conceptual lens of alterity rather than the conceptual lens if diversity, we begin to imagine alternatives to the status quo and changing things for the better (revolution, transformation, transfiguration, or simply an openness to experience and to difference) — the goal being a better way to be.
What if we organized an award or a class curriculum or a museum around adversity rather than diversity… or around alterity rather than identity?
Beneath the dialectic of diversity and unity (or difference versus sameness) that structures the ideology of our educational system, perhaps we can think beyond the Freudian “id” of identity and its symbolic language of diversity. How can we subvert the hegemonic symbolic order that dominates our imagination in a way that reveals the true beauty of our reality? As we critically reflect on our roles as teachers and students, perhaps we can begin to imagine ourselves studying at a “subversity” rather than at a nominally diverse university.
The first provost of the subversity in my imagined historical drama might be the “sub-sub-librarian” who begins Herman Melville’s novel Moby Dick.
This past Wednesday, the Los Angeles Times interviewed anthropologist and author of Pink Globalization: Hello Kitty’s Trek Across the Pacific, Christina Yano, on the occasion of a new Hello Kitty exhibition at the Japanese American National Museum. In that interview, Dr. Yano remarked that the creator of Hello Kitty, Sanrio, was emphatic that Hello Kitty is not a cat; rather, she is a girl named Kitty White…. Who knew?!?!…. The next day, the blog-o-sphere was full of attempts at witty rejoinders such as the Huffington Post’s “Hello Kitty Is Not a Cat because Nothing Makes Sense Anymore” and Jezebel’s “Wait, Hello Kitty Isn’t a Cat?” And then the following days, there quickly appeared some humorous responses to those initial responses, such as The New Yorker magazine’s “The Truth about Hello Kitty,” Kotaku East‘s “Don’t Be Silly, Hello Kitty Is a Cat” and a cartoon riffing off the famous “Ceci n’est pas une pipe” (This is not a pipe) painting by surrealist artist Rene Magritte that inspired a playful meditative essay by the philosopher Michelle Foucault on the nature of language and representation.
But I guess I have a different take on the truth of Hello Kitty’s cattiness. It’s not just about anthropomorphism, personification, fetish, totems, etc., etc. These literary and anthropological terms certainly indicate what Hello Kitty is (the relative truth of her identity), but they don’t really get into what Hello Kitty does — the how and why of the happening, the adorable cuddles, the group-hug of commodified cute, and the truth of becoming a body without organs through the animal.
Here’s the thing: Kitty White has no mouth, and yet her existence is bilingual.
She is both Hello Kitty and haro kiti, an English girl represented by English words within a Japanese world. Her face, resembling a blank wall with two black holes for eyes, seems to utterly lack emotion or expression and yet at the same time also seems to be pure affect, pure emotional attachment. It is nostalgia for a memory of childhood that never existed, a childhood before mouths, before feet, before hands, and most of all before identity and definition. Paradoxically, it is through the identity of the cat that one achieves a a blissful non-identity, transcending the Japanese-ness of the product and the English-ness of the backstory through the truth of Hello Kitty’s reproducibility and potentiality of desire. Children never want just one Hello Kitty product. They want a whole pack of them, and a whole pack of friends who have them too.
Hello Kitty was created in 1974, the same year my sister (who adored Hello Kitty when she was a child) was born, and just one year after Foucault published This Is not a Pipe, and just two years after Gilles Deleuze and Felix Guattari first theorized the body-without-organs in their critique of psychoanalysis, Anti-Oedipus. And we can easily imagine Foucault, Deleuze, and Guattari drinking wine together, watching their children, and taking a delight in Hello Kitty as a playful image that undoes its own meaning — neither the cat that reveals the cattiness of the human nor the human that reveals the humanity of the cat, but the body without organs that resists signification and opens up the possibility for other, global attachments, ways of configuring our desire, becoming part of a capitalist machinery. In their later book, A Thousand Plateaus, Deleuze and Guattari describe the body without organs as a becoming animal as the human experiments with the possibility of what a body can do not by animalizing their humanity or humanizing the animal, but by becoming an abstract machine with the potential to become… whatever. They also describe the most elementary drawing of a face as a “white wall/black hole system,” and anyone who has ever doodled knows this to be true. Hello Kitty’s face is not a form of Kitty White’s subjectivity or true, inner self-hood (whether a cat or a girl), but rather, just like Magritte’s pipe, it is an abstract signifying machine whose significance is its own redundancy.
Obviously, Hello Kitty is a machine.
After posting this blog post, I began doing my laundry, and doing my laundry always gets me to thinking about what I was doing before I started doing the laundry, and at that moment, I realized that I had forgotten to include in my blog this bit of evidence of the machinic assemblage of desire that is Hello Kitty and other cat-people (e.g., the DC comic book character Catwoman or the Japanese manga character Doraemon, the robot cat) — the recent popularity of Shark Cat.
Yesterday, the ubiquitous “Book of Faces” (FB) and other social media were all atwitter over something the well-known author and theorist bell hooks said at a three-day event about race, gender, and body-image hosted by The New School in New York — everyone’s favorite pop star diva Beyoncé a terrorist?… A terrorist?!?!… Bloggers from wannabe-hipster gossip sites such as Gawker to fashion magazines such as Elle to feminist sites such as Jezebel immediately jumped on the bandwagon, probably hoping that the provocative headline would gain for them that ever-so-ethereal cyber audience.
Before I had a chance to actually read the blog-o-story, when I first saw the headline, I wrongly assumed that bell hooks was giving Beyoncé a compliment, as I immediately thought of Beyoncé’s video: “Run the World (Girls)“:
One might think of other videos of hot feminist badaaasssery such as Beyoncé’s “Superpower” and M.I.A.’s “Bad Girls” which all present images of women kicking ass and taking names, Pam-Grier style. So, this is what I thought bell hooks was talking about, and I was all, like, “hellz yeah!!!”
But then I started to read the articles, and apparently that’s not at all what bell hooks meant (oops, my bad!) Rather, she was raising questions about Beyoncé’s style of feminism and her tactical deployment of a hyper-sexualized body as potentially damaging to the self-image of young girls (specifically in reference to Time Magazine’s featuring her in a bathing suit on the cover of their issue about the world’s 100 most influential people.) In other words, what I think bell hooks was referring to is the very real “terror” that young girls feel when confronted with bullying from their peers regarding the way they look. I did not finish reading any of the blog posts because they were all so shallow and mean-spirited, and they all reminded me of the SNL skit about the government Beygency that hunts down a poor schmuck for committing the party foul of admitting he didn’t love everything about Beyoncé:
Can I admit that I don’t love everything about her either? But I do think she’s pretty awesome. And I also think bell hooks is awesome. So, given that I think both women are awesome, instead of reading the blog-o-crap any further, I skipped ahead to the video of the actual conversation (the “actual” always being so often remarkably different from what bloggers and journalists say to get a rise out of their audiences), and my wife and I had a very enjoyable evening listening to four awesome women — bell hooks, Janet Mock, Shola Lynch, and Marci Blackman — discuss among themselves and with a very engaged and intelligent audience a variety of complex and personal perspectives about gender, sexuality, race, violence, body image, and “what a body can do” (as the philosopher Judith Butler famously put it, in contrast to the essentializing pigeon-holing approach of traditional philosophy that asked “what a body is.”) You can all watch it too by clicking [here] or below:
The conversation among themselves and with the audience was full of humor and mutual respect as well as serious critical thinking, concern for the well-being of others, and deep personal involvement as they worked toward imagininng an alternative to the sort of media imagery that objectifies black women’s bodies and presents impossible standards of beauty. They discussed the movie 12 Years a Slave, SNL comedy, and — most of importantly — the work of the panelists, such as the movie Free Angela and All the Political Prisoners by Shola Lynch, the memoir Redefining Realness by Janet Mock, and the novel Po’ Man’s Child by Marci Blackman as well as their personal experiences as politically active, queer, transgender, and black women. I was so impressed that I began to imagine my teaching an entire course syllabus around this one panel event. One woman in the audience was in tears sharing her own experience as well as her gratitude to the panel for the support and safe space fostered by the event. Noticeably, in contrast to the blog-o-sphere and social media, not a single person in the audience during the 50 minutes of Q&A seemed the least bit concerned by bell hooks’s comment about Beyoncé. So, considering how her image and her style of feminism was discussed and debated in complex, thoughtful ways among the four panelists and the audience, it is interesting how the safe, supportive space of the panel discussion was transformed into bitchy nonsense by social media.
But beyond the confines of that singular event and its commodification by the blog-o-sphere, the question of Beyoncé’s feminism interests me, in part because as a teacher I find her extremely useful in the classroom for drawing students into debates about feminism, challenging their stereotypes about feminists as man-hating ugly women, and pushing students to think about why they enjoy what they enjoy. For example, because this semester I was teaching the famous Nigerian novelist Chimamanda Ngozi Adichie, I played for my class Adichie’s lecture “We Should All Be Feminists” as well as Beyoncé’s song “Flawless” that samples a full minute from that lecture (practically one fourth of the song):
The conversation among my students last month was similar to the one brought up by bell hooks, Janet Mock, and one of the members of the audience: how do we negotiate the positive work we see Beyoncé doing for feminism and women’s empowerment and the negative commodification of her body and debilitatingly impossible standard of beauty it presents to young girls? The “Flawless” music video, it seems to me — rather than being naive or unaware of this dialectic of opposing ideas — very deliberately and self-consciously puts this dialectic in play for us to work out. The song and the video puts the question back on us, for us to imagine ways to be authentically awesome.
But what do you all think? Thoughts?
Two things happened simultaneously on May 1st, both involving the U.S. State Department and its relation to Ethiopia. Thing one was the State Department’s news program, Voice of America, broadcasting its brief account of Ethiopian security forces firing upon student demonstrations the previous day (April 30) at three universities resulting in 17 dead and many wounded. Thing two was the Secretary of State John Kerry in Ethiopia giving a speech full of praise for Ethiopia’s rapid economic development as well as the U.S.-Ethiopia partnership in addressing the violence against civilians in neighboring Sudan and Somalia. Apparently, Kerry was unaware that the day before, just a two-hour’s drive down the road from where he was speaking, America’s supposed partner, the Ethiopian government, had committed acts of violence against its citizens. In fact, thousands of individuals at universities and in cities across the Oromia region of Ethiopia had been protesting for days, and as the journalist Mohammed Ademo’s article for Think Africa pointed out on Tuesday (August 29), what they were protesting was precisely the consequences of the rapid economic development and foreign direct investment that Kerry praised in his speech — the eviction and displacement of tenant farmers and poor people due to the expansion of the capital city Addis Ababa into the Oromia region.
We might observe a contradiction here within the same State Department. While the State Department’s news program laments an event and clearly points to the root cause, the State Department’s secretary appears ignorant of the event and also strangely unable to discern the causes of ethnic unrest across Africa. An Al Jazeera op-ed responding to Kerry’s speech suggests that the United States fails to see the contradiction in its policy that talks about democracy and human rights but in practice emphasizes security for foreign direct investment (as per the State Department’s own report on such investment in Ethiopia published shortly before Kerry’s visit.) Noticeably, two contradictory ideas are coming out of the State Department simultaneously. What do we make of that contradiction?
Before I answer that question, I might add on to this strange state of affairs by pointing out that Kerry did criticize the Ethiopian government for using repressive tactics against its journalists — the famous Zone 9 bloggers — but what strikes me is that at the very moment that Kerry criticizes the state of journalism in Ethiopia, the mainstream American news outlets such as CNN, National Public Radio, and the NY Times have for a long time neglected to give any serious coverage of the issues within Ethiopia and in fact did not report on the student demonstrations. The only American media mention of the recent student demonstrations and deaths is a very brief Associated Press article that appeared the day after Kerry’s speech (May 2) and that article embarrassingly gets its facts wrong about what happened and why. Such poor journalism is increasingly perceived to be the norm of America’s once celebrated media whose many factual inaccuracies and lack of any genuine will to truth arguably contributed to the Iraq War back in 2003. Curiously, the only news organization in America that did its job (the VOA) is the news organization intended to serve communities outside of America. Moreover, the VOA is part of the very same “department” that Kerry heads. The quality of mainstream American media coverage might seem excusable if it weren’t for the fact that BBC covered these tragic events in Ethiopia reasonably well, first on its radio program immediately after the massacre (May 1st) and then more comprehensively on its website the following day. You can listen to the radio report below:
Since this is a theory blog, I want to advance some philosophical hypotheses that we can draw from the U.S. State Department’s conflicted relationship to this tragedy. We might excuse Kerry for just being diplomatic at a diplomatic event, except that this blind spot seems more pervasive and systemic both in U.S. foreign policy and its coverage by its most respected news outlets, CNN, NY Times, and National Public Radio. Rather, what is happening here is a globalized twenty-first century version of what the theorist Stuart Hall, borrowing from Antonio Gramsci, called the “manufacture of consent” — that process whereby rather than through coercive force, hegemony is achieved through cultural means that slyly convinces individuals of their place in the world order. In this case, the role of Voice of America is to provide peoples all over the world (e.g., in Ethiopia) with access to the sort of “free journalism” (or at least, dissenting journalism) that doesn’t exist in their own country. In many ways, Americans should be genuinely proud that the VOA is what it is — relatively free journalism in over fifty languages that is independent of the organization that funds it and can publish dissenting views. Hence, through the VOA, the American government promises a sort of deferred liberalism (democracy and human rights) that the people within countries like Ethiopia aspire to even though at the same time the actual practice of the American government contributes to the very repressive regime that stifles its own civil society. This is why the American government media (e.g., NPR and VOA) sometimes appears more “free” than the private corporate media which is so beholden to the corporate agenda of its stockholders — ironically the same corporate agenda that the State Department is in some ways also beholden to.
Similarly, one could speculate that the rise of global philanthropists and civic organizations staffed by young graduates of peace-studies programs at American liberal arts colleges present the face of American liberalism that unintentionally undermines the very governments of the peoples they purport to help. This is why such philanthropy and civil society has been suggestively been called by some cultural theorists the “Trojan Horse” of global capitalism just as Christianity was once called the Trojan Horse of European imperialism.
But Gramsci and Hall’s theory of hegemony appears unable to fully account for the pervasive and ubiquitous violence and unrest that we see not only in Ethiopia but all over the world. In fact, ethnic conflict and violent expressions of dissent seem to have become the norm, and so I might suggest we revise their theory to point out a “manufacture of dissent” in which such dissent and perpetual crisis is ironically the unintended consequence of America’s global hegemony. Moreover, such local dissent ironically functions in a way that maintains the hegemony of global financial institutions (as was pointed out in Negri and Hardt’s famous book, Empire). We might find the more recent philosophy of Giorgio Agamben about the “state of exception” somewhat useful to explain this. Agemben focuses on how the normal functioning of law depends on an extra-legal assertion to establish the law. A simple and innocent example of this might be somebody writing on the chalk board in a classroom “Do not write on the chalk board.” The initial violation of the rule establishes the rule, and in this sense the rule’s own violation of itself is internal to the structure of the rule but externalized as its opposite. A more terrifying example of when the state of exception becomes the norm (rather than the exception) would be concentration camps, refugee camps, and “indefinite detention centers” (like that at Guantanamo Bay) in which individuals lived in a non-legal state that is organized by the state itself in the name of preserving law and rights.
In this more global context of foreign direct investment, international human rights organizations, and ethnic groups fighting over increasingly scarce land that is dominated by multinational capital, we have a peculiar state of exception — a new and somewhat strange space between competing jurisdictions and competing senses of sovereignty. One example of this state of exception is the troubled border between an Addis Ababa rapidly developing from foreign direct investment secured by global interests and the Oromia State that supposedly represents (constitutionally) the “people” who live in it. Hence, the local ethnic conflict that Americans may imagine to be “outside” their own democratic system is actually structurally inside it, hence the strange disconnect between the two offices of the U.S. State Department. My point here is not to play the blame game and look for the guilty party that we can then chastise and hold responsible. Rather, my point is (with Agamben) that noticing the structural gaps and impasses endemic to the system may enable the various constituencies to better address each other and work towards more ethical solutions.
There is more to say here, and admittedly, I am afraid my theorizing is a bit hard to follow. I am writing fast in order to participate in the very emotional conversation that has emerged about the issue over the past two days and so I have not yet fully worked out what I want to say. I will try to revisit this topic later when I have more time to reflect and work out what I like and dislike about Agamben’s theory. So I hope my readers will generously comment and help me think through this and figure out a way to express it more clearly.
Yesterday around lunchtime, I took a break from reading academic books about eighteenth and nineteenth-century culture at one of my favorite places in New York, the New York Public Library, so that I could attend the March against Monsanto that was about to begin in Bryant Park, the lovely and popular little public park behind the library. This march was actually the second of such protests that took place all over America and across the world. The first was May 25. If you want to see some photos and YouTube clips of this worldwide protest, click [here]. I attended the march for a few reasons, one being simply that it was near where I already was, another being that I support most of its goals, and last but not least, because it closely relates to the book that all the first-year students at my college were asked to read over the summer, Raj Patel’s Stuffed and Starved: The Hidden Battle for the World Food System — a book that most of my students told me they found a little bit difficult and convoluted and therefore a lot boring. In my view, it’s an important and interesting book, so I’m hoping here to make that clear. The main idea of both the march and this book is that our food system is being monopolized by corporate interests in ways that are unhealthy both for the consumer and for the producer. Examples of this problem are the obesity epidemic as well as the high rates of suicide among small farmers struggling to maintain their farms in countries such as India . The march focused on the issue of genetically modified organisms (GMOs) that Monsanto creates and actively lobbies governments to promote in their country’s agriculture.
Inexplicably, this world movement has not been covered by the New York Times. It is hard to imagine why the Times doesn’t cover it, since it seems to me to be more interesting and more relevant to the lives of people than the article about the spending habits of a Catholic bishop in Germany or the article about the dentist whose clients sometimes pay her in works of art. In my opinion, something that takes place in more than 500 cities around the world at exactly the same time deserves at least a mention. We could speculate that American journalists are so focused on the supposed conflict between the Democrats and Republicans (e.g., the government shut down) that it doesn’t occur to them to notice that most Americans have political viewpoints and ideas that are neither Democrat nor Republican.
One might raise the question of whether this march was in fact a failure, since the point of such marches is precisely to make the public aware of important issues by organizing an event that would attract media attention. So, since this event did not attract media attention, was it a failure and, if so, why? We might shift the blame to the newspapers themselves and accuse them of not wanting to upset the corporations that advertise in them, and I would agree it is important for the reading public to be critically aware of this potentiality. Since the march was covered by the alternative media, such as the newly formed Al Jazeera America, this may be a reasonable suspicion, though difficult to prove. Or maybe Americans are so focused on the Tea Party opposition to President Obama that they fail to notice the opposition to Obama from the other side of the political spectrum, the so-called “left.” Or maybe the journalists mistakenly thought the march was part of the Comic Convention, since both featured people dressed up in costumes, hahaha. However, in this case, I also wonder about the self-presentation of the march itself. As I listened to the speeches, the march seemed to bring together a diverse array of concerns, including, healthier food in public school cafeterias, the right of us consumers to know what we are eating, the long history of Monsanto’s dangerous and illegal business practices, and even a more spiritually fulfilling relationship to our food. The one thing uniting these diverse agendas was simply the evil of Monsanto which was something of a synecdoche for the world’s problems.
In a sense, the rather long list of various interests and feelings as well as the hatred of Monsanto somewhat obscured the two important issues that are actually before our government right now. The first issue is one that has received very little media attention even though it may revolutionize the world economy — something called the Trans-Pacific Partnership (or TPP) that has been under negotiation among countries from Japan to Chile since 2008. Proponents of the TPP argue that it would boost economic growth by encouraging trade, but critics argue that it would give power to large corporations and undermine any government’s ability to protect its labor force, the environment, and the health and safety of its food supply. Considering that President Obama has been both actively promoting the TPP and keeping the details of the agreement a secret, this could be one of those strange issues about which both the right-wing Tea Party and the left-wing Green Party and socialist parties could actually find common cause. Obama was hoping to fast-track this bill through congress this month and thus avoid any substantive public debate (a hope that may have been derailed by the government shut-down, I don’t know.) My guess is that the reason why the planners of the march long ago planned for mid-October was precisely to bring attention to the issue that they predicted would be rammed through congress (little suspecting how dysfunctional congress would be.) The second is a more local affair, the bill currently before the New York state legislature requiring all GMO food to be labelled for consumers.
My own observation, just listening to the speeches, looking at the signs, and also noticing how students responded to Raj Patel’s book is that the emotional energy and rhetoric revolved around the rights of the consumer and some vague notion of authentic and pure food. In other words, the vague feeling is that GMO food is bad because it is not natural. Some speeches argued that we have a “right to know” what is in our food, thus calling attention to the fact that few of us actually have a clue what we are eating most of the time (despite labeling and the efforts of the Food and Drug Administration.) The problem of this sense of “real food” versus GMO food is that a lot of food that is genetically manipulated is not bad for us. Farmers have for centuries cross-bread plants and live-stock. Thus the problem is not simply GMO; rather it’s the unsafe and aggressive manner in which Monsanto forces small farmers to use its products.
Don’t get me wrong here. As someone who just taught Upton Sinclair’s famous novel The Jungle, published in 1904, that inspired President Teddy Roosevelt to pass the Food and Drug Act in 1906, I certainly care about the role of the FDA and support the regulation of our food supply to ensure that it is healthy and safe. However, Sinclair’s novel was also about the plight of immigrants in Chicago at the end of the nineteenth century and about the exploitation of labor and the monopolization of food production by corporations. It is a somewhat well-known irony among teachers of literature that Sinclair’s intention was so totally misread. In other words, what people noticed in his novel were the long descriptions of the meat-processing factories which were quite gross, and not the long descriptions of the oppression of workers. The book hence inspired the government to regulate the processing of meat to make it safe for consumers, but it did not (as Sinclair actually hoped it would) inspire the government to protect workers. As Sinclair himself joked, “I aimed for the public’s heart, and by accident hit its stomach.”
I suspect the same thing is happening now that happened with The Jungle. The economy and the long-term effects of trade policy such as TPP are hard to understand. Likewise, the argument of Raj Patel’s book is complex in its drawing a connection between obesity in the United States, starvation in India, and migration from Mexico. Ultimately, Patel’s argument is about the political power of multinational corporations that undermines the ability of farmers to make smart decisions and the ability of local communities to do what they think is in their best interests — and that this affects all of us in various ways. However, what many students take away from this book, and what many of the protestors yesterday were focusing on, was some vague, nostalgic attachment to “real” food and some vague idea that we consumers should be able to get “real” food.
The law before the New York legislature right now is precisely the sort of law that focuses on the consumer — the supposed right to know what we are eating. At the rally, the proponents of the law argued that once we have GMO labels on our food, then the public will realize what they are eating and begin to buy non-GMO food, and this would so hurt Monsanto’s profit margin that… hmm… honestly, it wasn’t really clear to me what would be the outcome. I can’t imagine that Monsanto and the global food industry would be hurt so much that they’d change their business model. As the journalist Naomi Klein observed in her famous book, No Logo, such are the limits of political activism that focuses on the rights of the consumer rather than the means of production. Such also are the limits of political activism that focuses so intently on the evils of a single corporation that symbolically represents all that is wrong with the world rather than the trade policy that allows many such corporations to thrive. From the perspective of a literature professor such as myself, both the March against Monsanto and the bill against GMO food have a narrative that is full of symbols and what psychoanalysis calls “displacements” whereby complex political content is reduced to simpler emotional content.
Might the march have been more successful if it focused on the actual issue — either the worldwide concerns about the FPP or the local legislation against GMO, or (since they are related and timely) both?
Note: all the photographs in this post were taken by me, but I deliberately selected certain photos and cropped them so that there would be no faces. My intent is to protect individuals who might not want their face on the internet without their permission (especially considering the politically controversial stakes of the march.) An unintended consequence may be that readers of this blog will get the wrong impression that the march was a bunch of people in funny costumes, but actually, for the most part, it was a large crowd of ordinary people of diverse backgrounds, ethnicities, and ages.
Almost every week since 2011, American news corporations have reported on the non-violent grassroots democratic movements in Egypt and Tunisia and the violent, U.S.-supported movements in Syria and Libya — the so-called “Arab Spring.” However, almost never reported are the conditions for a viable democracy in Ethiopia, and even in those few reports about Ethiopia such as this one, what remains missing is any account of the religious, ethnic, and ideological complexities of that country and the changing multifaceted history of that region. In other words, what remains missing is precisely the information one might need to really understand what is happening. How do we understand human rights and democracy? I’d like to begin with this photography here taken on Thursday, August 8th that quickly circulated on various forms of social media and eventually was posted on Al Jazeera last night along with some earlier photographs and Twitter feeds.
The picture is of a young man in the capital city of Addis Ababa, confronting Ethiopian police non-violently by kneeling in prayer before them. Some conversation began on Facebook and Twitter about the symbolic meaning of the photo, and what I’d like to suggest to the readers of my blog is that, for many Americans, the way “democracy” in other countries is understood is largely through images such as this one. It is worth thinking about such images because they often take on a symbolic significance that may be emotionally moving but also may obscure many of the political details and actual functioning of democratic social movements.
But before I continue to think about my questions about how we understand the images that come to symbolize democratic ideals and social movements, I should provide some context for the photograph. Last week, as the month of fasting for Ramadan came to a close and the feast-day of Eid al Fitre was celebrated across the world, Muslims in Ethiopia were protesting the government’s closing of some mosques and arrest of Muslim community organizers and journalists. The Ethiopian government’s heavy-handed responses to those protests in various towns across the country and in the capital city of Addis Ababa left many dead and more injured. The government’s position is that these are violent Muslim extremists, but against this view, the Muslim community organizers argue that they represent the moderate form of Islam that has existed in Ethiopia for over a thousand years and that their movement that started in 2011 is non-violent. On Thursday, August 8th, in support of the Muslim protesters, Amnesty International filed this complaint against the Ethiopian government for human rights violations. Muslims make up about one third of the population of Ethiopia, but the state government has been dominated by Orthodox Christians since the incorporation of Muslim territory at the end of the nineteenth century. The entire history is a long one, and considering that the protest movement started about two years ago, I don’t want to dwell on all the details in this blog post; you can read or hear more about the past week’s conflict by following these links to OPride, BBC Africa, Reuters, and a United Nations brief. One frustrating thing is that the place where you won’t hear anything about these events is on the major sources of information in the United States: The New York Times and National Public Radio.
Coincidentally, exactly when this conflict started in the Oromia region of Ethiopia, I was listening to Oromo intellectuals at the Oromo Studies Association conference at Howard University in Washington DC who were engaging in a debate about the complex historical relationship between religious organizations (namely Islam and protestant Christianity), cultural self-determination, and democratic movements. One of my students and I were at that conference to give presentations on a panel about international education, media and film along with OPride‘s editor and the Oromo-language journalist for Voice of America.
So, drawing on what I learned at that conference and what I had already learned before going to it, we can deepen the context for this single photo to go so far as to suggest a context of a thousand year history of political involvement from Turkey, Portugal, England, France, Italy, the United States, and most recently Saudi Arabia, China, and India. The cultural divisions in Ethiopia are not merely religious but also ethnic, and this is complicated because the largest ethnic group in Ethiopia, the Oromo, are a mix of Christian, Islam, and older forms of religious practice. Earlier this year, on June 25, Al Jazeera became the first global television news network to focus on these issues in a segment that you can watch here. But there are other factors to consider too, not mentioned on that segment of Al Jazeera. From the 1960s to the early 1990s, both Christian and Islamic religious institutions participated with other organizations in a broad-based revolutionary democratic movements that eventually led to the revolutions in 1974 and 1991, but since the 1990s, new forms of Christianity and Islam have emerged that claim to be fundamentalist but whose funding and ideology seem to come from outside the country. We might consider too that for almost a century Ethiopian law prohibits religious practices (such as burial and marriage) that do not fall under the jurisdiction of sanctioned Christian or Muslim institutions (e.g., the Oromo’s traditional Waaqeffannaa), and these new forms of fundamentalism (not only Christian and Muslim fundamentalisms, but also western neoliberal fundamentalism) appear to be suppressing some of the older forms of ethnic culture that predate the adoption of the world religions, including older forms of ethnic culture that give women some important forms of agency in their communities (e.g., addoyyee and siiqqee.)
So, now that I’ve summarized that context, let’s return to the photo. The non-violent gesture of the man engaging in “salat” (prayer) seems to have stopped the police officers. The image might remind us of other champions of non-violent action such as Mahatma Gandhi and Martin Luther King, Jr., who argued for the effectiveness of moral persuasion through non-violent action that exposes the hypocrisy of the ruling regime whose excessive use of force undermines the legitimacy of the state. The action of this man engaging in salat is not passive, but firmly active non-violent practice. However, noticeably, other forms of non-violent protest (e.g., marches and assemblies) did not have the same effect on the police. Two things seem special about this photo: first, that it is an act of prayer and second that it is a solitary individual putting his body at risk. This does two things. First, there is a bias in western media that tends to read Islamic practice and liberal human rights in opposition to each other, and indeed, the Ethiopian government’s rhetoric to the outside world seems to deliberately capitalize on that bias in order to discredit their political opponents. But for Muslim Oromos living in the United States, Australia, and elsewhere, the meaning of this photo would seem to suggest that liberal human rights and Islamic practice can function together. Second, it foregrounds the decision of an individual to put himself at risk for the greater good rather than a group identity or mobilized mob. It creates a hero.
Thinking theoretically, and reflecting on this interesting question about the structural relationship between the practices of Islam and the idea of human rights, might all of this illustrate the anthropologist Arjun Appadurai’s inquiry into the nature of globalization? In his book Modernity at Large, he argues that various ethnoscapes, technoscapes, mediascapes, financescapes, and ideoscapes all play a role in social formations and local cultures — sometimes functioning together, but sometimes functioning in contradiction to each other. These global “scapes” are in tense dialectic with the local (i.e., the actual lived experience and social organization of communities.) My presentation at the Oromo Studies Association conference alluded to Appadurai’s theory to argue that today’s international education is very much enmeshed in these different “scapes.” In the case of the photo that is the subject of this blog, we see the ethnic identity of Oromos, the practice of Islam, the ideology of human rights, and the technologies of social media. The photo might seem to fuse these various “scapes” into a singular image that celebrates a global sense of local freedom.
However, what we do not see in this symbolic image, of course, is the economics, and this includes the distribution of wealth and Ethiopia’s GDP that Jawar Mohammed emphasizes in the interview with Al Jazeera, but also the daily labor of individuals that Dr. Ezekiel Gebissa talks about in his book on coffee and khat production, as well as the speculative labor of financial institutions (what Appadurai calls financescapes), and even more basically the home-making of families. What do we make of this absence? Might it be important for how we read the effectiveness of symbolic images that come to represent such ideologically loaded concepts of freedom and democracy for American consumers of media?
We might compare this image to another one, the famous Tiananmen Square demonstration in 1989 when a single individual stopped military tanks from interrupting a public protest.
In fact, Oromos on social media (e.g., here) have explicitly made the comparison between the recent event in Addis Ababa in 2013 and that event in Beijing in 1989, and it is precisely the making of such comparisons between different movements that is the point of my blog post today, because in the media these images can become filtered through a western ideology of human rights that may not be fully attentive to some of the local cultural practices and understandings of what was happening. For instance, the American and European media all understood the Tiananmen Square demonstration to be a pro-democratic and anti-communist demonstration. What the media failed to appreciate is how communism and democracy are not inherently antithetical, and that one could protest the government for other reasons. In an important book written by one of the leaders of the Tiananmen demonstration, Wang Hui, and published by Harvard University Press in 2006, entitled China’s New Order, it is revealed just how incorrectly the western media understood this event when they filtered it through the global ideoscape of human rights and democracy. Wang Hui outlines the variety of economic and social issues that concerned the Chinese people and the demonstrators, and how all these issues did not neatly fit under a single ideological perspective. Importantly, for many of the demonstrators, instead of protesting communism, what they were actually protesting was the capitalist reforms, opening relations to American and European capital markets, and the “financescapes” being dictated by the government that were causing some forms of economic displacement of peoples (e.g., working conditions) and general uncertainty. In other words, in a sense, the movement was actually in some ways a conservative one, exactly the opposite of what the western media assumed.
So, what lessons do we learn from Arjun Appadurai and Wang Hui’s inquiries into the nature of democratic practice in a globalized world order? What further questions might we raise about this photograph of a man kneeling in prayer before police in riot gear? How might we untangle the tangled relationship between the Islamic practice of salat, the local demands of various religious and ethnic institutions, and the international ideology of human rights and non-violent political practice that the photograph seems to symbolically fuse?
One of Appadurai’s points about using the terms “ethnoscape” and “ideoscape” instead of the more ordinary terms “ethnic group” and “ideology” is that the neologistic “scape” alerts us to the ways that the meaning of ideas changes depending on the contexts. For instance, African American civil rights activists in the 1960s, the U.S. government in the 1980s, and leaders of the democracy movement in Tunisia today might all use the same ideas of freedom, democracy, and human rights but mean slightly different things by them. Gandhi’s practice of non-violence is connected to a Hindu tradition whereas Martin Luther King, Jr.’s is to a Christian one. Scholars of the civil rights movement in America have long expressed frustration about the way Martin Luther King, Jr.’s political message has been watered down in the popular media and high school history textbooks and grafted onto the ideology of American patriotism. Likewise, the Ethiopian government’s branding opposition groups as “terrorists” appropriates the inflammatory rhetoric of U.S. president George W. Bush a decade ago, but does so for its own ends, and when Oromo’s speak of genocide and ethnic cleansing, they are using legal terms formulated by the United Nations in the context of the Jewish Holocaust in ways that may or may not be slightly different from the way a UN legal team might use them. Hence, we are dealing not with ideologies, but with ideoscapes whose very signifying power is supposed to be part of a universal language that everyone in the world can understand but is actually quite local and context specific. Similarly, just as ideas are not pure and stable concepts, ethnicity is not a pure identity based merely on territory or authentic culture, because the lived experience of ethnicity and cultural practices have a dialectical relation to the global transformations and movements of peoples due to financial speculation, colonialism, etc. For instance, a little over a century ago, the Oromo were a rather diffuse ethnicity of many tribes, kingdoms, religious practices, and dialects who were forced to unify as a singular political liberation movement only after their rights and their land were threatened by a newly formed Ethiopian imperial state and global capitalism. Notably, an ethnic group’s right to self-determination is usually argued with terminology borrowed the European enlightenment’s discourse on “rights” but applied to local cultures who may have a different language for talking about such things. During the conference, one Oromo feminist community organizer said she preferred to think of women’s empowerment in terms of “social balance” and traditional Oromo culture rather than in terms of “rights” and western ideas. Hence, the lived experience of “ethnicity” changes depending on context and also depending on the “ethnoscapes” relation to other “scapes.”
And so, in the case of this photo, we might need to think harder about what human rights and non-violent protest really mean in the context of Islamic practices within Ethiopia that are themselves undergoing a transformation due to various global forces such as the competing ideoscapes of religious fundamentalism and liberalism and also such as the ways in which finance capital transforms territory, the use of land, and a community’s access to natural resources such as water.
It happens almost like clockwork at the end of the spring semester and beginning of the fall: the New York Times publishes another blog post lamenting something about college English education, usually by someone who only peripherally knows what they are talking about. One of my all time favorites was Stanley Fish complaining about what was wrong with freshman composition classes based on offhand comments he overheard in the hall, and most recently Verlyn Klinkenborg sheds tears at the so-called “Decline and Fall of the English Major.” Since these are opinion pieces, they aren’t required to cite any actual data, but since they parade the veneer of insider expertise, their bitter commentary becomes somewhat dangerous as their misinformation is repeated and magnified so much as to almost seem like actual fact. Fortunately, professor-by-day, superhero-by-night Michael Bérubé is on the scene to correct this misinformation by citing actual numerical data in a recent op-ed to the Chronicle of Higher Education, “The Humanities, Declining? Not According to the Numbers.” Unlike Klinkenborg, who only teaches a few classes here and there as a “writer in residence” to various colleges, what might be analogous to having a guest-worker visa, former president of the MLA Michael Bérubé has access to real data. (Actually, all of us have access to this data; it’s just that journalists aren’t always motivated enough to go look at it.)
I pretty much agree with everything Bérubé says here, and love his article, but there are some important things missing — things important enough that I believe attending to them will change the conversation entirely, as you may have guessed from the title of my post. The upshot of Bérubé’s piece is that actually the number of English majors has neither increased nor decreased significantly since the 1970s. He notices that usually the lamentations carry with them an attack on theoretically rigorous scholarship and express some nostalgia for the olden days when we all knew which lines of poetry to quote at cocktail parties. He suggests that the real issue is neither the usefulness of the English major for the job market nor the quality of instruction (all of which are doing just fine); rather, the real issue is funding for education, the casualization of the labor force by hiring more adjunct instructors, etc. And of course, the constant specious attacks on our profession by the media and politicians aren’t so nice either.
So, what do I have to add to this conversation? Four things.
Thing one is is the rise of new interdisciplinary programs. There was a useful study done by the MLA published way back in 2003 about the declining number of English majors. Anyone can see it, though apparently reading something more than a page long is too much work for NY Times bloggers. The strength of this study is that it surveys all colleges and universities (not just a few, as journalists tend to do), and it shows the trends for each and every year (not just the years that support some sort of dramatic conclusion that the journalists prefer to cite.) I’m guessing Bérubé got his information from this, but what he doesn’t mention is some of the findings. There are many, but one of them that I want to draw attention to is the observation of what students started majoring in instead of English. The assumption by many is that they have shifted to business, but although the business major has seen some increase, even more significant was the emergence of entirely new interdisciplinary programs: environmental studies, peace studies, gender studies, ethnic studies, film and media, and most importantly, communications. These programs are never mentioned in the journalistic lamentations, and they are important, because people who once upon a time might have become humanities majors are now opting for these programs, and these are programs that often combine social sciences, humanities, and sometimes even a little technology or hard science. Historically, it’s also not surprising that the creation of these programs happens at about the same time as the number of English majors declines. Moreover, they affect different schools differently, depending on the size of the school. In my view, these are all great programs, and what’s more, they are programs that the English department is often involved in and supports, or even, in some cases, leads. And this is one reason why I title this blog post “The Rise and Change of the English Major,” because often it is the literature professors who have taken leadership roles in creating these new and innovative programs that then later affect the constitution of the English major itself. One of the enduring challenges for English department faculty is how to maintain the traditional major, with all the timeless classics and literary history, at the same time it includes these new programs. It is not uncommon to find professors who have dual appointments in English and something else. In my view, English faculty need to be engaged and take leadership roles in interdisciplinary programs, but they also need to be clear about the expertise they bring.
There’s a lot more to say about this, but for the sake of keeping this blog post short, I will move on to thing two, and thing two is the fundamental importance of communications skills, analytical skills, and critical thinking skills for employers today. These skills were identified in a report made by the National Association of Colleges and Employers about what employers wanted to see in college graduates. Moreover, corporations have been very clear that the kinds of things taught in business departments do not foster much critical thinking and writing which is why a broad-based liberal arts program is important (see [here] and [here], for instance.) Point being, the English major has become even more essential to colleges and universities than ever, and this is in part because of the interdisciplinary nature of the English department that I mentioned in the preceding paragraph. For instance, some schools even require their majors in other subjects to take business writing, tech writing, or something along those lines taught in the English department. Although the traditionalist may lament the fact that students aren’t walking around quoting Shakespeare and Keats on a regular basis (did they ever?), corporations may be happy to have a job candidate who has had the experience of working through the complexity of a poem because this sort of exercise carries with it a lot of transferable skills such as careful reading and original thought. And the inherent use value of English classes is another reason why I mention the rise and change of the English major.
The recent and often cited Report by the Commission on the Humanities and Social Sciences clearly states the value of a liberal arts curriculum to employers for precisely the reasons I just stated. English majors have the skills that employers want. The report also mentions other important things such as cultural and civic awareness, which are doubly important if one considers the rapid increase in jobs that have to do with civic engagement, social responsibility, and intercultural issues. What is curious is that when NY Times pundits such as Klinkenborg cite this report, they actually say the opposite of what the report says. Klinkenborg says that the humanities faces declining enrollment because students think they can’t get a job with an English major. Actually, the report asserts the value of the humanities for employers, but notes the problem of funding and support for programs. In other words, it’s a political issue. Duh.
Related to thing two is thing three, and thing three is the growth of writing centers. For the skills identified by employers, writing centers have gained a prominent role in the college. Often they have close ties to career centers, since both help student prepare their job application materials, and often they have close ties to English departments. The relationship to English department differs depending on the school, ranging from being directly run by the department to simply having a lot of English majors on staff as tutors. The history of writing centers is long and complex, and recently I’ve done a little reading about them including such books as Neal Lerner’s The Idea of the Writing Laboratory and Michael Pemberton and Joyce Kinkead’s The Center Will Hold: Critical Perspectives on Writing Center Scholarship, so I am reluctant to give a simplified history, but one of the upshots is that the role of writing centers grew considerably after the 1970s, and this was in part because a great number of Americans were now attending college, and more importantly a greater number of these college students spoke English as their second language or were first-generation college students. What this means is two things. First, far from the “decline” that the journalists moan about, English departments are actually more important, because they play a role in supporting the whole school. And second, the English department has changed a bit because scholarship on the teaching of writing has developed considerably since the creation of various journals on writing and writing centers, and much of this growth has tended away from the poetic and toward cultural analysis.
Thing four is the new emphasis in schools on “global citizenship” and leadership, cultural sensitivities, etc., and noticeably, these are all skills that employers value, too. They are also skills that English departments excel at. Afterall, English is where postcolonial studies was invented, way back in the 1970s, to better address the new global situation of newly independent African, Asian, and Caribbean countries. Before college administrations became aware of the “global,” literature departments were already there.
So, in conclusion, my argument is that we have not seen a decline in English, but rather an expansion and a change. Good changes in my opinion, though growing pains are always par for the course. The challenge is how to make our case to administrations and the general public who don’t always seem to understand the importance of English departments or even understand what it is that we do.
Why is this? Why are English departments the discipline that the media loves to cry about, and why are English departments uniquely misunderstood? Now I move from the practical and the factual into the realm of theory. What’s also interesting to me is the ways English professors are represented in Hollywood cinema, either as Shakespeare quoting, bow-tie wearing, obsessive, anti-social freaks or as lazy, lecherous alcoholics who sleep with their students. I’m not saying such colorful characters do not exist at all in real life, but they are the exception, not the rule. One possible explanation for all this misunderstanding and media hype is that unlike other disciplines, everyone thinks they know something about our discipline. I remember going to the doctor because I was sick and finding myself listening to the doctor through a haze of fever and congestion tell me about his favorite books and his view of literature. I was waiting for him to talk to me about my health, but he never did. I can’t imagine the opposite case of me lecturing the doctor about epidemiology if he came to my office to ask my advice about books. The fact is, most people don’t continue to study algebra, chemistry, and sociology after college, but they do continue to read books and even have strong feelings about them. Hence, history and English professors are often in the awkward position of talking to someone who thinks they know as much as we do about what we do, a position rarely experienced by the chemical engineer. This difference creates a psychological tension. Possibly the aspects of pure fantasy and irrational fear that we sometimes notice in the rhetoric about English departments that we find in the mainstream media or in the speeches of politicians is an effect of the uncanny difference.
Note too the clear contradictory nature of the lamentations about English. The same individual might complain first that English departments need to return to the classics by dead white males and stop teaching all this new-fangled theory and politically correct stuff, and then proceed to complain that English departments need to become more relevant to the “real world” (i.e., jobs and whatnot.) That these two desires contradict each other often goes unnoticed. That the English department has for a long time actually been doing both of those things — both the classics and the real world stuff, and continues to do both those things — also goes unnoticed. Sigh.
Still another disconnect is the strange notion that because English professors study metaphor and rhetoric then they must somehow be silly lovers of fanciful idioms rather than practical realists. To my way of thinking, it seems obvious that someone good at analyzing the use of metaphors and symbols would be expert at cutting through bullshit and seeing the facts for what they are. For instance, Bérubé’s article is a perfect example of such skills (as is, I hope, my own blog) in which he cites actual statistics and wonders why journalists keep repeating factually unsupported narratives. An English major would likewise quickly see through the rhetoric of those NY Times celebrity bloggers who seem to follow a rhetorical formula — the author relaying some cute anecdote which is supposed to make them sound like they know what they are talking about and then coming to all sorts of unsupported conclusions. Columns by the NY Times superstar pundit Thomas Friedman are typical in this regard. Reading Friedman talk about the economy is like reading someone who tells about a nice time they had rowing a boat on the pond and then launching into opinions about the chemical composition of various plants he saw there as if the one experience gave him the expertise for the analysis. There is a formal consistency to these op-ed pieces that is rather amusing and isn’t too hard to analyze.
However, I wonder if the fact of the growing importance and expansion of English for employers and colleges might be, paradoxically, the reason why they receive the sort of critical attention in the media. English departments are monstrous and scary — freakishly adaptable — the skills they teach lend themselves to almost every other discipline, since all disciplines require some sort of critical thinking and culturally situated communication. We are monstrous, and that is our strength.