Earlier this week, the college where I teach honored me and one of my colleagues with “diversity action” awards in recognition of our contributions to campus life. My contributions this past year included, among other things, mentoring the black women student organization, helping to resolve a racially sensitive controversy on campus, and inviting guest speakers of Jamaican and Oromo ancestry to engage our students, not to mention, of course, the content of my scholarship and teaching. My colleague’s contribution was to raise awareness about the diversity of disability and multiple forms of intelligence by involving his students in collaborative projects with individuals with disability. I am humbled by the recognition, as I’m sure my colleague is too, but at the same time, it provoked me to think more deeply about the meaning of the word “diversity” and whether or not that’s how I conceptualize my aspirations. Given that I fancy myself a provocateur, wielding my weapons of sarcasm, irony, and dialectical reasoning to get my students to think outside the box, I came up with some other words to help me imagine an alternative to diversity, such as alterity and adversity, and even made up a few words that don’t actually exist, such as obversity and subversity. These made-up words might be translated as challenging, counterpoint, oppositional, dissenting, or insurrectionary.
When did the word “diversity” became so much a part of American culture that it became the unquestioned ideal — so unquestioned that it is now an essential component of college curricula throughout the country (the “D” requirement)? It certainly wasn’t the case in 1980 when Congress was still debating whether to celebrate Martin Luther King, Jr. day and when most colleges required students to take not just one, but two, classes on the so-called “classics” of “western civilization.” A radical transformation took place in the 1990s, and consequently now most colleges require some form of “diversity” in their curricula instead of the old “western civ.” At the same time, corporations such as General Mills and Target began making “diversity” part of their corporate missions. Similarly, the Republican Party’s campaign strategy that once upon a time focused on what it called the “silent majority” (code for “white men”) under Presidents Nixon and Reagan changed significantly under President George W. Bush to become more ethnically “diverse,” especially with his famous attempt at a campaign speech in Spanish. Perhaps the most extreme example of what I’m talking about is something I just heard on the radio yesterday morning, that even some Klu Klux Klan members have recently begun re-branding their organization as “diverse” (according to this story on NPR.) Listening to the radio story, one can’t help but hold one’s chin in hand and wonder, if the Klan is no longer “racist,” then who is?… Is it not totally bizarre that two such different organizations as a liberal arts college and a white-supremacist organization might appropriate the same word to describe their aspirations? Language is a funny thing, as the meaning of words sometimes changes, used metaphorically to apply to so many different phenomena that they almost seem to lose meaning altogether.
But words matter. As teachers and college administrators, we use them to organize our classes. Museum curators use them to organize their exhibits. Corporations use them to organize their advertising as well as their hiring practices. Politicians use them to organize their campaigns. If diversity is the name of the game, then one can expect an effort to reach-out to different… um… different what? A long list of identity categories, both the sort of identities that I might proudly give to myself as well as the identities that others might maliciously or ignorantly give to me.
To illustrate my point, let’s take something really ordinary and commonplace such as the phrase that’s on the back of our dollar bills, e pluribus unum (translation: “out of many, one.”) The phrase might be interpreted by some to indicate that Americans have always celebrated their diversity, but this would be false. The phrase originally indicated the unity (not the diversity) of several states (the original 13 colonies) coming together to achieve an independent federal government with a centralized bank that could issue the paper money upon which the slogan is printed. As everyone knows from their high school history class, the unity of the “unum” in the e pluribus unum and the banking system that it subsequently unleashed was intensely debated in the Federalist Papers and the Anti-Federalist Papers. But since then, it came to be used to mean something else, something vaguely about who and what America is. Some have interpreted the e pluribus unum through the metaphor of the melting pot, which is a curious phrase, often applied so widely that the metaphorical sense of “melting” is scarcely noted. After all, it takes considerable heat to melt something. The origin of this well-known phrase is chapter three of the American literary classic, Letters from an American Farmer, written in 1782 by a French traveler to America, J. Hector St. John de Crevecoeur, who imagined how the American experience would melt the (metaphorical) metal of various European ancestries into a new alloy in which the flaws of European culture are burned away. Over a century later, in 1908, Israel Zangwill wrote the play The Melting Pot, which imagines all European immigrants giving up their old cultures and becoming completely Americanized. There is something a little violent in all this melting.
What I think is fascinating is that people often use the e pluribus unum and melting pot metaphor for diversity when in fact this metaphor actually meant the exact opposite. The opposite of diversity, after all, is unity and sameness. And in the melting pot, all difference is melted away to transform individuals into a new man with a common destiny. Hence, in response to the “melting pot” metaphor, the “salad bowl” metaphor was coined to imagine ways that everyone can keep their unique cultural identities but still be American or something, whatever the pot or bowl is assumed to be. For instance, if we take the pot or bowl to be the “university” where I work, then what assumptions do we have about the “uni” that unites the diversity of its students and its curricula?
What is apparent in the e pluribus unum formulation is a rather ironic dialectic of opposite meanings. On the one hand, it means celebrating our differences (you’re OK, I’m OK, hip hip hooray). On the other hand, it means abandoning what makes us different and embracing what makes us the same. If we question “diversity” by drawing attention to this dialectic, does the term begin to unravel as a contradictory ideology that says one thing (a celebration of difference) but actually, underneath, implies its opposite (an assumption of sameness, or, at least, the assumption of a patriotic nationalism)? What seems to enable a celebration of diversity is the belief deep down that our differences ultimately don’t matter (materially speaking) quite as much as the sameness that binds us (whether we like it or not) to our situation. For example, whether you are black or white, Christian or Buddhist, you still have to pay your taxes and buy your bread with money issued by the Federal Reserve Bank (rather than by bartering, for instance.)
Point being, the assertion of “the thesis of diversity” implies its own antithesis (or obverse). That’s the dialectic. In other words, when Walt Whitman hears American singing about all of our infinite multiplicity (to cite a famous poem), it’s possible that his poetic verses of universal diversity signify also its problematic obverse within a universe filled with adversity in such a way that might reverse the subversion originally intended by his revolutionary verses against (or versus) an oppressive and limited world…. Or so thought Langston Hughes in the subtle reversals that he made through his allusion to (or re-versification of) Whitman’s verse.
Can we think of other words to use? Perhaps alterity rather than diversity? When we think of cultural difference through the conceptual lens of alterity rather than the conceptual lens if diversity, we begin to imagine alternatives to the status quo and changing things for the better (revolution, transformation, transfiguration, or simply an openness to experience and to difference) — the goal being a better way to be.
What if we organized an award or a class curriculum or a museum around adversity rather than diversity… or around alterity rather than identity?
Beneath the dialectic of diversity and unity (or difference versus sameness) that structures the ideology of our educational system, perhaps we can think beyond the Freudian “id” of identity and its symbolic language of diversity. How can we subvert the hegemonic symbolic order that dominates our imagination in a way that reveals the true beauty of our reality? As we critically reflect on our roles as teachers and students, perhaps we can begin to imagine ourselves studying at a “subversity” rather than at a nominally diverse university.
The first provost of the subversity in my imagined historical drama might be the “sub-sub-librarian” who begins Herman Melville’s novel Moby Dick.
I sometimes read the columns of John Feffer, one of the founders and co-directors of a think tank called Foreign Policy in Focus, whose informed analysis of world events I generally appreciate and respect. For the sake of full disclosure, I might also add that, about fourteen years ago, before this think tank existed and before I attended graduate school, both John and I lived next door to each other in Tokyo, Japan, when we both worked for Quaker organizations — I was teaching English at the Tokyo Friends School and he was working for the American Friends Service Committee. Since I usually agree with the stuff John writes, I was more than a little surprised when I read one of his recent blog posts entitled “Scram!” and found myself shouting angrily at the computer screen. In his blog, John pretends to give a college commencement speech, and following the conventions of that genre, he reflects upon the purpose of education and gives practical sage advice, and of course, in doing so, he makes several political statements about the state of education today. I have no problem with the main idea of the article, but in the process of articulating it, he makes some rather cynical and disparaging cracks at college culture. Of course, John is not the only one to use this seasonal moment as an opportunity to say something about the institution of higher education. The end of the school year has inspired both President Barack Obama and Mitt Romney to attack each other on various education policy issues in various speeches, including an actual commencement address Obama gave to Barnard College and Romney’s release of his education plan, which has been discussed and debated in the higher education newspapers [here] and [here]. Almost all conversations about college education today point out the rise of student debt at a time of economic insecurity and rising unemployment. This year, college graduation has become something of a political football, tossed and punted around by media pundits hoping to score political points.
In sorting through the various perspectives and statements on the state of higher education, and in coming to terms with my surprise and anger at some of John’s comments, I am reminded of the theoretical notion about ideology made by the influential cultural theorist Stuart Hall. In one essay [here], he argues that ideology is not such a simple correlation between a social class position and a way of thinking — e.g., the ideology of the proletariat vs. the bourgeois, Democrat vs. Republican, or college student vs. banker. Instead, he urges a theorization of difference in which we recognize “that there are different social contradictions with different social origins; that the contradictions that drive the historical process forward do not always appear in the same place, and will not always have the same historical effects.” In other words, even though John and I may basically agree on most things, we may come to our beliefs in different ways and occupy different social positions in relation to our ideology as well as in relation to the various contradictions in our ideology.
Let me try to explain what I mean about Hall’s complication of the concept of ideology by first acknowledging how John and I agree. His main idea and advice for college graduates is for them to go to a foreign country, get some experience, learn a foreign language, and come to a different understanding of the world. In light of the changing socio-economic conditions that many journalists and scholars (including myself) call “globalization” and in light of the recession that began in 2008, I have often given this same advice to my own students and have used almost the same language as John. “Get out of the country,” we both say. “Make some money to pay off your college loans and get some experience to help you figure out what you can do.” This doesn’t just mean teach English, as I did. It could mean some sort of service, as John did. This goes along with some advice I wrote to graduating students in this blog way back in 2009 [here], specifically for my English majors who were confused about their career options and graduate school. Of course, our viewpoint here is not so original. Colleges and universities themselves are increasingly promoting “global citizenship” and study-abroad programs, as such experiences and skills are increasingly seen as necessary in today’s competitive environment. So, there is really nothing especially controversial about this view, whether you are a Republican, Democrat, Libertarian, Marxist, or Green. There is, however, something troubling about this view if you are poor — since the version of “global citizenship” and the way such cultural experiences are credentialed is somewhat exclusive.
And here is where I get critical of how John arrives at his position, and my point here, relating to the theory of Stuart Hall, is that we may all agree about the value of international experience and skills, but we may arrive at this viewpoint differently and articulate the contradictions of our ideology differently. John makes some rather cynical statements about the value of college education which he compares to a social club for rich people. Curiously, this comparison is part of his argument that contrasts the uselessness of college education with the usefulness of international experience. Hence, John’s argument relies upon a somewhat common (and, in my view, false and misleading) binary opposition between an exclusive ivory tower of spoiled brats and the “real world.” Ironically, of course (and here is an instance of the ideological contradiction I mentioned earlier), international experience is perhaps even more an exclusive opportunity for the rich than a college degree is. Many of my own students who come from poor families have remarked that a lot of the international and service opportunities (e.g., Peace Corps, AmeriCorps, various non-governmental organizations, etc.) with which their wealthier classmates pad their résumé’s are unavailable to them. They need to make money right away, and often do so with the less glamorous sorts of jobs.
This is not just a minor point, since the entire thrust of John’s argument relies on the contrast between a somewhat useless and exclusive nature of college education (useless in terms of any real skills or knowledge, according to him) and the usefulness, openness, and inclusiveness of “real” international experience. He claims, “college is more about socialization than about education.” Here his argument relies upon a binary opposition between socialization and knowledge, as if the two aren’t intimately related (as I demonstrated in my last blog post about intercultural competency). In fact, the way one goes about gathering data and interpreting it is an ethical act that requires the student to be able to question their own assumptions. Later, making the assertion that the infamously exclusive Skull and Bones club is a microcosm of the whole college experience, John claims that “college is a lot like the club of advanced industrialized nations” where the elite cater to the elite and exclude others, and he makes this claim without any acknowledgement that international travel may be an even more elite activity than college.
Repeatedly in question in John’s blog is what college faculty and administrators do. For those of us who teach, interact with students each day, work hard to design curriculum, and work with administrators to address the problem of social inequalities among our students, we tend to think that we are making a genuine effort to accomplish exactly the opposite of what John claims the institution of higher education does. Most colleges aim for their campuses being a “microcosm” of the whole society, not a microcosm of an exclusive club, and colleges support this as best they can with need-based financial assistance and programs to recruit first-generation students, immigrants, etc. However, what is disturbing here is that John’s stereotype of college culture is not supported by any evidence but is instead cloaked in the authority of a rather simplistic Marxism (exactly the sort that Stuart Hall is critiquing.) His stereotypes sound plausible, as strereotypes so often do. However, they are not true. Students actually learn quite a lot. Speaking as someone who actually labors in the trenches of higher education, the examples are numerous. In their first semester, students learn how to do research, evaluate sources, and compose a long research paper. This is very difficult for most of them when they arrive, but by the end of their first year, they can do it, and this is a very valuable skill, as I’ve argued elsewhere [here]. It is one of the joys all teaching when we see students accomplish something that they weren’t able to do before.
In addition to such skills (and there are many such skills), there is also a lot of knowledge and ways of processing knowledge. One of the things education abroad offices and college professors have discussed at length is how students learn (or sometimes don’t learn) from their experiences in other countries. Contrary to John’s assumption, most of the real learning happens not when the students are in the foreign country, but after they return to college, when they read, write, discuss, and process their experience. And this is especially important because what a middle-class white person (such as John and myself) usually experiences in a foreign country is a rather small piece of it. These travelers may not be aware of the socio-economic forces that produced their experience, and they may not ever see other aspects of the society, as I have argued in my blog about my trip to Japan with students [here]. Even someone whose job is to understand the whole society in which they work might be exposed to only a part of that society and come away from their “experience” with a rather ideologically warped understanding of where they were, as I have argued in my blog about my trip to Kenya with several of my colleagues [here]. There are things that a cultural historian, sociologist, and economist may reveal about a country that no personal experience will ever get, and although John seems to think that the various courses that students take during their four years of college have no connection to each other, actually the point of the liberal arts education is that they learn different ways of understanding the realities of the world that are usually invisible to us. College professors labor very hard revising and revising and revising college curriculum so that it is more inclusive and more effective at leading to exactly the sort of global understanding that John’s blog promotes. It is one of the joys of teaching when we see a student’s face light up with a transformative new understanding of the world they live in and when they make connections between what they learn in one class and another.
So, coming back to Stuart Hall’s point about ideology, we can see that while John and I might both agree that an international understanding of the world is important (a view with which almost all college administrations today also agree — so much so that it would seem both John and I reflect the very hegemonic ideology that we think we are critiquing), we arrive at that view very differently. Moreover, I would argue that the way he makes his politically leftist argument ironically has a lot in common with the arguments that the politically far-right make about higher education. Many of their arguments assert that public universities are sites of liberal brainwashing and socialization, and not about real content. (I have written about this at length [here]). Their goal is clearly to discredit the scholarly work that professors do, and their goal is a financial one — to shift funding away from publicly funded, publicly managed education to privately funded, privately managed education. My own problem with this is that privately funded education is, of course, like privately-funded think tanks, subject to the whims and biases of private money (often the interests of corporations and bankers.) What worries me about John’s disparaging comments about higher education is how similar they are in effect to the beliefs of the very people John is most ideologically opposed to.
And of course, when it comes right down to it, the real issue is the money question. All of the hysterical assertions in the mainstream media about the quality of education, how little students are learning, the content of their curriculum, the usefulness of a college degree, and the effectiveness of the delivery mechanism (i.e., on-line education, lecture hall, group learning, etc.) is largely smoke and mirrors. The real issue is an economic one, and has to do with inflation, cost of education, competition, etc. The often-cited graph is this one in which the cost of education has gone up so much faster than the rate of inflation.
Although Romney and other Republicans argue that the cause of this rising tuition is the federal government giving loans to students (huh?) and that fostering competition between colleges will drive prices down, it is more likely that the heightened competition is exactly what has driven prices up, as colleges pour money into new facilities, special programs, etc. I can tell you that the increase in tuition dollars does not go into professor’s salaries or the classroom experience. Rather, it goes into all the special things that make a school attractive and competitive. We might laughingly speculate that one of the things that has driven up college tuition is their effort to become more “global” in exactly the way John prescribes. The real concern here is how we finance a broad-based, accessible liberal arts education in which students from all backgrounds are exposed to a lot of different ways of looking at the world and learn a lot of valuable skills. In my view, it is dangerous to fancifully imagine replacing the liberal arts college and public university with something else, as John seems to do, because what is most likely to replace it is a corporate-driven agenda that merely trains young people to do the things the stockholders and CEOs need them to do. This is not the road to real prosperity, and it is not the road to a just and equitable society. Given the obvious gaps and holes in our culture that will likely widen with such a corporate-driven agenda, we can anticipate what will fill those gaps — not the objective, hard scholarly work that happens at the research university but the biased and narrow agenda of sectarian groups that fight each other.
Last semester, I blogged extensively about this Xtranormal Video “So You Want To Get a PhD in the Humanities” that went viral on the internet at the end of October, 2010. My blog post about the original video is [here] and about the conservative reaction to it that soon followed [here]. It seems that since then, just this January, somebody else made a sequel to the original entitled “Nine Years Later.” I don’t have too much to say about it, but since I blogged about the first two, I figured I’d follow up with this one. Last semester, my argument was basically that the humor of the videos only makes sense in its specific political context, and I’d make the same argument about this new one, except that this new one isn’t all that funny to me. See for yourself here:
What is supposed to be funny is how obtuse her advisor is about the financial reality of her situation. This basically reverses the roles of the original video in which the student was insistently obtuse and the professor frustrated. Of course, in reality, most advisors are quite aware of the problem facing academia today. The exigency (or timeliness and urgency) and what Aristotle called the kairos of this video is the recent budget cuts to higher education by state legislatures across the country. University presidents have actively protested these budget cuts. For example, the president of the University of Minnesota [see here] and the president of Penn State University [see here]. Students have also protested in Albany, New York [see here], and you may remember that it was the State University of New York in Albany’s elimination of several departments that prompted the first “So You Want To Get a PhD…” video. The upshot of all this is that higher education in the United States appears to be downsizing, and this is scary.
Questions I could explore in this blog post include (1) Why isn’t the new video as funny as the first one? (2) What the heck is causing all this mess in the first place? (3) What intended and unintended effects might we expect to see in the future? And (4) does this video spur us to actually do something about it, and what might that something be?
But for the first time in the history of this blog, I’m not going to explore them. I’m too pissed off.
A few weeks ago, a bunch of my older students from classes past started their secondary-education student teaching, and so I decided to try and write something that might be useful for them. This blog post will attempt a fun and practical distillation of my advice for teachers, based on personal experience, with a bit of theory about ethics thrown in. In a nutshell, my advice is rather simple — a question of priorities. In order, my priorities are these:
(1) care for yourself
(2) care for your students
(3) care for your subject
This might seem pretty simple and obvious in the abstract, but in practice it’s not. Once upon a time, the subject matter was perceived to be priority number one, and even now, as everyone knows, a mastery of the subject is the primary goal of all the testing that the controversial No Child Left Behind Act mandated in 2001. And back in the olden days — before such things as computers, information highways, and even regular concrete interstate highways — classroom pedagogy tended to be centered around the lecture. Against this lecture model, starting in the 1960s and 70s a variety of social and intellectual movements (including Paolo Friere’s famous Pedagogy of the Oppressed) prompted educators to imagine a more “learner-centered” approach to teaching. The learner-centered approach encourages students to take an active role in their own education and also stresses that people learn better by actively doing and creating than they do by passively consuming and regurgitating information. These days the learner-centered approach is pretty common, but it wasn’t always. So, we can see a movement from the priority being the subject to the priority being the student. But why I am suggesting that the priority should actually be the teacher?
Let me put it in practical, common-sense terms before I start bringing in the theory and reflecting on ethics. I’ll start with the bottom priority and work my way up. You ought to care about your subject. If you don’t, then you’ll be bored and so will the students. (This, by the way, is one of the problems of the No Child Left Behind Act, because its definition of what should be tested and how skills should be measured is so narrow and dreary.) So choose subjects that you personally think are interesting and worthwhile, not subjects that others think are interesting and worthwhile. And if you don’t have full control over your subject matter and find yourself having to teach things that the government, parent-teacher association, or school principal thinks are important, then find ways to relate what you have to teach to the stuff you really care about and love (or at least find amusing.)
But what’s more important is to care about the students. After all, it’s their education, right? Why should they be learning this stuff? If students suspect that you don’t like them, then they are less likely to learn from you no matter how much you care about the subject. (And this is doubly true if the subject you are teaching is politically controversial, e.g., the subject of teaching itself.) And sometimes something as simple as a smile or a laugh can go a long way. Students need to know that you are on their side even if they are wrong about something or if they are making some bad decisions or if they are pissing you off. And they especially need to know this if you happen to disagree with them about something reasonable people can disagree about (e.g., whether a senator or a short story is any good.) And really, you should be on their side, because it’s your job to help them learn and grow. Just because their personal qualities don’t coincide with your own expectations for the class doesn’t mean you can’t find pleasure in the qualities they do have. The students are more important than the class, after all.
However, if you are too much on their side, they will think they are the boss, when actually you are the boss. You are the one with a ton of knowledge at your disposal, not them. You are the one who has to answer to the principal of the school, not them. So, that’s why you should care about yourself most. And there’s an even more practical dimension as to why caring for yourself should be priority number one. If you are too tired and stressed out, then you won’t do as good of a job teaching. And if you get so stressed out that you start to get sick, then how is that good for the students? And if you are too worried about what the students think of you, how can you keep sight of what you know is right? (And the fact is, you never know what the students are really thinking.) So, get a good night’s sleep, exercise regularly, play sports, spend time with your family, hang out with your friends at a bar, watch a movie, make love to your significant other, write in your journal/blog… etc., etc…. In other words, do whatever you enjoy doing. If you don’t, you’ll start to hate your job and the students will sense that… and then…. Feh.
Easier said than done. It’s obviously easier to state these priorities than it is to actually juggle them. After all, preparing lessons and grading papers takes a lot of time, so it’s natural to prioritize finishing the work rather than (let’s say) working out at the gym or going to a movie with your friends. But if you don’t make time for yourself and insist on taking that time, then you’ll never have it, because the responsibilities of a teacher are endless. You can always devote more time to students, always make more of an effort to prepare for class, always learn more about your subject, always spend more time on students’ papers, always devote more time to your colleagues and the community around your school. It can feel overwhelming sometimes, and the giving of yourself to others can be exhausting. In fact, a common mistake many first-time teachers make (including myself) is over-preparing, and over-preparing often leads to frustrating disappointment if the meticulously planned lesson doesn’t go as planned. (And I am right now imagining my former students reading this paragraph and laughing, because they know that I don’t over-plan my lessons now!!!)
Now for the theory and ethics stuff. In the early 1980s, Michel Foucault explored different possibilities for what it might mean to be an ethical and free person in relation to the real world — in particular, check out his book The Care of the Self, his essay “Technologies of the Self,” and an interview “The Ethics of the Concern for the Self as a Practice of Freedom.” There, Foucault observed that the dominant tradition of Judeo-Christian ethics in particular and modern European ethics in general have tended to characterize virtue as caring for others. In this somewhat ascetic tradition, civic virtue and personal pleasure are understood as opposites unless one feels good about being selfless. Also in this tradition, philosophers have tended to emphasize ethical dilemmas or choices between conflicting obligations or between which good work is the most good. This sort of ethics is almost self-punishing in demanding that we reflect on our own worth and our decision-making ability, measured against some imaginary, idealized standard of transcendent excellence. And of course first-time teachers often agonize over whether the students like them or whether they are even learning anything. This agony is natural and common, and everyone feels it, but it is ultimately a dead end.
Foucault argues against the Judeo-Christian mandates to know thy self and attend to others as the starting point of ethics; instead, he suggests that care for oneself is a better starting point, and then knowledge of self and others will follow from a genuine and reflective practice of self care. Foucault rediscovers this lost tradition of self care in ancient Greek texts about ethics and government, such as the philosopher Xenophon who stated, “If you do not care for yourself you will make a poor ruler.” And for Foucault, this includes enjoying simple bodily pleasure, finding your source of empowerment, and using the technologies of self-government as a daily practice of freedom and ethics. Foucault goes further to argue that this self-empowerment is not just something that happens inside our souls. Rather, it is our relationship to technologies, tools, organizations, administrative bodies, social bodies, commercial businesses, etc., that will always channel our energies and regiment our lives. These technologies and social organizations can oppress us but they can also be resources of power and freedom. And for Foucault, the technologies and practices of self care include the technologies and practices of the body (e.g., the gym and sex). And hence, Foucault’s ethics differ greatly from the sort of ethics that emphasize a denial of the body’s desires and an examination of one’s soul or intellect. So, while ethics traditionally conceived emphasizes big choices and our interior, bodiless selves, Foucault’s ethics emphasizes the small habits of everyday life, the ways we care for and enjoy our bodies, and our access to the technologies and social networks that empower us both psychologically and materially. In other words, you don’t have to go it alone; there are other people and technologies that will help you. True ethics is not an individual act, but a social, cultural activity.
So, coming back to teaching, there’s always more to being a good teacher than the teaching itself. Take care of yourself. It’s not just your labor and good works that are valuable. You are valuable. To give an example, I know I became more confident in the classroom after I started exercising regularly. Is there a rational connection between jogging and teaching? No, of course not, but there it is. And incidentally Benjamin Franklin in his Autobiography strongly agrees with me and Foucault on this point. And I remember when I started teaching I would worry about whether students liked and respected me, but as Xenophon suggests, how could anyone respect a ruler who doesn’t take care of his or herself? How could anyone respect a teacher who has no life outside the classroom and no self respect? And how could students like a teacher who doesn’t like them back? And how could students like a teacher who doesn’t enjoy the subject?
Again, easier said than done.
What’s even better is when you can find lines of connection between what you enjoy (e.g., for me, writing this blog) and your job. This blog is a technology and a sort of playful fun space that helps me create myself and practice my writing, and I feel that it has made me a better teacher, a better writer, and a better person. One of the facts of life that Foucault (and also theorists Judith Butler and Simone de Beauvoir) helps us grapple with conceptually is that one is not born a teacher; rather, one gradually over time becomes a teacher. It usually takes a while; the first or second or even third year are not a true measure of the kind of teacher (or person) you might eventually become.
Occasionally, I bring my Pocket World In Figures to my classes to begin the hour with a few “fun facts.” I get this nifty little book every year through my subscription to The Economist magazine, and on those rare days when I remember to bring it to class, the students and I enjoy playing a guessing game for a few minutes before the real lesson. The first half of the book is rankings of various sorts, such as biggest producer of copper (Chile), highest education spending per person (Cuba), most consumption of beer per person (Czech Republic), and most people in jail (United States). The answers are sometimes surprising, and offer what we college professors like to call “teachable moments” because students will usually guess according to their stereotypes, and often the real data will contradict those stereotypes. For instance, they always guess Ireland to have the most beer consumption per capita, but it’s not even in the top 25. (The United States is 7th, and Ireland is actually 16th for wine consumption.) Also, the data will reveal very interesting things about current events, such as the top ten largest companies in the world being all oil and automobile companies with the exception at the number two spot being Wal-Mart. And the two countries in 2008 taking care of the largest refugee populations from other countries are not the United States and Canada, as my students always guess, but Iran and Pakistan (i.e., the two countries that border Iraq and Afghanistan.) Like I said, teachable moments. Sometimes the answers are somewhat obvious, but sometimes I’m just as surprised by the data as my students are.
The second half of the Pocket World In Figures — and the reason for this blog post today — are country-by-country profiles. So, if you want to quickly find out Germany’s population, biggest exports, unemployment rate, health-care spending, etc., this is where to go. Now, here is where my story really begins and why I’m writing this blog post. Right as I was leaving my office to head over to the very last day of my class on Caribbean literature and theory, it occurred to me to bring this book and have a little fun. I hadn’t brought the book to this particular class all semester because it didn’t seem relevant, but on this last day doing a few “fun facts” about some Caribbean countries seemed like a good idea. Anyway, I headed to class curious about what we would discover, but I was very unpleasantly surprised when I discovered that not one single Caribbean country is included in the “Country Profiles” section. Not one!
And that’s the reason for my rather absurdly provocative newspaper-style headline for the title of this blog post.
So much for the fun facts game in class that day, but nevertheless, it was still a teachable moment. After all, the students themselves had already experienced this blind spot when their parents said to them, “Um… you’re taking a class on… what?!?!… I didn’t know there was such a thing as Caribbean literature and theory.” My students and I heard this kind of thing a lot over the course of the semester, despite the fact that the Caribbean can boast two Nobel prize winners (V. S. Naipaul and Derek Wolcott), a few people who probably ought to win the Nobel prize (e.g., Edward Kamau Brathwaite and Maryse Condé), one of the hottest young authors in the world writing today (Edwidge Danticat), the most important pop musician of the twentieth century (Bob Marley), and some of the most influential and world-renowned anti-colonial political theorists of all time (e.g., Aime Césaire, Frantz Fanon, C.L.R. James, etc.) Moreover, looking back in history, from the seventeenth through the nineteenth centuries, the Caribbean islands were by far the most economically profitable and productive colonies the European empires had. So, why are people so surprised that my course exists? And for sure this cultural blindness to the Caribbean is exactly one of the reasons I taught the course…. But it still begs the question, why this blind spot?
Now, to be fair to The Economist, clearly they can’t include all 192 countries, because the book would be too big. It is a “Pocket World” after all, not the whole world. But in this case, the book strangely excludes an entire geographic region from its world — no Jamaica, no Haiti, no Trinidad and Tobago… not even Cuba. Now, we also know that The Economist tends to be somewhat neocolonialist in its attitude towards the world, a bit racist at times, and almost always smugly chauvinistic in its tone when describing any culture not Anglo-Saxon. Perhaps, so far as The Economist is concerned, the islands in the Caribbean are not really separate countries at all, but just extensions of the United States, Britain, France, and the Netherlands. And technically, many of the islands really are under the formal dominion of the United States (e.g., Puerto Rico, U.S. Virgin Islands), Britain (e.g., Montserrat, Cayman Islands), France (e.g., Martinique, Guadaloupe), and the Netherlands (e.g., Aruba, Curacao), but most of it is politically independent. However, as Éduoard Glissant observes in his Poetics of Relation (an observation also made by Jamaica Kincaid in A Small Place and by the movie Life and Debt), nominally independent is not the same thing as really independent. Economically, they are still in many ways controlled primarily by the United States (who has tended to invade countries that didn’t obediently fall in line with its political and economic interests, e.g., multiple invasions of Haiti, Cuba, and the Dominican Republic over the course of the twentieth century as well as the invasion of Grenada.)
Culturally, this neocolonialist relationship that much of the Caribbean has with the United States and Europe can be seen in the images Americans tend to associate with the Caribbean. Typically, if you ask someone what images come to mind when you say the word “Caribbean,” they think of beautiful beaches, spiced rum, and Captain Jack Sparrow from Disney’s Pirates of the Caribbean movies (the most financially successful film series of the past decade.) In other words, in this cultural imagination, the Caribbean isn’t a real place; it’s just entertainment.
What I think my American students most enjoyed about our class is that the Caribbean gradually became a very real place to them, a place where ordinary people are born, grow, learn, and express themselves. The politics are complicated, the economics even more complicated, and if any one culture could be truly called a “world culture” it is the diverse and varied cultures of the Caribbean (as theorist Glissant implies in his complicated explication of the word “Creole” and as Tiphanie Yanique illustrates in her recently published book of wonderful short stories.) Perhaps The Economist magazine’s Pocket World in Figures doesn’t include profiles of Caribbean countries because the Caribbean is itself the whole world in microcosm. Or perhaps the editors of The Economist just need a spanking.
Almost exactly a month ago, I blogged [here] about the Xtranormal on-line cartoon “So you want to get a PhD in the humanities” that almost everyone in universities across the country (and outside the country) was talking about. That cartoon was so popular that it inspired a myriad of copycats about various other graduate school programs, including political science, physics, law school, film, business, and economics. It also inspired a reaction. One of my former students just e-mailed me this reactionary retort called “Yes, I want to get a Ph.D. in the Humanities” that was made a few days after my blog post about the original. In my post today, I want to briefly explain three things: first, why they copycats are not as funny as the original; second, the factual inaccuracies in the retort; and three, the ways in which the retort is a classic example of conservative, reactionary ideology. But before I do that, check it out:
First, why aren’t the copycats so funny? The copycats resemble the original in that they exaggerate the bitterness of the professor and the naïveté of the student in order to dramatise the ironic contradiction between what the student (and society at large) expects and wants to believe and what the professor fears to be true. But despite the basic, structural similarity, it’s clear if you fool around with google searches that they haven’t had the popularity or buzz of the original. In my opinion, the copycats are less funny for a number of reasons. One reason is their lack of exigency — or what in Greek is called kairos. Essentially, the context for the original was the recent cuts in various humanities programs at universities across the country that occurred at the same time a political movement in the country was attacking humanities professors. This situation was unique to humanities programs. The cartoon about business school, at least, does include references to the recent recession and the reckless greed on Wall Street that produced it, but the others appear to float above recent historical circumstances. What most of the copycats do instead is simply dramatise the difference between the idealised version and the realistic version — except that because their realistic version has no anchor in true reality (i.e., historical events or even much truth at all), the result is simply two competing fantasies (one utopic, the other distopic), both of which are absurd. A final reason why the copycats are less funny is that they seem to have misunderstood the multiple audiences of the original. As I wrote in my blog before, the original had two audiences, college students and college professors. My interpretation of the original is that it was more a satire on the contradictory state of universities today than it was a mockery of idealistic college students. Since the copycats seem to miss the sophistication of the original (i.e., its multiple audiences and its historical depth), they come across as flippant and silly.
But what about the retort? The retort draws attention to all the privileges and perks enjoyed by graduate students and college professors, but its many obvious factual inaccuracies undermine its humor. First, it assumes all graduate students get free health care, don’t pay taxes, and get to live in an affluent neighborhood, but of course graduate students had to fight hard in order to get health care and don’t pay taxes because they barely make enough to pay rent. (Never mind that not all universities are in affluent neighborhoods.) The history of health care is too complicated to describe in my blog today, but it includes instances in the recent past of graduate students without health care in near-death situations. Second, few graduate students get to travel around Europe, because such scholarships are extremely competitive and have become even more competitive in recent years, especially when European language programs are being cut entirely. There is a huge disconnect between the realities of being a college professor and the idealized version in the video. Likewise, in the popular imagination, college professors make a lot of money, but few actually do. Unfortunately, people who serve on the boards of universities and who actually affect its administrative decisions often believe the popular media representations of the overpaid, underworked, overprivileged professor and fail to give their full attention to the very data (i.e., actual salaries and lists of responsibilities) that is supposed to be the basis for their decisions. In other words, unfortunately, the fantasy about academic life has the potential to produce as much of an effect on administrative decisions as real, statistical data. Moreover, the video assumes that the recession is equally tough for everyone, but this is false. In fact it is statistically harder for Ph.D.’s in humanities to get jobs than for people with B.A.’s, and the reason for this is rather obvious — there is a wide variety of jobs someone with a B.A. can get (e.g., a bachelor’s degree in English can lead to a job in publishing, media, public relations, personnel, administration, teaching, law, medicine, politics, philanthropy, advocacy, etc., etc., etc.), but obviously a very limited pool of jobs for someone with a Ph.D. (e.g., being a professor or working in college administration.)
So, how does this video represent a reactionary, conservative ideology? There are obvious signs, such as the rather childish comment about postmodern theory and the irrelevant quotation by John Adams. The comment about theory is childish because empirical research was always the mainstay of English departments; and likewise theoretical questions have always been, and always will be, interesting to anyone with a brain wanting to do original research — theory simply inspires and directs new research questions. In fact, if anything, the highly empirical research models of Michel Foucault are now so entrenched that most graduate students are basically doing Foucaultian work even though they may not be fluent speakers of Foucault’s jargon or interested in his critical perspective (a state of affairs that many scholars observed about a decade ago.) The quote by John Adams is irrelevant since Adams lived two centuries ago and was expressing an ideal that he hoped the country would commit to, not describing the reality of what the country has in fact actually committed to. But beyond these two superficial signs of the anti-intellectual, reactionary ideology, we can go deeper. Conservative ideology values nostalgic, idealised imagery and represses data about actual working conditions. Whether one is talking about automobile workers, high school teachers, or farmers, the image the conservatives hold on to is one of happy, good people threatened by outsiders and subversives (i.e., dark-skinned theorists.) This image is deployed to undermine labor unions’ efforts to achieve basic rights such as minimum wage, safe working conditions, overtime pay, and realistic expectations about productivity. Hence, the point of the video is that we should all be content with the fantasy and suppress all of those who might point out the difference between the fantasy and reality.
That said, what I appreciate about the video is that it does remind professors that the job can be really sweet if one happens to be one of the lucky few to land in a good place. The hardest part about getting a Ph.D. in my experience is simply this — one has to commit oneself to a very uncertain future.
Last Monday, somebody made a funny cartoon called “So, You Want to get a Ph.D. in the Humanities” that quickly went viral on YouTube and FaceBook. And the very next day someone else made a similar cartoon called “So You Want to get a Ph.D. in Political Science.” My buddies from graduate school especially appreciated the humor since most of us have somewhat recently suffered the slings and arrows of the infamously intimidating “job market,” and many are still searching for that treasured yet elusive tenure track job and at this very moment are anxiously sending out their applications. I want to do a quick reading of this rather bitter satirical video by putting it in its political context — the so-called “crisis in the humanities” — and attending to the different audiences who might be watching it. Here’s the video:
Some of my students might wonder if this video really reflects the secret inner thoughts of their professors whenever they are asked for a letter of recommendation…. No, not really. This is a classic case of satire through exaggeration that blends truth with untruth. So, the truth is that yes, the job market for humanities is very depressing and isn’t likely to improve much, that public universities face state-wide budget cuts, and that the reality of being a professor is very different from what many undergraduates imagine it to be. In fact, I have myself said exactly these things to so many students in my office on so many occasions that I eventually just wrote a summary of my “advice” in my blog [here] — and according to the nifty little calculating technology of my blog’s host WordPress, I know that this post is by far the most popular thing I have ever written, having been independently viewed more than two thousand times. The untruth of the video, of course, is that few professors actually think our students are misguided or “stupid” to want to pursue a Ph.D. in the humanities or for having the very same beloved ideals that we had when we started down that path. And likewise most students are far more sophisticated than the one in this cartoon. Reality is not so horrific, and it’s a pretty sweet job I have, all things considered; obviously, it’s the exaggeration-for-effect that makes the video funny…. Duh, that’s obvious, so what?
Actually, the real point of the cartoon and what makes it funny to me is not at all what professors may or may not think about their students or about their own jobs. If we shift the target audience of this video from students to other professors and administrators, its meaning changes a bit. (And, if you’ll excuse my “theory teacher” moment, this is why most introductions to literary theory stress to English majors that they pay attention to context and audience, as Roland Barthes suggested in his famous “Death of the Author” essay, as Stanley Fish suggested in his famous book Is Their a Text in This Class?, and as countless other “reader response” and “new historicist” theorists have argued.) For professors and administrators today the real issue in this video is something the newspapers are calling the “crisis of the humanities.” For instance, see this NY Times op-ed by Stanley Fish from a few weeks ago.
So, in order to really appreciate what’s going on in this cartoon, let’s consider its political context. What precipitated the flurry of discussion in newspapers and the blog-o-sphere is exactly the event mentioned in the cartoon — the State University of New York at Albany’s decision to cut its French, Italian, Russian, classics, and theater departments. Now that’s some serious cutting, and a lot of already tenured professors will soon lose their jobs. You can read about that frightening decision [here]. It is, therefore, not surprising that this particular video was made at this particular moment in time, especially since many other colleges and universities across the country are also cutting back, albeit in less drastic measure. (Lucky for me, mine isn’t.)
Moreover, as the cartoon suggests, there is fear among academics that radical conservatives like the ranting Tea Party movement are out to screw us. This is not an entirely irrational fear considering that bills have actually been proposed in several state legislatures to control what professors are allowed to teach or say in class. See, for instance, [here]. In other words, if these bills passed and you happened to teach at a public university, the content of your class would be limited by the narrow agendas of state politicians. There was even talk (back in the scary post-9/11 days) of putting professors who didn’t support George Bush’s war in Iraq under surveillance by Homeland Security. Fortunately, none of these bills passed.
Unfortunately, the so-called “crisis of the humanities” is neither something new nor what I would call a crisis. You can see this American Scholar article from last year, this Inside HigherEd article from 2007, and Michael Bérubé published this book and Robert Scholes published this book about it way back in 1999. In fact, the Modern Language Association gathers data about the state of the job market and enrolments in English and language departments and put together this solid, data-driven analysis in the 2004 issue of its journal Profession.
So, what, in a nutshell is all the hullabaloo about? Why all the lamentations about the declining enrollment in humanities courses?
The blame game goes in all sorts of directions. Some blame our more materialistic culture (since the 1980s) that encourages students to choose more professionally oriented majors such as business-management and encourages university administrations to run their schools the way one runs a business. Others blame the professors themselves for not teaching the right things (or the politically right things.) According to that belief, it is precisely because English departments and other humanities departments got all “postmodern” and “deconstrucitvist” and French-ified that they began to decline. In other words, so the argument goes, because corrupt, leftist English professors began critically demystifying and deconstructing the great authorial genius of Shakespeare and Emerson and began to attend to the voices of women, African-Americans, Native Americans, etc., they signed their own death warrant. (At the very least, we “theory teachers” inspired radical conservatives to wage media campaigns against academics for not being patriotic or traditional enough.) Others focus attention in another direction and blame the military-industrial complex and the changing nature of the “research university” since the 1960s because most administrators understandably seek large grants and pools of money to fund research at their universities, and let’s face it, there’s a lot more money coming into schools to promote technology and business than there is coming in to promote arts and critical thinking.
Interestingly, the surprising discovery of the MLA study (linked above) is that what’s really causing declining enrolments in humanities might be none of these things. Rather it is the rise of new, interdisciplinary programs such as Communications, Peace Studies, Global Studies, Gender Studies, and Environmental Studies. The attraction of such interdisciplinary programs is perhaps somewhat obvious, even though the skills learned in them are pretty much the same skills one learns in any humanities department — how to think, do research, and write. Most “career services” centers tell their undergraduates that it really doesn’t matter what one majors in. What really matters (career counselors at my school have told me and my students on multiple occasions) is whether you are excited about what you are learning. Considering the findings of that study, maybe declining enrolments in humanities and other traditional departments such as political science are not such a crisis after all.
Nevertheless, I do think it is true that university administrators these days favor business, hard sciences, and what I would call the NGO-majors of environmental sciences, global studies, gender studies, and peace studies. (And interestingly, the NGO-majors came into existence in the early 1990s at exactly the same time when the world witnessed a significant increase in the number of global NGOs.) When universities create these programs and talk about them glowingly and excitedly to newspapers and on graduation day (in ways they rarely talk about traditional humanities)… well then yes, there’s a bit of truth to the viewpoint that universities ought to do more than they have been doing to support their humanities departments.
But all this still begs the two related questions of why and how universities ought to support the humanities. In his op-ed earlier this month, Stanley Fish argues that university administrators should admit that humanities isn’t profitable but should aggressively defend their worth to state legislatures. In response to Fish, The New Arts, Politics, Philosophy, Science blog argues that actually the humanities are not only profitable but are very important to the fiscal life of the university in terms of overall cost-benefit analysis. In other words, not only has Fish got some of his facts wrong, he has also bought into the neoliberal, Wall Street ideology that misinterprets economic data. They cite this AAUP article that demonstrates that the sciences may bring in more money from private corporations, but sciences are also more expensive to run. Dollar for dollar, humanities departments are cheaper to maintain and provide a range of important services to the whole university.
What is that service? Now here we get into the nitty-gritty of what we as humanities professors ought to be doing about this so-called crisis. Fish’s argument suggests that we ought to maintain the traditional academic departments in their traditional formulation. But as Michael Bérubé and Robert Scholes argued way back in the 1990s, English departments ought to shift from focusing only on traditional literature and expand their range to include other roles — for instance, roles valued by the whole university such as writing courses and critical thinking courses catered to environmental studies, global studies, business, etc. Bérubé and Scholes are arguing explicitly against the conservative tendency of English departments to retreat into traditional notions of literary study. They are also arguing against the notion that English departments are themselves to blame for their declining enrolments. And I agree with Bérubé and also agree that broadening the horizon of English departments to include both interdisciplinary courses and what are called “service courses” (e.g., freshmen composition and business writing) is probably one strategically intelligent way forward. In contrast to the departments of Russian and classics cut at SUNY Albany, English departments at most public universities are safe from such cuts precisely because they perform an essential service to the whole university.
But this can’t be the whole story. Fish is absolutely right to argue that this is really a political matter, not a curricular matter, and the traditional approaches to literary study (i.e., explaining to students why Shakespeare is as great as everyone says he is) are still important and valued by students as well as by the general public. (By the way, Fish responded to his critics and the AAUP article I mentioned [here].) At the end of the day, it’s wrong to think of this in either/or terms — either we do more interdisciplinary cultural studies or we do more traditional valuation of literature. What makes departments strong (in my opinion) is their diversity of personalities, subjects, approaches, etc., because departments need to attend to the diverse (and divergent) expectations our students, administrators, and general public all have for us.
So, given all that political context, now we can return to our reading of the cartoon. What makes the cartoon funny is the contradiction between what the student expects and believes and what the professor fears to be true. And this contradiction reflects the larger contradiction in public expectations for college English departments. The general public expects on the one hand exactly what the student desires — a life of the mind. But it also expects original, empirically grounded research by intellectually brilliant, hardworking individuals. And it also expects the curriculum to be practical and relevant to the “real world” (whatever that is). So, because public expectations for English departments (and other humanities departments) are so contradictory, the pressures on us faculty are indeed stressful. Unfortunately, what this video does is displace the anxieties faculty have about the future of their discipline and the security of their jobs onto the idealistic student. In other words, in this video, the naive student is made to symbolically (or metonymically) stand in for the naive public expectations, the contradictory demands put on faculty, and the worsening job security. In this sense, the video is a bit unfair to the student, and considering its multiple audiences (students, professors, and administrators), it does little to move us forward towards a serious reflection upon our collective strategy for addressing the so-called crisis. And by “collective,” I mean the collective of students, professors, and administrators who ought to all be allies in this task, not antagonists. On the other hand, if students and administrators are able to see the humor in the video and sympathize with the overworked, yet-to-be-tenured faculty who fears their job might at any moment be cut by overzealous politicians, then maybe the cartoon does do some useful cultural work.
p.s. And I suppose what my blog post is really saying is that for me the value of the humanities, and English in particular, is that it gives me the critical tools to perform the kind of complex reading of a simple cartoon. This is why I love being an English professor.
Last night I went to K’Naan’s concert at the First Avenue club in Minneapolis, and it totally rocked. Seriously, it was so much fun. If you’ve been living in a cave for the past couple years and don’t know who K’Naan is, he’s the Somali-Canadian rapper from Toronto whose song “Wavin’ Flag” became one of the official FIFA theme songs for the 2010 World Cup soccer tournament. The rather tame World Cup version of the song has been peformed by artists around the world in 22 different languages, but of course fans prefer the more edgy and politically significant original. K’Naan himself has performed concerts in 67 countries, including his homeland Somalia. When I was travelling in Ethiopia this summer, I noticed that some people’s cell phone ringers were the “Wavin’ Flag” song. At the concert last night, I was impressed by the diversity of the crowd, including African immigrants not just from Somalia but from a variety of countries, African-Americans, Euro-Americans, Asian Americans, etc. A couple of white men standing near me in the audience (age mid-20s, I’d guess) seemed to have memorized all the lyrics and were vigorously singing along the entire show. The overall vibe in the room was intensely positive, and several people had brought their own flags, heralding a variety of countries, so during the “Wavin’ Flag” song, people actually waved flags, dramatizing the internationalism of the event.
An amazing coincidence is that even before I knew he would be performing in Minneapolis, I had actually assigned K’Naan to my first-year seminar for last week’s lesson. I was inspired to create a new unit for my class on the Hmong and Somali in Minnesota after several incidents of racism against Somalis close to where I live last year (and which I blogged about [here].) I decided to assign K’Naan to show a different side of Somali culture as it manifests itself in Diaspora than what they will get from the history book we’re reading together — a side of Somali culture that the students can better relate to. The past couple years, I’ve noticed that in almost all my classes, there is at least one student who is a K’Naan fan, and because of the World Cup almost everyone is familiar with the “Wavin’ Flag” song even if they don’t know who its author is. During a brief discussion in class last week, my students observed that we often think of the American flag as a symbol for freedom, which reflects our bias as Americans, but that K’Naan’s song wisely makes the case that any flag, from anywhere, can be a symbol of freedom.
So, do I have a theoretical point to make here? Not really, but I’ll try to make a few simple gestures. First, recently colleges across the country have become interested in teaching “intercultural competence.” Usually, these programs, such as the “Intercultural Competence Assessment“, have emphasized the recognition of cultural difference. But I have some serious disagreements with such programs, as I’ve blogged about before [here], but before I explain my disagreements, I want to reveal why most students also will never be convinced by that model. When I brought up the question of diversity and cultural difference in my “race and ethnicity in U.S. literatures” class last year, several of my students told me about an episode of The Office entitled “Diversity Day.” Most of our students watch this show and seem to know this particular episode which actually makes fun of precisely the model of intercultural competence being promoted on college campuses today — a model that I call the “business school model” since that’s where it originated.
And the TV show makes fun of it for good reason. By emphasizing arbitrary differences such as clothing, handshakes, and marriage ceremonies, “diversity day” ignores the political and economic issues that really affect people’s lives and interests. In other words, as my students easily recognized, it gives you cultural difference without the possibility of real difference — without the possibility of having a different opinion that might conflict with policy or with a general business model. (By the way, a great novel recently published on this theme is Mohsin Hamid’s Reluctant Fundamentalist, in which Hamid surprisingly reveals in the middle of the novel that the fundamentalism he’s referring to is not Islamic fundamentalism at all, but the economic fundamentalism of Wall Street.)
As theorist Slavoj Zizek wrote a couple weeks ago for The Guardian [here], the kind of vapid multiculturalism promoted by the intercultural competency assesors and satirized by The Office gives you difference without real difference, like decaffeinated coffee or sugar-free drinks. Interestingly, the World Cup version of K’Naan’s “Wavin’ Flag” is kind of like decaffeinated coffee — the controversial lyrics removed in order to celebrate a global unity. Fortunately, none of my students or the fans in the audience are fooled by the Coca-Cola-ized World Cup version of the song and prefer the original which rather explicitly debunks the false promises of global unity (see the lyrics [here] for yourself.)
Moreover, what K’Naan’s global popularity suggests is that the arbitrary cultural differences that the administrators and assessors of intercultural competency wrongly believe are so important aren’t quite so important after all. Focusing on such arbitrary differences misses the point entirely. No matter where you are in the world, people like to have fun, want to be free, want security, fall in love, etc., as K’Naan says explicitly in his song “Dreamer” — a tribute to John Lennon’s “Imagine.” What we have in common is our struggle in an imperfect world rife with poverty, economic exploitation, and violence.
So, what’s the upshot here? Reflecting on the K’Naan concert and all the various things my students have told me (such as when they recommended I watch the movie “Good Hair” that I blogged about a few weeks ago [here]), I think our students are already pretty smart about intercultural competence, and we as teachers can learn a lot from them. The problem as I see it is when administrators and assessors don ‘t recognize the skills and knowledge our students already have and instead impose a rather simplistic (and rather silly) model of “intercultural competence” upon them. In other words, intercultural competence isn’t all that hard and doesn’t require a lot of theoretical sophistication so long as we begin with some rather obvious facts about the world we live in — the fact of the K’Naan concert, for instance. What I mean by “fact” here is that it happened. In contrast, notions of “difference” are not facts; they are notions and conceptualizations. So we can ignore the model imposed upon us by the “assessors” not only because it actually does more to obscure reality than it does to explain it, but also, more importantly, because it doesn’t recognize what our students already know, and what they already know might actually be smarter and more in touch with reality than what the assessors are advocating.
As I mentioned a few days ago in my blog post [here] about my itinerary in Ethiopia, I visited quite a few museums: the National Museum, Ethnological Museum, Addis Ababa Museum, Red Terror Museum, Jimma Museum, Harar Museum, and Arthur Rimbaud Museum as well as a traditional Harari home and the Asni art gallery. That’s a lot of museums, and unfortunately I didn’t take very good notes. Nevertheless, as I moved from one museum to another (sometimes three in one day), I began to notice some key differences between them, and I began to consider a couple of very old questions: what are museums for and how ought they be organized? And I say they are old questions because quite a few scholars have published books attempting answers… such as this, this, and this… though I must admit that I have not read any of them.
But I did notice a few things. I’ll start with the National Museum and the Ethnological Museum, since they are almost right next to each other, and, significantly, they tell very different stories about Ethiopia. The National Museum has three floors; the first floor focuses on prehistoric times and includes animal skeletons and the famous skeleton of the oldest human, nicknamed “Lucy” by Americans and “Dinkenesh” by Ethiopians (which is Amharic for “wonderful”); half of the second floor is devoted to ancient archaeological relics such as old tools and ornaments, and the other half is devoted to symbols of the Ethiopian empire of the 19th century, mainly the paraphernalia of its emperors; the third floor focuses on modern Ethiopia, especially the communist paintings of the 1970s celebrating peasant labor and the postsocialist, postmodern artwork of the 1990s mourning the brutalities of the communist regime and the fragmented national consciousness. Thus, as one moves from the first floor to the third floor, one moves forward in time, and the organization of material suggests a patriotic story of national unity and progress, from the prehistoric Lucy to the symbols of empire to postmodernist painting. And of course, by doing so, it projects Ethiopia’s contemporary political boundaries back in time, making the nation seem older and more continuous than it ever really was. There is nothing unique about Ethiopia’s national museum, as this is the project of most national museums all around the world.
In contrast, the Ethnological museum is more multicultural, and thus reflects the direction of scholarship at Addis Ababa University since the mid-1990s (as well as the direction of scholarship in the United States and Europe), attending to different regions and cultures. The museum includes artifacts that illustrate the different rituals, musical instruments, tools, and clothing of the various ethnic groups. (Ethiopia has over 70 different ethnic groups, so for the sake of simplicity and space the museum had to combine many of them.) But there are two organizing principles at work in this museum — not only the ethnic one, but also the narrative of a human being from birth to death. As one walks through the museum, one walks through a universal human narrative of development from childhood, to marriage, to basic village economics, to religious ceremony, to war, etc. Thus, the museum shows both difference and sameness — cultural differences are ordered according to universal sense of what it means to be human…. Except that all of the artifacts are primitive, and none of them modern. The museum seems to suggest that beneath all the cultural differences is universal humanity, but I couldn’t help but think that its essentialist sense of human-ness was actually quite alien to the experience of anyone growing up in the early 21st century.
To analyze these two museums, one might say that the National Museum is ideologically nationalist and modernist and that the Ethnological Museum is multiculturalist. But thinking critically, I began to wonder why there couldn’t be a museum that was both multicultural and modern. There seems to be something rather wrong about both museums, one asserting a nationalism that is blind to cultural difference and the other asserting an essentialist humanism that is blind to history. Instead, can we imagine a museum that would show how cultures are changed by historical forces that aren’t always progressive and might involve disparities in relations of power?
A completely different museum, the Jimma Museum, focused entirely on the stuff that was owned by Jimma’s last king, Abba Jiffar II, who ruled from 1878 to 1932. This would seem to be the most unsophisticated museum of the bunch, as its goal seemed to be nothing more than to celebrate a ruler. Moreover, it seemed to celebrate him without admitting some uncomfortable truths, such as the fact that he acquired much of his wealth from the slave trade. However, in another sense, because it focuses on the stuff owned by Jimma’s most important person, it actually does do exactly what the National and Ethnological museums do not do — it simultaneously indicates the plurality of cultures AND historical change. Jimma was an important peripheral city for a complex economic network that extended from Europe to India. What one experiences as one walks through the museum and its cases full of artifacts from all over the world is a rather uncanny sense of cultural hybridity. The museum is in a sense a testimony to one king’s struggle to survive during the height of European imperialism and the rapid growth of the modern capitalist world system.
Now, a funny thing happened in Harar’s museum, which is basically one large room, whose walls show the history of the city and whose center has tables full of ethnological artifacts. First, the tour guide took me around the walls of the room, teaching me about Harar’s history, and then we looked at the tables in the center of the room. In a sense, this museum combines the ethnological with the historical, but the overall effect is a little strange. Each table in the center is devoted to a different ethnic group (Oromo, Gurage, Somali, and Harari), thus celebrating Harar’s multicultural past and the possibility of peaceful coexistence. But I noticed two ironies. First, most of the artifacts on the different tables were pretty much the same, which suggests to me that a thousand years of intermixture makes the cultural differences between the ethnic groups almost negligible; it was difficult to understand why separate tables were necessary and why materials were not organized in a different way. (I wondered whether anyone would notice if I switched some of the artifacts.) Second, I observed that there was no table for Amhara or Tigray artifacts. I asked why that was, and the guide simply shrugged. (Now, for those of you who don’t know Ethiopian history, it was the Orthodox Christian Amhara and Tigray kings who, in the late 19th century, conquered the territory now known as Ethiopia and created the modern state with European technological support, i.e., guns.) So, the joke that I then made to the museum tour guide — which elicited a nervous laugh — was that the table of Amhara cultural artifacts was not in the center of the room because it was part of the “history” wall. And then I pointed to a section of the wall that had a rack of modern rifles and machine guns from one of the early 20th-century battles, and I joked, “oh, there’s the table of Amhara and Tigray cultural artifacts.” My snide comment was meant to draw attention to the strange dichotomy suggested by the museum’s organization that ethnic culture (Oromo, Somali, Harari, Gurage) is somehow not historical in contrast to the politically dominate culture (Amhara and Tigray) which is able to be a historical culture (i.e., a modern culture) only by means of its acquisition of more technologically advanced weapons. (Indeed, the Oromo peasants’ word for their Amhara “landlord” is “gun-holder.”)
How we remember the past has important implications for the future of how we organize our lives together. Since the 1990s, Ethiopia has reorganized its state in a way that now recognizes its many ethnic groups in the wake of two regimes (Haile Selassie’s empire from 1916 to 1974 and Mengisu’s Derg from 1974 to 1991) that actively and often violently sought to repress the cultural achievements of those ethnic groups. Arguably, the current regime under Meles is just as repressive as the previous two, but there is a difference. Unlike the previous two regimes, it seems to me that the current regime allows for cultural differences and cultural achievements to be expressed, but of course it makes this allowance so long as no real political or economic consequences follow from that expression. After all, there are now Oromo cultural centers everywhere in Ethiopia, elementary schools now teach in the ethnic language of each region, some universities now have an Oromo folklore department, and there is an Oromia Bank and the “Finfinne Branch” of the National Bank… but the masses of people in Ethiopia (whatever their ethnicity or culture may be) seem to have little say in matters such as environmental regulations, the organization of land, foreign policy, etc.
In my opinion, the possibility for real alternatives remains marginalized or concealed from view in the museums that I visited. For instance, the Addis Ababa Museum carefully documents the development of the city from a small Oromo town to a military camp to a modern city, but it includes very little about the culture of the local people before the Abyssinian emperor Menelik II moved his military camp there at the end of the 19th century, and it includes hardly a mention of King Iyasu, who ruled from 1913 to 1916. Most of the Ethiopians whom I met characterize Iyasu as an alcoholic womanizer or as a closet Muslim, but in fact he tried to reorganize an Ethiopian state that would be both multicultural and opposed to European imperialism, and he was only 21 years old when he was deposed. More importantly, he was deposed by a politically powerful Orthodox Christian elite who preferred to ally itself with Europe’s imperial desire to exploit labor and resources and oppress the other ethnic groups. The Addis Ababa Museum noticeably puts his portrait in the shadows behind other artifacts as if in embarrassment, and most of the Ethiopians I talked to incorrectly repeat the propaganda, believing that Lij Iyasu’s overthrow was a result of his personal ethics rather than his political decisions. In other words, the average person on the street in Ethiopia seems to have no public memory of Iyasu’ idealistic, egalitarian, anti-colonial agenda and no sense of why European governments would support the coup d’etat that put Ras Tefari in power.
And all of this has important implications for Ethiopia’s newest museum that opened just a few months ago devoted to remembering the Red Terror. The “Red Terror” is the name for when Mengistu’s Derg regime went on a brutal witch hunt for anyone opposed to his administration. Thousands were tortured and murdered. Next to the museum’s front door is a placard with the slogan, “Never Ever Again.” The museum begins by documenting events shortly before the Revolution with magazines and leaflets containing revolutionary arguments against Haile Selassie’s oppressive monarchy. Such documents attest to the revolution’s admirable revolutionary goals, but the museum shows how the revolution soon devolved into an oppressive, terrorist regime. The museum includes more than 700 photographs of the victims, including women and children; it includes the technologies of torture, a map of the secret detention centers, coffins full of unknown bodies, pits full of bones, and some works of art made by survivors; and it even includes a duplicating machine that was destroyed by Mengistu’s regime because it would have published arguments against him. The Red Terror museum reminded me of the Hiroshima museum in Japan and the Holocaust Memorial Museum in Washington D.C.
I think such museums are very important, but I do have a concern about them that is more than merely theoretical. As a site of public memory, the Red Terror Museum seeks to document an act of evil with the explicit goal that Ethiopia never does it again. However, how different was Mengistu’s regime from the regimes before and after it? After all, emperor Haile Selassie ordered jet fighter planes to drop bombs on his own people in the 1960s, a fact that I never saw mentioned in any of Ethiopia’s museums. And some human rights organizations claim that the current Prime Minister Meles has arrested, tortured, and murdered political opponents. And was Mengistu simply evil? To say so doesn’t really explain the complexity of the situation, and it certainly doesn’t explain why so many people in Ethiopia would have supported Mengistu at the time. My concern here is that by demonizing one person, such a representation ignores important truths and creates an alibi — an alibi that allows Ethiopians (and also Americans and Europeans) to imagine that they themselves never supported the evil Mengistu and wouldn’t support such a man today… even though they did… and they do… and we do.
At the end of my presentation on the Harlem Renaissance at Addis Ababa University, the students and I discussed the role of art, literature, and representation. I told them that all of recent novels published in the United States about Ethiopia focus on the Red Terror and that, although I certainly recognize the importance of that subject, it bothered me that American literature couldn’t think of anything else to say about Ethiopia. They told me that, likewise, much of Ethiopia’s own recent literature did the same thing — and that it often did so by simply representing Mengistu as a monster. The students and I agreed that art ought to do something better if it is to help us (the public) work through the paradoxes and problems we face in the globalized capitalist world in which we live. In many ways, the organization of a museum is like a work of art as it seeks to encourage people to reflect on their identity and the atrocities of their history so that the phrase “never ever again” is not merely an empty slogan.
Recently, I was reading some scholarly books and articles that, among other things, respond to Paul Gilroy’s thesis in his book The Black Atlantic: Modernity and Double Consciousness, first published in 1993. In that book, Gilroy basically theorizes that we need to understand the enlightenment tradition and the modern world in which we all live as a hybrid pheneomenon that began with the violence of the transatlantic slave trade and emerged out of the commercial and cultural exchanges across the Atlantic ocean, including the cultural contributions of Africans, Caribbeans, Europeans, Native Americans, etc. It’s a complicated book, and I don’t have time to go into its argument in my blog. Instead, while I was reading, I came across two very contradictory statements about Gilroy’s book that I found very curious. One of them calls Gilroy’s book a history, and the other says that it’s nonhistoricist. How could the same book look like a work of history to one person, and look like the opposite to another person?
So, first I’ll show you the two contradictory quotes that I’m talking about, and then I’ll attempt to explain that contradiction. That contradiction might help us understand what this thing called “theory” is.
In her fascinating book, Hegel, Haiti, and Universal History, published in 2009, Susan Buck-Morss, wrote about the work of “historians like Paul Gilroy, whose attempt to grasp the diaspora of Africans across the black Atlantic led him to argue that no identifying concept of race or nation is adequate” (p.111). Buck-Morss is a professor of political philosophy, and her book convincingly demonstrates that the famous philosopher Hegel conceived of the “master-slave dialectic” as he was reading about slave revolts and the Haitian revolution in newspapers and magazines. Notice that when she refers to Gilroy’s book, she calls him a historian.
But a few years earlier, in 2001, Ronald Judy, a professor of English, published a review of Gilroy’s work in issue 28:3 of the journal Boundary 2 where he says that Gilroy “strove to present a nonhistoricist account of modernity that recognizes protocols for living in the world today in the supposedly marginal expressive forms (most particularly music) of what has come to be understood as African Diaspora culture…” (p. 210). Notice that when he refers to the same book, he calls it a nonhistoricist account.
So, is Gilroy’s Black Atlantic historicist or nonhistoricist? Is this purely a difference of academic disciplines? Perhaps what looks like history to a philosopher such as Buck-Morss may look like something else to somebody who specializes in history or literary history such as Judy. And what (academically speaking) is “real” history supposed to be anyway? Certainly, real historians are meticulous about thoroughly checking archival data in order to figure out what really happened, and Gilroy’s book (which just focuses on a few famous texts) is not that. But I don’t know if I’d call his book “non-historicist” either, since he does consider the movement of history and the relationship between books and historical forces. After all, later in his article, Judy himself calls Gilroy’s approach a “conceptual history of modernity” (p. 211).
So… apparently… it’s a nonhistoricist conceptual history…. Huh?
What’s also obvious, and perhaps important to point out, is that Gilroy’s many books are getting read and discussed by people in a lot of different academic disciplines: literature, history, philosophy, sociology, political science, etc. But apparently they all have different senses of what Gilroy’s work is and what his work means for them…. Or do they? Maybe the difference between them is less important than the similarity. All of them have changed their approach to their own academic discipline in the same way — taking Gilroy’s point about the “black Atlantic” as the originating locus for modernity and the enlightenment tradition instead of Europe. And likewise, they all now see the black Atlantic as the starting point for thinking about freedom and democracy instead of the United States.
Still, even though scholars from a range of disciplines are all taking up the same basic and groundbreaking point at the end of the day, what do we make of the two contradictory statements about Gilroy’s book? Let’s take a brief detour into the work of another theorist — we might imagine similar contradictory statements being said about a couple of well-known books by Michel Foucault that you may have encountered in your introduction to theory class: Discipline and Punish and The History of Sexuality. Philosophers and literature professors often treat Foucault as a historian, but historians treat him as a philosopher. This ambiguity and the interdisciplinary nature of Foucault and Gilroy’s work gives us some insight into what “theory” is. Theory is not philosophy, though often courses in literary theory will read philosophy. And theory is not history, though it often thinks very hard about history and talks about historical contexts. Theory is always in-between disciplines because its project is to change the way people think within those disciplines. Foucault was very clear about this in his essay “Nietzsche, Genealogy, and History,” where he explains his project as one of genealogical critique. What does this mean? Roughly, following the famous Friedrich Nietzsche’s example, Foucault means that his goal is to open up new possibilities and new ways of thinking about and acting in the world. His method for doing this is to critique how certain ways of thinking and acting have been repeated, enacted, institutionalized, and developed over time. Once one can expose these habits of thought (or habits of philosophy) as contingent rather than necessary, one can liberate oneself from the shackles of mental habits and academic disciplines… and think beyond them.
Then how does literature relate to theory? In some ways, the projects of literature and theory are similar. They both seek to open our minds to alternative ways of thinking about the world, and in that sense, they are allies. But in other ways, literature merely repeats mental habits and often simply repeats the conclusions of academic disciplines such as history and philosophy that the author may have read in school, so theory is a tool that can expose literature’s complicity with hegemonic power. In other words, literature is also part of a genealogy of ideas and institutions, and therefore theory can critically expose its place within that genealogy.
Coming back to Gilroy, he suggests that we read literature from the 17th century to the present as part of a “black Atlantic” context instead of as part of an “English” or “American” or even “African-American” context. How might Gilroy’s conceptual reframing of history, literature, and philosophy change how we read literature by such famous figures as John Milton, Walt Whitman, or Jane Austen as well as how we read literature by arguably more important figures such as Frederick Douglass, James Baldwin, or Toni Morrison? If we look at things from a black Atlantic perspective instead of from — let’s say — a British perspective, who appears to us to be the shining luminary figure that epitomizes the “best that has been said and thought” in our world? Perhaps English departments should all be requiring students to take a course on Frantz Fanon and Bob Marley instead of requiring a course on William Shakespeare?