As the season approaches – and by season, I naturally mean, as perhaps only literature academics can fully appreciate, the MLA interview season – I once again find myself pondering the vagaries of the academic job market. Every working academic has at least one horror story about being on the job market, and as I go up for tenure this year, six years after my initial interview with WT, I find I’m still not psychologically finished reliving all of mine. Though it’s tempting to relate them (I am prone to confessionals), such a forum as Arcade is perhaps not the place to review the host of errors I have made and seen made by others over the years; but it may be a good place to consider the question that I’ve been pondering lately.
I have regularly heard colleagues ask interviewees who the “cutting edge” theorists are in a candidate’s given discipline. Such a question is doubtless designed to assess whether a candidate has read broadly in her field, and whether she is current in her scholarship – both reasonable expectations of a candidate, particularly one aspiring to a research position. Even so, I find the question of who the most cutting edge critics are to be a nearly irrelevant for newcomers to the field (which most candidates are), as whatever they are studying is likely to be, by some definition, “avant-garde” in their field. It is harder to answer for those like myself, who are steady researchers long past our dissertation stage, who must carefully pick and choose what we read because precious research time is fit in alongside a heavy teaching and service schedule (not to mention the needs of two small children); I simply cannot read widely and generally in my fields, (though doing so constitutes a great pleasure when I have the opportunity), because in order to write and publish, I must be invested first and foremost in the specifics related to my own immediate areas of research.
But even beyond what is probably a chip on my shoulder for being old, cranky, and out-of-touch, I find it a frustrating question because I see it as representing one of the problems facing academia today. We are often more concerned with the “hip” and “now” rather than the enduring. Such a complaint applies to everything from teaching styles to research agendas to publication. At my own institution, for example, we have quite modern equipment for a small regional school in Texas: every classroom is a smart classroom, and we have several fully equipped computer labs we are encouraged – even harangued – into making use of because they are there. For myself, I find using a computer in the classroom to be useful, but I know colleagues who have never used them, and who still deliver some of the best lectures I have ever witnessed. These colleagues face undue scrutiny by an administration that sometimes seems more interested in being able to tout a percentage for the latest edition of US News and World Reports about how many courses make use of these modern facilities than whether the content is well-delivered.
The “most recent” is also privileged in our ongoing academic conversations. When someone has a good idea, or a useful idea, there are thirty proliferations of the same idea, slightly skewed or restated, or applied differently. There is nothing wrong with seeing where these ideas take us, for after all, such works constitute the necessary back-and-forth community that helps us to collectively interpret works, understand culture, and develop our own ideas. But there should also be nothing shameful in admitting that while one’s research must be current, the theorists and writers one owes the most to are older because, in some cases, better. In my own research, for example, there are still today few works on Djuna Barnes’s Nightwood that improve on Jane Marcus’s 1991 reading of the novel, “Laughing at Leviticus,” which is now nearly 20 years old.
I have had similar experiences in publication. Having spent a youth (and adulthood) hopelessly out-of-style, it would not come as a surprise to me to learn that my work is outdated. But it has been a revelation to me to discover that in writing for people who define themselves as literary critics, the monograph itself is outdated. I have been told by numerous publishers (who, in my growing experience with these astute folks, never tell me anything just to spare my feelings) that people don’t buy or read books that are all about one other book – perhaps another loss to the diminishing attention span, or perhaps I have only to hold onto my precious pearls of wisdom until that rusted old pendulum swings back in my favor.
In truth, of course, we all hope to be “cutting edge” either personally or by proximity, and we all hope to teach at “cutting edge” universities that appeal to the young and the hip who are our primary clientele (or at least that’s the way administrators at my school describe them). And so a question to prospective colleagues about the trendiest writers in our fields is on the one hand natural, and on the other demands of our interviewees that they, too, buy into the cult of youth and freshness of which we like to imagine ourselves a part.
While such a group, I admit, has its appeal, I believe that very few of us can actually be “cutting edge” if the term is to have value. Not all institutions can be on the cusp of literary breakthroughs; not all colleagues are doing groundbreaking work that will change the way we see our world. Indeed, however much our predilection for square glasses, stylish shoes and the latest MLA business suit belies it, there is very little about academia that is “cutting edge.” Of course we want to make our work to seem relevant to students by being “here and now” by speaking to politics or current theoretical and social trends, and we worry that we are “out of touch” when we fail to make it so. And yet, how can we hope to be trendy in a discipline that requires years to write a (good) book and at least a year more to actually publish one? With the possible exception of those specializing in political science, popular culture, or digital media, probably none of us is doing work that is au courant. We can, of course, speed up production processes. We can choose to cut corners in reviewing; we can publish online, editing mistakes as we find them; and we can in other ways prioritize speed. But none of these methods can mitigate the length of time it takes to do good research and writing. And to shorten such time results in a kind of leveling of the playing field that makes anyone capable of writing, but no one worthy of reading; it is to esteem modernity over knowledge.
I do not mean to critique us in our inability to be immediately relevant. I am arguing that relevance to current events or the latest trend is a poor way to sell what academia has to offer. Whether we like to admit it or not, we are part of a hierarchical tradition that reads and evaluates great works. Our relevance lies not in our ability to speak to the fleeting, but to recognize, interpret and assess the enduring. We can occasionally be timely in our research, but we can do so usually only by accident. And so the real key to convincing (hip, young) students about the relevance of an antiquated, pedagogical system lies in embracing something that we, in our postmodern, dissociated, individualist world, no longer believe: that there are “universal truths” in most well-crafted literature, just as there is meaning in thoughtfully considered lives. Being relevant to students may not be a matter of privileging them by confirming that their life is unique and original, but instead is about helping our students recognize that their experiences and thoughts are part of an ongoing, shared dialectic that has its ebbs and flows, and is connective precisely because it is never wholly new. It is instead the product of a history that academia, more so than perhaps any other form of hierarchy, values. For good reason. Time is one of the greatest tests of relevance.
And so for myself, a better interview question for potential interview candidates – or a better way of phrasing the question, perhaps – is not about which current scholars are “cutting edge” in the discipline, but is instead about which scholars are likely to be influencing the discipline over the next 50 years, and why.