By Matt Schneider, PhD Candidate
Over the past week, this blog has been abuzz with insightful and well-considered responses to the now-infamous “So You Want to Get a PhD in the Humanities” video. The conversation has unearthed a number of important concerns, and has identified some worrisome trends that have emerged in scholarship in the humanities, especially concerning the teleological assumption that PhD students are working towards employment in the academy. These concerns highlight the facts that the humanities in general function on the basis of a number of unexamined assumptions, and that these assumptions are damaging both to PhD students at any stage of their programme and to the public’s perception of the humanities in general.
One thing I think I’d like to add to the conversation is the way that departments tend to avoid dealing with students who want to work in what essentially amount to “unproven” fields. In my programme, I’ve noticed a lot of students finding themselves having to fight with their departments to varying degrees in order to be able to do the work they want to do, simply because some aspect of their work—be it comics/graphic novels, digital texts, children’s literature, romance novels, speculative fiction, etc—has not yet become a Proper Field and as such the university is unsure whether that work is Serious and Important. And to make matters worse, these projects are almost always SSHRC funded, normally with a CGS, and every one of these projects was accepted by the department when the PhD students applied.
It is frustrating that the government is willing to fund these projects (and well, too), and that the university is willing to accept them (at least initially), but when it comes time for the student to get to work, the university (or key figures in it) would then express doubt as to the feasibility/hire-ability/validity of this same work. This contradictory behaviour is especially frustrating because many of the students working in these unproven fields are especially well-suited to work in them, having great personal interest in an area that is either misunderstood or ignored by most scholars. Essentially, some of the brilliant students who could well one day be the stars of these new and emerging fields are being told that their work is interesting and could hold great potential (so much so that the government was willing to pick up the tab), but that they’re not allowed to do it until the field is better established.
This attitude is detrimental to those emerging fields, too. Several of my fellow students have had to change their projects drastically, often times cutting out the very interests and elements that made their work so valuable and unique. A student studying affect in the non-fiction comics of Joe Sacco, Art Spiegelman, and Marjane Satrapi, for example, quickly shifts focus, with supervisors and faculty asking the student to include more novels or biographies that aren’t comics until, by the time the student hands in her dissertation proposal, the comics have moved from the position of primary text to secondary, at best, or entirely absent, at worst. This is not to say that dissertations in new fields wouldn’t benefit from a grounding in canonical (that word!) texts and methodologies—I, for example, just recently discovered a connection between the work of Jonathan Swift and Unicode (for those interested, search for the words “bigendian” and “smallendian”)—but rather that these canonical works should simply enrich our studies into new fields, rather than authorise or rationalise these studies. If we insist that students spend the majority of their time studying the tried and true, we effectively stunt the developing fields by forcing students to wait until they’ve become established in a “traditional” field before shifting their scholarly focus back to their passions.
Perhaps most frustrating is the fact that many of these fields could well make the humanities more serious in the eyes of the public. Sure, the public may at first find it amusing that Intellectuals are Studying something like romance novels, comics, and videogames, but ultimately these are works the public can connect with. There’s a reason books like The Philosophy of Buffy the Vampire Slayer sell better than your average scholarly anthology. The latest collection of post-Lacanian psychoanalytic explorations of the works of Djuna Barnes may well be stunningly insightful and invaluable to scholars studying that amazing writer (I meant Barnes, but you can pick your favourite of the two), but much of the public is simply not in a position to connect with this work. By contrast, a dissertation examining the intersections between religion, gender, and politics in the Twilight series has the potential to reach a much broader, non-academic audience—the series has sold over one hundred million copies according to Publishers Weekly. If a scholar were to connect with even a fraction of a percent of this audience, she would, by academic standards, be a best-seller dozens of times over. When we discourage scholars from studying these popular works, we are wilfully distancing ourselves from the public at large.
Perhaps if we in the humanities want to be taken more seriously, we should encourage bright up-and-comers to prove themselves in these new or obscure fields. Not only would this attitude prevent students who were accepted for proposals in these areas from feeling like they fell for the old bait-and-switch, but it would also open up new avenues for the scholarly community to engage with the public. If we want others to take the humanities seriously, perhaps we should first ensure that we take the humanities seriously ourselves.
I should add, of course, that not everyone sees relating to the public as a good thing.
LikeLike
Hear! Hear! I agree with most (all?) of what you write, Matt. I think, as well, what you write about is the vexing notion of how best to define/negotiate/articulate the relationship between the PhD Candidate and his/her supervisor and/or supervisory committee. While ultimately PhD Candidates need to take seriously the advice of their supervisors/committees, they also need to know when to politely “agree to disagree” with some of the advice that they may be given about THEIR projects. It's a very difficult dance to learn, I think. How much pushing back is necessary? How much pushing back is pig-headed stubbornness? I have no answers to those questions, but I think that they are all part of your discussion of PhD Candidates who change their projects radically to fit someone else's notion of disciplinary priorities.
That's my two cent's worth.
LikeLike
Hm. I should start by saying that I did a PhD in an 'unusual' and emerging and not-canonized field (roughly cyberculture studies, degree starting in 1998; my two foreign languages are French, and Java). So I am obviously sympathetic to the idea of the academy and its disciplines moving in new directions.
However.
If I may be perfectly frank, a lot of crappy work is done under the name of discipline-busting, new-frontier-opening, no-one-understands-the-genius-of-my=work innovation. I know. My field is full of it in the period of 1989(?) to 2001(?) particularly. A lot of this work is: blind to proven methods and hard-won bodies of critical knowledge, trading on its 'newness' rather than its smarts, making wild and general claims about how the world is forever different.
The disciplines are conservative, by nature: by this, I mean that they are oriented towards careful and slow building up of knowledge over the long haul. This can be a powerful boon to understanding what looks like something new under the sun: for example, when I do a panel on “Is Twitter Killing the English Language” I can point to more than a 1000 years of similar vast upheavals and public worry over language growth and change. There's context. A firm ground to stand on.
Of course, the disciplines can be conservative to the point of hidebound, too. In my own interdisciplinary work, I sometimes have real trouble passing peer review: the sociologists want me to use more statistics, and the literary critics don't understand why I need them, and everyone calls me a dilettante. The bar is higher for those who want to do this kind of work, and rightly so. That doesn't mean the bar should be insurmountable, and I'm sorry for all those who have had the spirit and newness crushed out of them in that way.
Part of what graduate students do is to push against the frontiers of knowledge, and against the comfort level of their committees. That's necessary, and appropriate, and it enlarges our scope and brightens everything up. Part of what professors have to do is to make sure those new projects take full advantage of what disciplinary accomplishments to date can offer in terms of context or theory or method. Like Lindy says, it's a dance where we risk stepping on one another's feet, to mutual injury.
LikeLike
Awesome post, Matt!
I think the thing that saddens me most about the standard academic view of “emerging fields” is the fact that many new PhD students are doing fabulously interesting work within these fields, but it's very, very interdisciplinary. As Aimee noted above (in terms of grad students pushing the frontiers of knowledge), that's fine and good for grad school (and, I suspect, for SSHRC), but once one goes on the job market one has to re-box oneself into the traditional fields, and it doesn't work. And I know: the sensible, well-advised students start boxing themselves from an early period (hence many schools accepting non-canonical proposals, but then slowly molding students toward more recognized work), but… well, frankly, I find it all a little heartbreaking. And it often feels connected to some antiquated notion of the “high” versus “low” debate.
I'll admit that I'm biased: this is all coming from someone who managed to get a job in which she is currently teaching Satrapi's Persepolis, Buffy, and yes, even Twilight. I think that one additional angle on this debate surrounds the politics of creating non-traditional, “money-maker” enrollment-heavy courses, and how they may prop up the less-enrolled, more traditional classes (this type of funding is obviously more prevalent in the US). I know that the eight over-enrolled Harry Potter courses at my current institution basically fund a good chunk of the English department. I don't necessarily think that such a set-up is a good thing, but I do think it further questions the interactions between traditional and emerging fields.
LikeLike
This comment has been removed by the author.
LikeLike
Great comment, Amanda, and I think the problem of “Blockbuster” (to borrow a term from Art History) classes is an important one to consider. I'm torn on this one: on the one hand, they open up a popular topic to scholarly inquiry, and allow students to engage critically with a work of great personal relevance; on the other hand, it's easy to treat them as short-lived cash cows, which decreases their sustainability.
When we in English are asked what students gain from a BA in English, we often say that we prepare them to read and interpret their world, and as such make them critical thinkers and better citizens. If this is the case, however, it would seem that things like the Harry Potter classes are a perfect example of us keeping true to our word: a Harry Potter class is the perfect environment for students to see the benefits of English in action. It also allows us to bring up theorists and works that students would normally find inaccessible in a way that they can relate to. Unfortunately, they can also prove highly unproductive if the students are unprepared to engage critically with the works being studied. Sure, a blockbuster exhibit on Picasso might bring a bunch of people into an art gallery, but if the gallery's not very careful about the way they engage these visitors, they'll come and go without ever having changed their understanding of Picasso's work. It's a fine balance, but I think it's one we should strive to attain, as these sorts of classes really do offer a unique opportunity to educate students.
::EDIT:: Corrected a few typos
LikeLike
I have to say, I'm hesitant about validating popular culture studies through the argument that they are more apparently relevant to a contemporary audience. That isn't because I have a problem with popular culture studies (I think some enormously interesting and valuable work is being done in the field) or with relevance (because I am completely behind humanities scholars pushing the borders in terms of public engagement).
What gives me pause is accepting the terms of relevance on which the public is already functioning. It seems to trend so dangerously close to the myopic. Sure, students will show up in droves for Harry Potter and Twilight courses, and this may be an opportunity to teach them to critically engage with the world around them, but it fails to make an argument for the critical or cultural relevance of what lies beyond the popular sphere or the contemporary moment. It is this broader attention to culture in its synchronic and diachronic forms that I think gives humanities work much of its value, and I would hate to see it get lost through a disproportionate privileging of the contemporary.
And while it seems that, within your current institution, you're fighting against an unusually conservative disciplinary structure, I'm not sure if that's true across the board. In fact, with increasing pressures to prove relevance, I wonder if what's at risk isn't 17th century Spanish life writing more than digital texts. If you know what I mean.
LikeLike
I honestly think that these university level conflicts are because of egos. Some professors are threatened by their students and want to sabotage them, or they feel insecure and unqualified to supervise them.
LikeLike