When I interviewed for my current job we got onto the topic of alumni data tracking. My program had an exit survey on their website, one that suggested they were collecting contact information and checking in with PhDs in the years after they’d left our institution to see how and what they were doing. (It turns out that no one knew the form was there, and it hadn’t been used in many years.) We then got to talking about program evaluation, one of my favourite subjects, and about how we could start assessing if the professional and career development work we were doing–if they hired me–was having any effect on the post-PhD lives of our graduate students and postdocs.
“What we don’t need to do,” I argued in the interview, “is worry about the percentage of our alumni who get jobs after they leave us. In Canada, PhDs have the highest employment rates of any educational level. We know that PhDs get jobs after they graduate. What we don’t know is how hard it is to find those jobs, how long it takes, if those jobs are fulfilling and pay well and use the skills we’ve worked to help them acquire. We don’t know if the work we do teaching people how to develop their careers and transition into new fields works. The success of our programs can only be measured by our success in helping with all of those other things, because we can’t take credit for PhD employment rates. They’re great without us.”
That employment data wasn’t what we needed, that we had it already and it told us something promising but frustratingly incomplete, was a bit of a revelation to the people who would become my new team, just as it is to the conference panels and PhDs and graduate chairs with whom I often share this bit of information. It shouldn’t be a surprise–this is Statistics Canada data, after all, there for anyone to find and analyze (and, with the reinstatement of the long form census, being collected once again). But my organization is clearly not the only one to still think that employment data is the place we need to start in understanding the lives of our PhD alumni and the value of our programs, academic and otherwise. As Gary McDowell writes in Science this week, higher ed is still furiously mining for what he calls the fool’s gold of PhD employment data. And what they find is fool’s gold not only because it doesn’t have much value, but also because it looks shiny but is tarnished at heart. In the US, the Survey of Earned Doctorates and Survey of Doctorate Recipients stands in for the StatsCan data that we tend to use up here, and like the StatsCan data, it tells us only so much. It tells us that PhDs are employed, and roughly where. But it doesn’t tell us anything about the quality or nature of employment that PhDs are finding, and that is ultimately what we really need to know.
As an alternative to census data, the other popular approach at the moment is the old “let’s Google it,” and that’s the approach taken by HEQCO,* the Chronicle Vitae Academic Job Tracker project, and the American Historical Association. It’s not a bad approach when done carefully and well, as it at least does allow us to see what specific jobs people in different disciplines are ending up and, if we have things like CV data, the path they took to get there. The better studies, like the AHA and Chronicle Vitae projects (both, not coincidentally, run by Lilli Research Group), limit their Googling to sources that are arguably accurate and verifiable.** But people lie on the internet all the time, or job titles are misleading (is that Assistant Professorship a visiting or a tenure-track one? No way to know from your vague university bio, and no one has bothered to ask you), or people just can’t be found (this is especially true for people who move into non-academic employment).
And these data-collection exercises for the most part still don’t tell us what we really need to know (or at least what I really want to know): What kind–in qualitative terms–of employment are PhDs finding? What was it like finding a job? How long did it take? How much did you make in that first job? Did it use the skills you gained in your PhD? How long did it take you to get your first raise? To get promoted? Did you do any career development workshops in your PhD? Did they make you feel more confident in embarking on your post-degree job search? What is your employer’s perspective on hiring PhDs? And for those of us who work in graduate careers, professional development, support, graduate program reform: is our work doing anything? Are we helping people minimize the transition time between PhD and enjoyable, valuable employment that makes use of their skills? Are we reducing the emotional whiplash of being thrust out of the academy and into the non-academic working world? Do people feel confident in their ability to identify and deploy the skills they’ve learned in the classroom and the lab, in our seminars and in their own work to broaden and deepen their skill-sets? Are we doing anything at all? The TRaCE project running out of McGill University is taking steps in this direction, but major issues have already been raised with the validity of its approach and the data that comes out of it.***
The problem with seeking answers to these questions is the difficulty of reaching those who can answer them, and then making sense of those answers. Googling someone is easy. Reaching them by email or phone to ask those questions we want answered is far harder. It takes person-power and time and more money than any of us as individual organizations have. It also takes the buy-in of our PhDs, sometimes long after they’ve left our organizations, and that’s the place where these exercises often fail. Figuring out a baseline against which to measure our efforts is perhaps just as difficult–how hard was it to find a good post-PhD job before we started offering graduate career development programs? Did our PhDs find good jobs faster after we launched that internship program? How do we qualify or quantify what “easier” or “better” or “better aligned to my skills” looks like? How do we adjust for the fact that PhDs and postdocs, who are underpaid and undervalued during their training, might think a first job a godsend that years later seems like ill-fitting, underpaid grunt-work?
We don’t need more employment data. Quantitative data is not what we need. Perhaps my humanist is showing, despite the fact that I now work almost exclusively with STEM researchers, but this is a qualitative research problem. What we do need is contact information and to talk to our PhD holders–actually talk to them, systematically and en masse so that our data is comprehensive and valid and comparable against that useful but incomplete quantitative data–and ask them those questions I noted above. I wish someone had called me up and asked me these a couple of years after I took my first post-PhD job. I could have told them a lot. Instead, I use my experience–and that of the PhDs I talk to, every day, at work and online–to try to do more, and do better. Still, that’s anecdote, not data. We’re never going to be able to do our best in helping PhDs to find well-paying, engaging places to put their knowledge and skills to work in the world if we don’t start asking a whole lot of people the right questions. And start figuring out how to do that in a way that’s sustainable.
I’m in the midst of scoping out just this kind of project to be undertaken by the centre in which I work, and we’re hopeful that, if we’re smart and careful, we can come up with a model for PhD data collection that goes beyond the quantitative, and that uses qualitative data and its analysis not just to inform the work we do locally, but also to inform real change in how we go about the business of graduate and postdoctoral training more broadly. It’s early days yet, but stay tuned.
* For a useful take on the major issues with the HEQCO report, see Melonie Fullick’s Speculative Diction post. Her post on the Conference Board of Canada report, which contains the most comprehensive analysis of PhD employment data collected via the Canadian census, is also interesting and illuminating.
** Researchers at the University of Ottawa are also doing some interesting work with alumni records and tax data that looks promising in terms of answering the money part of these questions, but that again only gives us part of the picture.
*** For a thorough critique of the TRaCE project, I direct you once again to Melonie Fullick.