Many years ago I started a professional development workshop for our Ph.D. students at WashU. One reoccurring topic is the academic job market.
This year I assigned four undergraduate students the task of collecting data from the US news top 25 political science departments on their job market candidates. Specifically, I wanted my RAs to find the CV or website of students on the market. This data collection included the following attributes:
- Does the candidate have a solo authored publication? A co-authored publication?
- Did the candidate complete an exam in methods or formal theory?
- Years to degree
Obviously this doesn’t answer many of the important questions used by search committees. Did the applicant do field work? What type of training did the student get in grad school? What do the letters of rec say? We could add a long list of the different attributes and experiences, but it turns out my very minimal set data collection was a lot harder than expected. Why?
Candidates: Some candidates don’t have a web site or CV. Others didn’t list their field, their committee, or the minimum info search committees would want on a CV.
Departments: Some departments don’t prominently list their Ph.D. students on the market! Others make it very difficult by providing very little information or have a website that includes many students that have been on the market for years.
Students can make their own choices on what they do with their careers. But as the Director of Graduate Studies, I think there are minimum professional responsibilities that departments have toward their students.
Of the 372 candidates from 23 departments, only 288 of 372 had CVs that included minimal data.
Forgetting about selection and data quality issues (this is blog) we can look at some patterns in this data. My guesstimation is that this sample looks similar to previous samples published in PS on the job market.
I had my research assistants write the title of any journal publications for candidates. After cutting an unpublished undergrad thesis or two and a few other suspicious publications, I found that 45% of candidates with CVs posted have some sort of publication (23% with a solo and 27% with a co-authored pub).
I didn’t make any hard judgment calls of what is a “real” publication and what isn’t. My quick take is that the majority of these pubs are at peer-reviewed journals that are a positive signal to search committees. There is some variance across subfields with political theorists the least likely to publish (32%) and comparativists the most likely to publish (52%). Even 32% is a much higher number than I expected.
I shot off a few emails to search committees to get their take on the market. I asked search chairs for info and advice on how candidates moved from the general pool to the short list. Before seeing this data I would have guessed that a pub would set you apart. Maybe the way my undergrads coded the data overstates the amount of competition out there and the number of high quality candidates?
Not according to the chairs and search committee members to the small number of search chairs that gave me info on their searches. Most schools had their long short lists of about a dozen candidates mostly with pubs. Many other candidates with pubs were left off the long short lists. I stress that this was the story from broad range of institutions (R1, liberal arts colleges, etc).
The advice I got on how to make it to the short list varied. Some committee members stressed fit with the department, other stressed the quality of the publication (top journal, solo authored if possible), others stressed grad program reputations, letters of rec, etc.
What is my take away point? I really don’t know. I had the feeling that the magic formula for getting an interview was a decent publication. I think that was probably the case when I was on the market as an ABD (2001). Not so much anymore.
If there is some sort of change in the market for ABDs, is this a good or bad thing? I originally wrote up some thoughts that both praised the high quality (and quantity) of research from ABDs and expressed some concern that the ubiquity of publications could lead search committees to fall back onto other criteria that are less objective and more prone to bias.
But to be honest, I don’t know what to make of this. I just wanted to highlight one descriptive statistic from this data collection.
Tom Pepinsky has a great blog post on this data and a Chris Blattman post that offers insights from search committees.