I mentioned last week that graduates of my research group have been successful obtaining PhD-relevant employment, but of course there is something missing from those data: the students who left without getting a degree. I can see how someone might want to know what the ratio of completed to never-completed degrees is for a particular advisor, research group, or department.
But what would such data indicate? Would these data indicate anything useful for those seeking to make an informed choice about graduate programs or a particular advisor?
These data might indicate something about the level and duration of financial support available. A high attrition rate could be a signal of low level of financial support for students, but such data could probably be obtained more directly by looking at student funding levels and duration.
So let's assume that a department/advisor is fortunate to have sufficient resources to support students for the duration of a typical graduate research program. Would comparison of graduation rates (among advisors, departments, or universities) give a sense for some other essential aspect of the graduate programs, such as quality of advising?
Maybe, but the data would really only be useful if we had a good baseline estimate of the "background" attrition rate for graduate students. Students may leave a particular graduate school for all sorts of reasons that have nothing to do with the quality of the program. For example, some students realize they are interested in something else and move to another department/institution, some move when their significant other has to move elsewhere, and some decide to take a job outside academia before finishing their degree (for a wide variety of reasons).
Presumably, if graduation data were known for a large number of advisors, programs, or departments, a pattern would emerge so that outliers (very high or very low rates) could be detected. Such data are unlikely to be available anytime soon, however, and not necessarily because an institution or individual advisor is ashamed of such data; in fact, some might be proud of having high attrition rates.
I have no idea what this rate is for my department as a whole, and even if I knew how many students left without a degree, I wouldn't know the reasons for most of the departures. And even if I had such information, I wouldn't have any other data for comparison.
Perhaps we can make a small dent in that last issue. Some questions:
- Does anyone know what the average graduation rate is in their department, research group, or other relevant unit? (let's keep it positive and use graduation rate instead of attrition rate)
- Is anyone willing to share their personal graduation rate of advisees? (my research group's is ~90%*)
- Are there graduation rate trends for particular advisors: e.g. a high rate in the early-career years and a lower rate later on?
* Note that 'attrition' includes students who left for personal reasons (e.g., a significant other's career move) and then got a PhD elsewhere and then obtained a tenure-track faculty position, so not all 'drop-outs' actually drop out of Science or academia. Also, it is important to note that those who leave Science/academia are not failures**. Many go on to have interesting careers in industry, business, government, or K-12 education.
** OK, a few of them are, but only a few.
26 comments:
In general, if you show the good, you should also show the bad. Every institute tries to make their numbers look positive numbers look great so it's harder to evaluate the complete picture without both sides.
A university/dept might have 90% of its graduates get the job they want, but only 40% graduate. That's a lot worse than 90% graduating but only 80% get the job they want. Plus, percentages don't always tell the whole story. Saying you've graduated 9 of 10 students is good but 45 out of 50 is even better in my opinion.
I'm also interested in drawing some interesting correlations with the data rather than the overall numbers. For instance, if someone dropped out, how many advisors did they go through? Also, how many students left after their advisor left for either another institution or didn't get tenure. I could imagine this being a big issue at institutions which are known to hire 4-5 tenure tracks for only 1 tenured position.
In my field, math, I think attrition rates could help answer the question, "Am I being admitted because they think I have a really good chance of success, or do they just need a ton of people to teach College Algebra?"
Here in the UK graduation rates are very important to funders. If the dept doesn't have >80% of PhD students graduate within 4 years of the student starting, then PhD funding gets cut off. So there is strong pressure to get students to graduate and to graduate on time. Attrition rates for each dept are published internally and to funders - not sure if this info is easily available to candidates.
http://www.phdcompletion.org/
The PhD Completion Project assembled such data at 27 large universities, including cumulative completion and attrition rates over 10 years, by field, and by demographic variables.
http://www.phdcompletion.org/
I did my grad work in math at a public R1, and our department was going through an evaluation my last year there. One of the external evaluators told a group of the students at tea that about two-thirds of Ph.D. admits would graduate with a Ph.D. from the department. We all thought that sounded about right. Apparently none of the faculty had the slightest idea it was that low.
Two-thirds sounded good to us since we knew why our classmates had left. Some had transferred to another school or another program, some had come in with the intention of leaving with an M.A., some had fallen in love with industry during a summer internship, and so on, and we knew they were at least as happy as we were!
In my field - psychology - all programs are required to post the graduation/attrition rates of their students per year, although it is not by advisor - a.k.a. disclosure data or attrition data. Two examples below:
http://psych.la.psu.edu/graduate/programareas/ClinicalGradStats2009-2.pdf
http://gse.berkeley.edu/program/sp/html/admissions.html
It really depends on how you define an attrition rate.
Do you only count students who join the lab at the beginning of their studies and then leave?
How about the "refugee" students that you take a chance on after they have left another laboratory?
Would publishing this rate and using it as a metric for profs cause labs to turn down the students who do not start out ready for graduate school, but whom the mentors hope might be brought along with close mentoring (ie, the risky ones)?
None of these caveats are at all relevant to the fact that attrition/graduation rates should be made public. And, in fact, all of them apply to any student success data. The same caveats apply to students who graduate but don't obtain academic positions, or any jobs at all (for example, they too may have left for personal reasons).
The government should insist on such availability of student outcome data for any students who are supported off of any government funds, and should insist on the data being made clearly and easily accessible. Certainly a particular professors' unwillingness to release that data or a departments laissez-faire attitude about the information should be no excuse for its lack of availability.
(I have to allow an excuse for private universities providing private funding for students, but my guess is that those students are a rarity).
I'm currently considering a PhD, not in science but in another field seen as unfriendly to women, philosophy. I would like to see that attrition rate by gender. As it is, I look at the department websites and compare the male/female ratio of first year students against that of fifth year students. If I were to be accepted to a place with a change, I would at least ask some questions.
My personal graduation rate:
Grad with PhD: 50%
Grad with MS, successful: 25%
Left: 25%
In the second category, I'm counting people who left for happy reasons (to be with significant other, to pursue career that didn't require PhD). These are people I am all in touch with.
In the third category are people who had planned to get their PhD and who left for unhappy reasons.
I will note that I am moderately early in my career so these are based on small numbers, and I have rounded somewhat. I'll also note that all of the people in the second two categories ("MS" or "left") were from my early years, although I don't think I have an explanation; I'm not sure I agree with the obvious explanations that would have to do with my inexperience or youth.
My department's graduation rate is probably along the lines of 75%. It varies from year to year. Could be 90% one year and 50% the other.
Rates definitely vary by advisor. I don't know if it has anything to do with career stage. Some senior people have very low rates. One senior prof has had about 6 students in a row leave after 2 years. While my old advisor is still at 100%.
In my field - psychology - all programs are required to post the graduation/attrition rates
Uhh. I don't think this is true.
My institution (University of Minnesota) does provide this information by program, general science area, and cross-University. And splits out the results by sex, minority status, and international status. And by PhD vs. MAsters. (Or, it used to report these statistics, until budget cuts in 2009.)
For example, my program my program is here: http://www.grad.umn.edu/data/stats/pr/1124400.html
I think the additional info of time-to-completion very useful, as well.
In reporting a graduation rate, you'd also want to report an 'n'. For an advisor with a 50% graduation rate, I'd be much more willing to consider the advisor if n=2 as opposed to n=20. It would also be nice to have a 't', time since enrollment of first student. I'd consider an advisor with a graduation rate of 0% if t<=5, maybe not so much if t=10.
Nice set of data, anon form UMN. If I were in charge of the world, I'd require every publicly funded university or publicly funded program to post similar statistics (and add job data).
Departments should be keeping this data for individual professors, too.
True that no one should rely on percents to make their decisions, but having the information isn't a bad thing. Then, a student can try to dig further when they care about the number.
Graduation rates are important to know and on a specific advisor level if possible as well as any info on who left and why. I'm thinking of an advisor where I did my PhD who had never graduated a single female student. He had a number of male students who were very successful and he accepted women students all the time but he was so horrible to them they either switched advisors or left (~50%). This should be known and publicized.
Do you only count students who join the lab at the beginning of their studies and then leave?
This is important. Plenty of people start in a research group, spend a quarter or semester there, and then decide on a different group. This sort of thing is formalized in the biomedical sciences with their rotation systems, so it is seen as normal and probably wouldn't even be counted as attrition. Physical sciences tend not to have such things (though there are of course exceptions), so it might be counted as attrition.
Maybe a better metric is attrition among those who stay a year or more. There's nothing wrong with attrition if people who shouldn't stay in the group don't stay in the group, and it's done early, quickly, and humanely. OTOH, attrition after 4-5 years (yes, I've seen it) is a much bigger issue. It means that something went very wrong, and the student either should have been weeded out early or mentored better throughout.
Those ecology doctoral program data for Minnesota are interesting and the year-to-year variation in numbers is an excellent illustration of why the details of the numbers need to be shown; e.g. the 1999-00 class of 8 students had a 75% completion rate (after 10 years), the 00-01 class of 8 has a 38% completion rate (after 9 years), the 01-02 class of 13 has a 77% completion rate (after 8 years), and the 02-03 class of 1 had a 100% completion rate in the 6th year. These are small numbers of students for any one year.
More interesting are the data for male and female doctoral candidate broken out separately. The graduation rate for male doctoral students is much higher than for female doctoral students. The data for all biological sciences and for all graduate programs at the university are not so skewed.
Does that indicate a problem with this program, or are there reasonable explanations for the differences (i.e., reasons that don't obviously relate to things that this particular graduate program could change)? That's something the data don't tell us.
Our dean of engineering recently reported this data for all departments in our school (R1, most departments ranked in the top 5 in US). Data was broken out by male/female and US/international students. Averaged over 15 years, all of our departments seemed to have about a 70% PhD completion rate.
Graduation rates are important to know and on a specific advisor level if possible as well as any info on who left and why.
This seems very important to me! I learned about all the "bad" advisors by word of mouth after I'd already joined a lab -- I lucked out, but nearly every year someone ends up in a certain prof's lab and they almost inevitably switch labs or fail out of the program completely. He's had an attrition rate well above 50% for years, but incoming students have no way of knowing that. He confessed to a classmate of mine, after two years of her struggling, "I'm not a very good advisor" ... the day after he told her she was out of the lab and out of program.
Knowing about those particular people might be more helpful than knowing the program's overall attrition rate, even. He bumps the average up, but the students who aren't in his lab do fine, so it's a bit of a bimodal distribution of "good graduation rate" and "oh crud avoid that lab!"
So, I don't know. Graduation rate doesn't seem like a good indicator of anything, really.
Let's contrast two situations:
1) a grad program might admit a broad group of students, but then be very strict in the academic rigor expected from these students. So they might decide to kick ~1/3 of the students out after the comps phase. The remaining students were really the cream of the crop and most would graduate and be among the best of their peers.
2) another program might take in the same pool of students, baby them through their Comps (or not even have Comps to speak of), and allow them to graduate with a PHD after 9 years and 1 paper and then never get a job.
As a grad student, which program would you rather join? I'd go with #1.
Anon from UMN here again. Yeah, I noticed the male-female discrepancy, too. I really wish they would keep up these statistics because I'd like to see what happens with my cohort and surrounding ones. I am female (with a kid!), I'm in the 2007-2008 cohort, and I have felt nothing but amazing support from this program (ecology), including financially. I wonder if the department of five to ten years ago was somehow different, if my experience is abnormal, or if the female graduation rate is still lower, despite it *feeling* really good while here.
In case you're interested in your peer Science program at Minnesota, here's a link to the more general data: http://www.grad.umn.edu/data/stats/
Scroll down to your program of choice. TwinCities is the main campus and will have a larger number of students in the stats. Then click "Graduate Student Progress." (Or click on another link for a different set of stats.)
I think those numbers should be public. The same way it is important to know the graduation rate of undergrads, it is important to follow-up on the graduation rates at graduate school. It should not only be part of the pride of a graduate program to have high graduation rates, but part of the responsabilities of the mentors. In my experience, usually is not science that gets people looking for something else, but the scientists (who make the scientific community). Of course, this is a generalization and should always be taken as such.
Having those numbers, and asking all those that leave a graduate program without their degrees to mark a square to know the reason (personal, moving, going to K-12 or industry as first choice, doing so because the environment in academia many times sucks...) would help figuring out where the problem is.
I've worked in three countries and five different labs and I have met very few people that didn't just love science when they started their graduate program. Even those that didn't, they were doing their PhD to get a degree that would improve their future perspectives, and that was motivation enough to graduate with a good supervisor. We scientists are doing something to those people that start loving this and end up just hating being around anything that has anything to do with research.
I recently shared a few words in a Diversity Conference here at UCSD that seemed to be very well received. The sentence that hit their minds was something like "we select the people that succeed, and then we call it attrition". I wasn't expecting such an impact, but several encounters after that day have made me realize of the importance of that moment. In my opinion, plenty of the non-graduating cases are not authentic "alternative choices" but more "alternative exits". Keeping truck of those numbers would not only help finding the problem (if it exists, of course, I might be wrong) but would also transmit the students the idea that they actually matter; that they are not just one more failure-to-be, that we do care whether they graduate or not. I believe most of the people do care, but I think that, at least in my field (biological sciences) we don't even know how to show it.
In response to Anon @ 11:41:
Graduation rate is definitely a good indicator of something: how long a student can expect to stay at that institution, something equivalent to job security. In the scenario you presented, you might choose #1, but to a student who is not as mobile for whatever reason, attending #1 might be too risky.
Any serious applicants have already done enough legwork to have an idea about the reputation that graduates from the different programs have, so I don't think that #1 would need to be concerned that they might "look bad" if they made their high attrition rate public. Based on their reputation, they will still attract excellent students who will be among the best of their peers. However, to the students who are basing their decision on more than just the quality of the institution, these completion/attrition statistics are very important.
Graduate students are not teenagers; they are adults who have full lives and multiple priorities. When you started grad school, you may have been at a point in your life where moving again because it didn't work out was an option, but I don't think it's fair to assume that school is the #1 and/or only priority for all new grad students. Many are trying to resolve a two body problem. Some have children. Some have the responsibility of caring for older relatives. If somebody needs to uproot their family to go to grad school, they may need to also know that if they made the cut to be admitted to the school, they also stand a good chance of completing their program. Students with other priorities in their lives are not necessarily bad students, but they will need more information to make the decision that is best for them.
When my SO and I were looking at graduate schools, we actually specifically excluded one very prestigious institution because I discovered over lunch during the admitted student visit weekend that in the last 5 years, my program had only been passing about 50% of the students in their qualifying exams. A 50% chance of having me kicked out after a year without many related industry jobs nearby or being close to another program to transfer to while my SO was a year deep in his program was just too much of a risk for us to make that gamble. Instead, we went to another institution, also very prestigious in both of our areas, where the pass rates for quals and the overall completion rates were much higher. What happened as a result? We ended up in top notch programs with excellent peers and mentors, and we were comfortably secure that we could set down roots there (and I did not witness "babying", just attentive mentoring, which I don't consider to be a bad thing). Also, last I checked, that first institution didn't suffer at all when we opted not to go there. That story might not have ended so well if I hadn't happened to talk to the right group of people at that lunch; we could have unknowingly gotten into a situation with a 50% chance of ending up putting extreme strain on our relationship. I don't even want to think about how messy that situation could get for a couple with child.
I think attrition/completion rates should be made public because students may need that information to evaluate whether that school is a good fit for them or not. My SO and I absolutely do not regret turning down offers from an extremely prestigious program for nonacademic reasons. Why? Because we had another very important priority in our (adult) lives that we had to consider, and we're far from unique. Who benefits from keeping those statistics secret? Either it will be an important factor for a prospective student or it won't; either way, I don't see how keeping it a secret helps them make the decision that's best for them.
I think the information should be made public. And I find it rather odd for scientists to be questioning making data public because they don't like how it will be interpreted. Of course there are caveats to any data sets, and those can be stated. But the point of gathering data is to highlight trends and move discussions out of the anecdote phase.
Of course any individual student may leave for any number of personal reasons, but if this is significantly more likely to happen to students from one program as compared to another, then that may indicate something about the program (and that something is not necessarily bad…perhaps the program provides more opportunities for students to interact with industry early on and many learn that they would prefer such a career).
But now working in the Higher Ed admin side of things, I see so much fear surrounding making information about graduate outcomes public…and the only real losers are the students. The information is only sporadically collected, and there are few reliable sources that students can use to make informed choices about their education and their futures. The more information the better. And if one piece of data taken out of context is sufficient to sway a student from obtaining a PhD with a particular group, perhaps everyone benefits from that happening early. Trust students to use data responsibly…you are training them to be scientists after all.
The data should be made public. It's not the only thing that a prospective student should be looking at, but it's a major thing, and a starting point, along with job placement rates.
I don't ever want to assume that I will be the special snowflake exception to a program's rule, unless I have a really good reason to do so. If a program has, say, a 50% attrition rate, or even higher, that's not a good sign to me - why stack the deck against myself? Of course people leave PhD programs for innocuous reasons, and that's why these numbers are a starting point and not the be-all end-all, but people also leave programs because there's something wrong with the program or the advisor.
Any data on grad programs is subject to being misinterpreted or overinterpreted - attrition rate isn't unique in this. But various other types of data (e.g. median time to completion among those who do finish, job placement rates) for different programs are collected and can be found online, and attrition should be too.
From my experience in grad school (now behind me) there was a lot of variation in attrition rate based on advisor. Based on personal observation and conversations with my fellow graduate students, I felt that the advisor's skill, commitment and behavior were quite correlated with how likely his or her students were to finish. Some advisors would help their students through rough patches, go the extra mile to find funding, provide the necessary resources and feedback, etc while others were neglectful, emotionally abusive, or sabotaging. Personally, I would look for a lab (such as the one I graduated from) where at least the majority, and preferably 75% or more, of students who started, finished.
Post a Comment