Monday, June 25, 2007

Reviewing Reviewers

This weekend I have been poring over statistics provided by a journal for which I do some editorial work. In addition to data related to how the journal is doing (impact factor, ranking among journals in related fields etc.), there are also lists of reviewers: who did reviews, how many each has done, and how long the reviews took.

It's amazing to contemplate these lists, first of all because they are a testament to the huge amount of work reviewers do in the name of 'professional service'. I have done my share of complaining about reviews of my own manuscripts, so it's good to be reminded from time to time that, despite some unethical and rude reviewers, the system of peer review is an impressive thing in terms of its scope and time involved.

I did a quick, statistically invalid analysis of the reviewer data for the past year to see whether the time it took a reviewer to complete the review was random or correlated with seniority. My working hypothesis was that younger scientists do quicker reviews. The dataset is sufficiently large to make an analysis like this reasonable, but I wasn't rigorous about tracking down reviewer time-from-Ph.D. data. I put reviewers in one of several bins: postdoc, assistant professor, mid-career, late-career, retired, and I put research scientists into these same bins based on where they would be in terms of time since Ph.D. if they were tenure-track. It's not a perfect system, but I just wanted to get a sense for any trends.

The quickest reviewing groups are the early-career and retired scientists. There are a fair number of outliers -- assistant professors who are very slow, mid-career and senior people who are very fast, but in general the time-to-review increases with seniority, then drops for emeritus professors. If I did a rigorous job of tracking down reviewer data, it would be interesting to see if there's a detectable change in review time immediately following tenure. Would it be an increase in time because the pressure to impress everyone eases, or a decrease because other pressures have eased (and many faculty get a sabbatical soon after the tenure decision)?

Within bins, reviewers who reviewed multiple manuscripts tend to be consistent in their time-to-review. Some people are quick reviewers and some are not. It's rare to see someone who did one review in a short time and another review in a significantly longer time. I thought there would be more variation because the time frame might be affected by how busy someone is, as well as factors related to the manuscript length and quality: some manuscripts are easy to review and some require a huge amount of time. But no.. time-to-review seems to be a personality trait more than anything else.

11 comments:

  1. I'm not an academic, but I find that if I don't deal with the paperwork when it arrives, I never get to it. So I'd be the quick-to-review type, perforce.

    ReplyDelete
  2. I'm curious what the times actually look like. I'm usually what I would consider late (a week or two after the deadline), but someone told me that it isn't late unless my review is the last one. They were at least half-joking, I'm sure, but I started wondering what the standard is really like. if you don't mind, can you share what's the % of reviews that arrive at or before the deadline?

    (this is a dangerous question. I really only want to know the answer if a sizeable portion of people do that, so I can shame myself into not procrastinating!)

    ReplyDelete
  3. On average, I bet that the time-to-review is a characteristic of the reviewer; but, given your experience with getting all of your reviews back at once, I wonder if there is an overall "seasonal effect." So, on average, people who take a long time take a long time, but that people, whether they tend to take a long time or short time, send back the manuscripts right after the semester ends?

    ReplyDelete
  4. The % of people who turn in reviews at or before the deadline is very small, but most people submit their reviews within 1-2 weeks of the deadline.

    ReplyDelete
  5. ' The % of people who turn in reviews at or before the deadline is very small,...'

    Hey, that means that with my usual three days late I'm actually rather fast! Who'd have thought?

    ReplyDelete
  6. I am fascinated to know how you acquired your data. How did you know the seniority level of the reviewers, and match that with how long they took to turn in their reviews? As peer review is anonymous, I am stumped as to how you managed to do this. Journals often publish lists of their reviewers, but you can't match these to individual papers/times.
    What was your dataset?

    ReplyDelete
  7. As an editor, I have a chart with reviewer names, number of reviews, and length of review time. I have data for the past 5 years. For matching these data with seniority, I knew the vast majority of reviewers, and the others I looked up.

    ReplyDelete
  8. This is really interesting. Like anonymous, I suddenly feel better about turning in my reviews a few days after the deadline. I hope this information doesn't make me feel like I can push the envelope further next time.

    ReplyDelete
  9. Oh OK, I get it. You know the ID of reviewers because you are the editor, sorry. I thought you had a clever way of knowing who had reviewed "any old" paper.
    At Nature we have a huge reviewer database, tens and tens of thousands, but we don't collect any data on them such as senority, gender (;-) ) etc. I think there would be data protection legal issues. Also, once we receive the first review (of two) or the second (of three) we begin chasing up the outstanding reviewer, so I guess it would not be a fair spreadsheet, in our case. If left to their own devices, those last reviewers might never deliver, despite having agreed to do so in a certain time frame before being sent the ms.

    ReplyDelete
  10. This information is useful and makes sense to me. I will definitely keep it in mind for my next submission! And Maxine makes a very valid point- I know far too many of these "if left to their own devices" types!

    But I'm much more interested in the other statistics.
    Who rejected the most papers, accepted the most papers, etc.

    That list could really tell you a lot about who's influencing the state of science in your field and what kind of influences they exert (conservative vs. liberal approach to new data?).

    I'd give a small body part (maybe a toe?) to know who reviewed and rejected the most papers in my field the last 5 years. I can guess, but having the actual data would be a lot more fun.

    ReplyDelete
  11. I did not referee much so far but I have always sent in the review in the last day or before :). You should not be posting this information because those that have been doing it on time might think that is is OK to let it slip a little bit.

    ReplyDelete