Search the Lounge


« Vincent Rougeau Named Dean At BC Law | Main | Major Law Professor Blogs Merge »

March 31, 2011


Feed You can follow this conversation by subscribing to the comment feed for this post.

Erik Gerding

For one hybrid, see the Peer Review Scholarship Marketplace:

Faculty submit pieces to a consortium of student edited law journals and agree not to submit to any other journals for 6 weeks. The journals send the piece to 2 experts in the field for a double blind peer review. Both the journals and the author get a copy of the anonymized peer evaluation. Then any consortium journal can decide to make an offer, which the author can accept or reject.

I don't know how the system is working, but it would certainly work better if more journals sign on and more authors submit.

Note that law professors can also support this process by agreeing to be peer reviewers.

Steven Lubet

Interesting idea, Jacqui, but I see at least a couple of problems: (1) authors wouldn't voluntarily use a system that consistently gave bad reviews, thus creating pressure to maintain business by praising everything; and (2) authors would just refrain from submitting bad reviews, in much the way that drug companies are accused of failing to report negative trials.

Of course the biggest problem is personnel. Who would want to be a reporter? I have enough trouble reading everything I actually want to read, as well as manuscript reviews for academic presses and the occasional tenure report.

Erik's proposal avoids the problem of suppression (because it would be imposed by the law reviews), but there would still be a problem finding experts.


Who are these professional reviewers? Just smart lawyers who can spot inconsistencies. That's can't be enough. Maybe people with a high degree of knowledge of the relevant literature? How do we test them fro that? If they're so great, why aren't they publishing articles instead serving a professional peer reviewers? Or are we going to somehow create incentives for faculty to do this? That would be great -- if you could great the incentives. But then faculty would have their own bias, which could shut out dissenting views. Anyway, I don't know how peer review works generaly, but these are questions that occur to me. Generally, I like the idea of finding a way to do it. I'm just skeptical of every idea i hear. THanks.


There's another option that would help the quality of articles: checking citations before publication decisions are made.

I've worked on a journal, and helped a professor with the final stages of publication as an RA, and in both instances I was surprised to find that unsubstantiated claims (either lacking a citation to back up key facts, or citing a source that doesn't actually support the article) are often overlooked because it's simply too late in the process to fix the problem.

Large sections of an article might have to be rewritten, or the article entirely scrapped. But, since so few people will actually go back and check the sources, the articles get published anyways. Odds are no one will notice the problem, and if they do, it's not like they have a forum where they can point it out.

Fact checking should not be left up to second year law students who lack a background on the substantive law and are only doing cite checks to get a fancy line on their resume. They speed through it so they can move on to things they actually care about.

Jacqui Lipton

Erik - thanks for posting that. I didn't know about it and it looks like a very worthwhile project.
I do think one of the bigger problems with my original thought (as noted by Steve and Quick) is in selection of quality reviewers and what kind of economic incentives (if any) would be given to them.
I'm not sure that I necessarily agree with Steve that it would be a big problem if authors refrained from submitting bad reviews. Articles editors may decide to prioritize articles with good reviews over articles with no reviews so it wouldn't matter if bad reviews were suppressed. However, this raises another potential problem. If there is a cost imposed on authors to get the peer reviews done (ie if it is not a free service) then there would be potential equity concerns. Those with more money/scholarship budget could afford reviews whereas authors of strong articles who didn't have the money to secure reviews may not get their work prioritized by articles selection boards.


This "PRSM" concept looks very promising. I just volunteered to be a reviewer.


I don't think you'd need direct economic incentives per se, but rather would just need to create an expectation in law schools where part of the academic service component of the job is serving as a peer reviewer for academic journals. That is the norm in many other disciplines, to the point where a tenure candidate would stick out if s/he hadn't served as a reviewer by the time s/he goes up for tenure.


As anecdotal evidence of peer review's success in the legal field: The Journal of College and University Law at Notre Dame Law School sends every article in consideration for publishing to at least one (and often two or more) reviewers, usually from outside the Notre Dame campus. Reviewers are uncompensated, and are solicited for their expertise in a particular subject area. Each reviewer fills out a 2 page feedback form, prompting the reviewer to comment on the relative strengths and weaknesses of the article, including a section for the reviewer to "recommend," "recommend with changes," or decline to recommend an article for publication. Reviewers may give as much or as little feedback as they wish, though the ultimate decision to publish the article rests with the editorial staff. Authors may not select who reviews their work, and reviwers may stay as anonymous as they wish.

The end result of a system like this? Nearly every article is sent back to the author for some level of revision. Because of this back-and-forth editing process (which occurs before the article ever reaches the student editing staff), the publication process is often more drawn-out. Some authors balk at this additional work, and it takes a seriously dedicated member of the journal team to coordinate the communication between reviewers and authors, and to assure that each potential piece gets reviewed. Still, the articles which ultimately are published are better vetted and better written than they might otherwise be.

I served as Editor in Chief for this publication, so my affection for this system may a bit biased. Even so, I was pleasantly surprised and the ease and relative painlessness of this system. Very few authors declined to make changes that a reviewer requested (even major substantive edits), and many authors expressed their gratitude at how much this review process strengthened the caliber of their end product. Moreover, it left my editing staff with more time to focus on areas that they were better suited to improving, and really helped to ensure that the material we published was both relevant and cutting edge.

As to finding reviewers: I was pleasantly surprised to see how many academics and professionals were willing to serve as peer reviewers, even without compensation. We solicit these reviewers (who work on an article-by-article basis, and who always have the option to decline to review) based on their academic credentials. Often, they are authors who have published with the journal in the past, or have a strong background in the particular subject matter. Frequently, they are professors, or otherwise tied in to the legal academic community. Spread between several able and willing volunteers over the course of several months, the task is not as onerous as one might expect.

Miriam A. Cherry

Has anyone actually used PSRM? How did it go, for those who used it?

Scott Boone

I wonder if the best first step to changing the placement process is, in a big picture sense, the reverse of what you suggest. In other words, the best first step may not be adding an additional (though admittedly far superior) proxy, but rather removing as many of the proxies that currently exist as we can.

Instead of creating a modified peer review system, create a submission system that anonymizes all submissions. This would not remove all proxies used by student editors, but it would remove the school letterhead and the prior publication record.

While this would not solve the problems related to the volume of material student editors receive, it would, by removing most of the crutches currently used, create very high incentives for journals to find another (presumably better) way. Some sort of modified peer review, limitations on simultaneous submissions, etc.

You'd have to get buy in from across the board including those benefited by the current use of proxies by student editors, or the use of the anonymizing service would become a proxies in and of itself.

Of course, we could also go to the real source of the problems - the professors. And no, I don't mean the problems of large scale simultaneous submissions or the expedited request game. The real source of all of this is that faculty care very deeply about placement. We don't normally refer to it this way, but the source of all of these problems is that faculty themselves use placement as a proxy for quality.

Seeking some measure or recognition of quality is not wrong. But we talk about all the problems with placement such as the abilities of 2L students to assess quality and the use of proxies not linked to article quality forced on students editors by the current system, and then we turn around and place an extremely high value on placement. It's our choice to place a high value on placement that more or less forces faculty to engage in the practices often cited as creating the problems for student editors.

Matthew Reid Krell

I've used PRSM. It led to no offers, but one of the reviews was very helpful (the other was bafflingly off-point), and when I finally did place the article, the PRSM reviews gave me significant help in improving the piece.

I'm actually pondering an experiment that will test reviewer bias in the PRSM system. If you're interested in participating, contact me.

The comments to this entry are closed.


  • StatCounter
Blog powered by Typepad