Search the Lounge


« Hiring Announcement: Texas A&M | Main | Still More Thoughts on Self-Delusion in the Legal Academy; Or, Accepting the Difference Between a Smokin’ Bucketful of Awesome and a Smoking Pile of Scrap »

June 30, 2014


Feed You can follow this conversation by subscribing to the comment feed for this post.

Jeff Harrison

As a "not recovering" rankings junkie, I find this all pretty interesting. Thanks for sending it along.

I am wondering why you use citations as a measure of reputation as opposed to, as I recall, just asking the question of reputation directly. Why call it reputation at all. Perhaps actually scholarly impact or something like that would capture it.

I also think it would be interesting to rank by judicial citations separately along with all citations. You might find some lower ranked schools that publish more practice oriented pieces move up and it is possible (without meaning to unleash a horde of nasty anonymous comments) that this is a better measure of impact.

Alfred Brophy

Hi Jeff,

The use of citations drew some criticism a couple weeks back when I posted on this. And I responded to some of it in the paper and in the comments to my post:

I use citations as a proxy for scholarly reputation. I'm trying to avoid using US News' proprietary measures and also I'm looking for something that may be more responsive to changes in school quality than those notoriously static measures of peer and lawyer/judge assessment. Moreover, citations are highly correlated with US News' peer assessment score, so I think that's a good proxy. But I also think it makes sense as a measure of the school's reputation because the better the reputation the more likely the main law review is to have its pick of articles. Finally, the articles that a journal publishes reflect on the editors' orientation.

I've written a little bit about citations as a measure of quality and I hope to have some more to say about this later summer:

I've also written a short piece on the citations over a longer period of time to articles in a bunch of leading law reviews, which reveals that some of the most cited pieces in good but not the most elite law reviews are cited more than many of the articles in the most elite law reviews:

Someone suggested using citations in judicial opinions; I've looked at those are measure of scholarly quality, too:

Court citations are lumpy and while they tell us something about law review quality I don't think they're as revealing as citations in other journals.

Jeff Harrison

Yes, I agree. Total cites probably does get at reputation better than just judicial. I was thinking of something more substantive as measure of what a law school actually does as opposed to what people think they do.

Hither and anon

Looks like maybe a slight error in Table 2 on the SSRN draft. Washburn (21) should be listed ahead of LSU (20), shouldn't it?

Alfred Brophy

Hither and anon, you're right. Washburn should be added to table 2 with an improvement in rank of 21. I made that change and posted it on ssrn.


AB, thanks for all the number crunching and the thought that went into putting together your paper. Very interesting over all.

I noted with a certain amount of vindictive amusement that your table 4 now properly drops Chicago all the way down to #10.

Perhaps some philosopher or other who feels a bit proprietary regarding his own version of LS rankings will take you to task for that.




Using the LSAT as an input to such a formula is utter rubbish. This is the epitome of a post hoc ergo procter hoc fallacy. Who cares what the LSAT - designed as one of several inputs to correlate with 1L performance - says?

If the average entrant to the Cardozo c/o 2014 used a 162 to get a 54% chance of a FTLT job, but an Albany entrant used a 153 for a 60% chance, how is Cardozo a “better school”?

Do you think people go to law school for some purpose other than employment?

Employment and salary data should be well over 50% of any ranking. I’m ashamed that the legal academy is so fixated on some single data point rather than outcomes for graduates. Considering the degree of “analysis” that the academy claims to perform, the gullability required to believe that inputs should be weighted more heavily than outputs in such rankings is simply embarrassing.


Again - I'll make the point that was not allowed through the first time. I think the problem with most law school ranking methodologies is that they are addressing two groups whose interests in the rankings diverge - academia and prospective law students.

Prospective law students have one key interest - will this law school provide me with the credentials to have a good career in law? - i.e., will I be able to get a 'legal' job and will it pay me enough to justify my attendance at this school (or indeed any graduate school?) The measure of that is post graduation employment (for which the 9 month cutoff is probably fine, though it rewards weaker schools agains the stronger (a Harvard- Yale grad who takes 9 moths off after graduating is still probably OK), incomes and longevity in the legal career (i.e., is the person still practicing after 10-20 years.)

Academics have different interests - is this school prestigious academically, which translates into - will it pay well? is it a good credential for lateraling? is it a good credential for perhaps some government role? will I be respected by my peers?

The problem with most ranking's is that they seek to straddle the two. LSAT scores and GPAs are important in their own way, but in outcome measurement they are a proxy for the main question, will this person be a successful lawyer - and they are at best a weakly coupled proxy at the high end - at the low end, where LSAT scores are catastrophically bad - they are probably strongly coupled. Similarly, journals are of limited relevance to the future careers of law students, except to the extent that being on journal is a factor with large law firms (a tradition that seems to have a lot to do with the hiring lawyers having also been on journal.)

Would academic orientated and prospective student orientated rankings diverge a lot - well probably not as much as the critics would think. But still the exercise would be valuable.

On journals - I think there may be a problem in selecting just the main journal. While it is the case that from an academic perspective impact factor is measured by citations in other journals, from a practitioner perspective 'impact' is whether the journal might influence a judge, tribunal or agency - as in "if I cite it will they follow it." I do cite journals in briefs and submissions, particularly to agencies and arbitral tribunals - but they are rarely the 'main journal,' but rather more specialised journals in say international arbitration law, intellectual property law and economics, antitrust/competition law, etc. These articles are frequently survey articles on valuation issues, economic impacts or procedural rules - the sort of article that main journals routinely decline and that rarely seem to get cited in more 'academic' journal articles. Indeed one of the basic weaknesses of main journal articles is their tentative style and unwillingness to state hard conclusions in clear language of the citable to a court/tribunal/agency variety. So the nature of journals reflects the academic/practitioner split, with Kluwer, Sweet & Maxwell, BNA and Aspen getting paid for their journals while law schools have to give them away.

Former Editor

@ anon 11:41

One reason to consider using median LSAT in a ranking system is that LSAT correlates very, very strongly with bar pass rate in the states with harder bar exams (NY, CA, etc.). You may have noticed that the difference is LSAT between Cardozo and Albany is within a point of the difference in their July 2013 bar pass rates. Now, it might make more sense to skip right to the bar pass rate itself, but because the difficulty of the bar exam varies from state to state, doing so is problematic for a national ranking system.

Regarding the fixation on outputs (read: employment & income), in my view, the problem with constructing a reasonable ranking system that way (as opposed to something arbitrary and insane like the ATL rankings) are the lingering issues of transparency and reporting.

As for transparency, law schools are still not required to release granular data to the public, what they do release isn't subject to audit (the new ABA proposal won't do much in that vein either), and many schools have a disappointing history of fudging their numbers. The schools are not entirely to blame for the lack of information, though, because they cannot disclose what they do not have and disgruntled/unemployed graduates are unlikely to answer surveys.



The issue of bar passage is interesting, but the LSAT is still being used as a proxy, in this case for a purpose for which it wasn’t intended.

Here’s the common-sense issue that hardly anyone mentions: In order to get a FTLT legal position, that graduate NECESSARILY passed the bar. Whether it’s 60% or 54%, the fact is that the employment data is the employers’ valuation of the degree.

For a frivolous hypo: if a school has a 50% pass rate and 50% FTLT, that means that employers wanted every last grad of that school (which clearly has a terrible passage rate). Ceteris paribus, if a school has a 90% pass rate and 60% FTLT statistic, that means that 1/3 of those graduates’ degrees weren’t used to their potential.

Certainly the example is frivolous, but how “good” a school is (and therefore what should be reflected in rankings) is best evidenced by employers’ willingness to hire the product of that education!

As much as transparency and reporting might be issues, they’re still - even if succeptible to selective participation - more useful than an entrance exam.


This may sound sarcastic - but it's not intended to be. In USNWR, ATL and other rankings there is a simple purpose - sales of the publication. A fashion editor explained this once vis-à-vis Vogue, Cosmopolitan etc. - that the best way to get magazines off newsstands was to put a ranking on a personal issue on the front page - top ten ways to, top twenty...

As she explained the list are usually entirely made up - it's having the list that sells - but even when there are criteria they are usually bogus. The AmLaw rankings regularly contain data that IMHO is bogus. The problem is that if the general public is going to look for and to these lists, and take them seriously, it is important that they be objective - so critiques of what Alfred is doing are good, objecting to the exercise, unless the objector has the time to do better is bad.

Former Editor

anon 12:16,

A few things. First, a report that a student is FTLT legally employed does NOT necessarily mean that the student passed the bar. Even assuming accurate reporting, when the information is gathered really matters. A student can report themselves as legally employed in a FT/LT position prior to taking or passing the bar and then go on to fail the bar and lose the job.

For example, if the information is collected graduation (one of the two USN employment categories), the students obviously haven't passed the bar yet because they haven't taken it yet. Even with a 9-month measure, if the student responded between June and October (at least in NY) then they necessarily haven't passed the bar exam yet (although I'll admit that it's likely they did). In other words, it's entirely possible for a student to get a FTLT legal job, report themselves to the school as employed early in the reporting period, fail the bar exam, get fired because of it, and the school never update its file.

Second, even if it were true that FTLT legally employed statistics did necessarily mean that the student passed the bar, the metric still would not be useless. Given that most lower ranked law schools have pretty abysmal FT/LT legal employment rates at this point, a proxy for bar passage might be useful in separating the mediocre/bad options from the absolutely awful options. In other words, graduating without getting a FTLT legal job in 9 months is bad, but at least the student has become an admitted attorney who can hang out a shingle and keep scuffling around looking for work as a lawyer. Graduating without passing the bar OR getting a FTLT legal job in 9 months makes the entire law school endeavor a very expensive waste of time.

Third, and I hadn't thought of this until just now, the correlation between bar passage and LSAT may also tell a prospective student something about how the school views its matriculated students. A school that is consistently enrolling half or more of its class with LSAT scores that strongly indicate bar failure probably has an institutional culture that views students more as a transitory customer pool than as a raison d'être.

For the record, I don't mean to say that employment statistics aren't a really important source of information regarding how good a schools is. Employment statistics are probably the MOST important source, and should be a large slice of any ranking system (as they are here). All I'm saying is that where the employment numbers we do have are subject to some skepticism as to their accuracy, a ranking system isn't necessarily making a mistake by factoring in a proxy measure that doesn't seem to have the same reporting problems (at least, not anymore).

Regarding weighting the two, it might make sense to adjust a bit. 45% employment v. 21% LSAT might be a little better than 33% each, but I honestly doubt it will result in much of a shakeup.

Alfred Brophy

Former Editor,

Thanks for all of this. The LSAT is also important because it tells us about the quality of students and that has a big effect on the educational experience. The quality of students affects what goes on in class and probably more importantly what happens outside of class as students are studying together and working on extra-curricular activities. If all one cares about is percentage of the class employed at long-term, permanent JD required jobs 9 months out schools' rank on that is available in my paper. But I think there are other factors that are important in distinguishing schools.

John Rooney

When I went to law school back in the 1950s, I intended to work for a company and use my JD as a resume improver like an MBA. Law practice was ill paid back then. Eisenhower's recession forced me into law practice.

What about MBAs? I don't think any jobs absolutely required them. The degrees possibly constituted an advantage. Has anybody calculated how many MBA advantage jobs there are?

Inquiring minds want to know.

John Thompson

@John Rooney/8:36 a.m.:

Did law school require you to take student loans? If so, how long did it take you to repay your student loans, and how much student debt were you carrying relative to your salary for that first year after graduation?

John Rooney

To: John Thompson

Student loans in the 1950s -- what? Are u kidding?

John Thompson

@John Rooney/4:33 a.m.:

I just wanted to confirm that law school was something a mere BA could readily pay for, out of pocket or working a part-time job, during the 1950's. This makes your idea of getting a JD as a "resume improver" sound much less insane to people from the present.

The comments to this entry are closed.


  • StatCounter
Blog powered by Typepad