Search the Lounge


« Bethany Oklahoma, Route 66, and the Grapes of Wrath | Main | Drexel Seeks New Sociology Department Chair »

November 13, 2014


Feed You can follow this conversation by subscribing to the comment feed for this post.


Who really cares about Brian Leiter's rankings? Or anyone's rankings for that matter?

Academia is a strange, prestige-obsessed universe.


Let's be clear about what is being ranked. This is really perceived scholarly distinction (it's actually even narrower - perceived scholarly distinction based on a particular style used to approach an issue). The fact that "faculty quality" is used to describe perceived scholarly distinction is unfortunate. Prospective law students should be more interested in teaching ability, a school's teaching philosophy, output measurements, and curricular options. Jojo asks why anyone cares about rankings. Well, because prospective law students select schools based on the rankings. An absurdly high percentage of the U.S. News rankings boils down to scholarship - most use it to assess their peers' "reputation" even though they know very little about their peer institutions. If that were to change, faculties would feel pressure to adjust their time allocations.


Prof Brophy, why did you ignore UC Irvine in your discussion? It is in the top 20 in Leiter's poll but unranked in US News. Seems like it should be mentioned, as it does affect your thesis of high correlation.

Steven Freedman

I'm perplexed at Princeton Law's absence from this list, they usually finish top ten :)

Seriously though, I think we all wonder how someone could possibly have a sense of the difference in scholarship quality among such a diverse group of schools. I'd be thrilled to hear someone articulate exactly which factors went into ranking San Diego over Tulane. Or Emory over Wisconsin. That doesn't mean the survey's not a fun exercise, just that it's value is dubious.



You're right. Many of the persons voted for their alma mater, or for other frivolous reasons.

The persons who "voted" don't know much of anything about the "scholarship" at all these schools. At most, they may have some real knowledge about a few persons writing in "their field," and beyond that, some scant, cursory knowledge of the work of one or two professors at a few of the schools. Nothing upon which anyone would claim an informed judgment. This is like grading papers based on the names of students whom you have never met, or worse (and this is par for the course in legal academia) students about whom one "heard" something.

People produce these beauty contest-like rankings because it garners attention. The post preceding the one considered here (on that other blog) dealt with an attack on a piece by Paul Campos that Campos wrote six years ago(!), with no apparent reason to have pulled it out of the dust heap other than to brandish the headline "Paul Campos admits he is fraud."

Now, there is some news you can use.


Steven makes a good point that this is a ranking by people who read Leiter's blog, and therefore, for example, have been exposed to that blog's cheerleading for the recently-established UC Irvine law school.


"Academia is a strange, prestige-obsessed universe."

Yes, it is.

But let's not hide the fact that academia is far from alone with respect to rankings. Prospective, current and former students are just as obsessed if not more obsessed with law school rankings. To a lesser but still significant extent, employers are as well. It's a huge problem.

Alfred Brophy

Hi Amir,

The reason I didn't mention Irvine is because it's a relatively new school, which has neither a USN rank nor sufficient data for its law review.


Students follow; they do not lead.

Rankings aren't used by "employers"; like all academics, the generalizations about the legal profession based on a small sliver of it are so tiresome.

Some "employers" look for the "best and brightest" and use the proxy of admission to a "top" law school. But this has to do with the selectivity of the school's admission process; this has nothing to do with "rank." Nobody cares if it is HYS or SYH or YSH or whatever.

Moreover, the law profs who pontificate about "opportunity" law schools contradict themselves regularly by claiming that employers follow "ratings" ... as these schools are largely unrated, i.e., not even recognized sufficiently to rank.

Again, "rankings" are irrelevant. Employers look to the selection process. If a school is admitting persons who can't pass the relevant tests, employers will take note. Students want to attend "top" law schools, to be sure; but this has more to do with endowments and longevity, as many studies have pointed out, the pecking order was established long before USNWR.

"Rankings" are an obsession based on caprice. As stated above, the ranking mentioned in this post was based on basically NOTHING reliable. As such it is a useless activity engaged in by folks who have nothing better to do.

Orin Kerr

Steven Freedman writes: "I'd be thrilled to hear someone articulate exactly which factors went into ranking San Diego over Tulane"

San Diego has a marvelous faculty with a large number of active an excellent scholars. I've presented papers there before, and I received some of the best comments I have ever received. Tulane is a good school with some fine scholars, to be sure, but I don't think it has nearly the quality of the faculty at San Diego.



You received "some of the best comments" about a "paper" that "you ever received"...? And, that is supposed to be accepted by any serious person as a basis for ranking a faculty? Really?

You couldn't make this stuff up. The total nonsense that actually passes as knowledge in legal academia is so consistently risible. That any thinking person would accept the statement quoted above as a basis for ANYTHING, leaving aside "ranking" a law faculty- and I have no doubt that just such nonsense played a big role in the "ranking" under discussion - is a sad testament to the juvenile nature of legal academia.

Anon too

Anon, surely you know that legal scholarship is based on anecdotes. The "See e.g." cite is the legal scholar's version of a correlation coefficient, a high R-squared, or a t-test. By the way, I have eaten at the McDonald's in San Diego and it was quite good. I have not eaten at the one in New Orleans and I am sure it has some fine burgers but there is no way it could measure up to the quality of the ones in San Diego. Anon, that is what we call "law science."

The One and Only Anon

The negative comments here about San Diego show the continuing problem about this site and the debate overall - it is filled with people who are know nothings. Serious faculty have long respected USD as a significant and creative place as Orin makes clear. The school has the good fortune to be located in one of the most beautiful places imaginable for a school but in a market that is quite small (altho the biotech boom there has not hurt). They overcame the structural disadvantages through years of hard work led by a very effective dean.


"Serious faculty have long respected USD as a significant and creative place as Orin makes clear. The school has the good fortune to be located in one of the most beautiful places imaginable for a school but in a market that is quite small ..."

Yes, indeed. If the readers are not yet howling with laughter, then they aren't seeing the supreme hilarity of these comments. Unbelievable. Just unbelievable.

Let the rankings begin.

"Let's see: San Diego, so sunny, but sort of small, but there is a "biotech boom" there ... (same person who last week was saying TJLS's building isn't worth anything) Hmmm. I'd rank that faculty at 61.2, with a margin of error of .2, but only because I'm not sure about the whole burgers thing. Burgers can be quite tasty, as I'm sure that Orin would agree. Orin received some great comments there. I heard him say so. Make that 61.4."


I'm not sure if this will be filtered out since I seem to again be PNG around here - but a few months ago the THE™ World University Rankings 2014-2015 and the QS World University Rankings® came out and in a number of countries were frontage news - with either considerable vaporings about how universities had slid or self congratulation about how universities had soared in the rankings. Since these rankings are begging to have a real world consequence - for example impacting the allocation of ESF funding and perhaps NSF, I became curious as to the methodology. Oddly enough, intellectual property law and antitrust/competition law uses surveys so you learn how to spot bad methodologies.

To say that the reputation surveys are it is as close to Horseshit Statistics™ as can be imagined would be kind. Let me explain - both surveys operate in similar ways - and a key aspect is the peer survey. In this they send thousands of academics a list of universities and ask that the academics rank their top 10 or 15. They ten rank universities by the number of times they appear in the top 10-15 selected. Not surprisingly, it seems that the survey participants tend to rank the schools that they have heard of - and since every academic has heard of Harvard, Oxford and Cambridge - well they get on about 100% of the replies - then everyone ranks the university next door and maybe the last place they went to for a conference.

Universities are aware that these surveys matter - a lot - in terms of grant revenue, the ability to attract international students (who in Europe and other countries where tuition is heavily state funded are very valuable because their tuition is essentially unregulated - they can be charged what the market will bear (close to full tuition at a US college), rather than what the department of education allows.) The result is that, at least within the top 100-200 gaming of the rankings is becoming visible. In the case of the QS ranking it seems that getting just 2-3 more people to list a university in their 15 will rocket it up the rankings (since the actual counts for the schools below say Harvard, Cambridge and Oxford are in the single digits.) This year Australian universities seem to have soared in the rankings - for no clear reason other than one suspects good marketing.

I tried an experiment on a few physicists I know - asking them to rank the top 15 physics departments - after MIT, Cambridge and CalTech almost all were places that the physicist worked with professionally - they were just the departments that they could remember. How many readers on this forum had even heard of Florida Coastal or Ave Maria (unless they applied for a job there?) How many people read most law review articles ... and how many who are not professors writing on the same subject (or the author's mother?)

The hard reality is that these surveys would be pointless - if they did not attract readers the way top ten lists do in Vogue and Cosmopolitan (not to mention the Weekly World News and other checkout fodder.) Pity that people base important decisions on them.

James Grimmelmann

No one is an expert on dozens of law schools. But even if professors have a lot of information only on some schools, they have some information on a lot of schools. They know the faculty in their field at most schools; they read the occasional paper from scholars outside their field; they give workshops and get comments, as Orin did. The point of a survey is that it aggregates the information held by many people to give an overall picture. When dozens of respondents in different fields at different schools all rate the University of Mars over Trantor Tech, it is fair to say that U of M has a better academic reputation than TT, even when none of the respondents has a complete basis for the head-to-head comparison. It is not necessary that every respondent know everything; only that most respondents know something.

That's not an endorsement of the U.S. News or Leiter surveys, both of which have issues that have been discussed to death. It's just to say that professors can easonably express an opinion based on their individual experiences. Enough anecdotes are data. All knowledge starts somewhere.

Camilla Highwater

Three respondents to Leiter's survey rated USD's faculty as superior to Yale's. So it must be really good.


And, all of those votes came from profs at USD, no doubt, who chose their employer over their alma mater, unlike most.


Yes, all knowledge about the "strength of law faculties" starts with anecdotes and rumors, but, more importantly, finds the true bases of supposedly valid "rankings," at the top, on a long standing pecking order (that preceded USNWR and perpetuates itself by means of faculty hiring, endowments, longevity, etc.) and at the bottom on absolutely nothing at all.

The unbelievable group think in which law academy engages is something to behold, it truly is. Never has there been a group of otherwise mainly intelligent people who so often appear to be willfully ignorant and intellectually self-indulgent.

"Mirror mirror on the wall ... Who is the fairest of them all?"

"Why, it is me! ME! ME!"

(Well, "me" when compared to Tulane; and this, I know for sure. Orin Kerr said he received great comments from my colleagues, and that means everything if we add up a bunch of Orins!)

Paul Horwitz

Anon, is your complaint about rankings, or is it about stating any opinion about the scholarly quality of a law school's faculty based on a scholarly interaction with the faculty of that law school? I certainly understand the former but I guess I find the latter a little odd--and, given that Orin was just politely answering someone's question in the comments section of a web site, and not doing something of lasting or major significance, the quantity and heat of your multiple responses also seems pretty remarkable.

The comments to this entry are closed.


  • StatCounter
Blog powered by Typepad