I thought I'd look a little at the differences between Brian Leiter's recent on-line survey of faculty quality and the U.S. News rankings. First an observation about the changes in the top fifty schools. There are five schools that appear in Leiter’s 2014 top 50 faculties that are not in the current U.S.News top 50 schools. They are (and the numbers following the schools are U.S. News rank school and average rank):
University of California, Hastings 55.5 54
Cardozo Law School/Yeshiva Univ 65.5 64
University of San Diego 79.5 79
American University 75 72
Brooklyn Law School 84.5 83
There are also five schools in the current U.S.News top 50 schools that are not in Leiter’s 2014 top 50 faculties. (And the number following each school is its U.S. News rank):
Wake Forest 33
BYU 37
SMU 42
Maryland 46
Tulane 48
Utah 49
I also looked at the correlations between Leiter's ranks, U.S. News data on peer assessment and overall rank, and Doyle's Washington and Lee Law Review rankings. The following table reports the correlations between those variables.
Rank Leiter | Peer Rank | USN Rank | Journals, 2006-2013 | |
Rank Leiter | -- | -.94 | .84 | -.82 |
Peer 2015 | .94 | -- | -.86 | .82 |
USN Rank avg | .84 | -.86 | -- | -.67 |
Journals 06-13 | -.82 | .82 | -.67 | -- |
For all correlations, N = 49 and p < .0001
I guess one of the things that really stands out for me is the high correlation between Leiter's ranking and the U.S. News peer assessment scores (.94). As long I as I'm talking about rankings, I want to say that last summer I posted a paper on ranking law schools using entering student data on LSATs, student employment data, and law review citations (which I use as a proxy for academic reputation). It's available on ssrn.
Update: I've been surprised at the amount of commentary on this post -- I guess it shows that people care about rankings, or something. Anyway, I think people might be interested in the differences between the Leiter rank and the U.S. News peer rank for individual. Here's a table that lists the schools in order of their Leiter rank and compares them to their US News rank. A negative difference indicates that the school does better on Leiter than U.S. News; a positive indicates that a school's U.S. News peer assessment rank is better than its Leiter rank. I noticed that my own school (UNC) does substantially less well on the Leiter rank than on the U.S. News peer assessment rank. Actually, I think UNC has the fourth-largest diffence between Leiter rank and U.S. News peer assessment rank (after Washington and Lee, which does much better on U.S. News than Leiter, the University of Illinois, which does much better on Leiter than U.S. News, and the University of Florida, which does much worse on Leiter than on U.S. News). I guess I should have voted. What's the slogan? "If you don't vote you don't count," or some such.
At the risk of sounding too self-promotional, this paper uses citations as a replacement for U.S. News peer assessment, which -- as some people point out in the comments -- rely on raters to have some knowledge about a bunch of schools.
Who really cares about Brian Leiter's rankings? Or anyone's rankings for that matter?
Academia is a strange, prestige-obsessed universe.
Posted by: Jojo | November 13, 2014 at 11:03 AM
Let's be clear about what is being ranked. This is really perceived scholarly distinction (it's actually even narrower - perceived scholarly distinction based on a particular style used to approach an issue). The fact that "faculty quality" is used to describe perceived scholarly distinction is unfortunate. Prospective law students should be more interested in teaching ability, a school's teaching philosophy, output measurements, and curricular options. Jojo asks why anyone cares about rankings. Well, because prospective law students select schools based on the rankings. An absurdly high percentage of the U.S. News rankings boils down to scholarship - most use it to assess their peers' "reputation" even though they know very little about their peer institutions. If that were to change, faculties would feel pressure to adjust their time allocations.
Posted by: Anon | November 13, 2014 at 11:36 AM
Prof Brophy, why did you ignore UC Irvine in your discussion? It is in the top 20 in Leiter's poll but unranked in US News. Seems like it should be mentioned, as it does affect your thesis of high correlation.
Posted by: Amir | November 13, 2014 at 12:24 PM
I'm perplexed at Princeton Law's absence from this list, they usually finish top ten :)
Seriously though, I think we all wonder how someone could possibly have a sense of the difference in scholarship quality among such a diverse group of schools. I'd be thrilled to hear someone articulate exactly which factors went into ranking San Diego over Tulane. Or Emory over Wisconsin. That doesn't mean the survey's not a fun exercise, just that it's value is dubious.
Posted by: Steven Freedman | November 13, 2014 at 12:38 PM
Steven
You're right. Many of the persons voted for their alma mater, or for other frivolous reasons.
The persons who "voted" don't know much of anything about the "scholarship" at all these schools. At most, they may have some real knowledge about a few persons writing in "their field," and beyond that, some scant, cursory knowledge of the work of one or two professors at a few of the schools. Nothing upon which anyone would claim an informed judgment. This is like grading papers based on the names of students whom you have never met, or worse (and this is par for the course in legal academia) students about whom one "heard" something.
People produce these beauty contest-like rankings because it garners attention. The post preceding the one considered here (on that other blog) dealt with an attack on a piece by Paul Campos that Campos wrote six years ago(!), with no apparent reason to have pulled it out of the dust heap other than to brandish the headline "Paul Campos admits he is fraud."
Now, there is some news you can use.
Posted by: anon | November 13, 2014 at 01:40 PM
Steven makes a good point that this is a ranking by people who read Leiter's blog, and therefore, for example, have been exposed to that blog's cheerleading for the recently-established UC Irvine law school.
Posted by: JPQ | November 13, 2014 at 01:46 PM
"Academia is a strange, prestige-obsessed universe."
Yes, it is.
But let's not hide the fact that academia is far from alone with respect to rankings. Prospective, current and former students are just as obsessed if not more obsessed with law school rankings. To a lesser but still significant extent, employers are as well. It's a huge problem.
Posted by: ATLprof | November 13, 2014 at 06:02 PM
Hi Amir,
The reason I didn't mention Irvine is because it's a relatively new school, which has neither a USN rank nor sufficient data for its law review.
Posted by: Alfred Brophy | November 14, 2014 at 07:58 AM
Students follow; they do not lead.
Rankings aren't used by "employers"; like all academics, the generalizations about the legal profession based on a small sliver of it are so tiresome.
Some "employers" look for the "best and brightest" and use the proxy of admission to a "top" law school. But this has to do with the selectivity of the school's admission process; this has nothing to do with "rank." Nobody cares if it is HYS or SYH or YSH or whatever.
Moreover, the law profs who pontificate about "opportunity" law schools contradict themselves regularly by claiming that employers follow "ratings" ... as these schools are largely unrated, i.e., not even recognized sufficiently to rank.
Again, "rankings" are irrelevant. Employers look to the selection process. If a school is admitting persons who can't pass the relevant tests, employers will take note. Students want to attend "top" law schools, to be sure; but this has more to do with endowments and longevity, as many studies have pointed out, the pecking order was established long before USNWR.
"Rankings" are an obsession based on caprice. As stated above, the ranking mentioned in this post was based on basically NOTHING reliable. As such it is a useless activity engaged in by folks who have nothing better to do.
Posted by: anon | November 14, 2014 at 04:49 PM
Steven Freedman writes: "I'd be thrilled to hear someone articulate exactly which factors went into ranking San Diego over Tulane"
San Diego has a marvelous faculty with a large number of active an excellent scholars. I've presented papers there before, and I received some of the best comments I have ever received. Tulane is a good school with some fine scholars, to be sure, but I don't think it has nearly the quality of the faculty at San Diego.
Posted by: Orin Kerr | November 14, 2014 at 11:07 PM
Orin
You received "some of the best comments" about a "paper" that "you ever received"...? And, that is supposed to be accepted by any serious person as a basis for ranking a faculty? Really?
You couldn't make this stuff up. The total nonsense that actually passes as knowledge in legal academia is so consistently risible. That any thinking person would accept the statement quoted above as a basis for ANYTHING, leaving aside "ranking" a law faculty- and I have no doubt that just such nonsense played a big role in the "ranking" under discussion - is a sad testament to the juvenile nature of legal academia.
Posted by: anon | November 14, 2014 at 11:26 PM
Anon, surely you know that legal scholarship is based on anecdotes. The "See e.g." cite is the legal scholar's version of a correlation coefficient, a high R-squared, or a t-test. By the way, I have eaten at the McDonald's in San Diego and it was quite good. I have not eaten at the one in New Orleans and I am sure it has some fine burgers but there is no way it could measure up to the quality of the ones in San Diego. Anon, that is what we call "law science."
Posted by: Anon too | November 15, 2014 at 01:17 AM
The negative comments here about San Diego show the continuing problem about this site and the debate overall - it is filled with people who are know nothings. Serious faculty have long respected USD as a significant and creative place as Orin makes clear. The school has the good fortune to be located in one of the most beautiful places imaginable for a school but in a market that is quite small (altho the biotech boom there has not hurt). They overcame the structural disadvantages through years of hard work led by a very effective dean.
Posted by: The One and Only Anon | November 15, 2014 at 01:48 AM
"Serious faculty have long respected USD as a significant and creative place as Orin makes clear. The school has the good fortune to be located in one of the most beautiful places imaginable for a school but in a market that is quite small ..."
Yes, indeed. If the readers are not yet howling with laughter, then they aren't seeing the supreme hilarity of these comments. Unbelievable. Just unbelievable.
Let the rankings begin.
"Let's see: San Diego, so sunny, but sort of small, but there is a "biotech boom" there ... (same person who last week was saying TJLS's building isn't worth anything) Hmmm. I'd rank that faculty at 61.2, with a margin of error of .2, but only because I'm not sure about the whole burgers thing. Burgers can be quite tasty, as I'm sure that Orin would agree. Orin received some great comments there. I heard him say so. Make that 61.4."
Posted by: anon | November 15, 2014 at 03:02 AM
I'm not sure if this will be filtered out since I seem to again be PNG around here - but a few months ago the THE™ World University Rankings 2014-2015 and the QS World University Rankings® came out and in a number of countries were frontage news - with either considerable vaporings about how universities had slid or self congratulation about how universities had soared in the rankings. Since these rankings are begging to have a real world consequence - for example impacting the allocation of ESF funding and perhaps NSF, I became curious as to the methodology. Oddly enough, intellectual property law and antitrust/competition law uses surveys so you learn how to spot bad methodologies.
To say that the reputation surveys are it is as close to Horseshit Statistics™ as can be imagined would be kind. Let me explain - both surveys operate in similar ways - and a key aspect is the peer survey. In this they send thousands of academics a list of universities and ask that the academics rank their top 10 or 15. They ten rank universities by the number of times they appear in the top 10-15 selected. Not surprisingly, it seems that the survey participants tend to rank the schools that they have heard of - and since every academic has heard of Harvard, Oxford and Cambridge - well they get on about 100% of the replies - then everyone ranks the university next door and maybe the last place they went to for a conference.
Universities are aware that these surveys matter - a lot - in terms of grant revenue, the ability to attract international students (who in Europe and other countries where tuition is heavily state funded are very valuable because their tuition is essentially unregulated - they can be charged what the market will bear (close to full tuition at a US college), rather than what the department of education allows.) The result is that, at least within the top 100-200 gaming of the rankings is becoming visible. In the case of the QS ranking it seems that getting just 2-3 more people to list a university in their 15 will rocket it up the rankings (since the actual counts for the schools below say Harvard, Cambridge and Oxford are in the single digits.) This year Australian universities seem to have soared in the rankings - for no clear reason other than one suspects good marketing.
I tried an experiment on a few physicists I know - asking them to rank the top 15 physics departments - after MIT, Cambridge and CalTech almost all were places that the physicist worked with professionally - they were just the departments that they could remember. How many readers on this forum had even heard of Florida Coastal or Ave Maria (unless they applied for a job there?) How many people read most law review articles ... and how many who are not professors writing on the same subject (or the author's mother?)
The hard reality is that these surveys would be pointless - if they did not attract readers the way top ten lists do in Vogue and Cosmopolitan (not to mention the Weekly World News and other checkout fodder.) Pity that people base important decisions on them.
Posted by: MacK | November 15, 2014 at 10:15 AM
No one is an expert on dozens of law schools. But even if professors have a lot of information only on some schools, they have some information on a lot of schools. They know the faculty in their field at most schools; they read the occasional paper from scholars outside their field; they give workshops and get comments, as Orin did. The point of a survey is that it aggregates the information held by many people to give an overall picture. When dozens of respondents in different fields at different schools all rate the University of Mars over Trantor Tech, it is fair to say that U of M has a better academic reputation than TT, even when none of the respondents has a complete basis for the head-to-head comparison. It is not necessary that every respondent know everything; only that most respondents know something.
That's not an endorsement of the U.S. News or Leiter surveys, both of which have issues that have been discussed to death. It's just to say that professors can easonably express an opinion based on their individual experiences. Enough anecdotes are data. All knowledge starts somewhere.
Posted by: James Grimmelmann | November 15, 2014 at 12:04 PM
Three respondents to Leiter's survey rated USD's faculty as superior to Yale's. So it must be really good.
Posted by: Camilla Highwater | November 15, 2014 at 01:12 PM
And, all of those votes came from profs at USD, no doubt, who chose their employer over their alma mater, unlike most.
Posted by: anon | November 15, 2014 at 01:46 PM
Yes, all knowledge about the "strength of law faculties" starts with anecdotes and rumors, but, more importantly, finds the true bases of supposedly valid "rankings," at the top, on a long standing pecking order (that preceded USNWR and perpetuates itself by means of faculty hiring, endowments, longevity, etc.) and at the bottom on absolutely nothing at all.
The unbelievable group think in which law academy engages is something to behold, it truly is. Never has there been a group of otherwise mainly intelligent people who so often appear to be willfully ignorant and intellectually self-indulgent.
"Mirror mirror on the wall ... Who is the fairest of them all?"
"Why, it is me! ME! ME!"
(Well, "me" when compared to Tulane; and this, I know for sure. Orin Kerr said he received great comments from my colleagues, and that means everything if we add up a bunch of Orins!)
Posted by: anon | November 15, 2014 at 02:51 PM
Anon, is your complaint about rankings, or is it about stating any opinion about the scholarly quality of a law school's faculty based on a scholarly interaction with the faculty of that law school? I certainly understand the former but I guess I find the latter a little odd--and, given that Orin was just politely answering someone's question in the comments section of a web site, and not doing something of lasting or major significance, the quantity and heat of your multiple responses also seems pretty remarkable.
Posted by: Paul Horwitz | November 15, 2014 at 03:54 PM