I have a new paper on ranking law schools with LSAT scores, employment outcomes, and law review citations. This builds on the paper that I workshopped here at the faculty lounge last summer and that appears in the Indiana Law Journal. I've made some changes to methodology in light of those comments and also updated the data.
The new paper is "Ranking Law Schools, 2015: Student Aptitude, Employment Outcomes, and Law Review Citations." Here is the abstract:
This essay builds on a paper released last year that ranked law schools on three variables: the median LSAT of entering students of the most recent class, the most recently available employment outcome for each school’s graduates, and citations to each school’s main law reviews over the past eight years. This paper updates that study with LSAT median data for the class entering in fall 2014, employment data for the class graduating in 2014 nine months after graduation, and the most recent law review citation data for 2007 through 2014. It studies 195 ABA approved law schools.
In addition to using more recent data, this study changes the method of combining those data. Where the last paper used simple ranks for each variable and averaged them, this study has a more granular approach to the data. It converts each school’s median LSAT score and the percentage of students employed in full-time, permanent, JD-required jobs nine months after graduation (excluding school-funded positions and solo practitioners) to standard scores. In addition, given the dramatic differences in number of law review citations among schools, it employs a common log transformation of law review citations and then converts the transformed scores to standard scores. The paper combines the first two scores to provide a two-variable ranking, and then combines all three variables to provide a three-variable ranking. The paper reports average scores for the three-variable ranking, thus permitting examination of how close schools are to each other. It also ranks the 195 ABA-approved law schools in the United States (excluding the three schools in Puerto Rico) that U.S. News included in its rankings released in March 2015. And it compares the new, two- and three- variable rankings to the U.S. News provided ranks in March 2015. It identifies the schools that improve and decline the most with the new rankings.
There are a few changes from last summer's study -- this uses the most recent LSAT data, employment outcomes, and law review citations. But this also treats the data somewhat differently -- where last year I used simple ranks, this year I converted the raw data to standard scores (and in the case of law review citations I transformed them with a common log function first before converting to standard scores -- otherwise the leading law reviews would have had too great an influence on the overall rankings). And then I combined those standard scores. This helps preserve some sense of the differences between schools. For some schools, there the differences in standard scores are dramatic. Harvard and many of the other leading schools are many points higher than many of the middling schools. I'm going to talk a bit more about this later in the week. One key upshot here is that the differences between many of the schools in the middle aren't all that great. This I guess is no surprise, but sometimes I think people act as though the difference between a U.S. News rank of 55 and 75 is really significant. Looking at the standard scores in my study suggests such distinctions may not be so great.
As with last year I have tables that list the schools that perform substantially better (and worse) than their U.S. News overall rank. I think this is worth some commentary later in the week, too; because I think focusing on the three variables that I use here provides a good, basic sense of a law school's quality in comparison to other schools. It seems as though some of the other variables that U.S. News uses introduce some confusion -- or in the case of the notoriously static peer assessment and lawyer/judge assessment numbers may not reflect a school's current quality.
As I say, I'll be talking more about this later this week -- just as soon as I deal with some pressing matters on Confederate monuments, such as whether Georgia Code, 50-3-1(b)(2) that prohibits the removal of Confederate monuments by a local community without the consent of the legislature should be modified or repealed.
The image is Drexel Law School, which is one of most under-rated schools by U.S. News, according to one of the rankings methods in "Ranking Law Schools, 2015: Student Aptitude, Employment, and Law Review Citations."
An earlier study, which used a somewhat different methodology (ranks rather than scaled scores) and data from last year, appears in the Indiana Law Journal Supplement.
"The third and final variable used in this paper is citations to a law school’s main law review over the period 2007-2014. This is designed to tell something about the intellectual orientation and culture of the school and to reveal something about the school’s standing in the legal education community"
Intellectual orientation and culture of the school? What exactly does that mean?
Harvard has 6500 cites, but Columbia only has 4900 cites. Does that mean Harvard has 1600 more culture? That it's orientation points closer to ...what exactly? NYU is only at 3700 cites, so Columbia has a third more orientation?
Posted by: Derek Tokaz | June 29, 2015 at 09:44 AM
As a consumer of apple sauce, I'm concerned about the quality of the sauce; of quite secondary importance is the quality of the apples that went into making it. As a prospective maker of apple sauce, the quality of the sauce machine (i.e. the value it adds making good or great sauce from mediocre or great apples.
I can't see why either of us would put great emphasis on the quality of the inputs. Likewise for the SAT: it seems to only rate the quality of the inputs to the Law School process. Best is the school that turns indifferent entering students into good lawyers than the one that turns excellent entering students into great lawyers.
What I would like to see is a measure of the value of the outputs from the schools or the value added by their attendance (per dollar spent, perhaps). For that reason, it seems necessary to subject all graduates to a second SAT (a bar-exam score might suffice) and compare the output to the input for each student.
Posted by: Jimbino | June 29, 2015 at 11:53 AM
Jimbino,
Al explains the reason for caring about the LSAT in his paper. The idea is that cohort quality greatly affects a student's learning experience in law school. Your professors aren't the only ones you're learning from, often they're not even the primary people you're learning from.
You're going to learn a lot more with a study group of smarties than with class time taken up by the questions of dummies.
Posted by: Derek Tokaz | June 29, 2015 at 12:10 PM
It seems to me that for law schools particularly the impact of research by faculty doesn't really speak to educational quality. The JD is not a research degree, and most JD students are rarely involved in faculty research.
That's not the case with a research degree, where high-impact scholarship directly benefits the graduate students studying under those faculty members (or serves as a direct proxy for educational opportunity).
Posted by: twbb | June 29, 2015 at 03:44 PM
Jimbino,
The explanation Al gives in the paper is that cohort quality affects an individual's education. I think there's some merit to that. Consider how much more you learn when you get a study group of very bright students together. Consider how much is wasted when idiotic questions are posed in class (I apologize).
However, cohort quality may be more about matching than just being with the best cohort. If everyone else is far above you, you'll get left behind. Just think about being the slowest kid in a math class.
Posted by: Derek Tokaz | June 29, 2015 at 04:36 PM
twbb: that's a good point. Most law students attend law school to become lawyers, not researchers.
Posted by: anon | June 29, 2015 at 11:04 PM
twbb,
But this ranking doesn't look at the research done by professors at the school. It looks at the research done by professors at other schools who were published in the school's law review. So, it doesn't really matter that a professor's high-impact scholarship isn't directly benefiting students at his school, because it's benefiting ...and here's where I lose the logic train.
Posted by: Derek Tokaz | June 30, 2015 at 09:33 AM
twbb,
It doesn't matter that high-impact scholarships doesn't directly benefit the students, because this isn't looking at the quality of the research done by your professors. It's looking at the quality of the research done by professors published in your law review. You just want to publish high-impact scholarship; doesn't matter if your professors are doing any of it.
Posted by: Derek Tokaz | June 30, 2015 at 10:25 AM
Hi Al. I just saw notice of this paper in today's ABAJournal online frontpage, and have just started looking at it. Thanks for doing it; in particular I hope would-be students will find the 2-variable data useful in making their decisions.
Out of curiosity (nosiness, really), I wonder did you get any peeps (read: outraged email complaints) from one or more Defenders of the Sacred Honor of SCU due to your 2-var ALgorithm having placed SCU in the position of single most over-rated by the USNWR rankings?
Thanks again.
Posted by: Concerned_Citizen | July 02, 2015 at 11:42 AM