The state of the market for entry-level legal employment is important to current law students, and to potential applicants for admission to law school. As Al Brophy, and others, have discussed, the number of applicants for the Fall 2014 entering class continues its free fall. Before long, the ABA (and NALP) will be releasing the employment report for the Class of 2013. That report may well influence decisions about applying to law school at all, and the law schools to which applicants might apply, or enroll.
Given that, I was not entirely surprised by an inquiry from one of the editors of a Wikipedia entry on law-schools, or on recent graduate employment. Last year, when the Class of 2012 ABA employment data was released, I focused on one crucial slice of the overall data, the percentage of graduates employed in positions (i) requiring admission to the Bar that were both (ii) full-time and (ii) long-term. In my initial post, Bar-Admission Required, Full-Time, Long-Term: First Look, I discussed the general distribution of Class-of-2012 law-graduate employment rates. I also listed the 20 schools reporting employment rates in such positions of at least 75%.
Given our national obsession with rankings, and requests from commenters, I folllowed up with Full Rankings: Bar Admission Required, Full-Time, Long Term, which
- posted the Class of 2012 ABA Employment rates in that category for all 197 law schools (i) located in the continental United States and (ii) reporting employment statistics, and
- ranked the law schools from highest to lowest reported rates.
Thus, the inevitable question: is it better to focus on reported employment rates (percentages), or on rankings?
A relevant factor in judging the reported employment rate of a law school is the location of the jobs for the graduates of a law school. For example, the reputations of less prestigious law schools tend to be regional, if not local. For example, graduates of South Texas College of Law (were I teach) have the greatest chance of getting jobs in the Houston metropolitan area, or in Texas.
But there are more subtle problems with rankings. By nature, rankings are ordinal, rather than scale, variables; what counts is the order of finish, rather than the range of variation between ranks. For example, consider the recent Winter Olympics. In some events, there were large differences in performance between the persons who won gold, silver, and bronze medals--first, second, and third. In other events, the differences between getting on the podium at all, much less those among the persons winning one of the three medals, were less than a second.
The same is true with the rankings of employment rates; differences between schools may not necessarily be meaningful. As the following chart shows, the relationship between employment rates and rank is not linear.
Employment rates drop quickly over the first 14 schools: from 94.9% (Chicago) to 81.0% (George Washington). As rank increases, the differences in employment rates between successive ranks gets smaller (the slope of the curve gets flatter), until a 50% employment rate. To hit the last school with an employment rate of 70% (South Carolina, No. 31, with 70.4%) takes another 17 schools. To reach the last school with an employment rate of at least 60% (U. Mississippi, No. 80, with 60.0%) takes another 49 schools. To reach the last school with an employment rate of at least 50% (Faulkner, No. 133, with 50.0%) takes another 53 schools.
Below an employment rate of 50%, differences between ranks begins to increase (the slope of the curve gets steeper). To reach the last school with an employment rate of 40% (St. Thomas U, No. 173, with 40.3%) takes another 40 schools. To reach 30% (Western State, No. 189, with 32.5%) takes another 16 schools. To reach the lowest-ranked school (Golden Gate, No. 197, with 21.5%) takes another 8 schools.
Information about employment rates is clearly important. General information about the distribution of one or more of the various employment rates is of great use to persons considering law school. For those choosing among particular law schools, the employment rates of each of the schools being considered is also useful.
But rankings are not particularly useful, and are even misleading. It's better to list the schools alphabetically.
Gary Rosin
Update: As I was writing up Netting Out Law-School Funded Jobs, I realized that Dan Filler had followed up on my Bar Passage Required, Full-time, Long-term posts of last year (for links, see above) with New Law School Rankings: Employment Data Cleaned Of School Funded Jobs. Dan's post not only ranks schools after netting out law-school funded jobs, it also discusses various problems with relying on rankings.
Maybe I'm missing something, but it seems that you could get at the information you want to get at by plotting a bell (or near-bell) curve. Then, it would be pretty easy to see which schools are how many standard deviations away from the average school, as to employment rate. That would be a ranking, of sorts, but it would solve the problem of misleading ordinal ranking.
Posted by: Scott Bauries | February 28, 2014 at 07:57 AM
By definition, Z scores are standardized variations from the average. But is a 0.1 difference in Z score necessarily material? Some statisticians suggest that even a 1.0 difference in Z scores is not necessarily material. These statisticians suggest looking for a 2.0 difference.
In terms of the data discussed above, when the average is a 56.9% employment rate, a Z score of 0.0 tells many potential students that they ought to look elsewhere, especially when entry-level compensation is falling.
Posted by: Gary Rosin | February 28, 2014 at 09:31 AM