Search the Lounge

Categories

« Being Jewish at UCLA | Main | Do Employers Actually Look at U.S. News Rankings? »

March 07, 2015

Comments

Feed You can follow this conversation by subscribing to the comment feed for this post.

Matt Bodie

I had the sense that USNWR is even more important because it helps to set the national discount rate. But I could be wrong on that.

Jim Milles

It might be that applicants have started to understand that below the top 14, rankings don't matter, and that the top 14 never changes, so who needs the US News?

Rob Vischer

I think there's broader recognition that the vast majority of law school markets are local, so metrics re local competitors are more important than rankings on a national scale. I also think that most law schools recognize how difficult/rare it is to move peer assessment, so (thankfully) there's more emphasis on outcomes than on marketing to other law school faculty. The rankings still matter, though.

Rob Vischer

Addendum: for top 14 (20?) schools, rankings still matter on a national scale. For other schools, less so. That might also inform Matt's comment on discount rates, which I also think are driven primarily by the local market outside the top ranked schools.

Jojo

'Twas always thus, and thus it will always be. US News was a beard and scapegoat for the scam. If students had hopes for fame and treasure, they assumed merit and reason and rank equated to dollars and ability. If the land is twice as rich, my crops will flourish and I will prosper. If school x is rated 30 places higher than school y, then it must be better and result in better lawyers with better lives.

Kyle and others have pulled back the curtain. Law school now is largely open enrollment. Everyone knows that only 14 schools matter. Who cares if another is ranked 25 or 50.

I do agree that the rankings now perversely work against the schools. A student will not pay more to go to a worse ranked school than a better ranked one.

anon

IMHO, what is most telling is the compression of the tiers. USNWR is moving toward a model for almost one half the schools that deems them not even worth ranking. The implication is that all of these schools are equally poor, or, alternatively, all are "special" in their own "special" ways (the preferred view of those who work there).

Michael Risch

anon at 5:30- I'm not sure I understand your point about compression. Ranking of schools used to stop at 100 (and before that, used to stop at 90), and the numerical ranks into the third tier are a new thing in just the past couple of years. Thus, there are arguably fewer, not more, schools that remain unranked. After all, everyone outside of the top 100 will grow as the number of law schools grows but the number of top 100 spots remains fixed.

Perhaps you mean that the third tier is not growing, but the fourth tier is, but that seems not unusual either. According to an article I read (but can't link to because this will go to spam), the number of schools in the third tier has been 45 or so since 1998, whether or not such schools were given a numerical ranking.

anon

Actually, Michael,

"beginning with the 1992 rankings, the magazine ... published the rank of every law school accredited by the American Bar Association (ABA). ... Beginning with the
2004 rankings, ... USN reported the top 100 law schools by rank, and divided the remainder of the schools into
the third and fourth tier ... listing these schools alphabetically."

See, Fear of Falling: The Effects of U.S. News & World Report
Rankings on U.S. Law Schools, Michael Sauder and Wendy Espeland (2007)

Now, for the "alphabeticals," what earthly difference does the USNWR "ranking" mean? To repeat: the implication is that all of these schools are equally poor, or, alternatively, all are "special" in their own "special" ways (the preferred view of those who work there).

In a sense, a significant proportion of the ABA accredited schools should be thanking USNWR for ignoring them. Because they aren't worth ranking, in its view, they need not worry about being deemed to be at the bottom of the bottom tier.

The other side of the coin, however, is that prospective students may turn to other sources, and this may reveal even more negative information. Thus, the downturn in applications, because of the truly abysmal performance of so many of these schools.

Michael Risch

Anon - it's convenient how you edited out this statement from the same paragraph: "During most of the period since 1990, law schools were divided into four tiers: The top tier listed the 50 highest-rated programs in order of rank, and then schools were separated into the second, third, and fourth tiers and listed alphabetically within these tiers." So it turns out that we used to only get 1-50, and the rest were in alphabetized groups (until 1998, when that changed a bit, and again in 2004, and again in 2011, as I note from the paper: A LONGITUDINAL ANALYSIS OF THE U.S. NEWS LAW SCHOOL ACADEMIC REPUTATION SCORES BETWEEN 1998 AND 2013 (see note 69) and the paper you cite.

So, I'll rephrase what I said. I do understand what you mean by compression, as you make clear in your followup comment - that it's too bad that the fourth tier isn't ranked because there might be a big difference between number 146 and number 200+. Fair enough.

I guess I'm just scratching my head about the statement that "USNWR is moving toward a model" of this, because it's a model it's had since the beginning for the fourth tier, and off and on for the third tier, and even for a while at the second tier. If anything, it has moved away from that model in the second and then third tiers, and may yet do so for the fourth.

anon

Michael

You stated: " Ranking of schools used to stop at 100 (and before that, used to stop at 90 and the numerical ranks into the third tier are a new thing in just the past couple of years."

Now, you say I've omitted something! You don't seem to believe that USWR, beginning with the 1992 rankings, published the rank of every law school accredited by the American Bar Association (ABA). ("I guess I'm just scratching my head about the statement that "USNWR is moving toward a model" of this, because it's a model it's had since the beginning.")

That's ok, I suppose, but odd because you cite the same paper that refutes your main point. I guess the early 1990s never happened.

Although I don't think your statement was complete, or even accurate, it really doesn't matter much. I'll take your point that the number of schools ranked has vacillated, and a "trend" to rank more schools is not completely without merit and that stating "moving toward" was perhaps ill advised without context!

I readily concede the point because you've ignored the point of my comments in any event. It isn't worth going back and forth about your quibble about "compression" because it is sort of an ancillary and misplaced debate.

If you don't think that unranked schools enjoy the advantages, or disadvantages, identified, I can't tell. If you believe the rankings should be expanded, or compressed, I can't tell. If you believe driving prospective students to acquire more information about the "alphabeticals" from other sources, I can't tell. ANd, if you really think there would be no difference between rank 146 and 200, so be it!

Good point though (sort of) about the words "moving toward" a model of compression!

Just saying...

The longer they are around, the less important US News rankings become. Few schools have moved up or down in any significant sense, so most are locked in as a "top tier," "mid tier" or "low tier" school.

The cost of legal education and the decline in applications has changed the game to one in which those who do apply now know to play one school against another for more $$$. This is generally done with schools in the same tier. So in NYC, prospects play Brooklyn against St. John's and Hofstra and Cardozo, for example. While the rankings of these schools differ somewhat, they do not differ significantly enough to pay more to go to one than another.

I even know students who turned down top tier schools (NYU, Columbia) for free rides at some of the lower tiered NY schools I mentioned. Is that a smart move in the long term? Who can really say.

The game now is not the rankings, it is "show me the money."

jess

US News will become more relevant when they stop counting school funded jobs in their rankings. That is probably a year away. Until then, schools like George Washington, Georgetown, Emory, UVa and Willliam and Mary are severely overranked.

Michael Risch

anon -

I don't know what my views are on those things, in part because I think the rankings are corrupt and corrupting, in part because I think they can be wildly inaccurate in terms of school quality, and in part because you might be right that if you ARE going to rank, then you ought to rank.

I just didn't understand your position that it was a new problem, but I do now. I don't think it's an important enough point to debate the history.

Derek Tokaz

"I continue to be impressed with how highly the U.S. News rankings correlate with other rankings, such as employment outcome (hence the scatterplots that I use to illustrate this post)."

Surely a word is missing there, and Alfred means how much they do NOT correlate with other rankings, especially employment outcomes.

Al Brophy

Hi Derek, there's a .7 correlation between the U.S. News overall score in 2015 and the employment outcome. That's certainly lower than the overall score's correlation with some other key variables, such as LSAT score (.91), but still seems high to me. I discuss this in "Ranking Law Schools with LSATs, Employment Outcomes, and Law Review Citations," which is available here:
http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2456032

The fourth column over in the first row of the illustration has a scatterplot for the overall U.S. News rank against the rank of % of graduates employed at full time, JD required jobs nine months out.

Rob Kar

Al: I think you are sensing something real, but the shift is complex. I think students are looking more than before toward employment outcomes, which often correlate with USNews but not perfectly. And desired geographical locations in the long-term are also playing a larger role--on the assumption (rightly or wrongly) that getting one's foot in the door with connections during law school can be important. I thought your readers might find this blog post from Spivey Consulting helpful (which essentially shows how important the students they work consider various factors when choosing law schools): http://spiveyconsulting.com/blog/how-do-students-pick-which-law-school-to-attend/

Derek Tokaz

I agree that the r value seems high, but there's not a standard way of interpreting r when it comes to these types of rankings. I think understanding the correlation makes more sense if you look at it from the point of view of a prospective student (who is ostensibly the target audience of the USN rankings).

Quick caveat: What I'm about to say is based on data from a few years ago when Kyle and I did similar analysis of US News ranking correlations to employment scores and other variables (see Take This Job And Count It, Journal of Legal Metrics, Vol. 1, 2012). US News has since then modified how it counts employment data.

Let's take the schools ranked 22-41, 42-61, and 62-81, and call them Groups 2, 3, and 4 respectively. Now let's look at the 25/50/75 percentile Employment Scores for those groups (the LST ES is more or less the same employment categorization you've used).

Group 2: 54.4% / 58.0% / 63.2%
Group 3: 47.5% / 55.7% / 61.4%
Group 4: 47.3% / 52.6% / 61.4%

If a prospective student has been accepted to school #30 and school #70, he's probably going to think #30 surely offers him greatly enhanced job prospects. In reality, it's a lot closer to a coin toss than to a lock.

This illustrates one of the key problems with USN and other ordinal rankings. Even if there was perfect correlation between USN rankings and Employment Score rankings, USN rank would still be a poor proxy for employment outcomes. This is because the USN ranks are equidistant. The difference between 1 and 2 is the same as the distance between 51 and 52. But of course there are times when there are larger differences, and when there's compression with many schools having nearly identical scores. For instance, there is a 2.1% ES difference between Penn and Cornell. That's a relatively high gap for 1 spot, and wouldn't be reflected if they were just called #5 and #6. That same 2.1% difference spans 10 schools if you go down to the 72.9-75% range. #5 and #6 may appear to a prospective student to be a virtual tie, while #26 is clearly better than #35, but the size of the difference is actually the same.

anon

Derek

Great points!

The incessant fever among legal academicians is to rank themselves. They count everything imaginable, and then put together lists. Most Tweets, Most This, Most That.

It is so risible when viewed objectively, because it all amounts to nothing meaningful. MOst of these "best of" lists are so poorly constructed as to reflect nothing more than attempts to garner attention, validate one's own work and status and pretend to be famous. (To be sure, other professions do the same thing, but the risible aspect of the legal academia is the patina of "scholarship" put on such frivolous self regard.)

When it comes to law schools, these efforts are sort of ridiculous for another reason. As many have pointed out, the rankings are what they are, and are based on history. If one actually studies the history of law school rankings (which of course, in the FL, seems to be a rarity among the "knowledge generators") one will see that before USNWR the rankings were virtually the same. To be sure, there have been some changes, but nothing major.

That is the reason that the vast majority of law schools need to lift their collective heads and realize that they are too focused on the wrong issues. What is a "wrong issue"? Believing that you are better (or worse) than others because of a number on a list concocted by a self interested evaluator (or a list that simply reflects a historical reality that schools with hundreds of years of history have had and still have an advantage in this country).

Should there be a means to determine if employment outcomes at school X are better than at school Y? Yes. Transparency is a good thing. Anything wrong with making a list of these outcomes? Of course not. But, IMHO, legal academia should focus on legal education, and leave the obsession with ranking themselves behind.

Al Brophy

Derek,

Looking at your article (which is available here https://polisci.as.uky.edu/sites/default/files/Take%20This%20Job%20and%20Count%20It.pdf ) I'm wondering if there was a significant change in the employment data that US News uses since you and Kyle wrote that to the data I used. If you compare your plot of US News rank against employment (graph 2 on 332) it looks like there's a lot more variance than in my scatterplot of US News rank against employment rank for JD required jobs. Perhaps that's part of the explanation of our differences. I don't see that you reported the correlation for graph 2, so it's a little hard for me to judge how much of our difference might be explained by this.

I would add that your table 7 at 337, which reports the 25th, 50th, and 75th percentile for schools at each of your groupings (which you base on the US News overall rank) reflects -- at virtually every place -- a decline in the percent employed as you move down the groupings. That is, your US News-based groupings reflect declining outcomes as one moves down in rank. Some schools in lower tiers perform better than some schools in higher tiers -- that is true for pretty much all variables US News uses, I would imagine. That is, there is not a perfect correlation between US News rank and the variables that go into the rank. There is a separate question, of course, of how significant those differences are as you move down the rankings. More granular reporting of data will, almost necessarily, reflect more shades of grey than a single number (such as a correlation of .7).

I think that additional data on graduate employment is tending to replacement US News.

Derek Tokaz

Another pretty serious issue with ordinal rankings is that gaps between two school has little to do with the difference in quality between the schools, and more to do with completely irrelevant facts. The 7 point gap between Vanderbilt and Alabama has less to do with the difference in quality between the two schools, and more to do with the number of quality schools that exist in California. For someone planning to work in the South East, the number of good West Coast schools is irrelevant.

That is perhaps just a different way of explaining the uniform distance problem described above, but it plays out in a second way. Ordinal ranking doesn't work over time. We could see Alabama rise to 19 while Vanderbilt drops to 18. That would seem to indicate that two schools which once had some reasonable (if small) difference in quality are now in a virtual tie. But, the change could simply be due to WUSTL and Emory rising in quality, and GWU, Minnesota, and USC-Gould dropping in quality.

This isn't just a problem with USN rankings though. It's a problem with ordinal rankings in general, be they based on employment scores, LSAT/GPA stats, or whatever other wacky thing someone comes up with. The only time ordinal ranks really only make sense when there's a specific need. For instance, if we're going to have an 8-team playoff, we need to know who the top 8 are, and we don't care that #3 is a distant third between #1 and #2, and that #8 is only marginally better than #9. (But of course if we need to decide how many to include in the playoffs, the difference between 8 and 9 matters quite a bit. So you can see how quickly ordinal rankings lose their relevance.)

The comments to this entry are closed.

StatCounter

  • StatCounter
Blog powered by Typepad