Law School Transparency’s recent report, the 2015 State of Legal Education, led to numerous articles in the national media about the trend of law schools admitting large numbers of poorly qualified applicants. This prompted the Law School Admission Council (LSAC) -- the organization responsible for the LSAT -- to issue a press release denouncing LST’s report and purporting to rebut certain factual assertions allegedly made by LST. The press release mischaracterizes the conclusions of the LST report.
The press release begins: “A report recently released by Law School Transparency (LST) has gained headlines by claiming that some ABA-approved law schools have been intentionally admitting “high risk” students who, based on their LSAT scores, do not have a reasonable chance of passing the bar.”
LST stands by the assertion that dozens of ABA-approved law schools know that they have admitted large numbers of students who, based on their low LSAT scores, coupled with commensurately low undergraduate GPAs, are at high risk of academic failure or failing the bar. Although LST’s risk band categories are delineated by LSAT scores, LST was very careful to explain that the risks of a low LSAT score could be offset by strong academic performance in college. LST analyzed the available data and determined that law schools were not offsetting lower LSAT scores with higher GPA requirements. Although many of these same law schools have been allocating more resources to internal and external academic success and bar prep programs, these efforts have not yet offset the overall decrease in student capability, resulting in a strong trend of decreasing bar passage rates at the high-risk category schools.
Law schools can analyze internal data about attrition and bar passage rates by LSAT score and UGPA to make reasonably accurate predictions of the likely success of applicants with similar credentials. Law School Transparency has urged (and continues to urge) law schools to make internal data publicly accessible. LST acknowledges that some law schools may be better at educating students with marginal predictors and helping to prepare them for the bar. LST stands by its assertion that law schools have a duty to the legal profession to share empirically validated findings.
LSAC specifically states that LST has made three false claims. The first alleged false claim is that “LSAT scores can be used to assign bar passage risk.” LSAC objects to LST’s labeling of students with certain LSAT scores as high risk, very high risk, and extremely high risk. Drawing from LST’s report, LSAC notes that students labeled “high risk” at one school had a first-time pass rate of 57 percent, while comparable students from another school had a pass rate of 23 percent. Based on this disparity, LSAC concludes: “Clearly, many factors significantly affect bar passage rates above and beyond LSAT scores.” LST agrees wholeheartedly with this statement and has never claimed otherwise. The LSAC statement concludes “The assertion that LSAT scores alone measure comparability is patently wrong.” Here Mr. Bernstine is arguing against an assertion that the LST report never made. Here is what the LST report actually says about assessing risk by LSAT score:
The [LSAT risk bank] framework represents only a starting point for assessing the risk of bar failure. A student with a low LSAT score but very high undergraduate GPA, for example, has less risk of failing the bar than a student with the same LSAT score and a very low UGPA. Some law schools have also been more successful than others in helping students with low LSAT scores succeed on the bar exam. Where the student takes the bar exam matters as well.
While LSAC may object to “labeling” law schools and law students, LSAC doesn’t dispute the underlying premise that there is a strong correlation between LSAT score and success in law school and on the bar exam. Mr. Bernstine does not because he cannot. LST believes that “high risk” is a fair characterization of both a 43% and 77% risk of failing the bar the first time, as reflected in the data from these two schools.
The second “False Claim” identified by LSAC is: “LSAT scores can be used to delineate risk categories.”
This is really not a different claim, but a variation on the first claim. Nevertheless, LST did use LSAT scores to delineate risk categories (minimal risk, low risk, modest risk, high risk, very high risk, extreme risk), so clearly it can be done. Mr. Bernstine’s concern seems to be that the risk categories are either arbitrary or misleading because we are making “fine distinctions.” Mr. Bernstine notes that a difference of one point on the LSAT, such as from 150 to 149, may be statistically insignificant. But each of LST’s risk bands are more than one point wide, and represent a range of performance. For example, the high risk category from 147-149 represents the 33rd to the 40th percentile on the LSAT. The point of the risk categories is simply to illustrate that the risk of failure goes up as the LSAT score goes down, an undisputable fact that is not challenged by Mr. Bernstine. While reasonable people could differ on exactly where to draw the various lines, the available data confirm the validity of the risk categories as presented in the report. Again, LST invites any law schools with data tending to refute the validity of these categories to come forward and make that information publicly available. We want to be wrong.
In recent weeks, several deans at high-risk schools have made comments in the media suggesting that there is no correlation between LSAT scores and bar passage at their law school, but none have provided data to support their assertions. The most clearly false claim came from Dean Penelope Bryan at Whittier Law School. She told the Los Angeles Times that “[t]he LSAT score has no predictive value for the success of Whittier Law School students on the bar exam.” Incidentally, Whittier’s first time bar pass rate in California dropped from 64.7% in July 2013, to 42.7% in July 2014 to 30% in February 2015. (Note - Whittier had only 10 first time test-takers for that administration; California's July 2015 results by school are not yet available, but the statewide results were lower.)
At the same time, Whittier’s entering class profile progressively weakened. In 2014, over half of the entering class at Whittier was at 146 or below on the LSAT. To Whittier’s credit, the school dramatically shrank their first year entering class in 2015, substantially raising the LSAT profile of the bottom half of the class, with the 50th percentile increasing from 146 to 148 and the 25th percentile increasing from 143 to 146. Clearly, Whittier has recognized that LSAT scores do matter, even if Dean Bryan won’t publicly admit it.
Mr. Bernstine’s third objection to the LST report is our alleged claim that “A study based on 25-year-old data can be used to assess current bar passage risk.” Mr. Bernstine makes much of the fact that LST cited a landmark LSAC study, the LSAC National Longitudinal Bar Passage Study, in support of the uncontroversial assertion that there is a strong correlation between LSAT scores and bar passage. He notes that the LSAT has changed since that study was published, but, importantly, he does not assert that the current version of the LSAT has any less of a correlation than the old version used in the study. Neither, to our knowledge, has the scaling and equating process changed in a way that damages the link between current tests and previous tests for comparability. And, of course, the LSAC study was just one of several data sources in support of the report’s conclusions.
The LSAC release concludes by noting the limitations of the LSAT and scolding LST for allegedly failing to identify these limitations in our report. Ironically, the LST report specifically, correctly, and adequately addresses each of these issues. Despite these caveats, the fact is that LSAC has developed a highly accurate test for assessing law school aptitude. Because the bar exam is essentially a cumulative law school exam, it should not be surprising that the LSAT turns out to have a strong correlation with bar passage as well. The irony is that while LSAC spends millions refining the LSAT each year to ensure that it is the best predictive tool available for law school success, LSAC feels compelled to downplay the validity of the exam.
LST understands that LSAC is speaking on behalf of many of its customers, which include the scores of law schools identified by LST as high risk, very high risk and extreme risk schools. LST understands that many of these law schools are very unhappy being called out for their reprehensible admissions practices. Undoubtedly, being labeled an “extreme risk” school is not good for business, and does not make the school’s alumni very happy.
LST should not be criticized for pointing out uncomfortable truths. Dozens of ABA-accredited law schools are making reckless choices that have a huge impact on thousands of real people. If effective labeling motivates this minority of law schools to act responsibly, then the hurt feelings of a few people associated with these schools is a small price to pay. Until LSAC, or the law schools identified by LST as having problematic admissions practices, come forward with actual data to refute the conclusions in the LST report, general attacks on the validity of the report should be taken with a large grain of salt.
Kyle McEntee, Executive Director, LST David Frakt, Chair, LST National Advisory Council
Appalling that FL would give this specious organization a platform.
Posted by: Anon | December 02, 2015 at 12:30 PM
Anon
Do you believe it to be less "specious" to use other websites to provide a platform to those who differ with LST? Or, is the FL different in some way? If so, how? Please explain.
Moreover, if you truly believe that David Frakt is in the business of making "specious" claims, then it is fair to say that most objective readers would deem you to be no credible judge of this topic; thus, you have demeaned the FL on the basis of, well, a specious accusation.
Posted by: anon | December 02, 2015 at 12:40 PM
As Georgetown and George Washington (and some others) keep admitting larger and larger classes (and then employing their own grads because they cannot place them) and then taking large transfer classes; schools farther down in the pecking order start scraping the bottom of the barrel.
LST blames the schools farther down the pecking order, which is fair as far as it goes -- but you need to look at those higher up in the pecking order for the irresponsible admission practices as well.
Posted by: Tracy Flick | December 02, 2015 at 01:30 PM
Tracy, can you explain more clearly why those practices are a problem, and for whom? I've never heard a satisfying argument although I am sure one could be made.
Posted by: Kyle McEntee | December 02, 2015 at 01:36 PM
Anon - You will no doubt be horrified to learn that the Faculty Lounge is not the only one offering a platform to Law School Transparency. In addition to the many reputable news organizations from across the political spectrum that have replied on LST's State of Legal Education report (e.g. The New York Times, Wall Street Journal, Bloomberg News, NPR, LA Times, ABA Journal, etc.), the National Committee of Bar Examiners has invited Kyle McEntee and I to make a presentation at the 2016 Annual Bar Admissions Conference, and the ABA Section of Legal Education has invited Kyle McEntee to their next Council meeting.
I invite those who have serious critiques of the LST Report or the organization to share them with FL readers. I will be happy to post any serious response to the LST Report as a separate guest post on the Faculty Lounge. If you would like to take advantage of this opportunity, you may e-mail me directly at [email protected]
LST would also welcome the opportunity to explain our views and debate our critics publicly. All we need is an invitation.
Posted by: David Frakt | December 02, 2015 at 01:37 PM
LST's "study" is nothing more than pseudo science and LSAC is to be commended for calling you out.
Posted by: Anon | December 02, 2015 at 01:43 PM
And let's add that neither of the authors have any serious training or research background in higher education, labor markets, or social science generally.
The report fails to disclose that one of the co-authors of the above post is a failed dean candidate at one of the very schools targeted by the study - a shocking admission of the unfamiliarity of the authors with the bare minimum of standards in social science. Was that co-author so unfamiliar with how that school functioned that he went through an entire search process before realizing his mistake?
The "study" claims to have made a grand discovery: weaker students have a lower chance of success in law school and in a legal career. Wow. In fact, the entire diverse structure of higher education is built around the understanding that students have varying capabilities. That is its strength not its weakness.
Incredibly, some of the very people (including the NY Times) who championed for profit schools in order to break up the alleged monopoly of the ABA schools now attack them for achieving precisely that goal.
LST has a history of raising the alarm about issues that the world has already figure out and then claiming victory. They are clearly at it again. It is unfortunate that so many here fall for this gambit.
Posted by: Anon | December 02, 2015 at 02:37 PM
Anon,
The sad truth is that the ABA and AALS fail to demonstrate even a modicum of introspection. I'd love it if the bar organizations, and the law schools themselves, were to critically examine the effects of the last 15 years on students, on the practicing bar, on wages, on tuition, and on student debt. In my opinion, and I'm not alone here, the "insider" organizations have failed to take leadership on any of these issues.
LST has no agenda other than transparency. They are operating with a dearth of information and doing the best they can with the data available. I have no doubt that schools -- especially the lousy ones -- have better metrics and data for predicting success and failure. For obvious reasons, they aren't publicizing or sharing their internal analytics.
I find the criticism of LST by the academy disappointing, but not surprising. After all, it's your ox that they are goring. Try to keep some objectivity here. You might find the message troubling, but the messenger's motives are good and are providing 0Ls with some of the only critical information on the law school crisis.
Posted by: Jojo | December 02, 2015 at 02:54 PM
Kyle -- Assume there is a zero sum pool of applicants and every school wants to meet its budget. Assume also that there is a zero sum pool of jobs for graduates of all schools, and that pool has shrunk in recent years. Sending everyone to Georgetown and GW (in my example) does not create more jobs overall, but it does create more lawyers out there unless other schools shrink their classes. When Georgetown and GW take a lot of transfers, the schools they take them from will, the next year, need to take more students initially. If other schools need to meet their budgets as well, they will look farther and farther down the applicant pool to get there. And because Georgetown and GW hire their own graduates, they are masking the fact that they cannot employ them in real jobs when new applicant apply.
I am just saying the higher up schools contribute heavily to the saturation problem, even if the lower ranked schools end up looking like the problem. If tomorrow we sent every student to Harvard, Yale and Stanford instead of where they go now, there would not be more jobs for everyone.
Posted by: Tracy Flick | December 02, 2015 at 02:58 PM
"LST understands that LSAC is speaking on behalf of many of its customers, which include the scores of law schools identified by LST as high risk, very high risk and extreme risk schools. LST understands that many of these law schools are very unhappy being called out for their reprehensible admissions practices. Undoubtedly, being labeled an “extreme risk” school is not good for business, and does not make the school’s alumni very happy."
I have to say that the vitriolic tone of LST and its allies, such as the above quoted portion, often give me significant pause about their claims to simply be idealistic defenders of transparency and accountability with no emotional axe to grind. If your facts are accurate (a contestable premise), they will speak for themselves and stand up to scrutiny. The constant table-pounding and ad hominem attacks do little for your long-term credibility.
I realize you may not care and may even enjoy your self-appoint role as the prophets in the wilderness whose role is to afflict the comfortable and evil deans and law faculty. That's fine, and I get it: I too used to believe that the world was divided into the righteous and the wicked and took great satisfaction in calling out "enemies." I am now either too old or too jaded to believe the world is that simple, and I therefore tune out LST like I do Above the Law. A shame, really, because some of LSTs work has the potential to make a meaningful contribution.
(Fully prepared for the slew of comments that the world is simple, law professors are buying yachts and working 3 hours a week on the back of the debt slave students and taxpayers. Flame on if you must, but believe it or not, I'm actually trying to provide some constructive feedback.)
And by the way: I do not teach at any of the schools LST is so worked up about. At least I don't think so: as mentioned above, because I tune LST out because I find their tone to get in the way of the content, I haven't bothered to parse their report to name-check every school they're labeling according to their parameters.
Posted by: Anon | December 02, 2015 at 03:12 PM
Jojo, if it makes you feel better, there has been an outpouring of support from law school faculty and deans around the country. Overwhelming support.
There are people in leadership positions within the ABA and AALS that are thirsting to act, and our report was designed to encourage that. It's a process, but it's one that is progressing really well.
It's clear that many faculty and deans are acting on what they've known: law schools are not all in this together. Standing up to the minority of schools that do the profession and academy harm, in addition to students, is a challenge that many are seizing.
Posted by: Kyle McEntee | December 02, 2015 at 03:12 PM
Tracy those are great points. I'm going to think on this with some folks. My instinct is that the numbers aren't enough to justify attention, especially when the students at the lower ranked schools are probably (but not always) acting rationally. But your conclusion definitely seems to follow.
Posted by: Kyle McEntee | December 02, 2015 at 03:15 PM
Tracy is raising the "fewer schools or fewer students per school" debate, which I posted about here:
http://prawfsblawg.blogs.com/prawfsblawg/2015/02/fewer-law-schools-or-fewer-students-per-school.html
Derek has a brief cost-benefit analysis in the comments. I think we have been on a "fewer students per school" track. But that may change if some of the better-ranked schools decide to squeeze out their regional competitors and shoulder the short-term drop in US News rankings. And schools will eventually put themselves out of business if they accept students who cannot pass the bar and cannot earn a living as attorneys.
It's bizarre that LSAC is running away from the predictive power of the LSAT. If it has no predictive power, why use it?
Posted by: Matt Bodie | December 02, 2015 at 03:48 PM
I agree with Anon 3:12. I wish LST could separate data collection/reporting from arguments/insinuations that tend to show a bias. (Biases are not inherently bad - only when actors cannot perceive their internal biases and how they might affect their conclusions.) LST's lack of methodological training and a failure to understand the narrow, outdated, and limited data set of the study contributed to these unfortunate deficiencies in the results the study seemingly depicts. It's really unfortunate, because the premise of the organization is a good one.
Posted by: Another Anon | December 02, 2015 at 04:32 PM
Tracy, by that reasoning, isn't the existence of schools that students prefer more always to blame for the practices of schools that students prefer less? Near the top of the pecking order you have what I think is the single biggest law school, Harvard, with 560 incoming 1Ls. If it closed, or if it dramatically cut back its class size, that would really help the schools down the pecking order. Is the idea that Harvard is partially to blame for what schools ranked below Harvard have been doing? Or maybe I misunderstand the argument you're making.
Posted by: Orin Kerr | December 02, 2015 at 05:04 PM
Orin,
Speaking for myself (not Tracy): think the question is why LST and the like direct little ire toward those schools who inflate their LSAT and GPA medians (through enormous numbers of transfers) and their job placement stats (by employing their own grads). In a world of "transparency," one would think that LST would shout these practices from the rooftops as it does the practices of other, less prestigious/wealthy law schools.
Yes, I know that LST discounts school-funded jobs, but there's none of the vitriol directed at the schools at the bottom of the pecking order.
Posted by: anon | December 02, 2015 at 05:19 PM
Orin -- I am not sure of Harvard's history of class size, but yes, its large numbers contributes to the glut of lawyers, but I do not know if its numbers are higher than they always have been. If all schools cut class sizes equally, there would be less of an effect of schools at the bottom lowering their standards more. For the Fall of 2014 your school, GW, admitted 539 students, which is the largest class size it has taken since at least 2007 - and yes its quality has declined and it is not finding more jobs for its students. It has taken in 44 transfers from American alone this year. American in turn takes transfers from UDC and Baltimore. Those schools, in turn, have the quality of applicants decline more.
Which schools are "the problem" in this equation? I would say the biggest problem are the schools that increased their class size in the recession.
Posted by: Tracy Flick | December 02, 2015 at 05:23 PM
Tracy, my understanding is that GW has tried to maintain its traditional class size, especially during years when the school had an interim Dean. The class size today is the same as it was when I began teaching there in 2001. As best I understand things, 2014 was an outlier only because the yield was greater than the admissions office expected. The larger size for that one class was an accident, not a plan. There has been no plan or effort to increase class size, with the exception of one year, under a prior Dean, when the Dean cut back the class size 20% in response to the nationwide drop in applications.
As for Harvard, like most of the top 10 schools, it hasn't changed class size. It's been in the mid-500s for decades, I think. (Although if others have studied this more closel
Posted by: Orin Kerr | December 02, 2015 at 05:37 PM
(Oops, sorry for posting before I had finished. I should add that all views are my own and based on my knowledge; I don't work in the Dean's office at GW, and I often enough disagree with decisions made by the Dean's office.)
Posted by: Orin Kerr | December 02, 2015 at 05:52 PM
If US News evaluated the GPA and LSAT scores of transfer students, transfers would go down considerably. There's a lot of gamesmanship with transfers.
The truth is, GW doesn't really have to worry about its incoming class size. Based on the yield, it can decide whether it needs to gut American or bleed the school slowly for another year. #KeepTheStatsHigh
Posted by: US News | December 02, 2015 at 05:58 PM