Law School Transparency’s recent report, the 2015 State of Legal Education, led to numerous articles in the national media about the trend of law schools admitting large numbers of poorly qualified applicants. This prompted the Law School Admission Council (LSAC) -- the organization responsible for the LSAT -- to issue a press release denouncing LST’s report and purporting to rebut certain factual assertions allegedly made by LST. The press release mischaracterizes the conclusions of the LST report.
The press release begins: “A report recently released by Law School Transparency (LST) has gained headlines by claiming that some ABA-approved law schools have been intentionally admitting “high risk” students who, based on their LSAT scores, do not have a reasonable chance of passing the bar.”
LST stands by the assertion that dozens of ABA-approved law schools know that they have admitted large numbers of students who, based on their low LSAT scores, coupled with commensurately low undergraduate GPAs, are at high risk of academic failure or failing the bar. Although LST’s risk band categories are delineated by LSAT scores, LST was very careful to explain that the risks of a low LSAT score could be offset by strong academic performance in college. LST analyzed the available data and determined that law schools were not offsetting lower LSAT scores with higher GPA requirements. Although many of these same law schools have been allocating more resources to internal and external academic success and bar prep programs, these efforts have not yet offset the overall decrease in student capability, resulting in a strong trend of decreasing bar passage rates at the high-risk category schools.
Law schools can analyze internal data about attrition and bar passage rates by LSAT score and UGPA to make reasonably accurate predictions of the likely success of applicants with similar credentials. Law School Transparency has urged (and continues to urge) law schools to make internal data publicly accessible. LST acknowledges that some law schools may be better at educating students with marginal predictors and helping to prepare them for the bar. LST stands by its assertion that law schools have a duty to the legal profession to share empirically validated findings.
LSAC specifically states that LST has made three false claims. The first alleged false claim is that “LSAT scores can be used to assign bar passage risk.” LSAC objects to LST’s labeling of students with certain LSAT scores as high risk, very high risk, and extremely high risk. Drawing from LST’s report, LSAC notes that students labeled “high risk” at one school had a first-time pass rate of 57 percent, while comparable students from another school had a pass rate of 23 percent. Based on this disparity, LSAC concludes: “Clearly, many factors significantly affect bar passage rates above and beyond LSAT scores.” LST agrees wholeheartedly with this statement and has never claimed otherwise. The LSAC statement concludes “The assertion that LSAT scores alone measure comparability is patently wrong.” Here Mr. Bernstine is arguing against an assertion that the LST report never made. Here is what the LST report actually says about assessing risk by LSAT score:
The [LSAT risk bank] framework represents only a starting point for assessing the risk of bar failure. A student with a low LSAT score but very high undergraduate GPA, for example, has less risk of failing the bar than a student with the same LSAT score and a very low UGPA. Some law schools have also been more successful than others in helping students with low LSAT scores succeed on the bar exam. Where the student takes the bar exam matters as well.
While LSAC may object to “labeling” law schools and law students, LSAC doesn’t dispute the underlying premise that there is a strong correlation between LSAT score and success in law school and on the bar exam. Mr. Bernstine does not because he cannot. LST believes that “high risk” is a fair characterization of both a 43% and 77% risk of failing the bar the first time, as reflected in the data from these two schools.
The second “False Claim” identified by LSAC is: “LSAT scores can be used to delineate risk categories.”
This is really not a different claim, but a variation on the first claim. Nevertheless, LST did use LSAT scores to delineate risk categories (minimal risk, low risk, modest risk, high risk, very high risk, extreme risk), so clearly it can be done. Mr. Bernstine’s concern seems to be that the risk categories are either arbitrary or misleading because we are making “fine distinctions.” Mr. Bernstine notes that a difference of one point on the LSAT, such as from 150 to 149, may be statistically insignificant. But each of LST’s risk bands are more than one point wide, and represent a range of performance. For example, the high risk category from 147-149 represents the 33rd to the 40th percentile on the LSAT. The point of the risk categories is simply to illustrate that the risk of failure goes up as the LSAT score goes down, an undisputable fact that is not challenged by Mr. Bernstine. While reasonable people could differ on exactly where to draw the various lines, the available data confirm the validity of the risk categories as presented in the report. Again, LST invites any law schools with data tending to refute the validity of these categories to come forward and make that information publicly available. We want to be wrong.
In recent weeks, several deans at high-risk schools have made comments in the media suggesting that there is no correlation between LSAT scores and bar passage at their law school, but none have provided data to support their assertions. The most clearly false claim came from Dean Penelope Bryan at Whittier Law School. She told the Los Angeles Times that “[t]he LSAT score has no predictive value for the success of Whittier Law School students on the bar exam.” Incidentally, Whittier’s first time bar pass rate in California dropped from 64.7% in July 2013, to 42.7% in July 2014 to 30% in February 2015. (Note - Whittier had only 10 first time test-takers for that administration; California's July 2015 results by school are not yet available, but the statewide results were lower.)
At the same time, Whittier’s entering class profile progressively weakened. In 2014, over half of the entering class at Whittier was at 146 or below on the LSAT. To Whittier’s credit, the school dramatically shrank their first year entering class in 2015, substantially raising the LSAT profile of the bottom half of the class, with the 50th percentile increasing from 146 to 148 and the 25th percentile increasing from 143 to 146. Clearly, Whittier has recognized that LSAT scores do matter, even if Dean Bryan won’t publicly admit it.
Mr. Bernstine’s third objection to the LST report is our alleged claim that “A study based on 25-year-old data can be used to assess current bar passage risk.” Mr. Bernstine makes much of the fact that LST cited a landmark LSAC study, the LSAC National Longitudinal Bar Passage Study, in support of the uncontroversial assertion that there is a strong correlation between LSAT scores and bar passage. He notes that the LSAT has changed since that study was published, but, importantly, he does not assert that the current version of the LSAT has any less of a correlation than the old version used in the study. Neither, to our knowledge, has the scaling and equating process changed in a way that damages the link between current tests and previous tests for comparability. And, of course, the LSAC study was just one of several data sources in support of the report’s conclusions.
The LSAC release concludes by noting the limitations of the LSAT and scolding LST for allegedly failing to identify these limitations in our report. Ironically, the LST report specifically, correctly, and adequately addresses each of these issues. Despite these caveats, the fact is that LSAC has developed a highly accurate test for assessing law school aptitude. Because the bar exam is essentially a cumulative law school exam, it should not be surprising that the LSAT turns out to have a strong correlation with bar passage as well. The irony is that while LSAC spends millions refining the LSAT each year to ensure that it is the best predictive tool available for law school success, LSAC feels compelled to downplay the validity of the exam.
LST understands that LSAC is speaking on behalf of many of its customers, which include the scores of law schools identified by LST as high risk, very high risk and extreme risk schools. LST understands that many of these law schools are very unhappy being called out for their reprehensible admissions practices. Undoubtedly, being labeled an “extreme risk” school is not good for business, and does not make the school’s alumni very happy.
LST should not be criticized for pointing out uncomfortable truths. Dozens of ABA-accredited law schools are making reckless choices that have a huge impact on thousands of real people. If effective labeling motivates this minority of law schools to act responsibly, then the hurt feelings of a few people associated with these schools is a small price to pay. Until LSAC, or the law schools identified by LST as having problematic admissions practices, come forward with actual data to refute the conclusions in the LST report, general attacks on the validity of the report should be taken with a large grain of salt.
Kyle McEntee, Executive Director, LST David Frakt, Chair, LST National Advisory Council