In my last post, I discussed the need to consider both sides of the ratio when examining bar passage rates. Looking only at a particular law school’s passage without examining the overall bar passage numbers can lead to incorrect conclusions. In this post, I am going to challenge the belief that law schools can easily predict which students will do well in law school and the bar. This is based partially on my years of experience being on the Admissions Committee for my school (not a current assignment, fortunately, as it is a really hard job) and on the statistics that have been developed about applicants.
When an individual applies to be admitted to a law school, a collection of objective and subjective information is provided to the admissions committee from which a decision can be made. The key objective information is the undergraduate grade point average (UGPA) and the LSAT score. The subjective information includes letters of recommendation, essays, c.v., and even the student’s transcript. The admissions committee and staff stir this information around and offer some students a seat. It seems that much of the discussion about admissions standards revolves around the objective indicators, so the focus of this post will be with those.
Without a doubt, the objective information drives the process at most schools. The subjective information is used, too, but UGPA/LSAT control most of the decision-making. Typically, an applicant must be defined objective standards first before any subjective information is considered. Indeed, much of the current debate about appropriate admissions standards rests on this assumption and focuses fairly exclusively on law school’s alleged failure to accord these objective measurements their driving force. The essence of the argument is that some students with low objective indicators should never (or at least hardly ever) be admitted to law school. See David Frakt’s LSAT Score Risk Bands. The major problem with this argument is that it ignores what the statistics tells us about the LSAT and UGPA.
According to the LSAC (the publishers of the LSAT), each of the objective data points are correlated with first-year law school performance. See LSAT Scores as Predictors of Law School Performance. For the LSAT, calculated on a school-by-school basis, the relationship correlates at between .19 and .56; for UGPA it is between .06 and .43.
Looking first at UGPA, at the bottom end, a correlation of .06 is statistically meaningless unless there are at least 1,000 students in the data set. See Table of critical values for Pearson correlation. According to the ABA, the largest admitted class last year (2016 data has not yet been released) was 576 at Georgetown. See ABA 2015 1L Data. The only other school above 500 was Harvard. See id. For these two schools, the correlation needs to be greater than .07 to have any validity; for the rest of us, a much higher correlation is needed. For my school, U.Mass., with an entering class in 2015 of 71, the correlation must be above about .19 to be valid. Even the highest correlation found, .43, is considered to be a statistically weak relationship. See How to Interpret a Correlation Coefficient r.
The correlation between LSAT and the first year’s performance is stronger than the one for UGPA; indeed, the highest correlation found, .56, is considered a moderately strong correlation. Further, with the lowest correlation at .19, schools with more than about 70 matriculates (all but seven ABA schools) would have a statistically valid result, although that correlation is considered weak.
There is a problem with these relationships, however. If a law school is supposed to recognize which students will ultimately pass the bar examination based on pre-admission data (and thus not be a “bottom feeder”), it must have data that predicts bar passage. Neither of these objective values do this.
To begin with, the relationship tested by LSAC is between UGPA/LSAT and first-year law school performance, not between the indicators and bar passage. The LSAC is not measuring the relationship with graduating law school or with passing the bar examination. Measuring either of these is challenging. If, for example, an individual drops out of law school or transfers to a different school, does that mean that he/she flunked the bar?
More importantly, the power of prediction of the objective measures is, even at their best, weak. Statistically, to obtain an estimate of the proportion of influence the correlated value has, you calculate the coefficient of determination. See Coefficient of Determination. This allows you to approximate the percentage effect that the correlate has. It is a simple calculation as all you have to do is square the correlation obtained. Thus, for UGPA, the coefficient runs from 0.4% to 18.5%, with a median of 6.8%. Likewise for LSAT, the coefficient ranges from 3.6% to 31.4% with a median of 14.4%. In other words, for the median school, the LSAT captures about 15% and UGPA captures about 7% of the student’s ultimate probability of succeeding in their first year of law school, leaving about 80% to other factors. My favorite example here is a student from years ago who had an UGPA of 2.00 but graduated near the top of our class. When I asked him about it, he indicated that he had specialized in a different kind of bar in college and was now on the wagon.
The conclusion that has to be reached is that there is no simple way to identify a potential student as one who should absolutely not be admitted to law school. LSAT and UGPA can not be ignored in the process, but their use is of limited value. A school with a low LSAT or UGPA spread in its entering class might be admitting anyone with a pulse or they might be successfully using the other 80% to find a valid class. To find a “bottom feeder” — particularly one who is in violation of ABA Standard 501(b) — a full analysis is needed that examines how the admission decision is being made and by whom, what factors are being considered in the decision-making, and how well the school is educating its class including its students’ attrition rate and success with the bar. To do less that the full analysis is unsupported by the statistics.
One is reminded of an old, old book.
Does anyone remember it?
"How to >>> With Statistics"
All this work just to avoid having any standards at all.
Perhaps, medical schools and every other professional school should also give up entrance examinations. They are not "statistically meaningful" enough. Let's go with interviews. That would be better, for sure.
David Frakt, all your work is bogus! It is known!
And, when a disproportionate number of law school grads fail the bar examination, and it can be shown that this failure correlates with admitting more and more unqualified students (measured by the traditional means), then there should be NO CONSEQUENCES to the law school that kept lowering its standards and increasing its admit rate to keep the federal loan dollars flowing.
In the event that we have law schools that demonstrably, year after year, fleece the lowest rung of performers for federal loan dollars, only then to release them into the wild to either fail the bar in disproportionate numbers or fail to obtain employment in the profession for which they were promised training, we should have a muddied, subjective and ultimately toothless inquiry into "how the admission decision[s] [were] made and by whom, what factors [were] considered in the decision-making, and how well the school is educating its class including its students’ attrition rate and success with the bar."
This, of course, will ensure that law schools will be held to no standards, and suffer no consequences, ever!
Lets squeeze that last dollar out of any sucker that we can cajole into law school with bogus promises about the "best year ever to enroll" and similar hucksterisms. Let's spew a bunch of statistical jargon and claims to "prove" that, yes, if you scored in the bottom 10% on the LSAT and had a C- UGA, those factors account for only 20%: the other 80% of what we don't know will be a FAR BETTER predictor of your success in law school.
How much of this clap trap is the legal academy willing to swallow before it gets behind some action to shutter the shucksters?
Posted by: anon | October 01, 2016 at 04:10 PM
The legal jobs and client work market today is like trying to feed twenty hungry people with one Arby's Roast Beef sandwich and then charging each one $49.00 for their portion.
Posted by: Captain Hruska Carswell, Continuance King | October 01, 2016 at 06:52 PM
Anon @ 4:10:
My post does not argue for no standards, it argues for rational standards.
Posted by: Ralph Clifford | October 01, 2016 at 07:12 PM
No one is contending that schools be prohibited from taking a chance on a handful of diamonds in the rough -- students with bad lsats and gpas who nonetheless impress the committee as someone who may outperform the raw data. (By the same token there may be applicants who nailed the LSAT but seem to have little chance or interest in practicing law or passing the bar.).
There is a huge difference, however, between spotting a diamond in the rough to polish, and admitting rough coal and praying for diamonds. DeBeers knows where to dig and where not to dig. Exceptions to the statistical rule prove the rule: they don't invalidate it.
If I gave you raw data on prospective students ranging form LSAT to height and weight, I have little doubt that you could find the salient independent variables (hint, LSAT is one). To suggest that we don't know is not intellectually honest. We know where success lies just like the miners generally know where to dig.
Posted by: Jojo | October 01, 2016 at 07:55 PM
My post does not argue for no standards, it argues for rational standards.
THis is PRECISELY the refuge that Deborah Merritt took after claiming that the bar is not "valid." When asked with what to replace it, she recommended a bar exam, with slight and truly minor tweaks (less detailed MBE, open book MBE, revise the balance between practice tests and essays). Really thing gruel.
SO, here we go again. The LSAT and GPA aren't valid. Statistically, these two factors capture about 20% "of the student’s ultimate probability of succeeding in their first year of law school, leaving about 80% to other factors."
Oh please. YOu must take your audience here for fools. How can you peddle this?
Do you seriously believe that for that cohort of candidates who scored in the bottom 10% on the LSAT and had a C- UGA, those factors would account for only 20% of the probability of those students succeeding in those student's first year of law school, with other factors accounting for 80% in that assessment?
And, because you are peddling such really absurd ideas (in order to justify minimizing reliance on LSAT and UGPA and dissolve any meaningful consequences for bottom feeders), the burden in on you to tell us:
When all of the objective indicia of the "success" of a law school point in the wrong direction - unacceptably low bar pass and FT JD required employment rates - and it is shown that that law school was dropping entrance credentials (the credentials used by every law school) and relaxing admit criteria to maintain cash flow, what should be done?
Will you implausibly claim that that law school's subjective admission criteria were superior to the traditionally accepted admission criteria?
BTW, leaving aside the 20% predictive value of the LSAT and UGPA, what factors do you consider overwhelmingly determinative?
A story about one student spending time in a bar in college?
Folks, reading these blogs is so revelatory of the reasons law schools have suffered such an precipitous decline in enrollment. The thought leaders are not able to make coherent, common sense arguments.
This one doesn't even pass the first stage of plausibility.
Posted by: anon | October 01, 2016 at 08:55 PM
I am scratching my head over this, because statistical correlation is a tricky thing.
Let me try to explain. If you are looking at LSAT score as a predictor of law school performance, or for that matter UGPA it helps to have an idea of what sort of "correlation" you are talking about and in what range the correlation is present or absent.
To put it in simple terms, you could be asking the correlation - how well does UGPA correlate with Law School GPA. Does a 4.0 UGPA typically get a 4.0 in law school, a 3.5 UGPA = a 3.5 LSGPA - all the way to does a 1.0 UGPA = 1.0 LSGPA. Probably if that is the question, the answer would be that there is a very low correlation. But if the question is, does a really low UGPA correlate with a high likelihood of a law Law School GPA the answer may be very different.
Similarly consider law schools. At most law schools there is a relatively narrow spectrum of LSAT scores, because all things being equal, applicants tend to go the highest ranking school their LSAT will get them into, while law schools usually have bottom cutoffs for LSAT scores. So for a start, in any given law school you are talking about a small spectrum of LSAT scores - which makes the data lumpy by law school. So the 75/25 percentile LSATs for Harvard were 175/170 - for Charlotte and Texas Southern the 75/25 was 143/141 and 143/140. So for Harvard the range of LSATs is 5 points, for the 'bottom feeders 3 points.
In the case of Harvard it is hardly surprising that there is little correlation in law school grades between someone with say a 174 LSAT and a 171. But the other issues is also simple - within the range between 139 and 175 there is probably a low correlation for those scoring over 160 or so - that is to say, that some 160s will do very well, some 175s will do relatively badly, but nearly none will flunk. So if you take the LSAT as a whole, you might be able to argue that there is very little correlation - because when you are discussing higher LSAT scores, well yes. However, that does not mean that a very low LSAT scores there is not a high-correlation with law school and bar success, or that very low UGPAs there is not also a high correlation with law school and bar success.
In my career I have had to deal with a lot of statistics being used by one party or another in cases - and you tend to learn how to spot a problem (or trick played) with the data, with statistics. One of the principles a partner taught me decades ago is that "if the data sounds too good to be true work out why - chances are we can use that why against them." LSAT scores in the 139-140 range are barely better than guessing on the exam. While it might be that a 150+ LSAT score does not correlate to failure, very law scores can correlate very well. Look for the wrong correlation and you won't find one - it's a form of "cherry picking" the data.
Posted by: [M][@][c][K] | October 02, 2016 at 05:50 AM
When I wrote "all the way to does a 1.0 UGPA = 1.0 LSGPA" I meant a 2.0 GPA, but still, these days, who knows, maybe Charlotte would accept it.
Posted by: [M][@][c][K] | October 02, 2016 at 05:52 AM
Folks, we need to get a grip on ourselves. Hello???? "Fat, Drunk and Stupid is no way to go through life, son." It seems like the legal academy has climbed down a Rabbit Hole with Alice along with the rest of the country. We are a nation of Trump, Bundys, Kim Davis and guns. In the same vain, Law Schools are okay with 80% Bar Passage rates, low LSATs, GPAs, shady data and will admit any warm body that pays tuition.
Posted by: Captain Hruska Carswell, Continuance King | October 02, 2016 at 01:06 PM
The bottom line is this:
Law schools, in attempting to excuse their failures and avoid any accountability, are devising poorer and poorer excuses.
What we hear from some of the most stubborn denizens of the law academy is this:
The Bar is invalid. We couldn't know our grads would fail the invalid bar in proportions that others tell us are unacceptable (we don't care, obviously) because the LSAT and UGPA are invalid. We work "hard" (e.g., on an admission committee). We are conducting "experiential" law courses, employing our "innovative" programs, to produce "practice ready" graduates, while all the while acting as "knowledge generators" who don't pause for even a moment to think about, when we sit on those admin committees (or allow our admins to think about) keeping a minimum enrollment to generate enough money. We are only interested in providing "opportunity" to the unprivileged who score badly on standardized exams.
What we know is this:
Our bottom rung produces a majority of fleeced and disappointed young people - people fleeced just to be used as conduits to get at those federal loan dollars. Our bottom rung demands to be held immune from consequences for their failures, and expects us to believe the clap trap above.
Some of us believe otherwise.
Posted by: anon | October 02, 2016 at 02:46 PM
There is a hard rain coming for the law schools and administrators that continue to offer mealy mouthed protests for why the schools charge so much for so little.
sad thing is, the author, Dean De Luc at Cooley, and Dean Allard at Brooklyn Law School, to name a few, will face no real repercussions from the corrupt business model they have created and ardently defend.
Why would anyone stop this: the students sign the master loan agreement. the 'financial aid office' (loan sharks with orthopedic chairs) accepts the loaned money from the government and pay the student's tuition to the school. the school pays the tuition out to the dean and his cronies. the student gets crippling debt and a 50% chance at a real legal job. the dean and cronies get mid-sized luxury cars and houses with three bathrooms.
What dean would want to stop this gravy train with biscuit wheels?
Posted by: terry malloy | October 02, 2016 at 09:00 PM
Mr. Clifford,
Please do not misunderstand or misuse my LSAT risk bands. The point of the risk bands is to advise law schools to exercise caution when admitting students from these bands (especially if they care about their bar passage rates), and to ensure that a low LSAT score is counterbalanced with a high GPA or some other factors suggesting a likelihood of success (such as succeeding in a rigorous admission by performance program). I have not advocated for a strict LSAT cutoff, but you are correct that I would recommend hardly ever admitting anyone at 144 or below. Hardly ever is not the same as never.
Posted by: David Frakt | October 03, 2016 at 09:43 AM
Per his comment above, Professor Clifford advocates for the use of "rational" standards to measure performance.
Using a "rational" standard for evaluating UMass School of Law's performance how would you evaluate these facts:
UMass School of Law's ABA Employment Summary for 2015 Graduates says 3 of the students got full time long term jobs with firms larger than 10 attorneys. Only 3.
More than 1 in 3 students who start there are gone before graduation (per 2015 ABA 509 report):
JD Attrition %
1st year 29.7
2nd year 6.7
3rd year 4.1
4th year 3.7
Difference between State pass rate (MA) and UMass School of Law's pass rate for 1st time takers:
2014 -18.24
2013 -17.61
2012 -31.35
I am all for the use of "rational" performance standards with respect to legal education. I suspect my definition of "rational" is a great deal different than that used by many law school deans and professors.
Posted by: confused by your post | October 03, 2016 at 10:49 AM
David: I appreciate what you are saying with your bands, but also how some are using them. I do not disagree with you that LSAT is, and always will be, an important pre-admission indicator. The goal of my posting is point out the severe limitations it has. I will be discussing the other "objective" indicators in future posts. My goal is to encourage discussion about how we can do a better job of selecting future students.
Posted by: Ralph Clifford | October 03, 2016 at 11:40 AM
"My goal is to encourage discussion about how we can do a better job of selecting future students."
Incorrect. Your goal is to disseminate pro-law school propaganda and provide cover for continued cheating of students. But nice attempt at spinning.
Posted by: Anon | October 03, 2016 at 12:03 PM
"My goal is to encourage discussion about how we can do a better job of selecting future students." PROVIDED it does not require sacrifice from the administrators of law schools taking money from these students.
Posted by: terry malloy | October 03, 2016 at 12:15 PM
Judging from UMass School of Law's student attrition rates I would think that its admissions folks should be VERY interested in such a discussion.
I look forward to Professor Clifford explaining what factors his school uses to select future students for admission and the weights those factors are given.
Posted by: confused by your post | October 03, 2016 at 02:29 PM
Confused
Introducing the harsh reality of what is being defended in the main post is so telling!
Is it any wonder one would resort to absurd nonsense in an attempt to defend that record?
WHERE ARE THE REGULATORS?
Posted by: anon | October 03, 2016 at 10:50 PM
Statistics matter.
Indeed.
Want to get really upset?
As I read the report:
Entering class:
71 students (20 part time), 33 total faculty.
That's one faculty member for every two students.
Then, take out all that attrition.
(I hope I'm reading this wrong; really, how disgusting could this situation really be?)
No wonder the main post refers to working "hard."
Posted by: anon | October 03, 2016 at 10:57 PM
Statistical analysis isn't something you want to armchair; a few points:
1. Your "Statistics for Dummies" link notwithstanding, r values are generally context-dependent. The values listed sound fairly strong, actually, and LSAT/ugpa should not be discarded unless something better is established.
2. Sample size limitations are not as restrictive here as you seem to think because the sheer number of schools allows for meta-analysis.
3. While LSAC does not describe the analysis they are doing, a straight linear regression equation might not be the best indicium of correlation.
4. Without a detailed explanation of the variables tested in constructing the regression analysis it's hard to tell how the independent variables interact with each other in impacting the dependent variable.
Posted by: Twbb | October 04, 2016 at 07:28 AM
Twbb, agreed. I'm surprised that linear r values get as high as they do for LSAT/GPA/performance correlations, for something that is not a first-order experiment in the chemistry laboratory. Where one sees "no correlation," I see significant correlation given the kind of data we are dealing with - perhaps even more with a different model. This is proof of psychohistory, as far as I'm concerned...!
Posted by: dupednontraditional | October 05, 2016 at 03:12 PM