The ABA Standard 509 Information Report was a major step forward for law school transparency. (Note - the reports on all ABA-Accredited law schools can be found here) The reports provide a wealth of useful information about the entrance credentials of admitted students, attrition data, conditional scholarships eliminated, and bar passage data, allowing prospective law students to make an informed comparison of law schools. Although these reports are useful, in my view they don't go far enough, and they even have the potential to mislead students about their prospects for success. The problem with the reports is twofold: first, there is a lag in reporting; and second, the data doesn't differentiate among the success rates of students based on their entrance credentials.
Let me give you an example. Let's take Florida Coastal School of Law's (surprise) 509 Report. The 2014 report, the report that students who are thinking about applying to law school for Fall 2015 have access to, provides the bar passage rates for first time takers for 2011, 2012, and 2013. Looking at the numbers from those years, the school's bar pass numbers look pretty decent, 76% in 2011, 75% in 2012, and 68% in 2013. Looking at the 509 report, a prospective student would likely conclude "if I am admitted to this school, I will have a roughly 70-75% chance of passing the bar the first time." That prospective student might consider this to be a reasonable risk, and plunk down a deposit. (Of course, prospective students tend to disregard the information about attrition, even though it is presented on the report. No first-semester law student that I ever met ever seemed to seriously contemplate the possibility that they might flunk out.) But, as with mutual funds, past results are not a guarantee of future performance. As I have explained in other posts, where a law school has substantially lowered the entrance credentials required for admission, it can reasonably be expected that the school's bar passage rate will drop significantly three to four years later when those full and part-time students graduate and take the bar. Thus, in the case of FCSL, the lowering of admissions standards in 2010 resulted in a lower bar pass rate in 2013, and the further weakening of admissions standards in 2011 resulted in an even lower rate this year (58% for first-time takers in Florida on the July 2014 bar). The further weakening of admissions standards in 2012-14 is likely to result in further deterioration of the first-time bar passage rate at FCSL But the college senior or recent graduate contemplating law school doesn't know all that. I can tell you with great certainty that the entering students with a 2.6 UGPA and a 140 LSAT (these are the 25th percentile averages for the most recently reported entering students at FCSL) don't have anywhere close to a 70% chance of graduating and passing the bar on their first attempt, but what exactly are there chances? Presumably FCSL knows, but they sure aren't telling. Wouldn't it be nice if they were required to disclose this? The current Standard 509 doesn't require anything like this level of granularity. I think that it should.
A Modest Proposal
I propose that each law school be required to have a calculator on their website which would provide customized, tailored predictors of success to all prospective students. Each law school would be required to maintain a master database which tracked every law student who matriculated. The database would include the students undergraduate GPA (UGPA) and LSAT score. The database would track whether the student was academically attrited, voluntarily left school, transferred to another law school, or graduated. The database would also track each student who reported taking the bar and whether they passed on their first or a subsequent attempt. Of course, law schools are already collecting most, if not all, of this data already. What I propose is that this data be made available to prospective students through the personal success calculator. Here's how it would work: the prospective student would plug in their UGPA and LSAT score into the calculator, and the school's website would then provide a customized personal report describing the experience of similarly qualified students, which I would define as those within +/- 1 point on the LSAT and +/- .10 UGPA. So, if a student entered a UGPA of 3.0 and and an LSAT of 150, the website would provide the following information:
"Over the last 7 years, we have matriculated x# of students with similar entrance credentials to your own, defined as those with a UGPA of 2.9 to 3.1 (+/- .10 from your self-reported UGPA) and an LSAT of 149-151 (+/- 1 point of your self-reported LSAT score).
Of these x# of students, # were academically attrited (failed), # voluntarily dropped out, # transferred to another ABA-accredited law school, # are still enrolled (as of the beginning of the most recent semester) and # have graduated, earning their Juris Doctor degree. Of the # that graduated, # reported taking the bar at least once. Of these #, # passed the bar on their first attempt, for a first-time bar passage rate of x%. An additional # who failed on their first attempt passed on a subsequent attempt."
In addition to making this data available on their websites, this individualized data should be required to be included in all offers of admission sent to any applicant. Thus, a student admitted to multiple schools could actually compare how students with similar entrance credentials have fared at each of the schools to which they were admitted, providing the prospective student with some meaningful basis for choosing among competing offers of admission. The raw numbers would also provide some potentially useful information. For example, schools that were admitting students with lower entrance credentials than they had accepted in the past might have little, if any, data on the success rate of students with similar entrance credentials. Knowing that a school had little, if any, experience, in helping students at their talent level to succeed in law school would be very useful for an applicant to know.
Let me give a hypothetical example to illustrate. Suppose a student with a 140 and a 2.6 recently admitted to a law school went to the school's website and plugged in their numbers. The report generated might look something like this:
"Over the last 7 years, we have matriculated 100 students with similar entrance credentials to your own, defined as those with a UGPA of 2.5 to 2.7 (+/- .10 from your self-reported UGPA) and an LSAT of 139-141 (+/- 1 point of your self-reported LSAT score). Of these 100 students, 30 were academically attrited (failed), 6 voluntarily dropped out, and 1 transferred to another ABA-accredited law school, 33 are still enrolled (as of the beginning of the most recent semester) and 30 have graduated, earning their Juris Doctor degree. Of the 30 that graduated, 28 reported taking the bar at least once. Of these 28, 12 passed the bar on their first attempt, for a first-time bar passage rate of 43%. An additional 2 who failed on their first attempt passed on a subsequent attempt."
This information would tell the prospective student that the school has only recently started taking significant numbers of students with similar entrance credentials, and that students like them tend to do quite poorly, with a high academic attrition rate and a low first time bar passage rate. These numbers are likely far worse than the school's overall bar pass and attrition rate as reported on the 509 Report, which might be perfectly respectable. Hopefully, this stark data would cause the prospective student to think twice about enrolling in that law school. But even if the student chose to defy the odds and attend this law school, at least it could be fairly said they were making an informed decision.
Requiring that law school's track this data and have it available for accreditors would also enable the ABA to determine whether schools were meeting standard 501(b), which states:
-
(b) A law school shall not admit an applicant who does not appear capable of satisfactorily completing its program of legal education and being admitted to the bar.
Sounds reasonable.
How would you propose disclosing the fact that a law school "admitting students with lower entrance credentials than they had accepted in the past might have little, if any," information in a particular respect?
Also, I missed the "of the total, X reported" qualification. Perhaps there is a way to make that more clear ...?
Posted by: anon | December 10, 2014 at 01:20 AM
I pretty much view law schools as carnival barkers with access to federal lending that would make a subprime loan originator envious, but I don't see the point of this one.
I'd rather the ABA get rid of the JD Advantage category, which is misleading and ripe for abuse. The ABA could and should collect and publish first year salary data.
Posted by: Jojo | December 10, 2014 at 08:30 AM
Anon -
If a school had not previously matriculated students with such low numbers, then the numbers would be 0. The school would still have to report the data, but it would inform the prospective student that no student with similar entrance credentials had been admitted. I also updated my post to add one more piece of data - the number of students currently enrolled with similar entrance credentials. So, a school that had recently lowered its admission criteria might report 20 or 30 students currently enrolled with similar credentials, but might have 0 in the number of graduates or number who have passed the bar column.
Posted by: David Frakt | December 10, 2014 at 08:50 AM
There are two areas that I'd like to see improved to have greater transparency and provide more useful information:
(1)Incubator programs.
The relationship of "incubator programs" to the employment categories in the Employment Summary Reports is not all that clear or uniformly reported. Ideally, I'd like those types of term positions have their own category. I strongly suspect that schools are reporting the participants in the program as FT/LT solo or 2-10, which is pretty misleading in my view.
What I'm told is that schools are supposed to list participants in those programs as employed by the school. But when I compared the Employment Summaries with the ABA list of programs, that does not seem to be happening in more than a few instances (i.e., schools are reporting classes of incubator participants that are larger than the number of those reported as employed by the school). There are also a few programs that, while not technically run by the school, are run by not-for-profits obviously associated with the school (perhaps to avoid having to report these students as employed by the school?).
(2) Adding teeth to the Audit Procedure.
I've already complained on this site and elsewhere about how the ABA's new procedure for checking a school's employment reporting is inadequate to detect systemic fraud by a law school. I'm not going to repeat all that here but if you want to find my arguments on the audit requirement just google "'former editor' + 'systemic fraud'" and it will pop up right away. That procedure needs to be substantially beefed up and, until it is, some schools' data (particularly those with a rap sheet, so to speak) should be viewed skeptically.
Posted by: Former Editor | December 10, 2014 at 09:16 AM
Interesting points; I also think we have to place the LSAT scores in context. The reported scores are not based on an absolute scale; they are curved. That means that LSAT scores as a whole do not seem to change too much even if on average less qualified students are taking it. The bar exams, of course, are not curved. I would suspect the next few bar exams will show more passage rate drops for 150+ LSATers.
Posted by: twbb | December 10, 2014 at 09:16 AM
I agree that the reporting of employment data could also be improved, but I don't consider that to be as important as having accurate information available on attrition, graduation, and bar passage rates. After all, a law school is not a guarantor of employment, and employment figures are largely driven by economic factors that are outside the control of the law school. But the quality of the educational program is within the control of the law school and the data to assess the quality of the program should be available. One law school might have a 5% attrition rate and an 80% first time bar pass rate for students with a 150 and 3.0, and another law school might have a 15% attrition rate and a 60% first time bar pass rate for students with the same entrance credentials, but right now, there is no way for a prospective student to know this. My proposal would at least remedy that problem.
Posted by: David Frakt | December 10, 2014 at 09:39 AM
Interesting proposal. Not sure if you're soliciting more, but one item that I'd like to see concerns the ABA certification of LSAT/GPA entering class stats. In response to calls for oversight, two years ago the ABA began certifying law school stats for entering classes. Oddly though, they made the certification process optional. From what we've been told, almost every law school participates, but not all of them. So some schools have strangely refused to allow the ABA to certify their results. Which schools? The ABA won't say.
I'd suggest requiring law schools to indicate whether their numbers have been certified by the ABA. They could put this on the 509 report, their website, or ideally, both.
Posted by: Certified? | December 10, 2014 at 10:12 AM
David, great question, althouth I think there is a simpler answer. In a perfect world, a prospective law student would have the following information available:
1. LSAT/GPA scores of all first year students. To date, you can only find the middle 50%. It is important to know the top 25% if you need to be toward the top of your class to land the job you want (i.e. Fordham, GW). If you came off the waitlist, it is good to know the bottom 25% of the class to gauge your odds of beating at least some of the competition.
2. Scholarship info for all incoming students. This will provide maximum bargaining power to the prospective students. This info should be cross-referenced with LSAT/GPA so each prospective can know what they are worth.
3. Employment outcomes for every single graduate (names redacted). This means actual name of employer, duration of employment, salary info, etc. The summaries are too deceiving. A school can have 3 biglaw grads, and claim a $160,000 median private sector salary because only 5 people got private sector jobs.
Posted by: JM | December 10, 2014 at 10:24 AM
David,
This is an interesting proposal. With a couple quibbles, it makes sense to me.
First, I think you might overstate the likelihood that schools already have such granular information at their fingertips. Some schools undoubtedly know how their 140/2.6 students are faring, but as you know schools aren't at this time required to analyze data in such detail for reporting purposes, which probably consume 95% of schools ' data analytics efforts. That said, the info is there, waiting to be mined. It would be instructive.
Part of me wonders whether your proposed calculator would tell prospectives more about the school's qualitiies, or their own. It's unlikely that the bar passage rates of entrants with LSAT scores in the 170s or the 140s vary much from school to school. However, I could be wrong about the lower end; some schools might have very effective and involved academic support programs that would distinguish them from the pack of schools with similar 25th percentile stats. Still, I think the greatest value of your suggestion would be at the aggregate level. Perhaps this concern could be met my including an "aggregate calculator", so that the school-specific data could be easily compared with selected other law schools, or all law schools.
FE: I did find an earlier post of yours on systemic fraud. My feelings are mixed. Law schools have not traditionally been expert at survey research. It consumes an astonishing amount of staff time to do even "back of the envelope" data gathering and reporting. This might actually be an argument in favor of some more rigorous auditing, but I do no think that is a realistic demand for most individual law schools. If I had a magic wand, I'd put consortiums of schools together for the purpose of hiring outside firms to conduct employment surveys. Short of that, I'm wondering what your suggestion is for requiring auditing prior to the ABA triggers (2% error rate or credible allegation of fraud). Thanks.
Posted by: Adam | December 10, 2014 at 10:31 AM
Other easy and useful points of consumer data that schools could disclose if they actually cared about students:
1. Mean and median debt at graduation. (Note that this will include accrued interest and not just originated debt).
2. Mean and median tuition & fees actually paid per annum. (Since 2007, we've converted to the MSRP with secret discounts model. How about a little transparency on pricing).
3. Median salary info for graduates.
Posted by: Jojo | December 10, 2014 at 10:43 AM
Certified -
Great idea.
JM -
1. School's report at the 25th, 50th and 75th percentile already for both LSAT and UGPA.
2. I'm not sure why prospective students should have "maximum bargaining power." What they need is to have adequate information to reasonably evaluate the value of the education they are being offered.
3. While greater employment data and salary information would be useful, your proposal would raise serious privacy concerns in publishing the salaries of individual graduates. Even though the names would be redacted it would be very easy to figure out which students had gone to which employers and how much they were making. The ABA does require a separate employment report http://employmentsummary.abaquestionnaire.org but that is not the subject of this post.
Adam -
It may be that the data shows that outcomes are very similar for students with similar credentials. That would also be very useful for a prospective student to know. Many law schools claim to have some secret sauce that makes their school a better choice. The actual data might confirm this, or it might prove otherwise. If the chances of graduating and passing the bar were basically the same, then the student could decide among law schools based on other factors such as location, price and employment prospects.
Posted by: David Frakt | December 10, 2014 at 10:52 AM
David,
1. I said scores for "all students" so that students know scores within the 1-25% and 75-100% bands. In a lot of schools only the top 10% or 20% get desireable jobs, so prospectives need to know if they are realistically competitive for that placement. Same concept for those seeking to avoid being at the very bottom of a class in a school they were fortunate to be admitted to.
2. Schools jack up tuition to astronomical levels with the purpose of price discriminating for each applicant. More transparency would balance the power between schools and students and promote fairness.
3. So don't include the names of the employer but rather the size/location of the firm. The point is to avoid any opportunity the school has to summarize. Summaries always lead to deception.
Posted by: JM | December 10, 2014 at 12:55 PM
"Short of that, I'm wondering what your suggestion is for requiring auditing prior to the ABA triggers (2% error rate or credible allegation of fraud)."
The most important thing I would change would be to essentially combine levels one and two of the random school review and begin the outside verification of the information in the files at the first step of the audit. My primary concern is that, under the current set up, there's no way to detect whether a school is misrepresenting (or otherwise messing up the reporting of) employment outcomes unless the school also leaves its file in a lousy state of completeness. In other words, so long as the information in the file fills out all the boxes in terms of kinds of things that must be gathered, the ABA presumes the information "to be complete, accurate and not misleading in the absence of credible evidence to the contrary." As you noted, law schools are not experts at gathering survey information and the whole reason for the protocol is the history of law schools disseminating misleading employment information. Any protocol worth the trouble, then, needs to "trust but verify" rather than just "trust," as the current protocol does on step one of the random school review.
I don't really see how starting the outside verification of information at level one is all that burdensome on either the law school or the auditors (if the school proves to be accurately reporting, that is). For a school with a graduating cohort of 200, we are talking 40 phone calls or emails to confirm the information in the file with a level 2 review. For much larger schools this could be more burdensome, but (1) that's the school's choice for admitting large classes, and (2) downward adjustments could be made in the percentage of the class polled when schools reach a particular size (e.g., 15% of the class checked for a school of 400).
I would also remove the word "credible" from all of the places requiring "credible evidence" to get to a level two review under the random school review, assuming the above suggestion isn't adopted.
Last, I would designate some specific ABA officer to be responsible for determining and issuing a written conclusion as to the credibility of an accusation made that, if believed, would trigger a red flag review.
Posted by: Former Editor | December 10, 2014 at 02:43 PM
FE,
I understand where you're coming from, but compliance work often looks just like this: You have to document the steps you're following (which ideally add up to a menu of "best practices"), rather than prove every item was in fact filled out correctly. LSAC auditing of every single data LSAT and UGPA data point works because it's extremely easy and cheap - they already have this info, and merely need to sort students into the schools they matriculated at. That's not the case with what you have in mind.
I think you're pretty optimistic about what those 40 calls or emails will reveal. Half of them will not be answered, which - depending on your perspective - will either raise a red flag (are these numbers faked?) or a shrug (people hate surveys). I know on an earlier thread, you objected to law schools blaming their graduates for failing to answer these surveys. Respectfully, that is a very significant problem here, and while I think law schools should disclose the number of respondents (so that an "90%" employment figure has context), they are not ultimately responsible, in my view, for their graduates' willingness to assist future law students (and the school itself). The response of most people to being asked the same boring questions a second time is to hang up or ignore the call. That's why I'm more sympathetic to a consortium approach: I think this task is more expensive and time-consuming than you suggest, so I prefer to distribute the cost of surveying over a larger group. Ultimately, that might get us closer to agreement here, though we probably disagree on whether every law school should have an obligation to hire an auditor in the absence of a consortium like that.
Again, the answer here turns on what one thinks the problem is. I'm not aware that faked - as opposed to dramatically incomplete - numbers has generally been regarded as the been the problem on the employment side of law school disclosures.
A small point: are allegations that do not provide credible evidence appropriately regarded as actionable "evidence"? Are you saying that a letter to the ABA that reads, "I think School X mis-counted BigLaw graduates because I know everyone who got a BigLaw job from that school," should trigger a red flag review?
Posted by: Adam | December 10, 2014 at 04:35 PM
Adam,
I guess I'm a little confused by your consortium idea (or at least why you think something like it isn't already going on by the ABA's role in the process). My reading of the protocols is that no outside auditor is required until you get to a level three review. Prior to that, the checking and evaluating (which I've somewhat sloppily been calling "auditing") is being done by ABA employees (although schools are required to turn over the information they have to those employees). And if a school's files are in poor enough shape to manage to earn a level three review, I think they should have to spend the money on auditors. The cost alone should provide some incentive for schools to at least pretend to maintain accurate files.
Regarding response rate, the protocols specifically say that "The reviews will take account of the facts that: it will not be possible to contact some graduates; a graduate might intentionally report false information; and a graduate’s employment status might have changed since the time data were collected but the school did not know or have reason to know of the change." The upshot of this is that for a school to trigger the next level of review the ABA would probably need to have several graduates telling their reviewers that what was reported was materially inaccurate, rather than the reviewers being unable to reach some portion of the class.
As to your small point, red flag review can be triggered by a "credible allegation", level 2 review can be triggered by "credible evidence" presented during a random school review. They are different reviews, the latter being (potentially but not necessarily) far more serious.
As you know, I think level two review should happen automatically during a random school review anyway. It should happen even more so if there is any evidence calling into question a schools files during a random school review.
Regarding a potentially far more serious red flag review, I do think (and said above) that a credibility determination should be made. Should an unsubstantiated email trigger a full red flag review? Probably not, but there should be some specific person responsible for making the credibility call in a transparent way. Your hypothetical letter probably wouldn't cut it and a decision not to proceed on that basis would be pretty easily explained and defended.
Posted by: Former Editor | December 10, 2014 at 06:48 PM
I meant to say the former above. To be clear, a red flag review is potentially, but not necessarily, more serious than a level one or two review.
Posted by: Former Editor | December 10, 2014 at 07:02 PM
Rather than have each school publish a calculator on its own website, it would be better to have each school report the underlying data to the ABA in a standardized format and have a single calculator on the ABA' website. Each school would have to have a conspicuous and acessible link on its website to the ABA calculator (with that school already selected from the appropriate drop-down menu), just as schools must currently have conspicuous and accessible links to their Standard 509 disclosures. There is no reason to program one of these things more than once, or to have to integrate one with each school's own sever software. A single centralized calculator would also make it easier for students to compare their prospects at multiple schools.
Posted by: James Grimmelmann | December 10, 2014 at 07:37 PM
Providing this data would mean in practice (though only in part) explaining to most affirmative action admittees that they have a very low chance of ever becoming lawyers. The ABA would have a conniption if a law school did this, because the ABA doesn't care how many minority students graduate and pass the bar, only how many matriculate, and therefore they don't want consumers to have this information.
Posted by: David B | December 10, 2014 at 08:39 PM
David B -
It may well be that some "affirmative action admittees" would find out that students with similar entrance credentials have not fared particularly well at the school to which they have been admitted (or perhaps at any law school), but I'm not sure that this would cause the ABA to "have a conniption." The data that I am recommending be disclosed is race neutral, and indisputably useful consumer information. If some students with poor aptitude for the study of law were dissuaded from attending law school because they were provided with data that enabled them to make a more realistic assessment of their chance for success, I think that would be a positive development, even if some of the students who happened to be dissuaded were minorities. I am sure that there would also be majority students who would choose not to apply or not to accept an offer of admission once they found out what a risky proposition law school is for students with poor predictors. Being against providing consumer information would be a tough position for the ABA to defend.
Posted by: David Frakt | December 10, 2014 at 09:19 PM
Thanks, Former Editor. I think I understand your position better now.
I don't think I object to making the ABA help law schools comply with their reporting requirements, though I'm a little unsure how much it is willing to do. If the ABA wants schools to turn over a bunch of call and email logs and follow up, fine with me. There is some nontrivial room for this game of "telephone" to go somewhat awry (no one can ask the same question twice), but in principle this should be doable.
Posted by: Adam | December 10, 2014 at 10:05 PM