Search the Lounge


« In case you haven't read enough about torture today. . . | Main | LSAC Seeks Nominations For Its Board of Trustees »

December 10, 2014


Feed You can follow this conversation by subscribing to the comment feed for this post.


Sounds reasonable.

How would you propose disclosing the fact that a law school "admitting students with lower entrance credentials than they had accepted in the past might have little, if any," information in a particular respect?

Also, I missed the "of the total, X reported" qualification. Perhaps there is a way to make that more clear ...?


I pretty much view law schools as carnival barkers with access to federal lending that would make a subprime loan originator envious, but I don't see the point of this one.

I'd rather the ABA get rid of the JD Advantage category, which is misleading and ripe for abuse. The ABA could and should collect and publish first year salary data.

David Frakt

Anon -

If a school had not previously matriculated students with such low numbers, then the numbers would be 0. The school would still have to report the data, but it would inform the prospective student that no student with similar entrance credentials had been admitted. I also updated my post to add one more piece of data - the number of students currently enrolled with similar entrance credentials. So, a school that had recently lowered its admission criteria might report 20 or 30 students currently enrolled with similar credentials, but might have 0 in the number of graduates or number who have passed the bar column.

Former Editor

There are two areas that I'd like to see improved to have greater transparency and provide more useful information:

(1)Incubator programs.

The relationship of "incubator programs" to the employment categories in the Employment Summary Reports is not all that clear or uniformly reported. Ideally, I'd like those types of term positions have their own category. I strongly suspect that schools are reporting the participants in the program as FT/LT solo or 2-10, which is pretty misleading in my view.

What I'm told is that schools are supposed to list participants in those programs as employed by the school. But when I compared the Employment Summaries with the ABA list of programs, that does not seem to be happening in more than a few instances (i.e., schools are reporting classes of incubator participants that are larger than the number of those reported as employed by the school). There are also a few programs that, while not technically run by the school, are run by not-for-profits obviously associated with the school (perhaps to avoid having to report these students as employed by the school?).

(2) Adding teeth to the Audit Procedure.

I've already complained on this site and elsewhere about how the ABA's new procedure for checking a school's employment reporting is inadequate to detect systemic fraud by a law school. I'm not going to repeat all that here but if you want to find my arguments on the audit requirement just google "'former editor' + 'systemic fraud'" and it will pop up right away. That procedure needs to be substantially beefed up and, until it is, some schools' data (particularly those with a rap sheet, so to speak) should be viewed skeptically.


Interesting points; I also think we have to place the LSAT scores in context. The reported scores are not based on an absolute scale; they are curved. That means that LSAT scores as a whole do not seem to change too much even if on average less qualified students are taking it. The bar exams, of course, are not curved. I would suspect the next few bar exams will show more passage rate drops for 150+ LSATers.

David Frakt

I agree that the reporting of employment data could also be improved, but I don't consider that to be as important as having accurate information available on attrition, graduation, and bar passage rates. After all, a law school is not a guarantor of employment, and employment figures are largely driven by economic factors that are outside the control of the law school. But the quality of the educational program is within the control of the law school and the data to assess the quality of the program should be available. One law school might have a 5% attrition rate and an 80% first time bar pass rate for students with a 150 and 3.0, and another law school might have a 15% attrition rate and a 60% first time bar pass rate for students with the same entrance credentials, but right now, there is no way for a prospective student to know this. My proposal would at least remedy that problem.


Interesting proposal. Not sure if you're soliciting more, but one item that I'd like to see concerns the ABA certification of LSAT/GPA entering class stats. In response to calls for oversight, two years ago the ABA began certifying law school stats for entering classes. Oddly though, they made the certification process optional. From what we've been told, almost every law school participates, but not all of them. So some schools have strangely refused to allow the ABA to certify their results. Which schools? The ABA won't say.

I'd suggest requiring law schools to indicate whether their numbers have been certified by the ABA. They could put this on the 509 report, their website, or ideally, both.


David, great question, althouth I think there is a simpler answer. In a perfect world, a prospective law student would have the following information available:

1. LSAT/GPA scores of all first year students. To date, you can only find the middle 50%. It is important to know the top 25% if you need to be toward the top of your class to land the job you want (i.e. Fordham, GW). If you came off the waitlist, it is good to know the bottom 25% of the class to gauge your odds of beating at least some of the competition.

2. Scholarship info for all incoming students. This will provide maximum bargaining power to the prospective students. This info should be cross-referenced with LSAT/GPA so each prospective can know what they are worth.

3. Employment outcomes for every single graduate (names redacted). This means actual name of employer, duration of employment, salary info, etc. The summaries are too deceiving. A school can have 3 biglaw grads, and claim a $160,000 median private sector salary because only 5 people got private sector jobs.



This is an interesting proposal. With a couple quibbles, it makes sense to me.

First, I think you might overstate the likelihood that schools already have such granular information at their fingertips. Some schools undoubtedly know how their 140/2.6 students are faring, but as you know schools aren't at this time required to analyze data in such detail for reporting purposes, which probably consume 95% of schools ' data analytics efforts. That said, the info is there, waiting to be mined. It would be instructive.

Part of me wonders whether your proposed calculator would tell prospectives more about the school's qualitiies, or their own. It's unlikely that the bar passage rates of entrants with LSAT scores in the 170s or the 140s vary much from school to school. However, I could be wrong about the lower end; some schools might have very effective and involved academic support programs that would distinguish them from the pack of schools with similar 25th percentile stats. Still, I think the greatest value of your suggestion would be at the aggregate level. Perhaps this concern could be met my including an "aggregate calculator", so that the school-specific data could be easily compared with selected other law schools, or all law schools.

FE: I did find an earlier post of yours on systemic fraud. My feelings are mixed. Law schools have not traditionally been expert at survey research. It consumes an astonishing amount of staff time to do even "back of the envelope" data gathering and reporting. This might actually be an argument in favor of some more rigorous auditing, but I do no think that is a realistic demand for most individual law schools. If I had a magic wand, I'd put consortiums of schools together for the purpose of hiring outside firms to conduct employment surveys. Short of that, I'm wondering what your suggestion is for requiring auditing prior to the ABA triggers (2% error rate or credible allegation of fraud). Thanks.


Other easy and useful points of consumer data that schools could disclose if they actually cared about students:

1. Mean and median debt at graduation. (Note that this will include accrued interest and not just originated debt).

2. Mean and median tuition & fees actually paid per annum. (Since 2007, we've converted to the MSRP with secret discounts model. How about a little transparency on pricing).

3. Median salary info for graduates.

David Frakt

Certified -
Great idea.

JM -

1. School's report at the 25th, 50th and 75th percentile already for both LSAT and UGPA.
2. I'm not sure why prospective students should have "maximum bargaining power." What they need is to have adequate information to reasonably evaluate the value of the education they are being offered.
3. While greater employment data and salary information would be useful, your proposal would raise serious privacy concerns in publishing the salaries of individual graduates. Even though the names would be redacted it would be very easy to figure out which students had gone to which employers and how much they were making. The ABA does require a separate employment report but that is not the subject of this post.

Adam -

It may be that the data shows that outcomes are very similar for students with similar credentials. That would also be very useful for a prospective student to know. Many law schools claim to have some secret sauce that makes their school a better choice. The actual data might confirm this, or it might prove otherwise. If the chances of graduating and passing the bar were basically the same, then the student could decide among law schools based on other factors such as location, price and employment prospects.



1. I said scores for "all students" so that students know scores within the 1-25% and 75-100% bands. In a lot of schools only the top 10% or 20% get desireable jobs, so prospectives need to know if they are realistically competitive for that placement. Same concept for those seeking to avoid being at the very bottom of a class in a school they were fortunate to be admitted to.

2. Schools jack up tuition to astronomical levels with the purpose of price discriminating for each applicant. More transparency would balance the power between schools and students and promote fairness.

3. So don't include the names of the employer but rather the size/location of the firm. The point is to avoid any opportunity the school has to summarize. Summaries always lead to deception.

Former Editor

"Short of that, I'm wondering what your suggestion is for requiring auditing prior to the ABA triggers (2% error rate or credible allegation of fraud)."

The most important thing I would change would be to essentially combine levels one and two of the random school review and begin the outside verification of the information in the files at the first step of the audit. My primary concern is that, under the current set up, there's no way to detect whether a school is misrepresenting (or otherwise messing up the reporting of) employment outcomes unless the school also leaves its file in a lousy state of completeness. In other words, so long as the information in the file fills out all the boxes in terms of kinds of things that must be gathered, the ABA presumes the information "to be complete, accurate and not misleading in the absence of credible evidence to the contrary." As you noted, law schools are not experts at gathering survey information and the whole reason for the protocol is the history of law schools disseminating misleading employment information. Any protocol worth the trouble, then, needs to "trust but verify" rather than just "trust," as the current protocol does on step one of the random school review.

I don't really see how starting the outside verification of information at level one is all that burdensome on either the law school or the auditors (if the school proves to be accurately reporting, that is). For a school with a graduating cohort of 200, we are talking 40 phone calls or emails to confirm the information in the file with a level 2 review. For much larger schools this could be more burdensome, but (1) that's the school's choice for admitting large classes, and (2) downward adjustments could be made in the percentage of the class polled when schools reach a particular size (e.g., 15% of the class checked for a school of 400).

I would also remove the word "credible" from all of the places requiring "credible evidence" to get to a level two review under the random school review, assuming the above suggestion isn't adopted.

Last, I would designate some specific ABA officer to be responsible for determining and issuing a written conclusion as to the credibility of an accusation made that, if believed, would trigger a red flag review.



I understand where you're coming from, but compliance work often looks just like this: You have to document the steps you're following (which ideally add up to a menu of "best practices"), rather than prove every item was in fact filled out correctly. LSAC auditing of every single data LSAT and UGPA data point works because it's extremely easy and cheap - they already have this info, and merely need to sort students into the schools they matriculated at. That's not the case with what you have in mind.

I think you're pretty optimistic about what those 40 calls or emails will reveal. Half of them will not be answered, which - depending on your perspective - will either raise a red flag (are these numbers faked?) or a shrug (people hate surveys). I know on an earlier thread, you objected to law schools blaming their graduates for failing to answer these surveys. Respectfully, that is a very significant problem here, and while I think law schools should disclose the number of respondents (so that an "90%" employment figure has context), they are not ultimately responsible, in my view, for their graduates' willingness to assist future law students (and the school itself). The response of most people to being asked the same boring questions a second time is to hang up or ignore the call. That's why I'm more sympathetic to a consortium approach: I think this task is more expensive and time-consuming than you suggest, so I prefer to distribute the cost of surveying over a larger group. Ultimately, that might get us closer to agreement here, though we probably disagree on whether every law school should have an obligation to hire an auditor in the absence of a consortium like that.

Again, the answer here turns on what one thinks the problem is. I'm not aware that faked - as opposed to dramatically incomplete - numbers has generally been regarded as the been the problem on the employment side of law school disclosures.

A small point: are allegations that do not provide credible evidence appropriately regarded as actionable "evidence"? Are you saying that a letter to the ABA that reads, "I think School X mis-counted BigLaw graduates because I know everyone who got a BigLaw job from that school," should trigger a red flag review?

Former Editor


I guess I'm a little confused by your consortium idea (or at least why you think something like it isn't already going on by the ABA's role in the process). My reading of the protocols is that no outside auditor is required until you get to a level three review. Prior to that, the checking and evaluating (which I've somewhat sloppily been calling "auditing") is being done by ABA employees (although schools are required to turn over the information they have to those employees). And if a school's files are in poor enough shape to manage to earn a level three review, I think they should have to spend the money on auditors. The cost alone should provide some incentive for schools to at least pretend to maintain accurate files.

Regarding response rate, the protocols specifically say that "The reviews will take account of the facts that: it will not be possible to contact some graduates; a graduate might intentionally report false information; and a graduate’s employment status might have changed since the time data were collected but the school did not know or have reason to know of the change." The upshot of this is that for a school to trigger the next level of review the ABA would probably need to have several graduates telling their reviewers that what was reported was materially inaccurate, rather than the reviewers being unable to reach some portion of the class.

As to your small point, red flag review can be triggered by a "credible allegation", level 2 review can be triggered by "credible evidence" presented during a random school review. They are different reviews, the latter being (potentially but not necessarily) far more serious.

As you know, I think level two review should happen automatically during a random school review anyway. It should happen even more so if there is any evidence calling into question a schools files during a random school review.

Regarding a potentially far more serious red flag review, I do think (and said above) that a credibility determination should be made. Should an unsubstantiated email trigger a full red flag review? Probably not, but there should be some specific person responsible for making the credibility call in a transparent way. Your hypothetical letter probably wouldn't cut it and a decision not to proceed on that basis would be pretty easily explained and defended.

Former Editor

I meant to say the former above. To be clear, a red flag review is potentially, but not necessarily, more serious than a level one or two review.

James Grimmelmann

Rather than have each school publish a calculator on its own website, it would be better to have each school report the underlying data to the ABA in a standardized format and have a single calculator on the ABA' website. Each school would have to have a conspicuous and acessible link on its website to the ABA calculator (with that school already selected from the appropriate drop-down menu), just as schools must currently have conspicuous and accessible links to their Standard 509 disclosures. There is no reason to program one of these things more than once, or to have to integrate one with each school's own sever software. A single centralized calculator would also make it easier for students to compare their prospects at multiple schools.

David B

Providing this data would mean in practice (though only in part) explaining to most affirmative action admittees that they have a very low chance of ever becoming lawyers. The ABA would have a conniption if a law school did this, because the ABA doesn't care how many minority students graduate and pass the bar, only how many matriculate, and therefore they don't want consumers to have this information.

David Frakt

David B -

It may well be that some "affirmative action admittees" would find out that students with similar entrance credentials have not fared particularly well at the school to which they have been admitted (or perhaps at any law school), but I'm not sure that this would cause the ABA to "have a conniption." The data that I am recommending be disclosed is race neutral, and indisputably useful consumer information. If some students with poor aptitude for the study of law were dissuaded from attending law school because they were provided with data that enabled them to make a more realistic assessment of their chance for success, I think that would be a positive development, even if some of the students who happened to be dissuaded were minorities. I am sure that there would also be majority students who would choose not to apply or not to accept an offer of admission once they found out what a risky proposition law school is for students with poor predictors. Being against providing consumer information would be a tough position for the ABA to defend.


Thanks, Former Editor. I think I understand your position better now.

I don't think I object to making the ABA help law schools comply with their reporting requirements, though I'm a little unsure how much it is willing to do. If the ABA wants schools to turn over a bunch of call and email logs and follow up, fine with me. There is some nontrivial room for this game of "telephone" to go somewhat awry (no one can ask the same question twice), but in principle this should be doable.

The comments to this entry are closed.


  • StatCounter
Blog powered by Typepad