Well, this is like catnip to faculty lounge readers. I see that Joel Kupfersmid has just posted an article on ssrn, "What Works to Increase Law Schools' Prestige and Their Graduates’ Passing the Bar: Better Students or Better Faculty?" Cribbing now from Kupfersmid's abstract:
This study asked two questions about the relative influence of student capability (as measured by LSAT scores) and faculty expertise (as measured by citations in law journals for faculty publications) for increasing a law school’s prestige (as measured by ranking in U.S. News) and passage rates on the bar examination for their graduates. Likewise, several shortcomings in the previous literature were addressed: (1) researchers have either investigated the relationship of student understanding of the law to prestige or examined faculty expertise to this outcome, but none explored the effects of one of these predictors with the effects of the other removed (partial correlation), (2) researchers have correlated various student measures to bar passing rates for law schools across the country but this presents interpretative difficulties because the types of tests given for each bar examination, and the scores needed to pass, have considerable variation across jurisdictions, and (3) several studies have assessed the influence of faculty scholarship to prestige, but no study has assessed the influence of scholarship to bar passage rates. The results of this study indicate that prestige is likely a function of the reciprocal relationship between student capability and faculty expertise. To determine which came first, better students attracting more well-known professors or well-known professors attracting better students, is a chicken and egg problem. With respect to passing the bar, the analysis indicates faculty expertise is more influential than student capability in promoting higher passing percentages, at least in California and New York. Based on these findings, increasing the number of faculty with recognized expertise in an area of law will raise a school’s prestige at least as much as encouraging students with high LSAT scores to enroll, and will have the added benefit of increasing the percentage of graduates passing the bar. This recommendation does not apply to law schools where bar passage rates are very high or where a high percentage of professors eminent in law are already in the department.
I hope to say some more about this once I've had a chance to digest the paper.
A paper of this sort is at very best a distraction. It asks whether incoming students' LSAT scores or the citation count for faculty scholarship is a better predictor of a school's bar passage rate. I suppose I can understand why someone would look at those two factors: both can easily be counted and both are publicly available.
But neither of them looks at what you'd hope would be the most important determinant of student success: what the students do in law school for three years, and how the faculty teach them.
Suppose this were a study asking which NHL hockey team was likeliest to win the Stanley Cup. To answer the question, the researcher looked at two factors: the plus/minus score the players racked up on their high school squads, and the number of times the coach was mentioned on ESPN. The correlations might be interesting in some esoteric way, but I doubt a bookie would place much stock in the study for placing bets.
Isn't that what this study is doing?
Posted by: Eric Muller | March 20, 2013 at 09:50 AM
There are a few seemingly obvious questions that the paper doesn't address:
First, what is the correlation between uGPA and bar passage rates? The paper looks at the correlation of LSATs to bar passage, and faculty citations to bar passage, but it seems like uGPA is a very big piece of the equation.
Second, does the correlation between faculty citations and bar passage change when you look at the areas of study the professors write in? The argument put forth in the article is that being heavily cited is an indication of being a good professor, and good professors tend to produce better learning outcomes for their students, which in turn translates into better bar passage rates. If we buy this argument (and there is good reason to doubt it), we should see that hiring professors who have lots of citations to their Law and Nietzsche papers do not have as big of an impact as professors writing on criminal procedure.
Finally, it looks like faculty citation ranking is just based on the total number of citations collected by the school, which gives an obvious advantage to large schools. More profs = more papers = more citations. What should matter, if citations indicate better teaching, is the citation:student ratio.
Posted by: Derek Tokaz | March 20, 2013 at 10:05 AM
How could the number of citations have anything but the most remote bearing on quality of teaching?
Posted by: Eric Muller | March 20, 2013 at 10:34 AM
I do not think bar passage rates themselves are a very good measure of the school, except at the bottom of the class. I am putting aside California, where a lot of people flunk; and a few schools with really low admissions standards like Cooley. In most states for most schools, most students pass, and we do not know by how much. Look at the results in New York -- even the worst performing schools this year -- NYLS at 70% and Touro at 74% -- saw a strong majority of their schools pass the exam. The surprising thing is that anyone can graduate from a T-14 school cannot pass an exam that more that 70% of the grads of these schools can pass; yet every year some --perhaps 5% to 10% -- do.
Posted by: Ganger | March 20, 2013 at 10:49 AM
Careful Derek, you want to watch what you say about Law and Nietzsche papers - you might be annoying he who must not be pissed off (next thing the'll be all sorts of nasty postings about you and your deficiencies as a scholar, how you should be subject to post-tenure review, etc.) Personally I think Law & Nietzsche is of enormous relevance to teaching law and every student ought to be grateful to sit at the foot of someone who has mastered this tricky conundrum. How could a thorough knowledge of criminal procedure possibly contribute more to legal pedagogy than Nietzsche scholarship.
Posted by: Bob | March 20, 2013 at 11:53 AM
Eric,
Do you think the problem is that the quality of a professor's scholarship is too far removed from the quality of that professor's classroom instruction, or that citations are too poor a measure of scholarship quality?
Ganger,
The essay did discuss your point, though it's not the bottom of the class that's the appropriate place to look, but rather the 25-50th percentile where taking bar-related classes helps. The top half students don't need it (not that there's no value added, but there's no difference between passing and passing with style), and the bottom quartile are beyond help.
To your question of how 5-10% of T-14 students can't pass the bar exam, I think you've phrased the issue poorly. It's not that they can't pass, but that they didn't. That can come down to a lot of factors. There could be a lack of effort, not enough money to pay for a top of the line prep class, perhaps the student has to work to pay the rent and doesn't have enough time to study, the student may be in a job where passing on the first attempt or passing at all isn't relevant (federal clerkship, business consulting), and some number of students will just have a bad day. I suspect many students from the T14 who fail the bar exam can pass it, they just didn't.
Posted by: Derek Tokaz | March 20, 2013 at 12:02 PM
Both, Derek, but more the former than the latter.
Posted by: Eric Muller | March 20, 2013 at 12:22 PM
"How could the number of citations have anything but the most remote bearing on quality of teaching?"
Good question. As I read the paper, the assumption is that more citations means more expertise, and more expertise means better teaching. But I don't know if there is any correlation between citations and expertise in the subjects a person teaches, and I suspect that there is only a modest correlation between expertise and an ability to teach a subject effectively.
Posted by: Orin Kerr | March 20, 2013 at 04:36 PM
More citations might mean more expertise, the disconnect is that it's not necessarily expertise in the subject matter being taught in class. Law journal articles typically exist at the fringe of a subject area, while classes usually teach the core. It's a bit like performing well in a slam dunk competition.
And of course, number of citations is going to be influenced by the popularity of the subject matter you're writing on (more people writing on it means more opportunity to be cited), the prestige of the journal you're published in, and an echo-chamber effect about who are the experts on the subject.
To the question of what increases law school prestige though, faculty citations are probably spot on. 25% of a USN rank comes from peer reputation, and if you're getting cited a lot by your peers, you probably have a good reputation.
The real question, for people who care about improving the quality of legal education, is what measurable factors tend to lead to better teaching? Citations are likely an extremely weak predictor, though possibly stronger than the other go-to factors, like being on Law Review, or scoring a federal clerkship.
Posted by: Derek Tokaz | March 21, 2013 at 08:41 AM
Soon we all will follow the lead of The Aussies and create high paid positions with the duty of popping our reputations.
http://www.insidehighered.com/news/2013/03/20/australian-universities-dedicate-positions-working-rankings-groups
It is a wonder to behold what Mort Zuckerman has wrought with a low quality and now defunct magazine.
Posted by: Bill Turnier | March 21, 2013 at 08:51 AM
None of this speculation addresses the study's finding that, controlling for median LSAT, on a school-by-school basis, number of citations *does* have a significant (.35) correlation with bar pass rates -- if that's true, we need to explain it. That said, now that I've read the study, the finding seems pretty dubious. The problem (as the authors recognize) is that median LSAT and professor citations are so highly correlated that it's hard to pull their effects apart. The authors find that, when you hold citations constant, there's no significant relationship between median LSAT and bar pass rates -- and that's a sufficiently dubious conclusion as to throw doubt on the whole enterprise.
Posted by: Jon Weinberg | March 21, 2013 at 11:45 AM
"To the question of what increases law school prestige though, faculty citations are probably spot on. 25% of a USN rank comes from peer reputation, and if you're getting cited a lot by your peers, you probably have a good reputation."
I thought the USN evaluation of peer reputation is just a professor's view of the school's quality, though, which is generally going to be based on the school's USN rank. If that's right, the correlation is likely very modest. Most professors have no idea who is on a school's faculty, making it hard to assess the scholarly reputations of individuals in the field. And they don't know who is cited unless they happen to have memorized the results of studies specifically on that.
Posted by: Orin Kerr | March 21, 2013 at 02:23 PM
The most significant predictor of a law school's prestige is the prestige of the university it's associated with.
Posted by: David Bernstein | March 21, 2013 at 08:16 PM
Orin,
I'm sure there's quite a bit of echo-chamber effect going on between USN and the peer reputation, but I doubt that accounts entirely for the reputation score.
While professors might not know the vast majority if a school's faculty, there's a good chance they'll know where a lot of the brand named professors are. Do you know where Arthur Miller teaches? Cass Sunstein? Mark Lemley, Harold Koh, or Deborah Rhode? How many professors, other than the ones you know personally, would you hear about if they moved to another school?
The number is going to be very small, and they're not at all going to be representative of the overall faculty, but they are going to bring in the bulk of a school's citations. A big name could have several hundred citations, in some rare cases several thousand, while I bet most professors struggle to crack 10. Tracking citations is going to be a lot like tracking prominent scholars, and it wouldn't be at all surprising for those scholars to form a large part of the basis for the peer reputation ranking.
It would be interesting for someone to study how much USN peer scores correlate with previous peer scores, with previous USN rank, with total citations for professors, and then with placement on an SSRN recent top 10 list, so see if professors producing currently popular articles raise a school's reputation.
Posted by: Derek Tokaz | March 21, 2013 at 11:43 PM
My strong suspicion is that US News rankings (and maybe LSATs) affect peer reputation more than anything else. Before US News rankings I just looked at LSAT scores myself.
At least that would be true if someone asked me to do the ranking; I suspect that at plenty of schools I could not recall the name of any professors; at some schools I might recall one or two who I know personally (excepting of course schools I have taught at, and maybe some of the top schools).
Posted by: ML | March 24, 2013 at 01:01 AM