School has started, and I have woken from my summer slumber. Unlike many colleagues, I get a lot less work done in the summer (when the kids are out of school) than I do during the school year. They haven't thrown me out of the lounge yet, so I thought I would post about a few things in the coming weeks.
First up: clinics and employment. I discussed Jason Yackee's paper on the topic in February. Now, Robert Kuehn (Wash. U. - St. Louis) is publishing an essay that refutes the paper (in the Wisconsin L. Rev., where Yackee teaches, which I found interesting).
Here is the abstract:
This Article examines evidence of a possible link between learning opportunities in law school and J.D. employment outcomes. It responds to a paper by Jason Yackee that finds, using 2013 data from top 100 ranked schools, “not much evidence” that law clinic opportunities are likely to improve a school’s graduates’ employment outcomes and suggesting that those opportunities may even harm employment prospects.
The Article reexamines Yackee’s methodological approach and then looks beyond both law clinics and his statistical models. The expanded empirical analysis finds it is not possible to draw any reliable conclusion from his models about the likely effects of law clinic courses, or other activities like law journal and interschool skills competitions, on employment outcomes, and surely not any negative suggestion about clinic opportunities or participation. The most realistic conclusion from available data is that nationwide models provide inconclusive results, as they do not achieve statistical significance and yield both positive and inverse relationships depending on the year of graduation, control variables, and outliers. In fact, other evidence shows that law clinic experiences are important to potential employers and do aid some students in securing employment.
To put it mildly, this is about as thorough a critic of an empirical work as I've seen, going after the data, the methodology, and the results on a point by point basis.
Of particular interest is that if you replace "clinics" with "law review" or "moot court" you also find that schools with more participation do not lead to more employment. While I think this is a helpful test of Yackee's methodoloty, I don't think it undercuts Yackee's findings as much as Kuehn does. Even if law review improves job chances intra-school (which it surely must), there's no obvious reason why it should make a difference inter-school. That is, more law review participants at school 100 need not necessarily mean more employment than fewer law review participants at school 10 if rankings really are driving employment outcomes. Remember that we are measuring employment for the class as a whole, not just for those on law review. Thus, even if employers are looking for it, law review participation might just wind up being a non-significant factor when compared to ranking in the aggregate.
Kuehn might say that this is not evidence against law reviews. Yackee would say that money might be better spent trying to move from from 100 to 10. If you believed that you could move your rankings like that (which I don't), then both would be right. (UPDATE: to note that Kuehn's application of the model to law review is consistent with my posited world -- that ranking may make a difference if there is a big jump, and that the model does not tell us much about the value of law review, even though it should).
The better way to answer the question about employment value of clinics -- which I suspect both would agree with -- is to look at data at the individual level. Unfortunately, collecting that kind of data is pretty difficult.
In all events, I suggest this paper as a refresher on dealing with thorny but relatively easy to digest empirical issues and methods.
Comments
You can follow this conversation by subscribing to the comment feed for this post.