I was going to comment on Profs. Harrison's & Mashburn's draft Citations, Justifications, and the Troubled State of Legal Scholarship: An Empirical Study, but Al Brophy beat me to it. They note that law professors will instinctively push back against their results. Count me among them - at least some of the normative conclusions.
Interestingly, I agree with most of their empirical findings as well as many of their suggestions for improvement for the system. For example, it's not shocking to me that citations are higher to articles in higher ranked journals. To me, this is just an instantiation of the Matthew Effect. I've heard many argue that the better law reviews get cited because the articles are better, but the findings here, that most of the citations are for "facts" and not "engagement," belies that argument. (Note, as Harrison & Mashburn note, one can quibble with some of their classifications.). The fact that judicial citations are more distributed by rank also implies that the best articles (or at least most useful) are not necessarily in the higher ranked journals.
But where I diverge with the authors is what this all means. They take it to mean that citations are a bad way to assess scholarly value. I'm good with that. They also take it to mean that the scholarly enterprise is probably not worth the investment. (We'll leave aside their calcuation of the investment; I think they overestimate the marginal cost). I'm not sure that their results lead to that conclusion, for a couple of reasons.
First, I believe they understate the value of articles read but not cited. There are many uses of articles by professors and lawyers that never wind up being cited anywhere, though this is surely subject matter based. As the draft notes, SSRN downloads (and abstract views) far outstrip citations. These, too, are based on prestige, but that doesn't mean the articles being read have no value. Some ways there might be influence:
1. People might be influenced by ideas and just not cite them. Given the citation patterns found in the study, this is not out of the question.
2. Professors are teaching students, our future lawyers. Academic discourse and scholarship will affect what and how we teach subjects.
3. Scholarship might be used in legislative reform debates but never get cited.
4. Practitioners may read it and make arguments.
Of course, none of these will apply to every article, but they are value not measurable by citations. Indeed, it is odd that the article first argues that citations are not a good way to measure the value of scholarship and then uses lack of citations to argue that scholarship is valueless.
Second, and really the crux of my normative differences, I don't know that the authors have exhausted all the reasons for legal scholarship that they reject as unpersuasive. The article starts with a prior belief that scholarship has to be useful for something in order to be worth the while. Indeed, the notion that scholarship might be of interest to only other scholars is viewed with derision. Maybe this just bucks against the current "law school is a trade school whose only value is to get jobs for students" view, but I see such a limited view of legal scholarship as exceptionalism. The reality is that citation rates vary widely by discipline. Engineering sciences are right in the middle; maybe we should tell the engineers they should stop seeking all those government grants because no one ever cites their work. I've read hundreds (thousands?) of computer science articles. Many fascinating, many containing novel applications. Most did not disclose something that became commercial software. Perhaps they should just stop trying. And don't even ask about the humanities, with the worst citation rates of them all (well worse than law, I think). It seems that either you buy into the view that scholarship is worth doing of its own accord, or you don't. I'm sure there are plenty of people who would happily jettison humanities scholarship.
In the end, this article is convincing proof of the failure of its normative position. It is a well developed and executed empirical study that tells us something about the world that we didn't know. It might even be cited for those facts by someone talking about legal scholarship. But its normative angle likely will never be cited in a legal opinion. And, yet, it was worth the effort anyway.
Michael,
I'm not sure I'm with you all the way through, but your last paragraph is really fantastic. Great point.
Anon
Posted by: anon | March 03, 2015 at 02:24 PM
"Indeed, the notion that scholarship might be of interest to only other scholars is viewed with derision."
I think what the article views with derision is the idea that scholarship that is of interest to only other scholars should receive the level of subsidization that it does.
"The reality is that citation rates vary widely by discipline. Engineering sciences are right in the middle; maybe we should tell the engineers they should stop seeking all those government grants because no one ever cites their work."
"I've read hundreds (thousands?) of computer science articles. Many fascinating, many containing novel applications. Most did not disclose something that became commercial software. Perhaps they should just stop trying."
I know this was tongue in cheek, but I don't really see what you're getting at. People outside academia find scientific inquiry worth paying for. You can't say that about legal scholarship. When faculty in the sciences/engineering start expecting their students to subsidize their research you might have a point.
Posted by: John Jacob | March 03, 2015 at 04:05 PM
(http://phys.org/news/2014-12-scientific-citation-nobel-laureates-papers.html)
Something you might find interesting.
Posted by: John Jacob | March 03, 2015 at 04:06 PM
"People outside academia find scientific inquiry worth paying for. You can't say that about legal scholarship."
This statement might surprise the legal scholars who have received the MacArthur genius grant (among many, many others who have received funding from a variety of sources.)
Posted by: Um... | March 03, 2015 at 04:18 PM
Seems like you wanted to say something but really could not think of anything. As I read it, they concede that articles may tell us something about the world but question whether everything law professors tell us about the world is worth it. Did you miss that? Plus, as another commentator as pointed out, they have underestimated the cost. More likely, your post makes their point. Good day, sir!
Posted by: Blake | March 03, 2015 at 04:59 PM
"This statement might surprise the legal scholars who have received the MacArthur genius grant (among many, many others who have received funding from a variety of sources.)"
Fine. That isn't absolutely true. But the point is that external funding of legal scholarship is, at best, a drop in the bucket (compared to the $250k/yr engineering professors bring in).
Do external sources of funding account for more than a couple percent of the support for legal scholarship?
Posted by: John Jacob | March 03, 2015 at 05:16 PM
your point on marginal costs needs an explanation. They are all marginal in the long run so you must mean an article by article analysis in which the marginal cost and benefits of an article are compared. If so, the article's conclusion that there is a surplus is correct. Actually quite obvious. What is also obvious is that law profs will not self regulate.
Wrt to John jacob's comment. I am sure you are right. The logic of using an exception to disprove a inconvenient truth is typical law prof reasoning. Think of the global warming doubters who note that it was really cold today.
Posted by: Anon | March 03, 2015 at 05:58 PM
"Indeed, it is odd that the article first argues that citations are not a good way to measure the value of scholarship and then uses lack of citations to argue that scholarship is valueless."
One can easily say that citations in academic journals are a poor proxy for value while also believing that citations in judicial opinions are an excellent proxy for value.
"First, I believe they understate the value of articles read but not cited. There are many uses of articles by professors and lawyers that never wind up being cited anywhere, though this is surely subject matter based. As the draft notes, SSRN downloads (and abstract views) far outstrip citations."
Not sure this is an argument the legal academy will want to advance. For starters, SSRN downloads are themselves not a great proxy. There are other ways to read articles, not everyone distributes their SSRN link as their main way of getting it out, and then not every article which is downloaded ends up getting read, and not every article that is read is thought well of by the reader (you could have the 50 Shades of legal papers, something widely read, and just as widely condemned as junk).
But, the bigger reason why the academy might not want to say "you have to look at how many people read these things!" is that not that many people read these things. With not quite 1500 total downloads over 3 papers (far less than Risch's 12,800!), I'm at roughly the 93rd percentile for all authors on SSRN. A lot of you lugheads need to catch up. I'm not going to name names, but two TFL contributors who have been at it for a decade haven't even cracked the 1000 download mark.
And of course, if we started using SSRN downloads more seriously in evaluating the value of papers, we'd just see a lot more gamesmanship. If you teach a 100 person 1L course each semester, you assign a couple papers written by you and your friends, and suddenly you're at the top of the heap. Don't even need to actually discuss the papers in class.
But despite all the problems with the various proxies, they seem to keep coming up with the same conclusion: while some scholarship is very valuable, the vast majority of it is of marginal or no value. Do we really want to sink half a Billion dollars (maybe a bit more) into that every year? Or to put it another way, would we be better spending our money on something else, such as slashing tuition by a quarter or dramatically reducing the justice gap?
Posted by: Derek Tokaz | March 04, 2015 at 09:30 AM
All - these are thoughtful comments. So much so that I'll probably address them in a followup post in the next couple of days!
In brief answer to Derek Tokaz, though, I don't mean to say we should judge value by SSRN downloads and abstract views. I'm just saying that influence likely extends beyond the citations. Even your 1500 downloads is likely more than you've been cited, no? That's my only point, not that SSRN should substitute.
Now, whether THAT'S enough to "sink a half a Billion dollars" into is another story, which I'll get into later.
Posted by: Michael Risch | March 04, 2015 at 09:35 AM
Michael,
Of course I have more downloads than citations. I'd hope that anyone citing an article would have at least downloaded it! I've got something like 12 or 13 citations. No clue if that's a particularly good number or not. (And I'll note SSRN is poor at tracking citations. I counted mine manually.)
As for not judging articles by SSRN downloads, sure. But, judging them by how widely read they are? That seems more reasonable, and if it is, then SSRN downloads may be a good proxy. And if we're not going to use SSRN, what evidence is there that anyone is reading these things?
It seems like the current thinking of law schools is "We know that some of these things are truly of great value, and we can't prove for certain that the rest are not, so let's throw as much money at it as we can." Personally, I think the burden of proof ought to run the other way. Want to only teach part-time because you're doing scholarship? Demonstrate its value.
Posted by: Derek Tokaz | March 04, 2015 at 09:48 AM
Yes, we are all such philistines for suggesting that legal scholarship should have some demonstrable use, especially for those who put themselves in massive debt to attend law schools.
Law is unique among graduate programs that aren't medical or dental school for its cost of attendance. For example, your school, the Villanova University School of Law, had a total cost of attendance of $237,923 for its graduating class of 2013. Maybe it could charge less to successive classes by subtracting from tuition the amount that would ostensibly be used to pay for professors' non-instructional time researching, and replace it with individual professors' Kickstarters or tip jars scattered throughout the law library. And then, once tuition subsides to a point where it's only 150% of what law schools cost to attend during 1980, perhaps us philistines will be far more reticent about our criticism of your impractical but fiercely beautiful adventures of the mind and human spirit.
Posted by: John Thompson | March 04, 2015 at 09:49 AM
You, like the other commenters, are showing your priors. I'll discuss this later.
But that said, you can also believe that SSRN reads are ALSO not enough to show influence. I don't have evidence of that. My point is only that citation counts, as an absolute measure of influence, fall short. I don't see that as contestable.
Posted by: Michael Risch | March 04, 2015 at 09:52 AM
Some interesting excerpts from “Citation Statistics: A report from the International Mathematical Union (IMU) in cooperation with the International Council of Industrial and Applied Mathematics (ICIAM) and the Institute of Mathematical Statistics (IMS)," published Jun. 12, 2008:
“The validity of statistics such as the impact factor . . . is neither well understood nor well studied. The connection of these statistics with research quality is sometimes established on the basis of ‘experience.’ The justification for relying on them is that they are ‘readily available.’ . . . [C]itation data provide only a limited and incomplete view of research quality, and the statistics derived from citation data are sometimes poorly understood and misused. Research is too important to measure its value with only a single coarse tool . . . .
This much isn't new: People have been assessing research for many years. What is new, however, is the notion that good assessment must be ‘simple and objective,’ and that this can be achieved by relying primarily on metrics (statistics) derived from citation data rather than a variety of methods . . . . [T]his faith in the accuracy, independence, and efficacy of metrics is misplaced.
. . .
Many writers have pointed out that one should not judge the academic worth of a journal using citation data alone, and the present authors very much agree. In addition to this general observation, the impact factor has been criticized for other reasons as well. (See [Seglen 1997], [Amin‐Mabe 2000], [Monastersky 2005], [Ewing 2006], [Adler 2007], and [Hall 2007].)
. . .
Once one realizes that it makes no sense to substitute the impact factor for individual article citation counts, it follows that it makes no sense to use the impact factor to evaluate the authors of those articles, the programs in which they work, and (most certainly) the disciplines they represent. The impact factor and averages in general are too crude to make sensible comparisons of this sort without more information.
. . .
Life sciences (6.2)
Neuroscience (4.5)
Clinical Medicine (3.3)
Pharmacology (3.1)
Physics (3)
Chemistry (2.9)
Earth Sciences (2.3)
Environmental sciences (2.2)
Biological sciences (2.1)
Materials science (1.2)
Social science (1.1)
Mathematics/Computer (.9)
(“[T]he journal impact factor is computed by calculating the average number of citations to articles in the journal during the preceding two years from all articles published in
that given year (in the particular collection of journals indexed by Thomson Scientific). If the impact factor of a journal is 1.5 in 2007, it means that on average articles published during 2005 and 2006 were cited 1.5 times by articles in the collection of all indexed journals published in 2007.”)
Posted by: Um... | March 04, 2015 at 10:15 AM
SSRN downloads are a bad measure of scholarly impact for scholarship for a another reason: the process of counting them inhibits reading. SSRN's lack of direct PDF links and its download integrity system drive down readership.
A deeper point here is that to the extent scholarly impact depends on reaching a broad audience, academics have an obligation to make their works readily accessible -- an obligation many legal academics discharge indifferently at best. Journals are helping now by posting their articles online, although their archiving is often quite poor. But this process further reduces SSRN download counts; readers of recently published work find it elsewhere online. That's a further reason to take SSRN download counts with the idiomatic grain of salt.
Posted by: James Grimmelmann | March 04, 2015 at 10:16 AM
James,
I'd argue that SSRN's lack of direct PDF links increases the value of its numbers as a proxy.
Direct linking is easily abused to boost statistics. The two-step process helps to ensure that the article is only downloaded by people who know what they're downloading.
That said, I agree that one serious flaw is that SSRN is not the only source for the articles, and we do not even know if it is the most popular source. But again, this is where the burden of proof comes into play. It shouldn't be enough to say "Well, maybe people are downloading it elsewhere." It should be on the professors (or the academy as a whole) to demonstrate that the articles are actually being read through other means.
Posted by: Derek Tokaz | March 04, 2015 at 10:58 AM
The authors cite the Chen article for the fact that ssrn downloads are slanted toward the highest ranking schools. Does anyone actually download or not based on the prestige of the author?
Posted by: Corndog | March 04, 2015 at 07:27 PM
Some data: I have roughly ten times as many downloads through BePress Selected Works as I do through SSRN. The numbers aren't consistently proportional; some articles are far more than ten times as popular on Selected Works and some are far less. I don't know how representative my experience is, but it strikes me as a strong case for open access.
That was my original point: open access is better for the world than good download counting. I don't have anything against drawing appropriate inferences from download counts where we have them. But the pursuit of the status that comes with high SSRN download counts has ironically led law professors to put barriers in front of their work, reducing their impact in the world. Institutional repositories and law-review websites are both good developments.
Posted by: James Grimmelmann | March 04, 2015 at 08:47 PM
James,
"But the pursuit of the status that comes with high SSRN download counts has ironically led law professors to put barriers in front of their work, reducing their impact in the world."
Can you explain what this barrier is? As far as I can tell, SSRN is free for anyone to access.
Posted by: Derek Tokaz | March 04, 2015 at 10:28 PM
There are a few issues but the worst is that The suspicious download system can get caught in infinite loops. You bounce between "download this paper" and "download anonymously" (or logging in) forever.
Posted by: James Grimmelmann | March 04, 2015 at 11:37 PM
John Thompson -
I just fished your comment out of spam. Philistines is your word, not mine. I'm merely saying that your position is colored, obviously, by a prior normative view about what law school professors should be doing. I think you're wrong about that, but I don't expect you to agree with me.
But while you are law school tuition (and Villanova specifically) bashing, let me just note that factually you are off. First, law schools are not singularly expensive. Undergrad tuition is a fortune, business school, engineering school, and other tuitions are not cheap. You also have no idea how much of the tuition in each of those places is covered by financial aid, a common failing of reporting raw tuition numbers.
Second, as to Villanova specifically, our students had average indebtedness that ranked about 70. That's not the best, but it's 69 spots better than worst.
Posted by: Michael Risch | March 05, 2015 at 09:36 AM