Like most others law profs this week, my attention was drawn to Gregory Sisk et al's scholarly impact rankings, a topic about which Al has already blogged.
One thing that struck me about the study, particularly with respect to the list of "top cited scholars" at my school was the wide variety of faculty included in the list. Of my 7-8 "top cited" colleagues, some are probably cited a lot because they do a lot of blogging and are cited a lot in the media, some are cited because they take unusual or extreme views on a particular issue, and some are more "traditional" scholars ie they produce more traditional law review articles for the most part and would generally have been cited for that work.
So I'm wondering what the new study tells us about just what is being ranked here? I gather that "scholarly impact" means something like "people who get cited a lot". But the term "scholarly impact" suggests to me (at least implies at some level) that these folks should at least be being cited for their scholarship, rather than other things they do. Of course, "scholarly impact" could just as easily refer solely to the impact of these people's work (whatever form it takes) on the scholarship of others - and perhaps that makes more sense, given how the study actually works. But then couldn't it be called a "citation ranking" rather than a "scholarly impact" ranking? I guess there's something about the word "scholarly" when juxtaposed with "impact" that suggests to me the study is doing something more qualitative than it is actually doing. (And I do understand that it was not Prof Sisk who coined the term.)
But regardless of what "scholarly impact" means, I'm wondering what the impetus is behind these kinds of studies? In other words, what do they really tell us about the faculties being studied? A list of how often you are cited in law reviews doesn't really tell you anything about what a faculty is doing, what its scholarly goals and ambitions are, what its scholarly culture is like etc. And, dare I say it, these kinds of studies may eat away at those kinds of discussions and faculties may be tempted to focus on doing whatever they need to do to get ranked highly in these kinds of studies. For example, if you're not a Top 50 law school, but you're able to get your scholarly impact ranked fairly highly by either doing a lot of media or making a lot of extreme comments, that may skew the faculty development side of things.
I'm probably over-reacting here. I suppose I take as much notice of these rankings as the next person (and Sisk's study is the first one in which I am personally mentioned as one of the "top cited" people at my school, probably more for blogging than scholarship). And despite the fact that I blog, I still like to write books and traditional law review articles so I guess I haven't been corrupted by the rankings! And I'm sure lots of people have made these observations before, including the observation that we don't even know what "traditional legal scholarship" entails anyway. I was just musing because everyone is talking about the new study this week ...
I've never seen any evidence that impact tracks media appearances or even blogging (I blog as much as anyone, and less than 10% of my citations are to my blogs). Where did that idea come from?
Posted by: Brian | September 15, 2010 at 07:39 PM
Brian: Thanks so much for commenting. I must admit that I've not reproduced/investigated the part of Sisk's study that relates to my school (or any other school), but I was going on my knowledge of the scholarship of the people at my school noted as amongst the most cited faculty. No doubt we have all written law review articles, but several of the people included in the list have not written traditional law review articles either very much or very recently - not that the age of the article should necessarily be relevant to how often it has been cited. I simply noticed that a number of our most cited faculty have done significantly more in the way of media and blogging than in the way of traditional law review articles, particularly over the last 5-10 years. I also noticed that some of our highest profile "traditional scholars" who don't really do much/any blogging or media did not show up as being highly cited. In other words, this is my own anecdotal impression based on the way our school was ranked in Prof Sisk's rankings. I may well be incorrect in my assumptions - indeed I kind of hope I am. But it would be interesting to know more. Maybe I'll have to take a closer look at the actual citations and see if I'm right or not. (And I'm very interested to hear that less than 10% of your citations are from blogging - that assuages some of my concerns about impact rankings.)
Posted by: Jacqueline Lipton | September 15, 2010 at 08:04 PM
Yup, much as it chagrins me to say it, I'm with Brian on this :-)
Posted by: Dan Markel | September 15, 2010 at 11:42 PM
Jacqui --
I'm with Brian as well. Instead of speculating, why don't you replicate a few of the study's searches. I think you'll be surprised.
JHA
Posted by: Jonathan H. Adler | September 16, 2010 at 08:44 AM
I'm so sorry - I didn't mean to offend anyone personally or collectively. I thought that it was OK to speculate on a blog and I'm happy that people have responded with their views on their own citation counts. Some others obviously have much more time than I do to be thinking about this. As I said, I was just musing and would be very happy to be incorrect. However, I do have an uneasy feeling about law professors always having to count everything and put so much weight on these kinds of ranking exercises. Many of us criticize the U.S. News rankings, but is it really the solution to do more or different rankings? Maybe it is for all I know. And of course I personally benefit from Sisk's study because I count as having a lot of citations so if you are all convinced of the usefulness and meaningfulness of these rankings, then it's great for my scholarly reputation too.
Posted by: Jacqui Lipton | September 16, 2010 at 10:01 AM
There is another point to make about these sorts of rankings. But first I wanted to say: as much as I agree w/ Brian, Dan and Jonathan that Jacqui's questions are best answered by the data, I don't think Jacqui needs to apologize for raising questions. And frankly, we're not all number crunchers. (Without making assumptions about Jacqui's skills in this area) a non-quantitative person shouldn't feel that they can't engage in a discussion about quantitative data just because they're not going to be good at running the numbers. If Jacqui's questions are answered by data -- great to know. Please share the results.
But the point I want to raise is different. This week one of my smartest and most productive colleagues just had a paper accepted for publication in a published collection of essays in the field of social psychology. The scholarly impact of this kind of paper will never be accurately registered in citation count studies confined to legal databases.
Let me say that I agree with the sentiment behind these studies, as Fisk, et al. put it: "As legal scholars, we write for an audience. It is right and appropriate, then, to ask whether anyone is listening. And, if possible, we should answer that question by something more reliable than anecdotes, past accolades, or casual assurances by those in our close circle that they’ve read this or that article." The question is what that audience is: Legal scholars alone, or the academy as a whole? Legal scholars are part of universities, and a scholarly discourse that extends beyond our schools, and ideally beyond our national borders. These studies are limited to a legal citation database that does not capture this sort of impact.
Leiter and Fisk et al. acknowledge limitations, including the impact on interdisciplinary work, but suggest that “an imperfect measure may still be an adequate measure, and that might appear to be true of citation rates as a proxy for impact as a proxy for reputation or quality.”
But the numbers have a tendency to take on a life of their own, and as they are circulated the caveats tend to drop away, and we are left with the numbers themselves, as if they were a representation of something true, rather than one imperfect measure that perhaps necessarily has systematic flaws.
Posted by: Mary Dudziak | September 16, 2010 at 11:18 AM
Jacqui --
No offense taken. There's certainly ample reason to speculate about the value of rankings, including this one, and the desire to count or rank everything. In this case I do think this sort of study provides a meaningful, though hardly perfect, measure of the recent influence of a scholar's past work within the legal academy.
Without question a citation study like this one will reflect the subject matter bias of law reviews, so highly productive scholars in some areas (e.g. tax) are less likely to be recognized for their contributions, and Mary is absolutely correct that some important inter-disciplinary work is likely to get slighted by this methodology.
Citation count studies will tend to reward past work more than present work. So if there is someone who used to be a productive scholar, but gave it up for blogging in 2008, we shouldn't expect to see the effect of that in a citation count study that looks at, say, the 2004-2010 period. Similarly, someone who wrote a seminal article on a particular question many years ago may still receive lots of citations today regardless of what they are doing today.
As for blog citations, my experience is like Brian's. Fewer than 10 percent of my citations are to blog posts or media citations, and most of these citations are to posts that relate directly to my scholarship.
JHA
Posted by: Jonathan H. Adler | September 16, 2010 at 11:49 AM
In light of these discussions, I'm wondering if it's worth having a separate discussion on the relationship between blogging and scholarship. I happened to be chatting about that question this morning with some colleagues - and Jon raised another issue in his last post as well about this by saying that most of his blog posts that are counted in the citation studies relate to his actual scholarly articles rather than other issues. Another colleague suggested to me that people who blog a lot may actually be given more notice as scholars because their audience is simply more familiar with who they are and what they have to say. I thought that was also an interesting point. I also have other colleagues who think that blogging detracts from scholarship because it takes time away from time you could be writing law review articles. I don't personally agree with this view, and I often think blogging informs scholarship and vice versa. (And the "time management" argument really makes certain assumptions about how much time a person actually has for any activities outside the classroom and the family.) I recently constructed a law review article largely based on discussions I had about a particular issue on another blog and I too have cited others' blog comments in scholarship - so I think it's an interesting symbiotic relationship and one that would be worth discussing more. And I'm sure it has been discussed on other blogs as well. I'd be happy to receive links to relevant discussions.
Posted by: Jacqui Lipton | September 16, 2010 at 12:34 PM
Just as successful blogging gets an author SSRN downloads (is there really a debate about this?), it doesn't take too much extrapolation to see how blogging can translate into citations. Not citations for blogposts but citations to the articles that get more attention because of the author's status as blogger and the traffic directed to those articles that would otherwise not be read or seen. Yes, yes: test away if you have nothing better to do with your time. But this is reasonable speculation, I'd imagine.
Posted by: Anon | September 16, 2010 at 01:09 PM
Jacqui, I didn't take offense and didn't mean to give any (by suggestion of chastisement) either. My apologies if I seemed piqued. Just genuinely puzzled at the initial hypothesis. For what it's worth, my blog posts have generated I think no more than 2% of my cites. The previous anon comment, on the other hand, has a ring of plausibility.
The relation between SSRN and blogging is also interesting. A number of us perma bloggers at Prawfs, Co-Op or here at Faculty Lounge do not have unusually high SSRN downloads despite what I think of as very impressive scholarship track records. And in the case of some, that's the case notwithstanding some efforts to furnish links to the work. In any event, let a thousand flowers and theses bloom!
Posted by: Dan Markel | September 17, 2010 at 12:24 AM
Not at all, Dan. And interesting re the idea that many bloggers don't have unusually high SSRN citations. Someone's next law review article, perhaps?
Posted by: Jacqueline Lipton | September 17, 2010 at 08:53 AM