Many people have suggested that the ongoing crisis in legal education ought to encourage re-evaluation of the traditional law school curriculum. One element of that curriculum that I have always found particularly unappealing is the traditional issue-spotter exam, the default approach in most doctrinal classes.
Of course, issue-spotter exams have certain advantages. They are relatively easy to grade and to curve. They are also relatively easy to write, especially with a little practice. And they are familiar. Every (well, almost every) law professor was once a law student, and law students get plenty of experience of issue-spotter exams.
But issue-spotter exams also have many disadvantages. They encourage students to memorize as much material as possible, which may not be the ideal way to promote deep learning. They encourage students to regurgitate as much of that information as possible, rather than reflecting on ideas and providing well-reasoned answers to questions. They may effectively favor some students over others. And probably worst, they require students to hone a skillset that is largely irrelevant to the actual practice of law.
In my opinion, the disadvantages significantly outweigh the advantages. Nevertheless, for an assortment of institutional and practical reasons, I continue to use a more-or-less traditional exam format in my 1L classes. I don't necessarily like it, but I do feel stuck with it.
However, I have developed a different approach to exams in my upper-division classes, which I have found to work quite well. Rather than a traditional exam, I ask my students to write research memoranda. I have used this exam alternative for four years now, and have tweaked it each year. I'll describe my current approach.
At the end of the semester, often about a week before the last class session, I give my students an exam that consists of three prompts. Each question is based on something that is actually happening. Often, I will use a recently filed complaint or controversy currently in the news. Each prompt explains that the student is a junior associate in a law firm or a law clerk for a judge, and that their employer has asked them to prepare a research memorandum answering a question about the controversy, in 1000 words or less (exclusive of footnotes). For example, a law firm partner might ask whether a client or opponent has a viable claim, or a judge might ask how to rule on a motion to dismiss.
The memos are entirely open-universe. The students are permitted - encouraged! - to research the facts of the case and the relevant law using any and all resources available to them. And they are permitted to do anything permitted to a real junior associate or law clerk. They can talk to each other about the questions. They can talk to lawyers and law professors about the questions. The only requirement is that they do the actual work of writing the memos themselves. Some of my students have even contacted the lawyers and parties who filed actions or are the subject of a controversy to ask questions. The memos are due on the last day of the finals period, so my students have about 3 or 4 weeks to research and write them. All of the exams are submitted using exam numbers and are anonymous.
I have found that this approach to exams has many advantages. Unfortunately, I find that many students have few opportunities to practice their legal writing skills in their classes. This exam method obligates them to practice those skills on real-world issues, in the same way as a real junior lawyer. If they work hard and write good memos, it also provides them with a useful writing sample they can offer prospective employers. It also avoids the "information-dump" problem endemic to issue-spotter exams. In order to write good memos, my students have to identify which issues matter and which don't. They have to revisit all of the legal writing principles they learned as 1Ls, but often let get rusty. They have to think about what kind of information and answers would be most useful to their employer and client, and how to present it most effectively. They have to research and thoughtfully apply the principles we discuss in class, rather than just memorize them.
From my perspective, this method is also a huge improvement. For one thing, (most of ) the memos are a pleasure to read! The students have the time to do real work and polish it. I get a much better sense of their understanding of the subject matter of the class, and their ability to actually apply it in practical ways. And they are typically very easy to grade. Some students inevitably phone it in. Despite receiving examples of excellent memos from previous years and an explanation of what is expected, they submit the equivalent of an answer to an issue-spotter. The middle-range of students get the formal elements of a memo correct, but not the substance. They provide extensive answers to easy or obvious questions and ignore the difficult ones. They fail to properly research key questions (e.g. "It is unclear whether this word-mark is registered."). And they recite the doctrine, rather than applying it to the facts.
But the best memos are just a delight to read. They reflect the diligence and thoughtfulness of the students who engage with the class most deeply. It is obvious that those students dig deep and learn a lot while writing their memos. I have had many students who have struggled in other classes tell me how much they enjoyed taking my exam. Enjoyed.
I have used this approach to exams in my intellectual property, copyright, and nonprofit corporations classes. It has worked remarkably well in all three. And (most of) the students seem to really like it as well. Interestingly, I have noticed that students who do well on traditional exams often write mediocre or quite poor memos. I find that rather troubling. It suggests to me that we are not necessarily testing for the skills that make for good lawyers. And it reinforces my belief that law schools should devote more time and resources to legal research and writing for upper-division students.
Of course, I am very interested in your comments and suggestions. And I would be happy to share examples with anyone interested.
I would be deeply delighted to review your samples, as I've been debating how to evaluate when I teach at the law school in the spring.
Posted by: Matthew Reid Krell | August 12, 2017 at 03:05 AM
It's always amusing when new profs "discover" some "new" "alternative" method, that actually just boils down to a very old, tried and true practice. It's like some new "invention" that turns out to have been patented in the 1800s.
Here, we have an "open book" take home exam portrayed as some marvelous new idea. And, the exams sort themselves out! Who knew?
The only difference seems to be that this prof may be inviting communications that may be borderline unethical, i.e., novice, unlicensed newbies contacting represented parties concerning an ongoing matter, perhaps both sides?, and by way of questions, conveying information?
Posted by: anon | August 12, 2017 at 11:45 AM
Anon,
I do not claim to have "discovered" or "invented" anything. Indeed, I have heard from many senior professors who use a similar approach, and am learning from their experience. That said, many people have expressed an interest in the approach I describe and how it has worked. Yes, it is a form of a "take home" exam, but it is different from any of the "take home" exams I had in law school, which typically were just longer "issue-spotter" type exams. But perhaps the approach I describe is common at some law school? If so, I was unaware of it.
It would be helpful if you would explain why it is "borderline unethical" for students to contact parties to a case or their lawyers? In other contexts, we call that "reporting," an activity practiced by most good journalists, few of whom have any legal training at all.
BLF
Posted by: Brian Frye | August 12, 2017 at 12:29 PM
Matthew,
I'd be happy to share examples with you or anyone else who is interested. Drop me a line at [email protected].
BLF
Posted by: Brian Frye | August 12, 2017 at 12:30 PM
I think this is a vast improvement on the traditional final examination. I, too, bow to practical reality and administer traditional final examinations in some of the bar-tested courses that I teach, on the view that the bar exam forces us to prepare our students for these high-stakes, closed-book examinations. Success on this type of assessment mechanism, however, does not readily translate into success in the practice of law. I often tell my students that in the real world, trying to provide a legal opinion by memory, with an arbitrary time limit, is called malpractice. Yet in law school, this is the primary mode of assessment. I have used similar exercises to the one you describe in the past, although I am wary of real cases because of the possibility that students will make excessive use of the filings in those cases. When I use real cases, I make sure that no briefs useful to the students will be written before the paper is due, or else I change the facts sufficiently to render whatever briefs are written in the real case of limited utility.
I do worry, however that this post, at least implicitly, assumes the primacy of the summative assessment. That, in my view, is problematic.
The task of legal education. IMHO, should be to impart the knowledge. skills, and abilities that entry-level lawyers need. Understanding and being able to apply legal doctrine to reach the "right" answer -- what is generally tested in traditional assessment exercises -- is of some utility, but entry-level lawyers need much more. Entry-level lawyers need to be able to use legal doctrine to solve the problems of their clients. That is why I have concluded that traditional case method offers rapidly diminishing returns. I instead teach primarily with the problem method, and ask students to use doctrine to build legal positions on behalf of clients thoughout the semester. I also provide lots of feedback and formative asseessments. A teacher concerned about the quality of daily class preparation and participation needs to find ways to assess preparation and participation if they are to improve. If students do not have an incentive to prepare carefully for and participate productively in class because the quality of their daily preparation and participation does not, at least in any transparent manner visible to them, drive their grade, then we should not be surprised that students will frequently fail to be deeply engaged prior to the summative assessment. And, if students are not receiving consistent feedback on the quality of their daily preparation and participation, we should not be surprised when it does not improve. The final assessment exercise should be testing the skills that the students have been building throughout the semester, instead of throwing something new at them. IMHO.
Larry Rosenthal
Chapman
Posted by: Larry Rosenthal | August 12, 2017 at 12:49 PM
" I have developed a different approach to exams ... this method is also a huge improvement."
Sorry, the take home, "open book" essay is old school, at least these days. You haven't modified in any significant respect, save one, IMHO.
That one aspect, is, as stated, a potential, which I think we need to know about. You state: "Some of my students have even contacted the . . . parties who filed actions or are the subject of a controversy to ask questions."
Yes, news reporters may do this, and may speak with both sides (if their attorneys permit it) and thereby convey possibly critical information.
In your experience, however, does "This exam method obligate ... them to practice those skills on real-world issues, in the same way as a real junior lawyer"?
Is this how the judge's clerk, the attorneys for the respective parties, develop legal arguments? Conduct discovery? Or, would this more often lead to disqualification, and perhaps sanctions, and, in extreme cases, disbarment?
If you want to "get real" let's hear the argument that this method comports with a rule that "they are permitted to do anything permitted to a real junior associate or law clerk." Perhaps you need to be more clear here.
One additional curious statement: "But issue-spotter exams also have many disadvantages. . . . They may effectively favor some students over others." So?
You admit that your "novel" take home essay exam does the same thing. "Interestingly, I have noticed that students who do well on traditional exams often write mediocre or quite poor memos. I find that rather troubling." Has it occurred to you that everyone doesn't have the same skill set?
I don't know if you've ever had a client relate a rambling disjointed story and whether you, personally, have been expected to spot the issues and quickly identify the legal issues and apply rules of general application to make a relatively quick decisions.
Believing that lawyers don't need this skill is a bit "novel."
Posted by: anon | August 12, 2017 at 01:28 PM
Larry,
Thanks for your comment. I agree that the traditional method of assessment doesn't effectively test or teach skills that are relevant or useful to actual lawyers. I typically assign recently filed complaints, so there usually aren't any additional substantive filings. On occasion, I assign cases that are further along, but in that case, I typically frame the memo as one from the law clerk to the judge, so the briefing is relevant to the answer.
I also agree that limiting the questions to summative assessment & "right" & "wrong" answers would be a problem. I try to choose cases & ask questions that admit to many different potential approaches and answers, so the students not only have to know the doctrine, but use their judgment to determine what is important & what is not & how to best & most effectively & efficiently frame their answer.
There is a potential concern that this "testing" strategy affects incentives around class participation. However, on the whole, I have found that it actually has a positive effect. Because the students know that they won't have to spit out answers on a timed test, but will have a chance to do research, they participate more in class, rather than feverishly taking notes. I try to structure my classes to consider about one big idea per class, on the belief that it is most useful for students to understand the "big picture" of the area of law we are studying & know where to look to provide answers to more specific questions.
It's a good idea to have additional assessment projects throughout the semester, and I have incorporated such exercises from time to time in my different classes. It is, however, one area in which I would like to do more.
BLF
Posted by: Brian Frye | August 12, 2017 at 01:57 PM
Anon,
You are the first person I am aware of who has not seen any difference between the "exam" I described here and a typical "home exam." But ok, whatever you say.
In my experience it is not at all uncommon for junior lawyers to talk to their clients in the course of preparing memoranda. Moreover, my students obviously do not have any actual relationship to the dispute or the parties. So there is no actual risk of anything. I encourage them to gather factual information from whatever source they think would be useful in order to provide the best possible answer. This is intended to encourage them to think hard about what facts are important and how they can best determine those facts and any ambiguities. Does that mean that they are always doing exactly the same things that a junior associate or law clerk would do? No. But I think it is close enough to be a productive learning experience.
I am cheered by your concern for the interests of the students who shine on traditional "issue spotter" exams. I assure you that they continue to have many opportunities to use their special skills. My exam approach is explained in my syllabus and on the first day of class. If they don't like it, they can take a different class, no hard feelings.
I have found that rambling, disjointed stories from clients - from most everyone! - are quite common. I disagree that "issue spotter" type skills are particularly useful in addressing them. But even if I am wrong, we seem to teach that skill quite extensively, almost to the exclusion of any others. I think there is room for teaching other useful skills as well.
BLF
Posted by: Brian Frye | August 12, 2017 at 02:13 PM
Brian
Of course a take home essay exam is useful! It just isn't novel, and you didn't "develop" it. Folks have been assigning take home, open book (or "open universe" as you describe it) bench memos and opposition to motions to dismiss, etc. for decades! Nothing new here.
As stated, the only thing that struck one as odd was the encouragement to send law students out to act as "journalists" to interview represented parties involved in law suits to determine legal issues.
You've now limited your reference to contact with parties to "their clients in the course of preparing memoranda." That's a little better, one supposes. For example, you state "a law firm partner might ask whether a ... opponent has a viable claim, or a judge might ask how to rule on a motion to dismiss." One wouldn't want to think you are training law students to think that calling represented parties on both sides, on the "other" side, or even on any side of a dispute (if this is a bench memo) is ever permissible. Lawyers are not journalists.
Lastly, you seem to be arguing an "issue spotter" doesn't shed any light on a student's ability to quickly assess whether there "a client has a claim." Oh well.
You are the first person I am aware of who has seen such a major difference between "spotting the issues" and then applying the law to facts as stated to reach conclusions in a logical and lawyerly fashion, and "determining whether [P] has a viable claim." (The traditional prompt on nearly all "issue spotters" in one form or another.) If the "polish" (and, to be sure, depth, but then again 1000 words?) impresses you so much, perhaps we can discern the reason you are so disappointed with some of the memos written by students who do well on issue spotters.
Litigating with junior associates in BigLaw always entertains: well written dreck, that misses the issues entirely! That's not so great either.
Posted by: anon | August 12, 2017 at 03:32 PM
Anon, I do something similar in one of my classes - they complete a series of research and writing projects that are typical assignments a junior associate in the field might get from a supervising partner or in some cases from a client. Typically only one is a research memo. They are typically novel problems rather than relating to existing controversies, but not always. Open everything, and no page limit or word limit. I don't claim it is novel, but especially because I engage on both the substance and the writing, as if I were reviewing junior associate work, I do think it is helpful for them to learn the material this way. I use it as a teaching method as well as an assessment method. I don't see a need for an assessment exercise at the end, though I may reflect on Larry's comment a bit. One of my students once suggested an exam (in addition to the papers) would be a good way to keep everyone engaged on every aspect of the course.
Posted by: anon-a | August 12, 2017 at 04:19 PM
anon-a
Another tried and true method. In a seminar, if one has the opportunity, one can create a case file, divide the class, and take them thru discovery (including objections, adjudicated motions to compel, etc.), dispositive motions, settlement conferences, mediation, and then, if possible, trial.
Or, one can assign a problem each week: tax lends itself to this sort of assessment activity.
Or, one can give a take home essay exam and let the students have time to work on it.
As noted, these types of efforts don't work well in first year classes, generally, though some novices do a form of one or the other and then boast to colleagues about their "innovative classroom techniques." "I gave a quiz! What a revelation this was!" Folks who speak this way think they are boasting and may get compliments on their insightful new approaches to learning: never realizing the implications of their reports!
What is so shocking is the never ending cycle of "discovery" by folks who claim to have "developed" some novel, notable form of classroom or exam experience, over and over and over. Same old same old.
The hot buzzwords just get applied to the old methods. Just ask folks who studied the law in Abe Lincoln's day, or early in the 20th Century for that matter.
Posted by: anon | August 12, 2017 at 05:02 PM
Sure, I didn't claim my approach was novel. I can claim that it seems to work better (in terms of skills and information imparted) than an exam did, for the class in question. Do you have thoughts on whether this is (generally, or in particular types of cases) effective? As far as novelty is concerned, no ordinary upper level courses were taught like this at my law school when I was attending. But that's anecdote. I think it is probably true that embrace of this sort of method for both teaching and assessing has become more widespread in the last few decades, but that is an empirical question, and I don't actually know.
Posted by: anon-a | August 12, 2017 at 05:38 PM
anon a
These days, every law school is competing to show everyone how "experiential" they can be. The ABA clamors to get on the bandwagon (while ignoring failing law schools).
None of the methods mentioned in this thread so far would qualify as "experiential" (at as described) and so, the ground breaking approach "developed" by the author above really isn't trendy, or new or something worth writing a long piece about. Lots of folks have been giving take home essay exams on current topics for a long, long time.
As for the "experiential" craze, the "new" emphasis is likewise mostly bs. Law professors are an isolated, sheltered, pampered and spoiled lot, and they think a lot of themselves as a result. They boast and preen and pretend that their feathers are ever so unique and beautiful.
I don't have any objection to sharing experiences. I don't have any problem with discussing assessment techniques. Or actually trying something new (nothing mentioned above even comes close to being a unique or innovative assessment technique). I do, however, find puffing and preening an irresistible subject of inquiry.
When the topic is approached as "I developed" an "alternative method" are we all supposed to nod and then add our own identical use of take home essay exams, and congratulate ourselves about how good we are to have discovered such an amazing alternative to a closed book, timed, race horse issue spotter?
Please. If there is any benefit to a forum like this, perhaps it is not to pretend that we are all so wonderful and talk about basic teaching and assessment skills as if we invented them, but instead prod each other to actually innovate and improve.
As for the question "Do you have thoughts on whether this is (generally, or in particular types of cases) effective?" I'm not sure what the "this" is, but I would simply note that there actually are no "new" ways of teaching or assessment (leaving technology issues aside). We can simply use the tools at hand in different ways, and, IMHO, the key is to determine which tool works best in specific settings, not to claim that one tried and true means is inherently better or more reliable than another, per se.
Posted by: anon | August 12, 2017 at 09:26 PM
"Many people have suggested that the ongoing crisis in legal education ought to encourage re-evaluation of the traditional law school curriculum"
The crisis in legal education does not seem like one that can be addressed through curriculum changes; graduates can't find jobs because there aren't enough jobs, not because they've been handicapped by issue-spotting in-class exams.
Posted by: twbb | August 13, 2017 at 01:13 PM
twbb,
I'll disagree a bit. The lack of job is an issue of supply and demand -- too many law grads, too few jobs that want fresh law grads. Most people have focused on the supply side. However, the problem could (at least theoretically) also be addressed by looking to the demand side.
If a firm is looking to grow, it might consider hiring 10% more junior associates, but it will also consider the possibility of simply increasing the workload of current junior associates by 10%. The quality of grads' education will factor into which route the firm takes. Likewise, looking at JDA and Other Professional jobs, the quality of legal education will factor into the demand for JDs to fill those positions.
Unfortunately, most of the demand-side discussions have focused on how to convince prospective students that there is demand rather than on how to increase actual demand.
Posted by: Derek Tokaz | August 14, 2017 at 02:44 PM
Derek
Good points. The law school enterprise has been notoriously oblivious to what the market wants. In fact, it has sought, in recent years to distance itself as much from practice as possible, hiring only PhDs in intersections with basically no idea about how to practice law, or where new opportunities might be created.
In order to remedy this obvious error, of late, law schools have begun to give lip service to "experiential" programs, and, sometimes, have increased clinical and externship opportunities in an effort to claim that they are responsive to the needs of their students.
Unfortunately, these programs suffer from the hide bound approach taken by the ever so self conscious and pampered elites in the law school complex. In order to preserve their status, they relegate these programs to be managed by persons given inferior pay, status and design these programs in a way that affords little or no real practical benefits to the majority students, other than as resume padding. More often, these programs reflect the high minded value signaling of the law school elites, and thus, have little or no relationship to the job market. (THis is not to say the job market is rational, or that a need doesn't exist for more attorneys to serve underserved populations.)
Certainly, the mind set that can allow one to convince oneself that writing a 1000 word research memo, with no restrictions on sources (including borrowing other students' ideas and perhaps their work product) somehow is a vastly superior way to prepare students for practice (over writing a three hour essay in class, for example) seems a bit of stretch. Neither means of assessment in all cases will be superior or inferior, and neither means will always or even usually better prepare students for practice.
To prepare students for practice one need to be trained by persons with deep knowledge of how to be practitioner. Not a glorified law clerk for a few years in BigLaw or worse, a judge's chambers, but a person who knows the ropes and can add to that knowledge insight into legal theory and structure.
Is there a place in the Citadel for persons ignorant of the real world? Of course. But, like Sam, one cannot learn much from those who don't really know what they are talking about when it comes to legal practice.
Posted by: anon | August 14, 2017 at 03:17 PM