THE EFFECT OF CLINICAL EDUCATION ON LAW STUDENT REASONING: AN EMPIRICAL STUDY

Stefan H. Krieger

 

     I.   Prior Study of Legal Reasoning in Law Students............. 365

    II.   Present Study of the Effects of Clinical Legal Education on Law Student Reasoning 368

          A.   Hypotheses of Study..................................................... 368

          B.   Methodology of Study.................................................. 369

                1.   Subjects.................................................................. 369

                2.   Stimulus Material.................................................... 370

                3.   Interview Methods................................................... 373

                4.   Coding the Data..................................................... 375

          C.   Analysis of Data........................................................... 376

                1.   Recall of Facts........................................................ 376

                2.   Identification of Rules............................................. 380

                3.   Drawing of Inferences............................................. 384

                4.   Identification of Client Interests............................... 386

                5.   Identification of Next Steps...................................... 390

          D.   Preliminary Conclusions on the Effect of Clinical Education on Student Problem Solving 394

  III.   Conclusion........................................................................ 396

 

During the past thirty years, clinical legal education has become an important component of most law school curricula. In clinics, students, typically in their second or third year of law school, represent clients in actual cases in a legal aid office at the law school, pursuant to a court approved “student practice order.”[1] Under the supervision of faculty members, students interview and counsel clients, investigate facts, research legal rules, negotiate with opposing parties, draft documents, and try and argue cases in court. Proponents of clinical education have urged the development and expansion of clinical programs to train students in the skills necessary to apply legal doctrine in practice.[2] While few have argued that the traditional law school curriculum be replaced with an entirely clinical curriculum, many have suggested the introduction of courses using clinical methods during the first year and increased clinical offerings in the last two years.[3] Just recently, the Carnegie Foundation for the Advancement of Teaching published a major study of legal education lauding clinical education as one of “the law school’s primary means of teaching students how to connect the abstract thinking formed by legal categories and procedures with fuller human contexts.”[4] And in the recent report, Best Practices for Legal Education, which professes to present a vision and road map for legal education, the authors argue that contextualized learning, such as clinical training, is the most effective and efficient way for students to develop professional competence.[5]

While the proponents of clinical education identify a number of virtues for this pedagogy,[6] much of the literature on the subject focuses on one major benefit: teaching modes of planning and analysis for problem solving in unstructured situations.[7] Advocates of clinical education argue that traditional legal education has focused too narrowly on legal rules and doctrinal analysis. In his seminal article on the purposes of clinical legal education, Anthony Amsterdam complained that traditional legal education taught students only three kinds of analytic thinking: case reading and interpretation; doctrinal analysis and application; and logical conceptualization and criticism, while ignoring other modes of analysis that are essential for the practice of law.[8] These neglected modes of analysis include: (1) ends-means thinking; (2) hypothesis formulation and testing in information acquisition; and (3) decision making in situations where options involve differing and often uncertain degrees of risk and promises of different sorts.[9]

Proponents of this pedagogy assert that clinical courses address this deficiency by teaching students how to solve problems in practice. As a committee report of the Clinical Legal Education Section of the American Association of Law Schools asserted:

The clinic is an ideal vehicle for imparting [the skills of ends-means thinking and applying doctrine to situations where the facts are unclear or developing]. First, the problems presented to students have all the difficulty, texture, and chance that occur in the world of practice. Students must consider this range of issues while problem solving. Second, the in-house clinic possesses the resources to develop these skills. Many clinical teachers who convey these skills begin with lecture and simulation discussing problem-solving models. Most then follow with intensive discussion during student supervision. While time-consuming, that individual supervision is a powerful means to focus student attention on these skills.[10]

In a similar vein, one commentator asserts that, “[w]ithout undermining the importance of other forms of learning, clinical education stands squarely for the proposition that students learn most effectively by participating in their own education by actually representing people.”[11] Another prominent figure in the clinical movement observes that “[s]tudents need to experience the demands, constraints, and methods of analyzing and dealing with unstructured situations in which the issues have not been identified in advance. Otherwise, their problem-solving skills cannot mature.”[12] The Best Practices for Legal Education report notes that “one of the strengths of experiential education is that it gives students opportunities to practice solving problems and to receive feedback on the quality of their efforts.”[13] Indeed, the Carnegie Report finds that “[s]tudents cannot become effective problem-solvers unless they have opportunities to engage in problem-solving activities in hypothetical or real legal contexts.”[14]

Despite the extensive literature identifying these lofty goals for clinical legal education, very little empirical work has been conducted of the actual effects of such training on students’ abilities to solve actual problems and learn from experience.[15] Most of this scholarship is based solely on the anecdotal experiences of the authors or informal surveys of students in clinical courses.[16] For example, the recent Carnegie Report on legal education makes broad claims about the benefits of clinical and other experiential education.[17] The report states that clinical training is the “law school’s primary means of teaching students how to connect the abstract thinking formed by legal categories and procedures with fuller human context;”[18] that “responsibility of clients and accountability for one’s actions are at the center of clinical experience;”[19] and that “context-based education is the most effective setting in which to develop professional knowledge and skills.”[20] The report primarily bases these assertions, however, on informal surveys of clinical programs at different law schools.[21] While field research is an accepted form of qualitative empirical research, adequate methodological controls are necessary to assure valid and reliable findings.[22] Without such controls, studies such as the Carnegie Report and Best Practices report provide only limited assistance for an assessment of whether or not clinical education achieves its goals.

Precisely because of this deficiency in the literature on clinical education, I have attempted to begin some empirical research on the subject. For the past five years I have been working with Vimla Patel and her colleagues on developing research projects on legal education and the profession. Patel, for the past two-and-a-half decades, has been conducting empirical research on medical education focusing particularly on the issue of teaching problem-solving methods to medical students. One of her areas of study has been the comparison of traditional medical curricula with problem-based learning (PBL), which places a strong emphasis on learning in the context of solving and understanding authentic patient problems.[23]

While Patel has recognized some benefits of PBL curricula, she has cautioned against the bandwagon acceptance of PBL in medical education because of the lack of rigorous empirical assessment of the approach.[24] As she has written, “[a]s is often the case in education, with problem-based learning, innovations in practice outstripped theoretical justifications or empirical research demonstrating the validity of the new approach.”[25] For this same reason, given the paucity of rigorous empirical research on clinical legal education, more rigorous study of this pedagogy is necessary to supplement the anecdotes and tributes from teachers in the field.

To that end, I have collaborated with Patel and her colleagues in replicating some of their studies of medical education in the domain of legal education. I initially conducted a study on the development of legal reasoning in law students from first year through graduation.[26] As a follow-up to this study, in consultation with Patel and one of her colleagues, I have conducted a study of the effect of clinical education on legal reasoning. This article presents the methodology and findings of this study. This article will first describe the previous study of the development of legal reasoning in law students throughout law school. It will then identify the hypotheses of the present study of the effect of clinical legal education on student reasoning and discuss the methodology of this research. Next, it will present the findings of the study and relate them to the conclusions of the initial research. Finally, it will suggest areas for future research on this subject.

As with the first study, a disclaimer is required at the outset. Findings of an individual study have only limited significance, especially given the small sample size of this one and the fact that the subjects were from a single law school and clinical program. The purpose of this study, however, is not to provide definitive conclusions on the effects of clinical education, but to lay the groundwork for future research in the area. Hopefully, others will review the data of this study, replicate it, fine tune it, and build on its findings. By using methodological controls which have not been used in most previous scholarship on clinical education, this research offers an alternative to informal surveys and anecdotes for assessing clinical education.

I.    Prior Study of Legal Reasoning in Law Students

The purpose of our initial study was to explore the development of legal-reasoning skills in law students throughout their law school careers. Cognitive scientists theorize that experts develop schemas to solve problems—-“ordered patterns of mental representations that encapsulate all our knowledge regarding specific objects, concepts or events.”[27] Developed from repeated encounters with similar experiences, “[a] schema can be viewed as a coded expectation about any aspect of an individual’s life, which dictates which characteristics of a given event are attended to, which are stored for the future and which are rejected as irrelevant.”[28] In other words, they are mental blueprints we carry around in our heads for quick assessments of what we think should be happening in a particular situation. In regard to the development of expertise, researchers theorize that as a result of greater experience in a particular domain, experts use their well-developed schemas to filter out reflexively irrelevant data and focus on relevant information to come to a solution. Novices have not yet developed such schema systems and instead solve problems by testing multiple hypotheses before developing a solution.[29] In our research, modeled after a Patel study of medical students,[30] we sought to examine whether law students, by the end of their education, had begun to develop some of the reasoning strategies of experts by filtering out irrelevant information and focusing on relevant facts and rules in solving problems. The prior study of law-student reasoning was conducted with subjects from Hofstra University’s School of Law and focused on three groups of students, each with ten subjects: (1) incoming law students; (2) law students completing the first semester of their second year; and (3) law students nearing graduation.[31] The study was conducted using “semi-structured” interviews of each of the subjects.[32] In each interview, the subjects were given a consumer fraud problem concerning the sale of a used car and were asked to recite the facts of the case and to determine whether the simulated client had a viable legal claim and the basis for this answer. The transcripts of these interviews were transcribed and coded for different factors: (1) recall of facts; (2) identification of rules and inferences; and (3) recital of procedural rules and approaches to take in the case.[33] These data were then analyzed comparing the reasoning skills of the three groups.[34]

The findings of that study raise some intriguing issues for the present research on the effect of clinical legal education on student reasoning. In most respects, third-year subjects in the initial study showed only a slight change in reasoning strategy compared to second-year subjects.[35] In regard to fact recital, for example, third-year subjects typically showed no improvement over their second-year counterparts in recalling the relevant propositions in the problem.[36] And in terms of the relevancy of the propositions actually identified, third-year subjects, on average, recalled a mean percentage of relevant facts comparable to second-year subjects.[37] These findings, the study suggested, “appear to raise some questions about the effectiveness of the final year-and-a-half of law school on students’ ability to focus on the relevant facts in a legal problem.”[38]

In regard to rule identification, third-year subjects, on average, identified more rules in comparison with second-year subjects, but performed only slightly better than second-year students in terms of mean percentage of relevant rules identified.[39] These findings suggest that by the end of their third year, students seem prone to indiscriminately generate a large number of rules, many of which are irrelevant. As the study concluded,

This finding seems to reflect what Patel and her associates term “the intermediate effect.” They have found that intermediates on their expertise scale, when confronted with a problem in their specialty, often engage in a wide scope of information gathering without screening out irrelevant information. They simply process too much garbage. Here, third-year students appear to be processing too much garbage. For example, even though the problem explicitly stated, “Your state has no statute that applies to this type of situation,” some of the Group 3 subjects appeared unable to keep themselves from identifying possible statutory rules applicable to the problem.[40]

In the initial study, the most dramatic difference between second- and third-year subjects is reflected in the data on drawing of inferences. On average, third-year students drew fewer inferences from common experience than first- and second-year students who identified approximately the same number of inferences.[41] This decline in inference drawing may suggest that third-year law students are better able to pay closer attention to detail than their counterparts.[42] It may also indicate, however, that as students progress through law school, they become less inclined to use common experience in solving legal problems.[43] Legal education, then, might be thwarting the brainstorming process about facts.[44]

These findings as to third-year subjects contrast significantly from those in the Patel study of final-year medical students. Unlike third-year law students, Patel found that graduating medical students performed better than their second-year counterparts in several ways.[45] For example, graduating medical students were more proficient than second-year students in identifying patterns of relationships in problems and developing coherent explanations for problems by focusing on relevant facts and doctrine.[46] Our initial study speculated that the difference between Patel’s and our findings could be explained by the fact that all medical students have the opportunity to apply the medical knowledge learned in earlier years to the actual treatment of patients in clinics.[47] Many law students do not have these opportunities, and even if they do, the experiences are usually much more limited.[48]

II.   Present Study of the Effects of Clinical Legal Education on Law Student Reasoning

A.    Hypotheses of Study

Building upon the initial research on development of law-student reasoning, this study explores the effect, if any, of clinical legal education on student reasoning strategy. While the first study indicated that third-year law subjects performed only slightly differently from second-year law students in regard to fact recall and identification of legal rules, this project was designed to explore whether or not clinical legal education had any effect on these and other problem-solving processes.

Proponents of clinical education argue that this pedagogy teaches students how to problem solve and deal with unstructured situations in which the issues have not been identified in advance.[49] Students who participate in a clinical program as part of their course work for the third year of law school obviously do not have enough repeated experiences in practice to develop well-organized schemas and become expert problem solvers. But if the proponents of clinical education are correct that the pedagogy teaches students to problem solve, third-year law students who participate in clinics should develop rudimentary schemas for approaching problems in practice. They should be more successful, for example, in recalling relevant facts in a case and filtering out irrelevant facts than those who have not enrolled in a clinic. They should also be more proficient in identifying relevant rules and rejecting irrelevant rules than their nonclinical counterparts. Additionally, in regard to inferences, based on their practical experience, students who participate in clinics should be able to draw more inferences from facts presented than those who do not enroll in a clinic. Especially in light of the findings in the initial study that third-year students, on average, do not outperform their second-year counterparts on a number of these factors,[50] this study attempts to examine whether clinical experience has any significant effect on the performance of final-year law students.

In this study, we also examined two issues which were not considered in the first study: (1) the ability of the subjects to identify client interests; and (2) their facility in developing a future course of action in a case.[51] Two of the key aspects of the problem-solving process in legal practice are the identification of client goals and the development of strategies to achieve them.[52] If the advocates for clinical education are correct, students who have participated in a clinic should have begun to develop schemas to consider goals and strategy development in a case. Accordingly, they should be more effective in identifying client interests and future courses of action than those who have not had a clinical experience.

B.    Methodology of Study

The methodology used in this study was similar to the one employed in the previous study of Hofstra law students at different stages in law school.[53] Like the previous study, this one sought to examine the development of reasoning skills as students progress in law school. In this study, at a different law school, we compared the skills of law students at the conclusion of their second year with graduating students. The current research also studied the effect of clinical legal education on these skills, comparing the reasoning of second- and third-year law students who have not participated in a live-client clinic with those who have enrolled in such a program.

1.    Subjects

The study was conducted with University of Chicago Law School students in April 2006. The research focused on three groups: (1) third-year law students with clinical experience at the law school’s Mandel Legal Aid Clinic; (2) third-year law students without clinical experience; and (3) second-year law students without clinical experience. The number of subjects in each group ranged from ten to thirteen.[54]

We solicited volunteers for the study by electronically sending all second- and third-year law students at the University of Chicago Law School a letter requesting volunteers for a study of “second-year law students who have not taken a clinic course; third-year law students who have taken a clinic course for the entire third year (Fall, Winter, and Spring Quarters); and third-year students who have not taken a clinic.”[55] We assured volunteers that the study was completely anonymous and offered $25 in compensation to each subject.[56]

2.    Stimulus Material

The fact pattern used in this study was based on an Illinois appellate case, Dobosz v. State Farm Fire & Casualty Co.[57] The problem states that the subject has just interviewed a new client, Ralph Kuzinsky, and is reviewing her notes in her office.[58] Mr. Kuzinsky purchased an “All-Risk” homeowner’s insurance policy from State Farm Insurance Co. and seeks legal advice because the company denied a claim for water damages to his home. When Kuzinsky first called State Farm to inquire about a policy, its agent promised to send a brochure showing exactly what the policy covered and recommended the “All-Risk” policy because it was the “Cadillac of the line” and would cover everything and insure against all risks.[59]

Mr. Kuzinsky brought a copy of the brochure to the interview.[60] The brochure presents information on three different homeowner policies: Basic, Broad, and All-Risk Special.[61] For each policy, the brochure contains captioned pictures describing examples of covered losses.[62] For the All-Risk policy, the brochure states that the coverage includes the protections of the other two policies “plus many others not specifically excluded.”[63] One of those additional coverages is “Water damage,” with a picture of an open window and rain accumulating on the floor.[64] In small print at the bottom of one page, the brochure provides, “This brochure contains only a general description of coverage and is not a statement of contract. All coverages are subject to the exclusions and conditions in the policy itself.”[65]

After examining the brochure, Kuzinsky called the agent and, believing that the All-Risk policy provided the coverage he needed, ordered that policy.[66] Although the agent claims the company mailed a copy of the policy to Kuzinsky, the client denies ever receiving it. He never requested a copy of the policy and renewed it the next year.[67]

Earlier that year, Kuzinsky said water leaked through the walls of his home’s basement, causing the sump pump to stop and resulting in accumulating water and damages in the amount of $10,820.[68] Kuzinsky filed a claim with State Farm for the damage.[69] The company denied the claim, asserting that his policy explicitly excluded this type of water damage. The policy, which Kuzinsky brings to the interview, excludes the following types of water damage:

a. flood, surface water, waves, tidal water, overflow of a body of water, or spray from any of these, whether or not driven by wind;

b. water which backs up through sewers or drains, or

c. natural water below the surface of the ground, including water which exerts pressure on, or seeps or leaks through a building, sidewalk, driveway, foundation, swimming pool or other structure.[70]

At the interview, Kuzinsky says that he believes State Farm owes him the full amount of the claim, and he wants to know the range of his options.[71] The fact pattern concludes with a note that preliminary research shows no statutory or regulatory provision in state or federal law addressing the issues in the case.[72]

This case was selected as a problem for several reasons. First, the relatively simple facts of the case provided a fact pattern that could be digested by law students in a short period of time, making it feasible to study the participants’ understanding and recall of the facts. Second, the basic legal concepts in the case concerned issues that most subjects had likely encountered in their law school careers. The Dobosz court framed the issue as a common law contract question: whether an advertising brochure constituted part of the insurance contract and controlled over inconsistent language in the policy itself.[73] In considering this issue, the court examined whether the insured relied on the brochure and whether the brochure created an ambiguity in the contract terms.[74] Relying on basic contract construction rules, the court held that any ambiguity in an insurance policy, particularly with provisions limiting the insurer’s liability, should be construed in favor of the insured.[75] As an alternative legal theory, the court framed the issue as one of estoppel: an insurer may be estopped from relying on an exclusionary clause in the policy when brochures or solicitations misrepresent coverage.[76] Both of these theories concern issues that have been addressed in the first year of law school and are usually revisited in upper-level courses, such as Sales, Real Estate Transactions, and Remedies. And by excluding any statutory or regulatory issues, no special doctrinal expertise was necessary for analysis of the fact pattern.

Third, while the facts of the case were fairly simple, the attached brochure created an opportunity for exploring the subjects’ skills in close reading of a document provided by a client. Not only did the description in the brochure of the “All-Risk” policy raise the issue of inconsistent contract terms, but the small print in the brochure warning that all coverages “are subject to the exclusions and conditions in the policy itself” suggests a possible State Farm defense to Kuzinsky’s reliance on the brochure. Unlike most law school exam questions, this fact pattern required the subjects to examine a document provided by a client and allowed for a more realistic simulation of the skills required in practice.

Finally, the fact pattern raised other issues that an attorney representing an actual client faces. For example, gaps and inconsistencies such as the mailing of the policy to Kuzinsky, his renewal of the policy even though he had not received a copy, or the precise basis for the damages, provided rich opportunities for identification of areas for fact investigation. Kuzinsky’s description of his background and beliefs also raised questions to explore about the client’s interests: that he is a young electrical engineer married to a high school teacher; that he purchased the policy because he believed it covered everything he needed; and that he feels that State Farm owes him the full amount of his claim. Kuzinsky’s query at the end of the interview “to know the range of his options” also provided the subjects an invitation to brainstorm possible courses of action in the case beyond legal research and analysis.[77]

3.    Interview Methods

This study was conducted using the same “semi-structured” interview methodology used in the previous study.[78] In each interview, the subjects were asked to verbalize their thoughts as they had them.[79] To encourage detailed descriptions of the reasoning process, the subjects were asked open-ended, probing questions throughout the interview.[80]

The interviews were conducted and taped by second- and third-year law student research assistants who were trained by a member of Patel’s team on the use of the semi-structured interview process. In conducting the interviews, the assistants followed a script.[81] The script informed the participants that the project was a short anonymous research study on the development of legal reasoning skills in law students.[82] The participants were assured that the problem was not a test of their abilities, but rather an attempt to determine how people think about legal problems. Further, the interviewers told the subjects that their responses would not be graded or revealed to their professors or anyone else but would be tape recorded to ensure accuracy. Finally, as in the previous study, the research assistants told each subject:

When answering the question please verbalize your thoughts as naturally as possible. Please do not explain or rationalize your thoughts but rather communicate them in a free flowing manner. The easiest way to do this is to go through your normal thought process but say everything aloud as if no one else were in the room.[83]

Before the participants were handed the actual fact pattern, they were given a sample LSAT problem for a test run. In answering this problem, the subjects were encouraged to verbalize their thoughts and discouraged from providing explanations for their reasoning. Then the research assistants gave the fact pattern to the subjects and told them:

You just interviewed a new client, Ralph Kuzinsky. This memo contains your notes from that interview. You are now reviewing your notes alone in your office. Please review this, either aloud or to yourself, and if you are thinking of anything as you review these material, please verbalize any thoughts you have as you read. This should mirror your normal thought process. It should be as natural as possible. Remember, just as in the previous exercise you only need to report what you are thinking without explaining why you think it. The information on the page is all the information available regarding this material.[84]

After the subjects indicated that they had completely reviewed the fact pattern, the research assistants took the problem from them and asked them to state the facts of the case. Then, the interviewers asked them for their assessment of the case and the basis for their assessments. Next, the subjects were asked what they should tell the client. And finally, the research assistants asked the subjects to identify the next steps to take in the case. After each of these questions, the interviewers probed the subjects’ responses, asking them if they had any other responses regarding the matter.

4.    Coding the Data

After the interviews, the tapes were transcribed.[85] These transcripts were then analyzed using the same “propositional analysis” method used in the previous study.[86] This technique involved segmenting the responses in each think-aloud interview by propositions (either clauses or sentence fragments reflecting a single thought). Two research assistants and I then coded each proposition for a number of categories: (1) recital of facts, both relevant and irrelevant, set forth in the fact pattern; (2) identification of rules, both relevant and irrelevant, used in the assessment of the case; (3) drawing of inferences from the facts set forth in the problem; (4) identification of client interests; and (5) the next steps the subjects identified should be taken in the case.

Prior to coding the transcripts for each category, we developed protocols to attempt, as much as possible, to standardize the process.[87] Then, two research assistants and I reviewed each transcript, segmented into propositions, and independently coded them. Any identifying information indicating the subjects’ year in law school or clinic participation was kept separate from the transcripts or tapes so the coders were unable to determine the group to which they belonged. After all the codings were completed, all three of us met and reconciled all discrepancies.

C.    Analysis of Data

1.    Recall of Facts

As in our study of development of legal skills throughout students’ law school careers, we coded the transcriptions for recital of facts at the point in the interview when the research assistants took the problem from the subjects and asked them to state the facts of the case.[88] The purpose of this coding was to examine our first hypothesis that third-year law students who have participated in a clinic should be more proficient in recalling relevant facts in a case and filtering out irrelevant facts than those who have not enrolled in a clinic.

To code for factual relevancy, we scored each proposition set forth in the problem for its relevancy using the following scale: (1) most relevant to legal issues in case or client’s interests; (2) relevant to legal issues in case or client’s interests; (3) limited relevancy to legal issues in case or client’s interests; and (4) little, if any, relevancy to legal issues in case or client’s interests.[89] We based our scoring on the opinion in the Dobosz case.[90] We scored the essential facts relevant to the two legal theories (contract construction and estoppel) and the client’s interests in the case as “1.” Cumulative evidence as to these theories and interests, as well as facts relating to possible defenses, were scored as “2.” We scored every background fact and statements establishing evidentiary foundation for relevant facts as “3.” All other statements were scored as “4.” Of the seventy-five propositions in the problem, fourteen were scored as having most relevance; six were scored as relevant; thirty-six were scored as background or foundation facts; and nineteen were scored as having little or no relevance.[91]

The summary of the data on the subjects’ recital of facts is set forth in Tables 1 to 3.

 

 

 

Table 1: Mean Number of Facts Recalled

 

Group

 

Average Number of Facts Recalled

 

Group 1 (3L’s with Clinic)

 

13.5 (6.4)

 

Group 2 (3L’s without Clinic)

 

18.3 (6.2)

 

Group 3 (2L’s)

 

15.1 (7.5)

*Standard deviation (SD) in parentheses

 

 

 

 

 

 

 

 

 

 

 

Table 2: Mean Number of Relevant Facts Recalled

 

Group

 

Most Relevant

 

Relevant

 

Limited Relevance

 

Little/No Relevance

 

Group 1 (3L’s with Clinic)

 

3.2

 

1.9

 

7

 

1.4

 

Group 2 (3L’s without Clinic)

 

2.9

 

2

 

11.1

 

2.3

 

Group 3 (2L’s)

 

3.1

 

1.8

 

7.8

 

2.4

 

 

Table 3: Of All Propositions Recalled, Mean Percentage of Relevant Propositions Identified

 

Group

 

Most Relevant/Relevant Propositions

 

Group 1 (3L’s with Clinic)

 

38.07%

 

Group 2 (3L’s without Clinic)

 

26.78%

 

Group 3 (2L’s)

 

32.53%

 

In regard to recall of facts, Table 1 shows that third-year subjects who had not participated in a clinic (Group 2) identified the highest average number of total facts (18.3), followed by second-year subjects with no clinic experience (Group 3) (15.1), followed by third-year subjects with Clinic (Group 1) (13.5). In terms of relevancy of total facts actually recited, however, Table 3 shows that third-year subjects who had taken a clinic recited the highest mean percentage of relevant facts (38.07 percent), followed by second-year subjects with no clinical experience (32.53 percent), followed by third-year subjects who had not participated in a clinic (26.78 percent).

These data suggest that while third-year students who had no clinical experience recalled more facts on average than their clinical counterparts, more of the facts identified by those subjects without clinic experience were not relevant to the problem. Consistent with the first hypothesis, students with clinical experience seem to be more proficient than those without such training at filtering out irrelevant facts and focusing on the facts that were relevant. This is an especially significant finding because the problem in this study did not directly concern any of the subject matters handled by the Mandel Clinic.[92] Indeed, some of the clinical programs addressed issues in very different areas from contract law, such as police accountability and mental health. While it seems reasonable that students with a clinical experience in a particular area of practice would develop rudimentary schemas for handling problems in that subject, these findings suggest that they can use these schemas in different areas in which they have not practiced, but in which they have domain knowledge such as contracts law. Accordingly, perhaps clinical training encourages students to focus on facts relevant to doctrine of which they have knowledge.

The data do indicate, however, that subjects in all three groups did not pay close attention to the State Farm brochure attached to the problem. Of the four Most Relevant and Relevant facts reflected in the brochure, subjects in all three groups only identified one of those propositions: that one of the ten pictures describing the coverage of the All-Risk Policy was captioned “Water Damage.”[93] And that fact was only identified by three subjects in Group 1, three in Group 2, and two in Group 3. With only two exceptions, no subject in any of the three groups identified the other three Most Relevant and Relevant propositions in the brochure: (1) that the brochure provided that the “All-Risk Special Policy adds coverages from the other two policies plus many others not specifically excluded; (2) that one of the pictures showing the coverage of the All-Risk Policy depicted an open window through which rain is falling and below which a puddle is formed; and (3) the brochure contained a statement, “This brochure contains only a general description of coverage and is not a statement of contract. All coverages are subject to exclusions and conditions in the policy itself.”[94] This third proposition is especially important in terms of Kuzinsky’s purported reliance on the brochure and State Farm’s possible defense, which in fact was raised in the actual case.[95] One of the reasons for this lack of attention to the brochure’s details could be the subjects’ perceptions that they did not have time to review the document thoroughly. It could also mean that none of the subjects, even those on the verge of graduation, had been trained well in the skill of close reading of actual case documents. But, it is especially surprising that the third-year subjects who had participated in a clinic and who undoubtedly had the experience of reviewing documents for preparation of their cases apparently did not take the time to closely read the evidence provided with the problem.

The tendency of third-year students without clinic to recite on average more facts of a problem, a higher percentage of which are irrelevant, appears to reflect what Patel calls the “intermediate effect.”[96] Patel has found that as novices gain more expertise, they often engage in a wide scope of information gathering without screening out irrelevant facts. “They simply process too much garbage.”[97] Only, as they developed expertise, could they better distinguish between relevant and irrelevant facts. The findings in this study suggest that without the clinical experience of schema formation, third-year students are less likely on average to screen out irrelevancies.[98]

2.    Identification of Rules

We also coded the transcripts for identification of substantive rules. A rule was defined as any legal standard that the subject took into account in assessing the problem. Reviewing each proposition recited by the subjects during any portion of their discussion of the case, my research assistants and I identified every rule considered by each of the subjects.[99] After we reconciled these identifications, we scored the rules for relevancy. A rule was scored as relevant only if it concerned one of the theories identified by the Dobosz court—breach of contract or estoppel—or one of the elements for those theories.[100] The purpose of this coding was to test our second hypothesis: subjects who had clinical experience should be more proficient in identifying relevant rules and rejecting irrelevant rules than their nonclinical counterparts. As with fact recital, students who have participated in a clinic should have started to develop schemas for handling problems in practice which help them to focus on relevant rules and filter out irrelevant ones.

 

 

 

 

 

 

 

Tables 4 to 6 summarize the data on identification of rules.

 

Table 4: Mean Number of Rules Identified

 

Group

 

Mean Number of Rules Identified

 

Group 1 (3L’s with Clinic)

 

6.77 (2.4)

 

Group 2 (3L’s without Clinic)

 

8.80 (2.4)

 

Group 3 (2L’s)

 

6.09 (2.8)

*SD in parentheses

 

 

 

 

 

Table 5: Mean Number of Relevant Rules Identified

 

Group

 

Mean Number of Relevant Rules Identified

 

Group 1 (3L’s with Clinic)

 

5.00 (2.9)

 

Group 2 (3L’s without Clinic)

 

5.90 (2.1)

 

Group 3 (2L’s)

 

3.73 (2.4)

*SD in parentheses

 

 

 

 

 

 

 

 

Table 6: Percent Relevant Rules Identified

 

Group

 

Relevant Rules Identified

 

Group 1 (3L’s with Clinic)

 

73.86%

 

Group 2 (3L’s without Clinic)

 

67.05%

 

Group 3 (2L’s)

 

61.19%

 

Table 4 shows that third-year subjects without clinical experience identified on average the highest number of rules (8.8); followed by third-year subjects who had participated in a clinic (6.77); followed by second-year subjects (6.09). In regard to the relevancy of the rules identified, Table 5 demonstrates that third-year subjects who had not participated in a clinic again led the pack, identifying on average the most relevant rules (5.90); followed by third-year subjects with clinical experience (5.00); followed by second-year subjects with no clinical experience (3.73). Finally, in regard to the mean percentage of rules identified that are relevant, Table 6 shows that 73.86 percent of the rules identified by third-year subjects with Clinic were relevant; 67.05 percent of the rules identified by third-year subjects without Clinic were relevant; and 61.19 percent of the second-year subjects without Clinic were relevant.

 

These data indicate that once law students reach third year, regardless of whether or not they enroll in a Clinic, they are better able to identify rules than second-year students, and the rules they identify are, on average, more relevant. This finding is similar to the one in our prior study of law student reasoning in which we found that as students progress in law school, they are able to identify more rules in relation to a legal problem.[101] The data in this study suggest, however, that third-year students without clinical experience are more likely to explicitly recite rules—both relevant and irrelevant—than their clinical counterparts. But, of all the rules identified, the percentage of relevant rules for third-year students who have not participated in a clinic is less than that for students with clinical experience. Perhaps this is another example of the “intermediate effect” for third-year students who have not enrolled in a clinic: they are more proficient in spotting issues in a case than their clinical counterparts but less successful in assessing their relevancy.[102]

These data call into question the second hypothesis—that students with clinical experience should be more proficient in identifying relevant rules and rejecting irrelevant rules than their nonclinical counterparts. While third-year students with clinical experience identified a higher percentage of relevant rules in terms of all rules identified than their nonclinical counterparts, third-year students without clinic, on average, identified a larger number of relevant rules than those who had participated in a clinic.

There are several possible reasons for these differences. First, clinic students, through their casework, may spend less time explicitly considering legal rules than those students who have not enrolled in a clinic. They may use the rules in practice but may not articulate them. Second, clinic students may be less concerned with legal theories than their non-clinic counterparts. Proponents of clinical education argue that it helps students learn to apply legal doctrine in practice.[103] Yet, the data raise questions as to whether clinics are in fact facilitating the application of legal knowledge in problem solving, or instead encouraging students to focus on other aspects of practice, such as addressing the client’s non-legal interests. Third, clinic students at client interviews may focus more on facts than rules and defer explicit legal analysis until after the interview. Finally, to a lesser extent than third-year students with clinical experience, third-year students without clinic experience may have considered the problem on a law school exam question, in which issue spotting was the aim, rather than a client problem to be solved. Students with a clinic experience may have treated the problem as an actual case for which rule identification was only one aspect. Further research would be helpful to explore this hypothesis and to assess the effect of clinical education on rule application.

3.    Drawing of Inferences

Our third hypothesis is that students who have participated in a clinic should be able to draw more inferences from facts presented than those who have not enrolled in a clinic because of their practice experience. In our previous study on law-student reasoning, we found a substantial decline in inference drawing in third-year subjects as compared with those in their second year.[104] We conjectured that this decrease may have reflected a greater attention to detail by the third year in law school, or an inhibition fostered by law school classes from assuming any facts not provided in a problem.[105] For the present study, we sought to examine whether or not practice experience in a clinic affected the inference-drawing process.

In coding for inferences, my research assistants and I defined an “inference” as the drawing of a conclusion from known facts based on premises or assumed to be true.[106] If the premise was a legal rule, we did not code the proposition as an inference but as a rule recital. We were only looking for inferences drawn from the subject’s experiences. For the coding process, my research assistants and I independently examined every proposition in the transcripts throughout the interview. We coded every instance in which a subject did not merely recite a fact in the problem but made an assumption about it. We then reconciled these codings.[107]

 

 

 

 

 

 

 

 

 

 

Table 7 summarizes the data for inference drawing.

 

Table 7: Mean Number of Inferences Drawn

 

Group

 

Number of Inferences

 

Group 1 (3L’s with Clinic)

 

2.85 (2.2)

 

Group 2 (3L’s without Clinic)

 

3.70 (2.5)

 

Group 3 (2L’s)

 

2.82 (1.5)

*SD in parentheses

 

As this table shows, third-year students without clinical experience drew, on average, the most inferences (3.70), followed by third-year clinical subjects (2.85), followed very closely by second-year subjects (2.82).

Apparently, like the third-year students in the prior study, students with clinical experience are more likely to stick to the facts.[108] Contrary to the third hypothesis, students who have participated in a clinic are less likely than their nonclinical counterparts to draw inferences.[109] This phenomenon could be a sign of cautiousness by clinical students in handling actual cases. Perhaps students with clinical experience wait until further fact investigation to begin the inference-drawing process. But these data could also indicate that the clinical experience may stifle the fact-brainstorming process. Indeed, in a related finding in regard to our examination of identification of next steps to take in the case, third-year students without clinical experience were more interested in investigating the facts of the case than their clinical counterparts.[110] Given the assumption of some proponents of clinical education that clinics facilitate the development of fact investigation skills,[111] these two findings are surprising.

4.    Identification of Client Interests

Our fourth hypothesis is that students who have enrolled in a clinic should be more proficient in identifying client interests than those who have not had a clinical experience. If clinical education teaches problem-solving skills, then students in clinical programs should have begun to develop schemas to consider client interests when considering a new case. To probe the subjects’ reasoning in this area, my research assistants asked the subjects what they should tell the client after the interview.[112]

To determine the range of codings for subjects’ recitations of clients’ interests, we reviewed the Dobosz decision and identified every client interest described by the court.[113] We then added the one interest set forth in the problem that was not included in the Court’s decision (“[RK] wants to know the range of his options”). Finally, from my own experience in practice, I brainstormed possible interests a client may have in this kind of case.[114] Possible interests included: damages, the full amount of the claim, information about the range of options, litigation costs, apology, revenge, change in State Farm advertising materials, relief for similarly-situated State Farm customers, peace of mind and quick resolution of claim.

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Table 8 summarizes the data for identification of Client Interests.

 

Table 8: Identification of Client Interests[115]

 

Group

 

Mean Number of Interests Identified

 

Group 1 (3L’s with Clinic)

 

1.77 (1.7)

 

Group 2 (3L’s without Clinic)

 

1.10 (1.1)

 

Group 3 (2L’s)

 

1.64 (1.6)

*SD in parentheses

 

These data show that third-year subjects with clinical experience identified, on average, the highest number of different interests (1.77), followed by second-year subjects (1.64), followed by third-year subjects without Clinic (1.10).

The differences between the three groups are reflected in the following representative excerpts from the transcripts.[116] A third-year student with clinical experience, for example, stated:

Subject: Ok the total in damages is ten thousand dollars um $10,820 which he wants to have repaired. Um I would wonder I mean I haven’t had a lot of practical experience so I don’t know how quickly legal fees add up but it would seem to me that depending on how annoyed he was about the situation he would need to do a cost-benefit analysis to see what his priorities were. I think I would advise the client that um I mean depending on what my analysis of the sewers or drains um and whether or not I felt like after some research about a sump pump it could in fact be excluded from this policy and that we had some colorable claim that it could be I would suggest that we write a letter you know draft a letter and pursue options outside of trial and see if we could get them to agree to pay for some of it but then I would see how far he would want to go and how much legal fees he would want to incur based on that um just based on the cost of litigation. I have a natural bent towards mediation but I don’t know that this seems like it might be a good I don’t know I would want to talk to the client and see what his feelings were on this matter um if he was mostly besides wanting his money if he was very annoyed about the process and therefore really wanted to punish the other side or he just didn’t really mind and just wanted some money to pay for it.

 

Interviewer: Anything else you want to tell the client?

 

Subject: To read contracts and make sure you get a copy before he um especially before he renews them . . . um what else would I want to tell my client. It doesn’t seem like it’s a clear cut case so that’s why I would want to talk to him about what his priorities were cause I think that I mean without understanding what the law is in the jurisdiction that um he might it might be a bit of an uphill battle so yea that’s about it.

 

       In contrast, a third-year subject without clinical experience merely identified one interest: “Ralph just wanted to know any claims he might have or if he can pursue this any further.” And a second-year subject stated,

I’ll try to get his permission to negotiate a settlement of some sort um and try to get a range from him as to what would be appropriate in this case if it is the full amount like if he would take $9,995 or something right. And then I would discuss with him that if that doesn’t work I would discuss with him the potential for filing a claim and tell him what my fees are whatever they might be. That is where I would start though tell him to fix it up and um if I think can call the insurance company and see what they have to say about it so.

 

In regard to the most frequently identified interests, little difference existed between the groups. Subjects from all groups noted the damages State Farm owed Kuzinsky and litigation costs.[117] But in terms of range of options, as a group, those subjects with clinical experience identified more interests than subjects without clinical experience: nine interests, compared with six for second-year subjects and four for third-year subjects who had not participated in a clinic.[118] Third-year subjects with clinical experience were the only ones who identified the interests of relief for similarly-situated customers and a quick resolution of the claim.[119] In contrast, subjects without clinic experience were the only subjects who did not identify revenge, pride, need for changes in State Farm’s practices and policies, and assessment of what case is worth.[120] Finally, no subject in any group identified an apology or peace of mind as an interest.

These data appear to support the fourth hypothesis. Subjects who participated in a clinic were more proficient, on average, than those without clinical experience in identifying client interests.[121] In fact, in contrast to second-year subjects, third-year subjects with no clinical experience were less proficient in identifying client interests.[122] Comparing these findings with those on rule identification, it appears that third-year students without clinic experience focused more on issue spotting than problem solving.[123] Perhaps they are at a point in law school where they just wanted to give the doctrinal answers and not consider the specific needs of the client.

These data also suggest some surprising findings for third-year students with clinical experience. Although clients of the Mandel Clinic do not pay fees, these students, on average, identified litigation costs as an interest more often than their nonclinical counterparts.[124] Perhaps, even without an actual experience with charging fees, students who have participated in a clinic are more sensitive to issues arising in actual practice, especially the needs of a real client. It is puzzling, however, that no student with clinical experience identified an apology or peace of mind as an interest.

5.    Identification of Next Steps

In their interviews, subjects were asked what steps they would take next in the case. Our hypothesis was that subjects who had participated in a clinic would identify more of these steps because of their experience in practice. We conjectured that through representation of actual clients these subjects should not only be focusing on the facts and legal rules in a case, but should be considering alternative courses of action to address their client’s interests. We coded identification of next steps by reviewing the entire transcript of each subject’s interview and noting all references to possible different courses of action: (1) fact investigation; (2) legal research; (3) alternative dispute resolution; and (4) client counseling.[125] We also divided each of these general areas into subcategories.

Table 9 summarizes the data for identification of next steps.

 

Table 9: Mean Number of Next Steps Identified

 

Groups

 

Mean Number of Next Steps Identified

 

Group 1 (3L’s with Clinic)

 

7.15 (3.6)

 

Group 2 (3L’s without Clinic)

 

6.10 (2.5)

 

Group 3 (2L’s)

 

6.64 (2.6)

*SD in parentheses

 

       These data show that on average, third-year students with clinical experience identified the highest number of next steps (7.15), followed by second-year students (6.64), followed by third-year students who had not participated in a clinic (6.10).

The differences between the responses of the subjects in the three groups can be quite stark.[126] A subject with clinical experience, for example, gave this response to the question about the next steps he/she would take:

My next step like I said would be to do some research, some case law research. Since it doesn’t seem like there’s any statutes or regs that are relevant. To see if there [are] any similar insurance cases in our jurisdiction or general contract cases in which, in which one of the parties signed the contract and then later claims to not have known the terms of the contract. And, specifically if it’s possible to look into whether State Farm has any sort of history of issues with this or whatever this agent does. Maybe talking to, asking Williams if I can talk to. Sorry asking Ralph if I can talk to you know the friend that recommended the agent to him. So basically researching the case law and possibly doing some factual investigation. I guess for the sake of preserving the evidence, I would want a more formal inspection of the damage and assessment of the damage. Maybe I’ll take pictures of the damage in the home before it gets fixed in case we do litigate. Basically I think I need to learn more about the, this area of the law and do that through research or through talking to colleagues.

 

In sharp contrast, a third-year subject who had not participated in a clinic answered the question,

 

Um personally I would probably do a little more research on the substantive area of the law. Uh it doesn’t seem like it is that complicated of a case as far as what the what the law would be and what the right answer would be. Um and so it is probably something that could easily be um the merits could easily be determined through legal research so I would do that and then advise the client as to what my findings are.

 

Interviewer: Anything you want to add at all?

 

Subject: No.

 

       And a second-year subject gave this response about next steps:

 

The next step that I would first want a copy of the policy and look at that um... I might try to do some research to find out how other cases had been resolved um depending on the situation. It might be a matter of calling the insurance company and sort of I mean just in my own experience um from purchasing insurance that companies routinely deny things because they don’t want to pay the money but if you fight with them a little bit sometimes they will capitulate and actually honor their insurance policy. So it might be a simple matter of calling the insurance company and saying I am the attorney for so and so why haven’t you paid this claim and sort of getting them to honor their obligation that way¼ After that I mean potentially pursue a case in court if it was worthwhile but I mean its really it’s a claim of $10,000 and it’s probably a significant amount to Ralph but it is not like worth litigating for months and month and months and a settlement is much more likely option and a resolution if it got to that point.

The most frequently identified next steps for third-year subjects with clinical experience were general legal research and research of analogous cases; for third-year subjects without clinic, fact investigation and research of analogous cases; and for second-year subjects, negotiation, counseling, and obtaining additional documents.[127] Third-year subjects with clinical experience were the only subjects to identify a number of next steps: mediation, sympathizing with the client, protecting the attorney-client privilege, and lobbying for changes in insurance regulations.[128] Their counterparts with no clinic training were the only subjects who identified contacting attorneys who handled similar cases, speaking with the State Farm agent’s assistant, investigating the type of neighborhood in which Kuzinsky lived, and contacting the Better Business Bureau.

These data appear to support the hypothesis that students with clinical experience are more proficient in considering the next steps in a case than their nonclinical counterparts. Clinical education apparently trains students to consider more extensively the actual process of developing a strategy to resolve conflict. But surprisingly, the most frequently identified next steps of these subjects concerned the need for legal research. Even though these subjects identified fewer rules than their nonclinical counterparts,[129] they focused on legal research as the next step to take in the case. One possible explanation for this finding is that these students’ clinical experience has trained them not to rely on their own knowledge of legal doctrine but to treat every case as one that needs research. This is not, however, the type of strategic reasoning used by experts in problem solving.[130]

The data in regard to third-year subjects without clinical experience are also surprising. Consistent with our hypothesis, these subjects identified fewer next steps than their clinical counterparts.[131] Perhaps, as with our findings in regard to rule identification,[132] these subjects viewed the problem as a law-school exam question and did not consider the ramifications of their answers for a client they encountered in actual practice. But these were the only subjects who identified several areas for fact investigation, including contacting the agent’s assistant and the Better Business Bureau. As with inference drawing, these subjects appear to be more focused on fact analysis than on subjects with clinical experience. Accordingly, it is unclear from these data the extent to which clinical experience actually trains students to focus on fact investigation and analysis.

D.    Preliminary Conclusions on the Effect of Clinical Education on Student Problem Solving

The data collected in this study indicate that at least in some respects, the advocates of clinical education are correct in their assertions that this pedagogy helps to train students for the problem-solving process. Our research suggests that subjects with clinical experience outpaced their nonclinical counterparts in identifying client interests and the next steps to take in the case.[133] It appears that a one-year clinical experience educates students to consider the needs of the client and to engage in strategic decision making. Third-year law students without a clinical experience seemed to be focused on the traditional law school task of issue spotting, to treat the problem as an abstract exercise, and to overlook the actual needs of the client.

In regard to other aspects of the problem-solving process, the findings are more mixed. As discussed previously, expert problem solving involves the ability to filter out irrelevant information and focus on relevant facts and rules.[134] In regard to fact recall, it appears that subjects with clinical experience exhibited some of this ability. While third-year subjects with clinical experience recalled fewer facts on average than their nonclinical counterparts, more of the facts identified by the clinical subjects were relevant to the problem.[135] Subjects with clinical training, however, paid as little attention to the brochure attached to the problem as those without clinical experience. At least from this study, it appears that clinical experience has little effect on a student’s ability to focus on relevant facts presented in a document provided by a client.

In one other respect, the findings as to reasoning about facts are surprising. Third-year subjects who had not participated in a clinic appeared to focus more on the facts of the case—whether relevant or irrelevant. They also drew more inferences from the facts than their clinic counterparts, and, in identifying next steps to take, some of these subjects focused on areas of fact investigation not considered by subjects with clinical experience.[136] These findings could indicate that clinics train students to stick to the facts and not to jump to conclusions. But they also could suggest that participation in a clinic might not necessarily lead to proficiency in brainstorming about facts as part of the problem-solving process.

In terms of rule identification, while third-year students with clinical experience identified a higher percentage of relevant rules in terms of all rules identified, nonclinical third-year subjects, on average, identified a larger number of relevant rules than those who participated in a clinic.[137] It is unclear, therefore, whether clinics help students to become more proficient in the process of filtering out irrelevant rules and focusing on relevant ones. Given the finding that subjects with clinical experience on average identified legal research as a next step to take in the case more frequently than their non-legal counterparts, it is possible that these subjects were using a novice approach to problem solving, treating every problem as one requiring legal research.

Some might argue that this measure for assessing problem-solving proficiency is wrong because the purpose of clinical education is not to train expert legal problem solvers but instead to educate new lawyers in problem-solving techniques which may eventually lead to expert problem solving.[138] Under this formulation, it might in fact be beneficial to train students in a clinic to research every legal issue which arises in a case. This argument, however, ignores the very benefit touted for clinical education. The value of this pedagogy, its proponents claim, is not merely that it teaches helpful techniques for handling problems but that it prepares students for problem solving in practice.[139] As I have argued elsewhere, explicit teaching of problem-solving techniques to clinic students does not necessarily prepare students for practice and might in the long run actually inhibit the development of the type of schemas needed for expert problem solving.[140] Obviously, in a one-year clinic, students are not going to become expert problem solvers. But if one of the purposes of clinical education is to train novice lawyers for problem solving in practice, it only makes sense to assess the effect of this pedagogy on students’ development of actual expertise, not simply their ability to use particular techniques in handling a client’s problem.

III.  Conclusion

Obviously this study of clinical education is only an initial attempt to explore the issues of the effect of this pedagogy on student problem solving in practice. But the findings of this study, however preliminary, do suggest both benefits and limitations of clinical education in training students for such problem solving. These findings indicate that in some respects, students with clinical experience are more proficient in problem solving in actual practice. Unlike their nonclinical counterparts, they appear to be more adept at exploring client interests and identifying next steps to take in a case. They also seem to be better able to filter out irrelevant facts and focus on relevant propositions. Nonclinical students, however, appear to be more proficient at drawing inferences and in identifying areas for fact investigation. Moreover, nonclinical students seem to a have a better ability to identify legal rules applicable to a problem. Rather than identifying rules, clinical students would rather research the issues.

The methodology used in this research obviously has limitations. The sample size was small, the study only reflects findings at one clinical program, and no attempt was made to control for the academic ability of the subjects or the particular pedagogies of the faculty members in the different clinical programs. The research, however, was conducted with a methodology which has been used extensively in assessing medical education. The subjects attended one of the foremost law schools in the country with a nationally-recognized clinic, committed for decades to the development of clinical pedagogy. Accordingly, the differences between the abilities of the subjects or the approaches of the particular clinical faculty members likely were minimal. Moreover, the selection of only students who had participated in a clinic for a full year gave us the opportunity to assess the effect on student problem solving of an intensive clinical program, rather than simply a one-semester course or externship.

In this context, the findings of this research, while far from conclusive, invite additional research to replicate this study and expand upon it. Future research on the relationship between experiential education and problem-solving ability should consider the effects of: (1) clinical education at another institution; (2) one-semester clinical programs; (3) simulation courses; (4) first-year lawyering courses; (5) externship programs; (6) summer and part-time employment; (7) different clinical pedagogies;[141] and (8) student interest in skills courses.[142] With hard empirical data, a serious exploration can be made of issues raised by the studies such as the Carnegie Report and the Best Practices Report. Such research is more helpful than simple reliance on survey data by enthusiasts or anecdotes of teachers in the field.

The Carnegie Report notes in its conclusion that most medical schools have an office of medical education and encourages law schools to follow suit.[143] I join in this call and invite others to start engaging in empirical research of legal education, especially clinical education. Only in this way can we begin to determine how best to train students to become effective problem solvers.

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

APPENDIX “A”[HU1] 

 

[You just interviewed a new client, Ralph Kuzinsky. This memo contains your notes from that interview. You are now reviewing your notes alone in your office.]

 

CLIENT INTAKE MEMORANDUM

 

CLIENT: Ralph Kuzinsky

ADDRESS: 1251 Ridgewood

Andover

DATE OF INTERVIEW: April 27, 2006

LEGAL PROBLEM: State Farm Insurance Co.’s Refusal to Pay Claim on Homeowner Policy

 

Ralph Kuzinsky (“RK”) is a 35-year old electrical engineer who has been employed at Midwest Utilities since 2002. He lives with his wife Sylvia Vondrasek (a high school English teacher) at the above address. They moved to the area for his job at Midwest Utilities. They have no children.

RK and his wife purchased their home at the above address in July, 2004. Before closing on the house, he contacted State Farm Insurance Co. to purchase homeowners’ insurance. He called a State Farm agent, Glen Williams (recommended by a friend) to inquire about obtaining homeowners’ insurance. Williams told RK that it would take a long time to explain the policy itself, but indicated that he would send RK a brochure which would show exactly what the policy covered. Williams recommended the “All-Risk” policy and said it was the “Cadillac of the line” and that it would cover everything and insure against all risks.

RK showed me a copy of the attached brochure which he said he received from Williams. RK examined the brochure and, believing that the “All-Risk Policy” provided the coverage he needed, he called Williams and asked him to issue an “All-Risk Policy” for his house.

RK remembers receiving a transmittal sheet from Williams for the insurance in the mail a few weeks later, but denies ever receiving a copy of the policy. For that reason, he never read the policy. RK says he never called Williams to request a copy of the policy because he did not think insurance companies typically sent policies to purchasers. He renewed the coverage in 2005.

In early March of this year, water leaked through the walls of the basement and sump pump pit, causing the sump pump to stop and allowing water to accumulate in the basement. The total of the damages was $10,820. Immediately after the incident, RK called Williams. Williams told him that he had in fact sent RK a copy of the policy along with the transmittal letter. RK submitted a claim for the damages caused by the leak, but State Farm sent him a letter denying coverage asserting that the policy excluded that type of water damage. At that point, RK called Williams for a copy of the policy, and Williams sent a copy to him.

RK showed me the policy sent to him by Williams. It excludes loss from “Water Damage” which is defined as

a. flood, surface water, waves, tidal water, overflow of a body of water, or spray from any of these, whether or not driven by wind;

b. water which backs up through sewers or drains, or

c. natural water below the surface of the ground, including water which exerts pressure on, or seeps or leaks through a building, sidewalk, driveway, foundation, swimming pool or other structure.

RK believes State Farm owes him the full amount of the claim. He wants to know the range of his options.

[Preliminary research immediately after the interview shows that there is no statutory provision in state or federal law or Department of Insurance regulations addressing the issues in this case.]

 

 



        Professor of Law and Director Emeritus of Clinical Programs at Hofstra Law School. This study was conducted in consultation with educational psychologist Vimla L. Patel and one of her associates David Kaufman. Papers describing this study were presented at the University of Chicago Mandel Legal Aid Clinic’s 50th Anniversary Symposium, “Out of the Shadow: Clinical Legal Education” (Feb. 23, 2008) and the Sixth International Journal of Clinical Legal Education Conference, “Lighting the Fire: the Many Roles of Clinical Legal Education,” University College, Cork, Ireland (July 14, 2008), and I wish to thank the participants at those conferences, Professor Lauris Wren, and Shoshana Krieger for their comments and suggestions on this research. I especially wish to thank Mark Heyrman and the other clinical faculty at the University of Chicago Mandel Legal Aid Clinic for their help in conducting this study. I also wish to give my gratitude to my research assistants Michelle McGreal, Yonatan Zamir, Bernadette McGann, Ernest Baello III, and Richard Louis for their help preparing this article. Finally, I would like to thank Hofstra University for providing me with the research support that made this article possible.

         [1].     See Peter A. Joy & Robert R. Kuehn, Conflict of Interest and Competency Issues in Law Clinic Practice, 9 Clinical L. Rev. 493, 497 (2002).

         [2].     See, e.g., William M. Sullivan et al., Educating Lawyers: Preparation for the Profession of Law 115 (2007).

         [3].     See, e.g., Roy Stuckey et al., Best Practices for Legal Education 151 (2007); Margaret Martin Barry, Jon C. Dubin & Peter A. Joy, Clinical Education for this Millennium: The Third Wave, 7 Clinical L. Rev. 1, 41–44 (2000); Kate O'Neill, Adding an Alternative Dispute Resolution (ADR) Perspective to a Traditional Legal Writing Course, 50 Fla. L. Rev. 709 (1998); Franklin M. Schultz, Teaching "Lawyering" to First-Year Law Students: An Experiment in Constructing Legal Competence, 52 Wash. & Lee L. Rev. 1643 (1995).

         [4].     Sullivan et al., supra note 2, at 58.

         [5].     Stuckey et al., supra note 3, at 144.

         [6].     See generally Report of the Committee on the Future of the In-House Clinic, 42 J. Legal Educ. 511, 512–517 (1992) [hereinafter Report on the Future of the In-House Clinic] (identifying nine purposes of clinical legal education: (1) developing modes of planning and analysis for dealing with unstructured situations; (2) providing professional skills instruction; (3) teaching means of learning from experience; (4) instructing students in professional responsibility; (5) exposing students to the demands and methods of acting in role; (6) providing opportunities for collaborative learning; (7) imparting the obligation for service to indigent clients; (8) providing the opportunity for examining the impact of doctrine in real life and providing a laboratory in which students and faculty study particular areas of the law; and (9) critiquing the capacities and limitations of lawyers and the legal system).

         [7].     See, e.g., Sullivan et al., supra note 2, at 95–100; Anthony G. Amsterdam, Clinical Legal Education—A 21st Century Perspective, 34 J. Legal Educ. 612, 614–15 (1984); Report on the Future of the In-House Clinic, supra note 6, at 512.

         [8].     Amsterdam, supra note 7, at 613.

         [9].     Id. at 614–15.

      [10].     Report on the Future of the In-House Clinic, supra note 6, at 512.

      [11].     William P. Quigley, Introduction to Clinical Teaching for the New Clinical Law Professor: A View from the First Floor, 28 Akron L. Rev. 463, 475 (1995).

      [12].     Roy T. Stuckey, Preparing Students to Practice Law: A Global Problem in Need of Global Solutions, 43 S. Tex. L. Rev. 649, 670 (2002).

      [13].     Stuckey et al., supra note 3, at 170.

      [14].     Sullivan et al., supra note 2, at 95 (while purportedly quoting Roy Stuckey et al., Best Practices for Legal Education: A Vision and A Road Map 109 (2007), such a quotation is not contained in the published copy of that report).

      [15].     See James R.P. Ogloff et al., More Than “Learning to Think Like a Lawyer:” The Empirical Research of Legal Education, 34 Creighton L. Rev. 73, 184–86 (2000).

      [16].     See, e.g., Mark Neal Aaronson, We Ask You to Consider: Learning About Practical Judgment in Lawyering, 4 Clinical L. Rev. 247 (1998); Brook K. Baker, Learning to Fish, Fishing to Learn: Guided Participation in the Interpersonal Ecology of Practice, 6 Clinical L. Rev. 1 (1999); Katherine R. Kruse, Biting Off What They Can Chew: Strategies for Involving Students in Problem-Solving Beyond Individual Client Representation, 8 Clinical L. Rev. 405 (2002); Andrea M. Seieslstad, Community Building as a Means of Teaching Creative, Cooperative, and Complex Problem Solving in Clinical Legal Education, 8 Clinical L. Rev. 445 (2002). While one empirical study compared skills in client interviewing between students with and without clinical experience, it did not focus on the issue of the impact of clinical education on its principal pedagogical goal: enhancing students’ abilities to solve actual problems and learn from experience. See Donald L. Alderman et al., The Validity of Written Simulation Exercises for Assessing Clinical Skills in Legal Education, 41 Educ. and Psychol. Measurement 1115 (1981).

      [17].     Sullivan et al., supra note 2, at 58, 121, 125.

      [18].     Id. at 58.

      [19].     Id. at 121.

      [20].     Id. at 125 (affirming a conclusion made in the Best Practices for Legal Education Report).

      [21].     In its discussion of development of professional expertise, besides relying on informal surveys, the report also bases its recommendations on theories developed by philosopher Hubert Dreyfus and engineer Stuart Dreyfus. Id. at 116–18. Those theories, however, are hotly contested in the cognitive science community, see Vimla L. Patel, Steering Through the Murky Waters of Scientific Conflict: Situated and Symbolic Models of Clinical Cognition, 7 Artificial Intelligence in Med. 413 (1995), a point that is not acknowledged by the authors of the Report.

      [22].     See generally Richard K. Neumann, Jr. & Stefan H. Krieger, Empirical Inquiry Twenty-Five Years After The Lawyering Process, 10 Clinical L. Rev. 349, 375 (2003) (observing that for rigorous survey research, the hypothesis must be identified precisely, and rival hypotheses must be identified and tested as well).

      [23].     “The problems stimulate the students to acquire all pertinent knowledge, including the basic scientific information necessary to understand the underlying mechanisms of health and disease.” Vimla L. Patel & David R. Kaufman, Medical Education Isn’t Just About Solving Problems, Chron. Higher Educ., Feb. 2, 2001, at B12. For a fuller description of the differences between traditional and PBL medical curricula, see generally Stefan H. Krieger, Domain Knowledge and the Teaching of Creative Problem Solving, 11 Clinical L. Rev. 149, 178–79 (2004).

      [24].     Patel & Kaufman, supra note 23, at B12.

      [25].     Id.

      [26].     Stefan H. Krieger, The Development of Legal Reasoning Skills in Law Students: An Empirical Study, 56 J. Legal Educ. 332 (2006).

      [27].     See Mark P. Higgins & Mary P. Tully, Hospital Doctors and Their Schemas about Appropriate Prescribing, 39 Med. Educ. 184, 185 (2005).

      [28].     Id.

      [29].     Id. at 186.

      [30].     Patel & Kaufman, supra note 23.

      [31].     Krieger, supra note 26, at 336.

      [32].     Id. at 339; see infra text accompanying notes 79-80.

      [33].     Krieger, supra note 26, at 341–42.

      [34].     Id.

      [35].     Id. at 352.

      [36].     Id.

      [37].     Id. at 342–45.

      [38].     Id. at 345.

      [39].     Id. at 347.

      [40].     Id. at 349.

      [41].     Id. at 346.

      [42].     Id.

      [43].     Id.

      [44].     See id. (noting the questioning regimen in law school classes and exam experiences have ingrained in law students a notion that they should stick to the facts in a given problem and not rely on their own experiences).

      [45].     Vilma Patel et al., Reasoning Strategies and Use of Biomedical Knowledge by Students, 24 Med. Educ. 129, 132–35 (1990).

      [46].     Id. at 132–33; Vimla Patel et al., Biomedical Knowledge in Explanations of Clinical Problems by Medical Students, 22 Med. Educ. 398, 405 (1988).

      [47].     Krieger, supra note 26, at 353.

      [48].     Id.

      [49].     See supra note 12 and accompanying text.

      [50].     See generally Krieger, supra note 26.

      [51].     See generally id.

      [52].     Stefan H. Krieger & Richard K. Neumann, Jr., Essential Lawyering Skills 36–37 (3d ed. 2007).

      [53].     See Krieger, supra, note 26, at 332.

      [54].     Group 1 contained thirteen subjects; Group 2 contained ten subjects; Group 3 contained eleven subjects. This variation in number of subjects was caused by the different number of volunteers from each group for the study. Given the nature of this study, these slight differences do not affect the validity or reliability of the findings.

      [55].     A copy of the solicitation memo is available at http://www.studentlegal reasoning.info/ucsolitication.html (last visited Dec. 13, 2008). The University of Chicago’s Institutional Review Board found that this study was exempt from its rules governing humans as research subjects because it concerned educational testing.

      [56].     Because we did not have a full complement of subjects for all three groups by the dates of the interviews, my research assistants and I also personally solicited a few volunteers in the student lounge and the library at the law school on the days of the interviews.

      [57].     458 N.E.2d 611 (Ill. App. Ct. 1983). The fact pattern presented to the students is contained in Appendix A. Some of the facts, having no bearing on the legal theories, were modified to provide an opportunity for students to explore the clients’ interests in the case and strategic options. Fictional information was given, for example, about the client’s marital and economic status and the amount of damages.

      [58].     See Appendix “A.”

      [59].     Id.

      [60].     The brochure included with the fact pattern was an exhibit included in the record in the Dobosz case, slightly revised to update information for a 2006 insurance policy. See Student Legal Reasoning, http://www.studentlegal reasoning.info/ucbrochure.html (last visited Nov. 12, 2008).

      [61].     Id.

      [62].     Id.

      [63].     Id.

      [64].     Id.

      [65].     Id.

      [66].     See Appendix “A.”

      [67].     Id.

      [68].     Id.

      [69].     Id.

      [70].     Id.

      [71].     Id.

      [72].     Id.

      [73].     Dobosz v. State Farm Fire & Cas. Co., 458 N.E.2d 611, 613 (Ill. App. Ct. 1983). Obviously, the legal issues examined by a court or parties in a case do not reflect the universe of possible legal theories which can be identified in regard to a particular fact pattern. In Dobosz, for example, further fact investigation might have produced evidence which would support a common law fraud claim. But an appellate opinion issued after briefing by the parties provides a good starting point for identifying the basic legal issues in the sample case used in the study.

      [74].     Id. at 614.

      [75].     Id.

      [76].     Id. at 615.

      [77].     In the prior study of development of legal reasoning skills in law students, the fact pattern concluded with the questions, “Based solely on these facts, does your client have any viable legal claim for damages? What is the basis for your answer?” Krieger, supra note 26, at 355. Reviewing the transcripts of the subjects’ responses, we found that a number of subjects, perhaps prompted by the questions at the end of the fact pattern, treated the fact pattern as a law school examination. For that reason, we modified the question at the end of the fact pattern to more realistically replicate the kind of question asked by a client at the end of an initial interview.

      [78].     Id. at 339-40.

      [79].     Id.

      [80].     For a complete description of the methodology and the research used in the study, see id. at 339–40.

      [81].     A copy of the script is included in the material posted on this study’s website. Student Legal Reasoning, http://www.studentlegalreasoning.info/ ucscript.html (last visited Nov. 12, 2008).

      [82].     Id.

      [83].     Id.

      [84].     Id.

      [85].     Copies of the transcripts with segmented propositions are available at http://www.studentlegalreasoning.info/uctranscripts.html (last visited Nov. 12, 2008).

      [86].     For a description of the benefits of using the propositional analysis method for examining the reasoning process, see Krieger, supra note 26, at 341.

      [87].     The coding protocols used in this study are available at http://www.studentlegalreasoning.info/ucprotocols.html (last visited Nov. 12, 2008).

      [88].     For related discussion in the previous study, see Krieger, supra note 26, at

341–42.

      [89].     The protocol used for fact recitation coding is posted on the study’s website available at http://www.studentlegalreasoning.info/ucprotocols.html (last visited Nov. 12, 2008). The scoring system we used was similar to that used in our previous research. Krieger, supra note 26, at 342–43.

      [90].     Dobosz, 458 N.E.2d 611 (Ill. App. Ct. 1983).

      [91].     A chart reflecting those codings is available at http://www.studentlegal reasoning.info/ucfactcoding.html (last visited Nov. 12, 2008).

      [92].     At the time of this study the Mandel Clinic had projects in the following areas: Appellate Advocacy, Civil Rights and Police Accountability, Criminal and Juvenile Justice, Employment Discrimination, Entrepreneurship, Children's Advocacy, Housing Development, and Mental Health. Five of the Group 1 subjects participated in the Civil Rights and Police Accountability Clinic, two in the Criminal and Juvenile Justice Clinic, four in the Mental Health Clinic, one in the Housing Development Clinic, and one in the Entrepreneurship Clinic.

      [93].     Student Legal Reasoning, http://www.studentlegalreasoning.info/ ucfactcoding.html (last visited Nov. 12, 2008)

      [94].     Id. One subject in Group 2 identified the picture of the window with the puddle beneath it, and a different Group 2 subject, pointed to the language in the brochure that it was not a statement of contract. Id.

      [95].     Dobosz, 458 N.E.2d at 615–16.

      [96].     Krieger, supra note 26, at 349.

      [97].     Id.

      [98].     Surprisingly, as Table 3 reflects, second-year subjects identified a higher mean percentage of relevant propositions than third-year subjects with no clinical experience. As Table 1 demonstrates, they processed less garbage, recalling fewer facts than the Group 2 subjects, but, of the facts recalled, a higher mean percentage of them were relevant.

      [99].     A chart reflecting the rules identified by each subject is available at http://www.studentlegalreasoning.info/ucrulecoding.html (last visited Nov. 12, 2008). We initially distinguished between three categories: (1) “theories”—explicit identification of particular causes of action such as breach of contract and estoppel; (2) “rules”—explicit identification of an element supporting a cause of action; and (3) “evidence marshaling”—subject appears to apply a rule by using a term such as “argue” or “claim” in discussing evidence. After coding the data, we found it impossible to make any significant findings because of the overlap between the three categories, so we combined codings for all of them as a single “rules” category.

    [100].     This definition is very narrow, and it can be argued that experienced lawyers might be able to identify other possible causes of action for Kuzinsky from the facts in the problem. We used the court’s decision for assessing relevancy, however, because we needed a standard for examining the proficiency of the subjects in identifying relevant rules. It is reasonable to assume that theories and rules identified by the parties in the case and the court deciding the case are arguably the most relevant to the particular case. Our goal here was to examine the subjects’ ability to identify rules concerning the most relevant causes of action, not possible rules applicable to some less relevant hypothetical claim. Accordingly, by scoring a rule as irrelevant, we intended only to note that it was not applicable to breach of contract and estoppel theories.

    [101].     Krieger, supra note 26, at 347–49.

    [102].     Id. at 349.

    [103].     See Sullivan et al., supra note 2, at 82 (suggesting that experiential learning helps law students learn how to apply their legal knowledge in representation of actual clients).

    [104].     Krieger, supra note 26, at 345–46.

    [105].     Id. at 346.

    [106].     See David A. Binder & Paul Bergman, Fact Investigation: From Hypothesis to Proof 82 (1984).

    [107].     A chart reflecting these codings is available at http://www.studentlegal reasoning.info/ucinferencecoding.html (last visited Nov. 12, 2008). We also coded for reasonableness of the inferences based on independent assessments made by my research assistants and me. After unreasonable inferences were removed, the numbers were too small to draw any significant conclusions.

    [108].     Unlike the prior study, however, there was no substantial decline in inferences drawn between the second-year and third-year subjects. While the second-year subjects in the present study drew inferences, on average, at the same rate as third-year subjects with clinical experience, the combined third-year data shows an increase in inference drawing in the third year.

    [109].     See supra Table 7.

    [110].     See supra Part II.C.5.

    [111].     See, e.g., Amsterdam, supra note 7, at 614.

    [112].     Student Legal Reasoning, http://www.studentlegalreasoning.info/ ucscript.html (last visited Nov. 12, 2008).

    [113].     Dobosz, 458 N.E.2d 611 (Ill. App. Ct. 1983).

    [114].     The protocol for Client Interests codings is available at http://www.studentlegalreasoning.info/ucprotocols.html (last visited Nov. 12, 2008).

    [115].     A chart reflecting these codings is available at http://www.studentlegal reasoning.info/ucinterestscoding.html (last visited Nov. 12, 2008).

    [116].     See Student Legal Reasoning, http://www.studentlegalreasoning.info/ uctranscripts.html (last visited Nov. 12, 2008).

    [117].     Student Legal Reasoning, http://www.studentlegalreasoning.info/ ucinterestscoding.html (last visited Nov. 12, 2008).

    [118].     Id.

    [119].     Id.

    [120].     Id.

    [121].     See supra Table 8.

    [122].     See id.

    [123].     See supra Part II.C.2.

    [124].     Student Legal Reasoning, http://www.studentlegalreasoning.info/ ucp.html (last visited Nov. 12, 2008).

    [125].     The protocol for coding identification of next steps is available at http://www.studentlegalreasoning.info/ucprotocols.html (last visited Nov. 12, 2008). Under this protocol, we provided that the same proposition in an interview could be scored for both client counseling and client interests. If the subject merely recited a client interest, we only scored the proposition for that category. But, if the subject recited a possible client interest in regard to his/her further discussions with the client, we scored for both client interests and counseling.

    [126].     The transcripts for these subjects can be found on the study’s website. Student Legal Reasoning, http://www.studentlegalreasoning.info/uctranscripts. html (last visited Nov. 12, 2008).

    [127].     Id.

    [128].     Id.

    [129].     See supra Tables 4–6.

    [130].     See supra text accompanying notes 27-29.

    [131].     See supra Table 9.

    [132].     See supra Tables 4–6.

    [133].     See supra text accompanying notes 115–23 and 126–31.

    [134].     See supra text accompanying notes 27–29.

    [135].     See supra Tables 1 & 3.

    [136].     See supra Table 7 and text accompanying note 127.

    [137].     See supra Tables 5 & 6.

    [138].     See Mark N. Aaronson & Stefan H. Krieger, Teaching Problem-Solving Lawyering: An Exchange of Ideas, 11 Clinical L. Rev. 485, 491 (2005).

    [139].     Sullivan et al., supra, note 2, at 95.

    [140].     Aaronson & Krieger, supra note 138, at 499.

    [141].     Proponents of clinical education as a method for teaching students how to problem solve in practice acknowledge that students do not learn this skill simply by experience in practice. See, e.g., Stuckey, et al., supra note 3, at 128 (citation omitted) (“[L]earning does not result only from experience: ‘Only experience that is reflected upon seriously will yield its measure of learning . . . . Our duty as educators is both to provide the experiential opportunity and . . . a framework for regularly analyzing the experience and forming new concepts.’”). While the clinical literature is replete on suggested methods for assisting students to learn from experience, little empirical research has been conducted to assess the most effective means for achieving that goal.

    [142].     While the academic ability of the subjects in all three groups in this study likely was approximately the same, their interest in skills courses was not. While all but one of the subjects in Group 1 (clinical students) had also participated in at least one other skills course, only one subject in Group 2 (nonclinical student) had taken such a course. These courses were Trial Practice: Strategy and Advocacy; Negotiation and Mediation; Pretrial Advocacy; and Entrepreneurship and the Law. Eight of the Group 1 subjects enrolled in more than one skills course. This contrast could suggest that the differences between the approaches of both groups to problem solving may have reflected to some degree the different learning styles or curricular interests of the subjects in each group as much as the effect of their clinical experience. Further research comparing clinical and nonclinical subjects with similar course preferences would be helpful.

    [143].     Sullivan et al., supra note 2, at 171.


 [HU1]Move to top of page