Lower Bar Pass Rates in Some States Should Cause Us to Examine This Year’s Test, and the Bar Exam in General

Updated:
Posted in: Education

Over the next few days, the California Bar announces the outcomes from the July 2014 exam. Results in most other states have already been released, and well over a dozen states saw passage rates drop by a significant margin (by five percent or more) compared to the July 2013 exam. One significant reason for the drop is a national decrease in scores that test takers received on the so-called Multistate Bar Exam (MBE), a 190-question multiple-choice exam that consumes one whole day of what is in most states a two-day bar exam ordeal. The Multistate exam thus counts for about half of the overall test for many test-takers (and some states use Multistate results to calibrate the grading of the one-day essay portion of the Bar—the other commonly used major component of the exam), so performance on the Multistate is undeniably important. (California is somewhat unusual in that it has a third full day in its bar exam—a so-called Performance component that simulates a real-world situation in which a lawyer might find herself and asks her to draft a relevant document, e.g., client letter, court brief, memo to a firm partner, etc.). So when Erica Moeser, president of the organization that creates the Multistate exam (the National Conference of Bar Examiners) wrote a letter to law school deans last month highlighting and trying to explain the national drop in MBE scores, she triggered some pointed replies, especially from Nicholas Allard, the Dean of Brooklyn Law School (whose overall bar pass rate in July 2014 apparently was almost ten percent lower than in July 2013). In the space below, I identify and examine some of the narrower and broader questions raised by this year’s drop in Multistate performance.

Ms. Moeser’s Letter to Law Deans and Dean Allard’s Response

Various reports indicate that the average number of credited answers on this year’s Multistate exam was 141.5, a drop from 144.3 in 2013. No one denies that this is a big drop; it appears to be the largest single-year drop since the Multistate exam has been given (going back a number of decades). And a mean score of 141.5 seems to be the lowest national MBE average in a decade. The question, of course, is: What accounts for this drop? Here is what Ms. Moeser wrote to law school deans:

[T]he drop in test scores that we saw this past July has been a matter of concern to us. . . .While we always take quality control of MBE scoring very seriously, we redoubled our efforts to satisfy ourselves that no error occurred in scoring the examination or in equating the test with its predecessors. The results are correct.

Beyond checking and rechecking our equating, we have looked at other indicators to challenge the results. All point to the fact that the group that sat in July 2014 was less able than the group that sat in July 2013. In July 2013 we marked the highest number of MBE test-takers. This year [2014] the number of MBE test-takers fell by five percent [which would be expected since] first-year law school enrollment fell 7% between Fall 2010 (the 2013 graduating class) and Fall 2011 (the 2014 class.) . . . .

In closing, I can assure you that had we discovered an error in MBE scoring, we would have acknowledged it and corrected it.

Brooklyn Law Dean Allard found this letter both “defensive” and also “offensive.” He criticized Ms. Moeser’s organization for not providing enough details about its internal test-validation processes and doubted whether the rest of the world can “have confidence in th[e] self-serving unaudited assertion” that the test was not flawed. For this reason, he said her “statements ring hollow,” and that she owed “a sincere apology to all the law graduates [she] disparaged and described as ‘less able’ without meaningful convincing evidence.” He added that, at least as far Brooklyn Law is concerned, the “credentials [of the class graduating in 2014] were every bit as good as our 2013 graduates, if not better.” Other commentators have pointed out that Brooklyn’s median LSAT score was the same for both graduating classes.

Understanding This Year’s Results

There is no doubt that the term Ms. Moeser used—“less able”—is loaded, and perhaps explains some of the passion of Dean Allard’s response. What I assume Ms. Moeser meant to say is simply that the 2014 group did not perform as well on this test as the 2013 group. Whether they are “less able” in some broader sense is a question entirely separate from how they scored on this particular test, and one Ms. Moeser would not seem to be in a good position to answer.

For that reason, Ms. Moeser probably erred (putting aside her choice of language) in trying to offer any explanation for the lower performance; her diagnosis of a “less able” group of takers seems to be, at most, a (limited) diagnosis of (partial) exclusion. In other words, what she knows—or should be able to know—is confined to the fact that the MBE test that was given in 2014 was reliable as compared to prior year tests. Even if this year’s test was no different in substance or administration, Ms. Moeser really has no way of accounting for the lower performance. Certainly her vague implication—that a decrease in the volume of law school applications and graduating students explained the lower score—is open to question. Indeed, a seven-percent reduction in the number of starting law students in the fall of 2011 might suggest that law schools shrank in size rather than lowered their admissions standards. And the comparison of seven-percent fewer incoming students and five-percent fewer MBE takers wouldn’t, without more data, say much. So Ms. Moeser should have said no more than that the test has been examined and validated, and that we need to look elsewhere for an explanation.

In this regard, Dean Allard is correct that the rest of us deserve to know more details about MBE’s “quality control” processes, to use Ms. Moeser’s term. It’s hard to see why more transparency about the internal test-validating data and techniques that the MBE-makers use would not be a good thing.

And what about the real explanation for the drop in performance this year? Dean Allard said his 2014 takers were just as strong academically as the previous class (and he also added that the school devoted at least as many if not more resources to support bar-takers this year.) Brooklyn is of course, only one law school, and significant yearly fluctuations by a single school in its bar pass rates are not uncommon. Moreover, median LSAT scores (of the kind people have noted about Brooklyn) don’t tell us much; much more relevant might be what the spread of LSAT performance is across a law school’s entire entering class, especially the tail end of LSAT-performers at the school. A number of other analysts over the past week have tried to analyze aggregate national LSAT data (broken down into small LSAT performance segments) of the Class of 2014 to see whether there is any overall difference between the LSAT scores of that class versus the Class of 2103 that would explain a significant drop in MBE scores. And these commentators seemed to have found none.

Even assuming these commentators have all the aggregate LSAT data they need and that their math is right, the LSAT score is, of course, just one measure of the academic strength of a group of law school students. Almost every law school looks (at the admissions stage) not just at LSAT scores, but also at college and graduate school grades, as well as a host of non-numerical indicia, such as letters of recommendation, personal statements, and the like. Perhaps law schools tried to keep their LSATs from dropping in the fall of 2011 (in part because median LSAT scores factor most heavily in the USNews ranking admissions category) and as a result enrolled a national class with lower GPAs? Or perhaps GPAs were unchanged, but the non-numerical indicia of the class entering in 2011 were less strong, as some of the best students steered away from law school and pursued other options? Or perhaps college GPAs have stayed constant only because of undergraduate grade inflation? Or perhaps law school professors are doing a poorer job of imparting knowledge that the bar exam tests? (The last two explanations wouldn’t seem to account for a large single-year drop in bar exam performance insofar as they, if they are valid explanations at all, reflect trends rather than sudden changes; the national admissions pools have varied more dramatically each of the last number of years.)

My point here is that we can’t possibly know what accounted for the drop until we have a lot more data. Some analysts have speculated that exam-taking software glitches in the essay portion of some state bar exams that occurred the day before the MBE was given might have stressed out test-takers in those states in a way that caused them to underperform on the MBE itself. Perhaps, but again, we need data from all the states in order to compare MBE performance where software glitches occurred to performance where there were no glitches. (The preliminary data seems mixed.)

Using This Episode to Explore Bigger Questions

Apart from the need for greater transparency about the internal reliability processes of the National Conference of Bar Examiners, what is the big takeaway? For that, let me quote another passionate segment of Dean Allard’s letter:

There is a serious disconnect between education and the bar exam if qualified students who have invested enormous sums of money and dedicated their time and energy to successfully completing their 85-odd credits in compliance with ABA and state requirements, results in more than a small number not passing the exam. It is strange that after completing, at great expense, such intense studies, the first thing law graduates must do is pay a lot of money, once again, [for a commercial Bar-exam-prep course] to prepare for a test that will enable them to practice law. This defies common sense.

There is much to Dean Allard’s observations here. There was a time when law school graduates incurred relatively low debt and could get decent jobs even if they failed the bar. And many employers that need licensed lawyers used to be more generous in giving folks a second or third chance to pass. But given the high debt loads and the mixed (if improving) legal employment picture we see today, failing the bar is a much bigger setback than it used to be. We therefore need to ask some basic questions, such as whether a bar exam is a good or necessary way to regulate entry into the profession. (At least one state, Wisconsin, does not require passage of a bar exam for those people who have graduated from an ABA-approved law school in the state.) Even if some kind of licensing exam is necessary or proper, is the bar exam that we give nationally each year the right kind of exam? I must confess that most of us who teach in and help administer the nation’s law schools don’t spend as much time and energy on thinking about the bar exam and ways to reform it as perhaps we should. (I remember being asked to grade essay portions of the California Bar exam a few decades ago—which I wanted to do to see exactly how the grading criteria were formulated and implemented—but then being told that because I had recently started a full-time law-teaching position at the University of California, I was not eligible to be a grader—a rule that did not seem sensible to me.) But Dean Allard’s implication is undeniably powerful; just as many of us in legal education have been moved over the last decade to think a great deal more deeply about how best to educate lawyers (e.g., interdisciplinary courses, more skill-based and practical offerings), we in the legal academy need to focus more on how best to license them. So beyond asking whether current MBE tests are comparable to past MBE tests, we in the academy, bench and bar should not defer to professional test-makers as much as we have in the past, but should be looking into precisely what the MBE and other bar exam components ask students to do, how exams are graded, and whether these screening tests are a good basis for deciding who should be permitted to practice law.

Posted in: Education, Law Practice

Tags: Legal

5 responses to “Lower Bar Pass Rates in Some States Should Cause Us to Examine This Year’s Test, and the Bar Exam in General”

  1. Kristina L. Diaz Cooper says:

    I am surprised by these news. I am a lawyer, who passed the bar exam in Puerto Rico and now this February I will be taking the Bar Exam in NJ to get licensed to practice here. There was also a drop in the results from bar exam takers in Puerto Rico, but the MBE is not administered there. So the speculation about the glitch that happened before the MBE Test does not apply to PR test takers. I agree completely with Professor Amar: ” we in the academy…should be looking into precisely what the MBE and other bar exam components ask students to do, how exams are graded, and whether these screening tests are a good basis for deciding who should be permitted to practice law”

  2. n900mixalot says:

    Perhaps she needs to consider a resignation. People have stepped down from much larger organizations, for far less.

    Is her logic button turned all of the way off? Is she really making that bold of an assumption? Since the first-year class was smaller in 2011, somehow that made them all “less able” to pass the bar?

    Wait … is she saying that far fewer people passed the July 2014 because fewer people TOOK it?! That right there is a 5th a grade math mistake if I’ve ever seen one.

    Oh goodness. Maybe I need to just read her full letter.

  3. AKLady says:

    America is graduating highschool students who are unable to read and/or write above an elementary school level. What has been witnessed this year is a portend for the future — not just in the legal field.

    • n900mixalot says:

      Actually, no. It isn’t that simple. If you’d sat for the February and then the July 2014 exams, you’d be able to identify a striking difference in difficulty (and QUALITY) between the two MBEs. They were night and day. The July MBE had ACTUAL typos in it, but since we don’t see the exam questions with their correct answers, none of that can be proven … save for among the few who dare discuss the exam question particulars on message boards. Go ahead and check them out … a lot of very smart examinees were perplexed by a significant amount of what we were faced with in July.

      The NCBEX likes to play hide the ball. If they are going to be an entity that is so instrumental in licensing individuals, they need to start being more transparent, and stop giving favor to BarBri, and work with law schools so that the schools are able to accurately prepare students for an exam that has as little to do with being an actual lawyer as possible.

      • Michael2255 says:

        Teach to the test? Sigh.
        The bar exam is an endurance test to see how much BS you can take before you run out of the room screaming. If you manage to stay in the room to the end you should pass.