California’s July 2016 Bar Results, and the Bar’s Apparent (and Wrong-Headed) Decision to Stop Providing School-by-School Data

Updated:
Posted in: Education

California—typically the last state to finish grading its summer bar exams—recently announced results for the July 2016 test administration, and the news was not good. (One blogger used the word “carnage” in his headline describing the bar’s press announcement.) While we are still awaiting detailed statistics (more on that below), we do know that the pass rate in the Golden State for first-time takers who graduated from ABA-approved schools located in California (which is probably the single best metric to use to compare year-to-year results) dropped from 68.2 percent in July of 2015 to around 62 percent in 2016. (I say “around” because the 62-percent figure was rounded off to the nearest whole number.) This year’s aggregate California results are depressing; the first-time California ABA school pass rate dropped around 6 points (or almost 10 percent) from an already low 68.2 mark. For some perspective, understand that the comparable figure from 2012 was 76.9 percent, and in 2008 was 83.2. That’s a 21 (or so) point drop in just eight years.

California’s pass rate for ABA-school-graduate first-time takers is, year-in and year-out, the lowest of the major states, and usually by a wide margin. Many other large states (such as Illinois, Florida, Pennsylvania and Georgia) also saw pass rates this year that were lower than 2015’s; in Illinois, for instance, the pass rate among Illinois-based ABA-approved schools fell from 80 percent in 2015 to 77 percent this year. By contrast, New York and Texas—two other very large jurisdictions—saw small increases this year in their bar pass rate. For example, New York’s pass rate among graduates from ABA-approved schools in New York ticked up this year to 82 percent, from 79 percent. (New York is often considered a “difficult” bar exam because its overall first-time pass rate—which includes takers who didn’t graduate from ABA-approved schools and who, in part for that reason, are unlikely to pass—is often low compared to other states, but the numbers I offered above, which focus on ABA-school graduates, illustrate that the California exam is much, much more difficult to pass.)

Overall, it seems—although I haven’t seen a good source aggregating national data—that this year more states (or at least more states of significant size) saw a pass-rate decline than enjoyed an uptick, and also that the declines in prominent states were larger than the increases in other notable jurisdictions. Over the summer, the National Conference of Bar Examiners—who design large parts of the exams used in the states, including the so-called Multistate Bar Examination (MBE) used in every state—reported that the average (or mean) MBE score inched up in July 2016 after a number of years of significant decline. But that doesn’t mean that the nation’s overall bar first-time pass rate (if one were to be calculated by combining all the numerators and denominators in all the states) necessarily increased this year (although it could have, since New York and Texas have big denominators), for at least two possible reasons. First, even if the average MBE score went up, the percentage of mediocre or low scores could also have gone up (resulting in lower passage rates) because the average might simply have been pulled up by better performance at the top; in other words, the top passers may have passed by a wider margin, lifting up the average, even if there were a smaller percentage of passers.

Second, the MBE (which is a multiple-choice test) isn’t the only component of each state’s bar exam, and performance on the other components (which involve writing essays and other documents) might have slipped such that the overall percentage of bar passers could have gone down. (Although the non-MBE components seem to be “scaled”—or calibrated—to MBE performance in most if not all states, I haven’t seen enough information on the scaling process to know whether poor writing skills could significantly bring down overall performance by the test-taking pool even if the multiple-choice performance on the MBE crept up.)

When we move from national and state-by-state results to school-by-school outcomes, things get harder to discern. Anecdotally, it seems that more schools saw their pass rate (at least in the jurisdiction where a plurality of graduates took the bar) drop than saw their pass rate increase. To give one example, in Illinois, only two of the nine ABA-approved schools in the state saw an increase this year, although to be fair one other school, the University of Chicago, had such an exceptionally good pass rate in 2015 that there was nowhere to go but down. Unfortunately, not all states disclose detailed school-by-school data—and certainly not for out-of-state schools—so that a comprehensive look at school-by-school trends will have to wait until schools post their 2016 bar result data on their websites, which is not required under ABA accreditation standards until December 2017, one year from now.

California traditionally has been one of the leaders in the country in disclosing the most detailed school-by-school statistics about a month after initial results are reported (which means sometime in late December or early January for the July bar administration.) But the California Bar seems not to be giving out so much detailed information anymore. For the February 2016 bar exam (whose results came out this summer), for the first time since I can recall (going back to the early 1990s at least) the California bar did not publish any school-by-school statistics, even though in its first press release announcing February results it promised to do so. The May 13, 2016, press release, presumably using stock language from prior years, represented that “more detailed statistics, including passing rates by individual law schools with 11 or more takers, will be made available in approximately four to six weeks and published on the State Bar’s website” (emphasis added). But when the more detailed statistics came out, school-by-school data was not included. Notably and disappointingly, the press release two weeks ago for the July 2016 results also refers to more detailed statistics slated to come out in a month or so, but fails even to mention the inclusion of school-by-school data.

Why would the California bar be publishing (or perhaps even compiling) less school-by-school data these days? It’s hard to say. Last year, the California lawmakers did pass a statute, California Business and Professions Code §6060.25, which sought to protect the privacy of people who sit for the bar exam and apply for bar membership. The statutory provision—which itself seems to have been a legislative reaction to information requests from UCLA law professor Richard Sander and others to obtain data about bar exam takers that would help answer how race, LSAT scores, college GPAs, and other factors predict bar exam performance—provides that “[n]otwithstanding any other law, any identifying information submitted by an applicant . . . including, but not limited to, bar examination scores, law school grade point average (GPA), undergraduate GPA, Law School Admission Test scores, race or ethnicity, and any [other] information contained within the State Bar Admissions database . . . that may identify an individual applicant, shall be confidential and shall not be disclosed . . . ” (emphasis added). Putting aside whether release of race, LSAT and undergraduate GPA data, etc. could enable individual applicants and their bar performance to be identified (say, if the size of any particular cell—African-American men at Stanford taking the bar in 2012—were so small that snoopy folks might hazard plausible, if not always perfectly accurate, guesses about identity), it is hard for me to see how aggregate school-by-school data (without any racial or other narrowing information) could be used to identify individual applicants, any more than the (even more) aggregate data already released—i.e., the 62 percent pass rate among all first-time California ABA-school graduates—would. If this 62 percent statistic wouldn’t meaningfully identify any individuals, I don’t see how the bar publishing the fact that (and I’m making up numbers here, since we don’t have and may not get any actual data for a while) 230 of UCLA’s 270 first-time takers in July 2016 passed would lead to any significant prospect of individual identification either.

Perhaps the bar is acting out of an abundance of caution given § 6060.25, but it’s hard to see how any concerns about this provision are even reasonable given the statute’s clear and narrow focus on “information that may identify an applicant.” And in an era when prospective students want (and need) more data that is current and finely grained from each law school—and at a moment when the ABA is considering imposing more stringent bar pass requirements as conditions for accreditation—the California bar’s apparent decision to provide less, not more, information than they’ve been publishing for decades does not seem wise or even easy to defend. Let’s hope that the absence of school-specific statistics for the February 2016 exam was an aberration, and that when the more detailed statistics for July 2016 come out later this month or early next, school-by-school data is provided (at least for schools with a large enough number of takers so as to make any concerns about identifying individual takers baseless.)

Comments are closed.