Additional Thoughts (and Concerns) About the Low Bar Pass Rates in California and Elsewhere in 2014

Updated:
Posted in: Education

About a month ago I wrote an essay for this website commenting on the drop in bar passage rates in many states in the fall of 2014. I focused on the large national decrease in scores that test takers received on the so-called Multistate Bar Exam (MBE), a 190-question multiple-choice exam that accounts for much of the entire bar exam in most states, and on remarks made by Erica Moeser, who heads the organization that makes and scores the MBE (the National Committee of Bar Examiners or NCBE), to the effect that this year’s takers were “less able.” Much has happened since I wrote that essay: on November 25, about 80 law deans (I should note my dean at UC Davis was not among them) joined in a letter to Ms. Moeser requesting that “a thorough investigation of the administration and scoring of the July bar exam” be conducted, and that “the methodology and results of the investigation . . . be made fully transparent to all law school deans and state bar examiners” so that there might be “independent expert review” of the exam’s “integrity and fairness”; on December 18, Ms. Moeser responded with a letter, and an attached essay from NCBE’s quarterly magazine that provided additional analysis and data; and other states, including the largest state, California, have in recent weeks released details on bar passage within their jurisdictions. In the space below, I analyze some of these recent developments, with specific reference to what likely accounts for the large drop in MBE performance (and thus bar pass rates in many states) this year.

Ms. Moeser’s Letter Defends the NCBE Against Implicit Criticism by Law Deans

I begin with Ms. Moeser’s formal responses to the law deans. The tone of her letter suggests she feels a bit attacked by the deans (and her perception in this regard is probably understandable). She apologizes, sort of, for using the term “less able” in a way that might suggest anything other than the simple fact that the 2014 test takers did not do as well as did test takers the previous year. But even as she makes clear she did not intend to offend or distract by using that term, she seems to bristle at a term used by the deans in their letter: “integrity.” Ms. Moeser appears to understand the deans’ request for an examination of the “integrity and fairness of the July 2014” exam as questioning the honesty or professional qualifications of NCBE personnel. In reality, I suspect the deans used the word “integrity” in reference not to the personal or professional character of the test makers, but to the soundness of the July 2014 test itself. As we all try to get to the bottom of why test takers scored less well this year, it would be nice not to be overly burdened by linguistic sensitivities.

On the question of whether the 2014 exam was more difficult than usual, Ms. Moeser’s letter reassures deans that NCBE has “reviewed and re-reviewed” every “aspect of [its] methodology and execution[,]” and that the July 2014 test has been examined multiple times and by different, independent psychometricians to guarantee that it was no more difficult than the 2013 test or previous tests. Ms. Moeser makes clear, however, that “the results of our studies will not be revealed publicly [because] [o]ur systems are proprietary, and security is essential.” Her steadfast refusal to turn over specifics about NCBE’s “equating” process (used to ensure that difficulty remains constant across test administrations) may not sit well with some of the deans who want outside experts to be able to verify NCBE is comparing tests properly. I can certainly understand that the NCBE does not want to make public the actual text of the questions it has used (and might continue to use) in order to equate the difficulty of one test administration with another, but perhaps NCBE could share more details about the way equating questions are selected, and on the precise statistical inferences that it draws based on taker performance on these equating items. Maybe there is no middle ground, but I would not be surprised if some deans persist in seeking more technical detail.

Ms. Moeser’s Essay Contains Some Unhelpful Explanations

The magazine essay Ms. Moeser attaches to her letter provides additional context, and also includes data about LSAT scores for students at the 25th percentile of LSAT performance for each ABA-approved law school for the classes that entered in the fall of 2010, 2011, 2012 and 2013. (Data for fall 2014 became available just a few weeks ago and weren’t included in Ms. Moeser’s essay.) One the one hand, a number of points Ms. Moeser makes in her essay do not seem particularly relevant to understanding the dramatic aggregate drop in MBE scores in 2014 from 2013. For example, she points out that the Law School Admissions Council (LSAC) now asks schools to report the highest LSAT performance for each law student, rather than the average LSAT performance for each. This change made by LSAC might make it harder for a law school to compare and analyze its own bar exam performance over a long period of time, but since LSAC made this change before the class graduating in 2013 entered law school (in fall of 2010), the reporting policy would not seem relevant to comparing the bar performance of the class graduating in 2013 and the one graduating in 2014 (the latter of which saw the dramatic drop in bar performance).

Ms. Moeser also observes that some law schools accept more transfer applicants these days (perhaps in part because this is a way to keep a school’s headcount and tuition dollars high without diluting the admissions credentials of the entering first-year class, since the characteristics of persons who transfer in as second-year students aren’t included in those credentials). This phenomenon undoubtedly exists at some schools, and it may complicate a particular school’s efforts to compare its current bar passage rates with those from an earlier era (when it didn’t accept as many transfers), but this modern increase in transfers can’t easily explain a national drop in MBE performance this year, since every student who transferred presumably would have taken the MBE whether s/he transferred or not. I suppose the transfer phenomenon might affect aggregate MBE performance if there were some “mismatch” effect (of the kind that Rick Sander has asserted, and that his critics reject, with respect to affirmative action) taking place when people transfer to schools for which they are not academically suited. But Ms. Moeser does not suggest this (or any other) theory for why an increase in transfers might affect aggregate bar performance, and I am unaware of any evidence of a transfer mismatch effect. Moreover, the number of transfers who graduated in 2014, while larger than in past years, wouldn’t seem big enough to move the aggregate bar performance numbers very much this year even if there were such an effect.

Ms. Moeser’s essay also posits that curricular changes in law school, ranging from an increase in ungraded externships and other experiential learning offerings, to fewer (or shorter) required black-letter courses, may be causing test takers to be less well-prepared for the bar exam. But any such curricular changes have been taking place gradually across the country, and unless there were some tipping point that was reached with respect to the class that graduated in 2014 compared to the class that graduated a year earlier, these changes would not likely contribute greatly to an abrupt and significant change in bar performance from one year to the next.

Ms. Moeser’s Essay Also Contains Some Probably Fruitful Explanations

On the other hand, Ms. Moeser does adduce facts that tend to support her contention that the July 2014 MBE was no more difficult than earlier tests. First, she says that 2014 test takers performed worse on the very “equating items drawn from previous July test administrations” than did students from past years. Assuming the equating items are reasonably well chosen, weaker performance on those identical items would be indicative of a group that would perform more poorly on the test generally.

Second, she points out that the July 2014 test takers also performed more poorly relative to prior law graduates on the Multistate Professional Responsibility Examination (MPRE)—which most graduating 2014 students took earlier in 2014 or in 2013. Ms. Moeser’s suggestion that recent score declines on the MPRE (which tests legal ethics in a multiple-choice format similar to the MBE’s) can be seen as precursors to the 2014 MBE decline is interesting, and may bolster her conclusion that the MBE was properly equated and scored—provided that the MPRE has itself been properly equated and scored and that the MPRE and the MBE exams test similar skills.

Third, and probably most powerfully, she describes how many law schools, even as they have reduced entering class size, have enrolled lower LSAT performers, perhaps especially importantly at the 25th percentile of a law school’s entering class. In addition to this, she points out that we know nothing about matriculants “below the 25th percentile . . . ; the tail of the curve leaves a lot of mystery, as the credentials of candidates so situated. . . and the degree of change [from previous years] are unknown.” To be sure, this may be a group at many law schools that often struggles with bar passage, and a decline in the 25th percentile LSAT performance (and within that bottom of a school’s LSAT quartiles) could explain lower bar pass rates at many schools.

If we look at the 25th percentile LSAT scores at all the nation’s ABA-approved law schools for the classes that entered in 2010 (and took the bar in 2013) and the classes that entered in 2011 (and took the bar in 2014), we see that, on average, 25th percentile LSAT scores slipped by about half an LSAT point. Perhaps worse yet (because decreases in LSAT scores in the higher ranges of LSAT performance may have less importance to bar passage), the number of law schools whose 25th percentile LSAT performance was in the bottom half of LSAT scores nationwide (an LSAT score of 151 or below) grew from 62 schools for the class entering in 2010, to 71 schools for the class entering in 2011. And, as Ms. Moeser points out, the (unobserved) drop-off within the bottom LSAT quartile at many schools may be more ominous indeed.

Of course, as I said a month ago, weaker LSAT performance might be accompanied by higher college GPAs and other indicia of academic strength. And some schools suffering LSAT score drops might be shrinking in size quite dramatically, such that their effect on national bar pass rates might be lessened. So much more analysis is needed before the full picture is understood. But it appears that beginning with the class that entered law school in 2011, there has generally been some decrease in LSAT performance, and that such decrease may account for a good chunk (though likely not all) of this year’s lower bar performance.

What Preliminary Analysis of California’s Recently Released Data Suggests

The results released this week in California seem to be consistent with this account. Overall, it was a tough year for bar passage in the Golden State. One out of every three first-time takers from ABA-approved schools throughout the country failed the California bar exam. Among the particularly depressing facts is that first-time African American takers from ABA-approved law schools had a pass rate of only 42%. When we look at first-time takers from ABA-approved schools located in California (who often do better than takers from ABA-approved schools in other states), Latina/o takers suffered a big decline this year; whereas White and Asian first-time California ABA-school takers saw their pass rates drop about 5% as compared to 2013, Latina/o takers saw their pass rate drop over 10%, to just 59.5%. At least four well-established California schools—UC Hastings, University of San Francisco, Santa Clara and Southwestern—experienced first-time pass rates (of 68%, 61%, 60% and 54%, respectively) that were the lowest in 18 or more years. (The data I had went back only to 1997, so this year’s performance might well be the worst in more than 20 years for these schools.)

And there does seem to be a correlation between declines at the 25th percentile LSAT score and lower bar pass rates among the California schools this year. Eleven schools saw their 25th percentile LSAT score drop between the class that entered in 2010 and the class that entered in 2011, and 9 of these schools saw their bar pass rates also drop. (One of the schools that saw its 25th percentile LSAT score go down but whose bar pass rate did not decline was USC, and its 25th percentile LSAT remained quite high—above 160—for the class entering in 2011.) The California school that saw the sharpest drop at the 25th percentile LSAT score in fall of 2011, UC Hastings, suffered, as I noted above, its worst bar pass rate in decades. And among the three schools in California whose 25th percentile LSAT scores increased in fall 2011 compared to the year before, two of those schools (UC Davis and UC Berkeley—both of whose 25th percentile LSATs were above 160 in 2011) saw their bar pass rates increase a bit (UC Davis from 85% in 2013 to 86% in 2014, and UC Berkeley from 85% in 2013 to 88% this year.) Only four schools statewide saw bar pass rates increase at all, and Berkeley’s increase of 3% was the largest.

Obviously, as mentioned earlier, much more than a school’s 25th percentile or median LSAT score goes into its bar pass rate, and year-to-year variations in bar passage are unavoidable at each school, even if student academic quality remains constantly high. There is likely no single factor that explains all of this year’s bar performance decline. But Ms. Moeser’s suggestion that we delve deeply into the admissions and academic support functions of law schools if we want to raise pass rates (as long as we have to live with a questionable device like the bar exam) is well worth heeding. And incoming admissions numbers do not bode well for bar pass rates for the next few years. In California, for example, the four schools I mentioned whose bar pass rates are at twenty-first century lows (UC Hastings, University of San Francisco, Santa Clara, and Southwestern) all have seen significant slippage at the 25th percentile in the last three years since the fall of 2011. And nationally, the number of schools whose 25th percentile LSAT score is below the national median score (i.e., 151 or below) grew again in the fall of 2012 (from 71 to 80), and yet again in the fall of 2013 (from 80 to 90), and likely grew again in 2014. Unless bar examiners across the country lower the threshold for passage (which in most states they insist they never do), or unless law schools find some new, highly effective academic success tools to help students do better on the bar—and find them very quickly—I fear that the difficult news about bar pass rates we experienced this fall will recur each year for the foreseeable future.

Posted in: Education, Law Practice

Tags: Legal

Comments are closed.