Bar Exam Scores as a Law School Ranking Metric

Law deans, faculty, and of course students obsess a great deal over the rankings put out annually by the US News and World Report. Some like the rankings, and some hate them. Some find them important, while others dismiss them. Some propose improvements, while others suggest alternatives. Some join anti-US News letter-writing campaigns or even try to organize anti-US News boycotts (nothwithstanding that a concerted boycott of US News would seem to be an antitrust violation, given that horizontal group boycotts are per se violations of section 1 of the Sherman Act under the Supreme Court’s decisions in NYNEX and Klor’s).

But whatever one might think about the US News’s rankings, there can be no doubt that they evoke strong feelings, as attested to most recently by the many reactions in the legal blogosphere to this story on the rankings in last week’s Wall Street Journal. Because of the high level of interest in them, the rankings are a favorite (and possibly the overall most frequently written on) theme of law faculty blogging. Indeed, it almost seems as though a blogger who has yet to opine on the rankings subject cannot be taken seriously. So, lest I be thought an unserious blogger, here is a suggestion for how the US News’s law school rankings might be improved or replaced that has largely, though not entirely, been overlooked. (After drafting this blog entry I did a Google “preemption check” and noticed that a recent comment on the Moneylaw blog makes a suggestion that is similar to mine, and a somewhat more extended treatment is offered by Andrew Morris and Bill Henderson in a recent paper.)

The basic idea is this: why not use bar exam scores as a way to rank law schools? Schools could easily be ranked, as US News and others do with LSAT scores, based on the average, median, and/or quartile scores that their graduates obtain on the multistate portion of the bar exam (which has the advantage of being standardized across jurisdictions). US News already uses bar exam passage rates as one data point that factors into its rankings formula. But bar passage rates are a rather crude metric. Knowing how well a particular law school’s graduates did on the exam would be much more useful than knowing simply what percentage of them passed. The scoring data exist already but are kept under tight wraps by bar examiners, who share specific score data (as opposed to a simple pass/fail outcome) only with those test takers who fail the exam (and even this in only some jurisdictions). Why not open up this data more broadly and release it at least in the aggregate so that all can see how schools perform in relation to each other? And for that matter why not release specific scores to any test takers who wish to know how they did and who might wish to share their scores with prospective employers?

Bar exam scores would be useful data for prospective students and anyone else trying to assess student quality for particular law schools. Such score data would also show much better than bar passage rates how well-prepared a school’s graduates are to sit for the bar (assuming that the use of bar prep courses is roughly equivalent across schools) and better help schools to understand how changes to their curricular offerings affect bar exam performance. And just as law admissions officers find LSAT scores useful in assessing applicant quality, no doubt legal employers (and therefore those seeking legal employment) would find bar exam scores similarly, if not even more, useful. Bar exam scores contain valuable information, and it does not seem very sensible to leave valuable information on the table instead of using it.

To be sure, there would likely be some non-rankings consequences associated with these proposed changes in the way that bar exam data is reported. Most obviously, many law schools would probably try harder to prepare their graduates for the bar exam (perhaps at the cost of other curricular objectives), and many law graduates would probably try harder not only to pass but also to do well (again, perhaps at the expense of other curricular or extracurricular objectives). But it’s hard to see why these incentive effects would be anything other than salutary. And if one were worried in particular about the second, “study harder” effect, then that could be avoided simply by retaining the practice of keeping individual scoring data under wraps even while releasing school scoring data in the aggregate.

So if law deans are serious about finding a way to improve or replace the US News’s rankings, then rather than write letters criticizing US News or attempt to organize boycotts of its data collection efforts, they ought to lobby bar examiners to make available to others the useful data that they hold.

This Post Has 9 Comments

  1. Mike McChrystal

    MBE scores of graduates do measure a slice of a law school’s output and, as such, would likely be relevant information to some students as they are choosing a law school to attend. By the same token, the score that an individual achieves on the MBE would seem to be just about as telling about that individual as the aggregate scores of graduates would be for a law school. In other words, it seems roughly equivalent in usefulness to measure an individual’s quality as a lawyer by his or her MBE score as it would be to measure the quality of a law school’s graduates by their aggregate MBE scores.

    So should MBE scores become a standard credential in the legal services marketplace? Should prospective employers demand to see such data? Should prospective clients?

    As a one-time bar examiner (in Wisconsin, 1979-1986), I am not persuaded that the MBE measures much of what good lawyering entails, but it surely measures some of it. And so we are left with the problem of whether imperfect information should be withheld lest it be given excessive weight by persons who are not well informed.

    Perhaps there is a flaw in my analogizing an individual’s MBE score to the aggregate scores of a law school’s graduates. It strikes me, though, that in each case the information is a relevant but imperfect measure of performance, and we would presumptively favor the same policy of disclosure or non-disclosure in each case.

  2. Michael M. O'Hear

    I share the basic instinct in favor of openness with information, but I also have concerns about the likelihood–certainty, really–that law schools will increasingly “teach to the test” in a world in which bar exam scores are an important ranking metric. There are many aspects of the law school curriculum that do not get tested on the MBE, but that are nonetheless quite helpful in the formation of future lawyers and professions, including (to name just a few) trial advocacy and other lawyering skills courses, upper-level doctrinal courses that prepare students to enter into specialized practice areas, courses that focus on state-specific law, “law and” courses, and clinical instruction. I would think it unfortunate if law schools were to shift teaching and other resources away from such courses in order to require students to take more credits in the areas tested on the MBE.

  3. Keith Sharfman

    Thanks to both Michaels for these excellent comments.

    I agree with Mike McChrystal that bar exam scores may be an imperfect measure of school and student quality (just as the LSAT and law school transcripts are arguably imperfect too). But at the same time they still are good measures and in my view better than nothing. And of course consumers of the data will be free if they wish to disregard it. While I favor disclosure of the data both in the aggregate and to individuals, I recognize that some might be less inclined to permit disclosure in the case of individuals, and so to them I suggest an incompletely theorized agreement to disclose at least the aggregate data.

    Michael O’Hear’s point about “teaching to the test” is a fair one and is discussed in the literature by Morriss and Henderson among others. I think the best response to this argument is that the disclosure of bar passage rates (and indeed the existence of the bar exam itself) already gives many schools an incentive to teach to the test. The only way to avoid completely the “teach to the test” phenomenon is to abolish the bar exam entirely. A further point is that to the extent that certain untested curricular material is valuable for practice, employers will continue to seek students who receive this training and students and law schools will therefore continue to have an incentive to make curricular choices in light of these market realities. A final point point is that the bar exam ought to be modified to include other valuable subjects that currently are not being tested. There isn’t any problem with teaching to the test if the test is optimally designed!

  4. Eric Goldman

    I can’t resist pointing out the irony of this proposal coming from a school where the majority of students don’t even take the MBE! Eric.

  5. Gordon Hylton

    I suspect that MBE scores would pretty closely track LSAT scores. I have always felt that law school passes along lots of law-related information and certain habits of the mind, but I don’t think that it does much to change the academic abilities of its students. At least not in the aggregate.

    Marquette would likely to well on this metric since the MULS students who end up taking a bar exam are disproportionately drawn from the top of their classes.

  6. Eric Earley

    I just took the Bar. And in my preparations, I also took the BarBri course and the simulated MBE: I was surprised to find out that I ranked in the top 1% in the nation for all simulated test takers.

    This is true even though I only scored a 151 on the LSAT (took it cold — did not know there were prep courses). Furthermore, I was only ranked in the top quarter of my class at my “4th-tier” law school (Cooley).

    We allow student rankings and GPA on our resume, but how can I compete with a Harvard grad — even though I’m smarter than he is?

    Releasing MBE rankings would level the playing field. As for “schools teaching to the exam,” so what — the exam is already adapted to test for what the examiners think is important. It’s not like law review and trial advocacy classes will disappear — I would have still done them because they are important in there own right (and as such they go onto my resume anyway).

    I think the real reason they don’t want to release the scores has a lot more to do with preserving the current power structure — and those that make the decisions probably come from high-ranking schools and stand to lose the most.

  7. John Doe

    There are pros and cons to releasing these bar results, however, I believe the cons outweigh the pros. The value of this data point (the bar score) is very minimal and not comparable to the other tests we have taken in our lives. The LSAT was created in order to be able to compare students and to aid schools in making decisions on whom to admit into their school. The LSAT tests a student’s skills, not knowledge. The bar exam, on the other hand, not only tests legal skills, but also knowledge of the law. The purpose of the bar exam (in my mind) is to determine whether a person is competant to practice law in their state. It was not meant to be used as a metric for employers; it was not meant to rank one law student against another.

    I agree that bar results may be helpful to employers, but before scores may be released, the test must be rewritten to serve the purpose or ranking law students instead of testing compentency. Scoring the test must be much more accurate (more than 3 graders per PT or essay), and students must be informed that the results will be released.

    Although the conspiracy theory of preserving the current power structure may be true, I think the decision not to release the bar scores comes from the state bar’s understanding that the test is not perfect, the results do not accurately reflect one’s aptitude, and the potential consequences outweight the benefit of releasing this low-value data point to the world.

  8. Marie Doe

    First, it is important to realize that we are not talking about the general “bar results.” Instead, we are talking about the MBE (Multistate Bar Exam). There is a big difference between the general bar in each state and the MBE: The first is different from state to state and includes subjective essay grading, which simply can’t be compared between states; the second is like the LSAT or any other SAT — it is a standardized test given throughout the United States on the same day.

    Because the MBE is a standardized test given virtually everywhere at the same time, it creates an even playfield from which all test takers can be (and are) ranked. Furthermore, the test graders go to great lengths to “scale the score” so that it is the same from year to year.

    Second, this test also tests more than just minimal competency: There are many long-term lawyers who retake the test to practice in a neighboring state. None of them score a perfect score no matter how proficient they are (even law school professors). In fact, those I have spoken with typically receive a rude awakening — they do well in their area of practice but fail the other sections. Therefore, scoring the typical 135 is difficult for a practicing attorney unless she takes time to review a course (like BarBri).

    Last, since the test includes easy, medium, and hard questions in the six basic subjects tested, you can fail a couple of sections and still pass the test if you are exceptional in the others. But if you receive a truly outstanding score (a 165 or better — based on the curve), then you are significantly more well-rounded then the other test takers that skirt by with a 135. It means you answered more of the hard questions correctly in multiple areas. And being in the top 5% or 10% of test takers in such a comprehensive standardized test, should mean at least as much to employers as your grades and law school rank.

    And if this test is good for ranking test takers, then why not for ranking the law schools that helped them become so well-rounded in the first place?

  9. Eric Earley

    It’s been ten years since my comments above. I googled myself and saw this and read what was written 10 years ago and just thought I would reiterate that I feel the same way. As it turns out, my MBE scaled score was 168.5, not bad.

    The only thing I would add to what I said is this: It’s not the school, its the person — unless there is a trend, then it’s the school. Rank the schools with the test results! They should teach to the test, that’s why we go to school — to pass the bar. Yes, we want to be well-rounded attorneys, but teaching to the test will accomplish it — because those are the skills being tested!

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.