Law deans, faculty, and of course students obsess a great deal over the rankings put out annually by the US News and World Report. Some like the rankings, and some hate them. Some find them important, while others dismiss them. Some propose improvements, while others suggest alternatives. Some join anti-US News letter-writing campaigns or even try to organize anti-US News boycotts (nothwithstanding that a concerted boycott of US News would seem to be an antitrust violation, given that horizontal group boycotts are per se violations of section 1 of the Sherman Act under the Supreme Court’s decisions in NYNEX and Klor’s).
But whatever one might think about the US News’s rankings, there can be no doubt that they evoke strong feelings, as attested to most recently by the many reactions in the legal blogosphere to this story on the rankings in last week’s Wall Street Journal. Because of the high level of interest in them, the rankings are a favorite (and possibly the overall most frequently written on) theme of law faculty blogging. Indeed, it almost seems as though a blogger who has yet to opine on the rankings subject cannot be taken seriously. So, lest I be thought an unserious blogger, here is a suggestion for how the US News’s law school rankings might be improved or replaced that has largely, though not entirely, been overlooked. (After drafting this blog entry I did a Google “preemption check” and noticed that a recent comment on the Moneylaw blog makes a suggestion that is similar to mine, and a somewhat more extended treatment is offered by Andrew Morris and Bill Henderson in a recent paper.)
The basic idea is this: why not use bar exam scores as a way to rank law schools? Schools could easily be ranked, as US News and others do with LSAT scores, based on the average, median, and/or quartile scores that their graduates obtain on the multistate portion of the bar exam (which has the advantage of being standardized across jurisdictions). US News already uses bar exam passage rates as one data point that factors into its rankings formula. But bar passage rates are a rather crude metric. Knowing how well a particular law school’s graduates did on the exam would be much more useful than knowing simply what percentage of them passed. The scoring data exist already but are kept under tight wraps by bar examiners, who share specific score data (as opposed to a simple pass/fail outcome) only with those test takers who fail the exam (and even this in only some jurisdictions). Why not open up this data more broadly and release it at least in the aggregate so that all can see how schools perform in relation to each other? And for that matter why not release specific scores to any test takers who wish to know how they did and who might wish to share their scores with prospective employers?
Bar exam scores would be useful data for prospective students and anyone else trying to assess student quality for particular law schools. Such score data would also show much better than bar passage rates how well-prepared a school’s graduates are to sit for the bar (assuming that the use of bar prep courses is roughly equivalent across schools) and better help schools to understand how changes to their curricular offerings affect bar exam performance. And just as law admissions officers find LSAT scores useful in assessing applicant quality, no doubt legal employers (and therefore those seeking legal employment) would find bar exam scores similarly, if not even more, useful. Bar exam scores contain valuable information, and it does not seem very sensible to leave valuable information on the table instead of using it.
To be sure, there would likely be some non-rankings consequences associated with these proposed changes in the way that bar exam data is reported. Most obviously, many law schools would probably try harder to prepare their graduates for the bar exam (perhaps at the cost of other curricular objectives), and many law graduates would probably try harder not only to pass but also to do well (again, perhaps at the expense of other curricular or extracurricular objectives). But it’s hard to see why these incentive effects would be anything other than salutary. And if one were worried in particular about the second, “study harder” effect, then that could be avoided simply by retaining the practice of keeping individual scoring data under wraps even while releasing school scoring data in the aggregate.
So if law deans are serious about finding a way to improve or replace the US News’s rankings, then rather than write letters criticizing US News or attempt to organize boycotts of its data collection efforts, they ought to lobby bar examiners to make available to others the useful data that they hold.