How Prospective Law Students Can Make Better Use of the U.S. News Law School Rankings That Are About to Be Released

Updated:
Posted in: Education

Over the next month or two, tens of thousands of admitted applicants will make decisions about which law schools to attend. One tool that many will no doubt use to guide their decisions is the annual U.S. News & World Report rankings, which will be released in a little over a week. Many analysts criticize the methodology (or various aspects of it) that U.S. News employs to rate law schools (and some folks doubt whether all the nation’s law schools could ever be meaningfully graded according to any single set of criteria.) But, for the time being at least, U.S. News remains the most looked-at, and seemingly influential, ranking system out there. For that reason, in the space below we offer—based on our collective experience in both evaluating other law schools and having our own law school evaluated—five pieces of advice for making the most sophisticated use of the rankings that U.S. News is poised to unveil.

#1: The Importance of Trends: Remember That Each Year’s Rankings Capture a Snapshot in Time

The rankings that are set for release on March 10 present a great deal of raw and processed information, but the data they contain—and the bottom-line rankings they assign—represent only a snapshot in time. Any sensible consumer of the rankings should look not just at one year’s result, but at a longer track record, perhaps attaching more weight to a five-year average rather than to any single year’s numbers.

To be sure, sometimes there is, as to a particular law school or type of law school, a clear trend line—in particular components within the ranking or as to the bottom-line performance – and it may be important to try to discern what accounts for any such consistent assent or decline. More commonly, a school may bounce around somewhat because of short-term factors, such as a bad year in passing the bar and/or placing graduates in jobs, or an anomalous drop in application volume or quality due to some administrative gaffe or regional downturn. Such volatility is itself a basis on which the U.S. News rankings are often criticized—how much could a school’s overall quality really change within the space of a year?—but taking a somewhat longer view may partially address that criticism and make the bottom-line ratings more meaningful.

In looking at changes over time, it is important to realize that certain parts of the U.S. News evaluations very rarely move much from year to year. This would include a school’s reputation rank among other law professors who are surveyed (which accounts for 25% of a school’s overall ranking) and its reputation rank among lawyers and judges who are polled (which accounts for 15% of the overall result). The relative quality (compared to other schools) of a school’s student body—as judged by median LSAT scores, college GPAs, and the school’s acceptance rate—also has tended, as an historical matter, not to change tremendously in a single year (but rather evolves much more gradually), but this factor has itself become a bit more volatile in recent years as the national decline in application volume has hit some schools harder than others. Other factors, such as the percentage of graduates who are placed in law-related jobs at or nine or ten months after graduation, bar pass rates, and dollars-per-student spent by a school (more on that later), have tended to fluctuate much more, and thus may account more for the year-to-year changes in bottom-line rankings.

One might argue that the parts of the U.S. News survey that are more stable are more reliable and thus should be taken more seriously than the overall rankings. There is something to that, but even these stable components have been open to significant criticism. The response rate by lawyers and judges who are polled has often been quite low, and the integer-based scale (ranging from 1 to 5) on which law professors, lawyers and judges are asked to place schools is not sufficiently finely grained for people to draw the kind of nuanced distinctions that the U.S. News rankings purport to depict overall. Moreover, it may be that larger law schools, with more graduates, may have an easier time making a positive impression on judges and lawyers, simply because members of the bench and bar may be more likely to encounter recent alums of schools that pump out more graduates. (There are other, smaller aspects of the U.S. News methodology—such as student-faculty ratios—that might tend to inadequately reward economies of scale and thus favor small schools.)

#2 The U.S. News Data Is Necessarily Limited in Scope

In addition to being limited in time, the data that U.S. News employs and presents every year is limited in scope. Among the data that it ignores is how diverse a law school’s faculty or student body is. We have argued (in an earlier series of online columns) that this information concerning racial/ethnic (and perhaps other kinds) of diversity ought to be incorporated into the rankings. Most law school faculty and administrators around the country believe that diversity within a school is a helpful plus in a world where graduates are going to encounter and serve clients of various different backgrounds. Yet U.S. News has declined to include a diversity component in its overall scoring (although it separately presents raw data as to racial diversity). As we have explained before, the main reason U.S. News has offered for not including diversity—that some schools are located in places where diversity is harder to accomplish—simply doesn’t wash. Some schools are located in places where there are fewer high-LSAT performers in the community, yet we still include median LSAT as a rankings input because we think a law school student body’s LSAT performance is a relevant characteristic. If, as the Supreme Court has held and as most people in academia believe, a diverse school is pedagogically better than a less diverse school, all other things being equal, then we should develop a way to have diversity count for at least something when we evaluate and rate schools. In the meantime, prospective students can find helpful data on each school at the ABA “Standard 509” website.

#3. Distributions Within Each Law School Student Body

While we are talking about the makeup of the student body of each law school (which many prospective students would find an important factor since law students often learn from, and are judged by the outside world by, the company they keep), we should point out that the data that U.S. News weighs most heavily—median LSATs and GPAs—while relevant to an assessment of student-body academic strength, itself can mask important differences within each student body. Two law schools may have similar medians, but they may have very different LSAT scores and GPAs at the 75th and 25th percentiles within their student bodies. Let us compare, for example, using 2014 data, Northwestern and Cornell, both excellent law schools with undeniably strong student bodies. Northwestern’s LSAT median was a 168, and its median college GPA was a 3.75. Cornell’s were a bit lower on both—a 167 and a 3.68. But Cornell’s 25th percentile LSAT and GPA were somewhat higher than Northwestern’s (166/3.55 compared to 162/3.53). How could the school with higher medians have lower numbers at the 25th percentile? There could be a number of possible explanations. Northwestern may prefer applicants who have either a very high LSAT or a very high GPA (sometimes known as “splitters”), whereas Cornell may prefer people who were reasonably (but not quite as) high on both metrics. Perhaps Northwestern’s 25th percentile LSAT is lower because it has enrolled more students who have been out of college for a longer period of time, in which case LSAT scores may be less important than real-world accomplishment. Or maybe some different reason altogether.

We are not suggesting here that having medians that diverge from a school’s 25th percentile numbers is inherently problematic (although it might be problematic for law schools that, unlike Cornell and Northwestern, have many low LSAT performers and that may have low bar pass rates); instead, we are simply saying that when an applicant is looking at the student bodies of schools in which s/he is interested, it may make sense to look at more than medians. We note, in this regard, that the ABA “Standard 509 Report” website has a good tool that enables users to search and compare law schools along these axes.

We should add that if critics believe that U.S. News creates a perverse incentive for schools to admit “splitters” (perverse in the sense that pedagogical considerations would otherwise incline these schools to admit folks who present reasonably strong LSAT scores and GPAs instead), there might be ways to tweak the U.S. News student-quality formula, but any such changes could create other incentives or disincentives about which other observers might complain.

#4. Looking Behind the Employment Numbers

Two final observations warrant mention. First, one of the most volatile—and thus influential as to many schools’ rankings in a given year—factors in U.S. News is the percentage of graduates who have a full-time, long-term, law-related job ten months after graduation. Certainly a school’s ability to help place its graduates is an important factor in any decision about where to attend law school. But note, importantly, that the percentage employed in full-time, long-term, law-related jobs does not by itself convey any information about the particular type of jobs a school’s graduates are getting. U.S. News does not present or make use of salary data (although it did decades ago); it does not break jobs down by geography; as of last year it did not even tell consumers how many jobs are funded by the graduate’s law school or home university. We should add that some law school- or university-funded jobs are quite meaningful and reasonably paid, whereas others are less so. In any event, here too, the ABA provides much more finely grained data on job type and salary; applicants should consult the web page on which the ABA collects and presents the employment surveys for all ABA-approved schools.

#5. Follow (or at Least Examine) the Money

Finally, speaking of money, we should point out that there is one factor in U.S. News as to which the underlying data and the use to which U.S. News puts it are harder to see and thus harder to analyze, and that is the so-called “faculty resources” component that looks at “average fiscal expenditures per student for instruction, library and supporting services.” This inscrutable factor accounts for about 10% of a school’s overall score and often determines where a school lands within a bunched-up grouping. For example, if one looked at all the other major U.S. News components—peer academic assessment, lawyer/judges assessment, median LSATs/GPAs, acceptance rates, placement rates, bar pass rates, student-faculty ratio, etc.—Yale should be tied with, or even slightly behind Harvard. And yet Yale consistently beats Harvard for the top spot in the rankings by a non-trivial margin; last year it was four overall score points out of a possible 100. And this difference seems likely accounted for by the fact that Yale spends more—although precisely how much more is hard to know—per student than any other school by a significant margin.

Now four points out of 100 in the U.S. News overall score may not seem like a lot, but given how bunched up schools are, four points can be a big deal. (Yale’s four-point lead over Harvard last year was four times larger than Harvard’s lead over #3 Stanford, and four points farther down the scale was all that separated #29 from #42.) And Yale’s perch atop U.S. News every single year for over two decades likely accounts for its qualitatively better yield among admitted applicants than that of any other law school, thus enabling Yale essentially to have first choice among the applicant pool. (Yale these days admits only around 250 people to get 200 to attend, generating a yield of about 80%, compared to Harvard’s yield of about 60%, which itself is much higher than the yields of almost all other top law schools.)

To say that expenditures-per-student can have these important consequences on a school’s ranking and its yield is not to imply that this spending criterion is illegitimate. But one cannot help wondering: if the additional spending-per-student isn’t elevating placement rates or lowering student-faculty ratios, or allowing a school to obtain a faculty that is seen by other law professors as superior to that of other schools (and all of these are already measured directly and counted by the ranking system), precisely why should the money matter so much? The U.S. News methodology may result in double-counting of many considerations, but dollars spent may be a particularly problematic example. Yet there may be responses: perhaps Yale’s resources don’t increase its placement rate, but affect the kinds of jobs its graduates are able to obtain. For example, maybe its resources allow more students to undertake their own original research, which leads to more jobs in the academy. And so forth. Again, as with most other features of the U.S. News rankings that we’ve discussed above, our goal here is not so much to provide definitive answers as to cause students to think a bit more critically as they consume the bottom-line ordinal rankings for which U.S. News is best known.

Comments are closed.