The November 2012 issue of the Journal of Legal Education carries an article by Theodore Seto (Loyola Law School) entitled “Where Do Partners Come From?” The article opens with the scenario of a law school applicant whose “long-term ambition is to become a partner in a national law firm in a certain city,” and suggests that the article will be helpful for the applicant to decide “which schools may increase [the applicant's] chances of realizing that ambition” (p. 242).
Professor Seto gathered a prodigious amount of data on 26,973 partners at the nation's largest firms, which is an impressive and valuable effort. Using this data, he conducted an analysis that poses provocative challenges to the received understanding about hiring prospects from various law schools. The study concludes that for big firm jobs on a national basis, Loyola Law School ranks higher than Notre Dame, USC, Iowa, Washington & Lee, William & Mary, Washington University, Emory, Minnesota, Wisconsin, North Carolina, and over 100 others. As Professor Seto himself remarks, “[r]eaders may find some of these numbers surprising.” Indeed they would! But before our hypothetical prospective applicant rushes to choose Loyola Law School over Vanderbilt as Professor Seto suggests in an anecdote (pp. 242-243), there are a few things he or she might want to know.
The new Theodore P. Seto ranking of law schools is in large measure a reincarnation of the notorious Thomas M. Cooley’s Law School’s ranking of law schools. You might remember this ranking in which the Thomas M. Cooley Law School suspiciously comes in second (just behind Harvard). In Seto's ranking, Loyola Law School achieves the lofty rank of 25, just slightly above the rank of 26 in Thomas M. Cooley’s data. Is this the result of the extreme precision of two independent rankings? No, unfortunately it's a largely predetermined similarity based on a common underlying flaw. In fact, a simple linear regression of Seto's ranking on US News Rankings and Thomas M. Cooley's rankings revealed that Cooley's rankings were a better predictor of Seto's rankings than were those of US News. Below is a plot of the Seto's differences from US News versus Cooley's differences from US News for schools all three rankings put in the top 125. This is about as strong an empirical linear relationship as one sees in data, and remember that this is after controlling for similarities based on US News.
Have Theodore P. Seto and Thomas M. Cooley discovered secret dimensions of law school success that lead their law graduates to make partner in America's largest firms? Sadly, the secret ingredient in having more partners is apparently enrolling more students. Seto's study simply counts the raw number of partners in NLJ 100 law firms, without any control variables such as law school size. Cooley did much the same thing, but in its characteristic overachieving fashion, used 40 factors rather than just one, many of which were various raw counts like the one used by Seto—number of students enrolled, number of foreign students enrolled, number of minority students enrolled, number of faculty, number of applications, number of seats in the library, and so forth. Thus, the secret ingredient in Seto's rankings, like those of Cooley, is bloated enrollment.
Why does Professor Seto not control for enrollment size? He explains that the reason is that “per capita outcome measures are merely proxies for student quality,” such as LSAT score. Perhaps that is true, but the remedy for that is to control for LSAT score, not to fail to control for the size of the law school. If I want to become a large law firm partner, what I care about is (1) the percentage of graduates of my law school who will be employed in such firms, and (2) whether I will fall within that percentage. If the top 10% of students in my law school land big firm jobs and I’m in that top 10%, it doesn’t matter whether I’m one of 20 who got the job or one of 40 who got the job. Likewise, if the top 10% of my class land big firm jobs and I’m not in the top 10%, it doesn’t matter whether I’m one of 180 who didn't get the job or one of 360 who didn't get the job. Yet Professor Seto’s ranking would put the larger school much higher, even if the two schools were otherwise identical.
The flaw of this approach is illustrated by the fact that Loyola would rank above USC not merely if we counted the number of law graduates who are law firm partners, but if we counted the number of law graduates in just about any other category—such as the number of left-handed law graduates, the number of deceased law graduates or, more to the point, the number of unemployed law graduates. Raw numbers rankings will tend to have more of everything both positive and negative, making a ranking on raw numbers misleading. (Note: I am not necessarily suggesting that being left-handed or deceased is negative).
The study also suffers from serious chronological impairments, as it includes the outcomes of graduates as far back as 1986. This data is relevant if one plans to accelerate his or her DeLorean up to 88 miles per hour to activate the flux capacitor and return to the year 1983, enter law school, graduate in 1986, and go to work at a major law firm. But for those of us who have been constrained in our exploration of all dimensions of the space-time continuum, outcomes that would have resulted from attending law school in the 1980s have limited utility for present decisions. Many schools have changed dramatically since the early 2000s, let alone the 1990s, let alone the 1980s, and outcomes dating from the last appearance of Halley’s Comet are of scant relevance for today’s decisions. Unfortunately, Professor Seto provides no explanation for why he includes ancient history in his ranking.
So if you believe that Thomas M. Cooley Law School has unlocked the factors predicting partnership in major law firms, Theodore P. Seto has the law school ranking system for you. If, however, you believe that law schools should not be ranked based on enrollment size but on a per-capita basis, perhaps because you only have one capita, then you might want a different measure. How would the results differ if Seto had controlled for the size of law schools or omitted data from the Reagan administration? Unfortunately we will not know because Professor Seto, citing the Loyola "Deans Office" that "funded the research," declined to allow me to examine the data.
I am not the first person to notice the problem with these rankings. Indeed, when they first circulated over a year ago I, together with virtually every academic that commented on the study, urged Professor Seto to control for law school size. He chose to ignore those comments, and the result is a misleading article. This type of academic frolic would be harmless fun in a good year for law firm hiring. In the current environment where too many law school graduates are chasing too few jobs the idea of publishing a ranking that rewards law schools for bloated enrollments seems out of touch.
On one hand, probably not very many prospective law students read the Journal of Legal Education, so perhaps the potential for harm is limited. On the other hand, some do read media coverage that has uncritically repeated these results. And a number of prominent law schools such as Georgetown, Boston University, Villanova, and NYU have recently started using Professor Seto’s study in advertising. These schools would receive flattering rankings even if the study were corrected for size so there is little harm in their reliance on the study. But what about the many fine flagship state schools that produce good outcomes for their graduates but are trashed by Seto's study because they maintain responsible enrollments? I will try to do justice to them in a subsequent post with corrected data.