Is the “69th best law school” significantly better than the “77th best law school in America”? Eight places, sounds like something. How about compared to the “82nd best law school” in America? That’s a 13 place difference, surely it should mean something, shouldn’t it? But it’s not as good as the “60th best law school in America” is it?
But what if they are all the same school in different years? And what if the whole ranking system is, save for fairly large differences, pretty much a sham?
Some years we get a lower score than the year before, and then I think I shouldn’t carp about the whole thing for fear of it looking like sour grapes. Some years we get a higher score than the year before, and then I carp.
The idea of ranking law schools is not ridiculous. The way US News does it is very ridiculous. The survey data relies on the opinions of people who in most cases may be very informed about a few law schools but as a class are not likely to be particularly well informed about many law schools — even though they may be judges, hiring partners, law Deans and professors. And increasingly the survey data is self-referential: people have heard school X has a high/low ranking, so it must be good/bad, right?
At its grossest level, there is no doubt US News captures something real: the top N schools (10? 14? 15? 20? 20+?) really are better than the middle N or lowest N. But are the middle N significantly better than the bottom N? Sometimes, yes, but only sometimes. Here the picture gets very cloudy — not least because “better” ought to be “better for whom”; once you get away from the most elite, best resourced (i.e. high endowment), most prestigious law schools, what is best depends on factors that are personal: urban/rural, North/South, East/Middle/West, large/small, best in town/best town and so on.
The US News systems are designed to churn. Changed numbers sells magazines. Having the numbers stay the same doesn’t. Yet it’s hard to believe many schools change very much from year to year. Yes, once a while a school suffers a crisis or an epiphany, but those are pretty rare events.
There are inbuilt biases in the US News scoring system that favor small schools, and schools in cities with high starting salaries. Not to mention that in South Florida the market has more medium-sized firms than in other cities our size, and those firms rarely make offers until a candidate has passed the bar, notably depressing the ’employment at graduation’ rate.
I sympathize with aspiring students who need a guide to the perplexed when sorting through their options. It’s such a shame that the information market’s first-mover advantage has allowed such a crummy measure to dominate.
Anyway, we went up eight places this year, continuing our record of high volatility that has seen numbers from 60-82 in a small number of years. I suppose the Dean and the alumni will be happy, and that’s always nice. Personally, I’d put UM somewhere in the 45-60 range, but I suppose I’m biased.
Update: Or maybe I’m not. TaxProf Blog notes that Miami’s peer rank (rank by how professors at other schools see it) is 51.