Measuring How Much Schools Challenge Students

The Washington Post and Newsweek have a crude but interesting methodology that aims to capture whether high schools challenge their students.

The metric has a lot of limitations, but it also has its attractions. They start by excluding all magnet schools, more or less on the grounds that they'd win if they were included. They rank the remaining schools on how many advanced tests the students take:

We take the total number of Advanced Placement, International Baccalaureate or Cambridge tests given at a school in May, and divide by the number of seniors graduating in May or June. All public schools that Newsweek researchers Dan Brillman, Halley Bondy and Becca Kaufman found that achieved a ratio of at least 1.000, meaning they had as many tests in 2006 as they had graduates, are put on the list on the Newsweek website, and the 100 schools with the highest ratios are named in the magazine.

… I think 1.000 is a modest standard. A school can reach that level if only half of its students take one AP, IB or Cambridge test in their junior year and one in their senior year. But this year only about five percent of all U.S. public high schools managed to reach that standard ….

This is indeed a crude measure. It doesn't capture how good the teachers or the students are (the results of the tests don't enter into the calculation). There's no control for demographics of the school's catchment area, although it appears that the correlation isn't that good since rich schools sometimes reserve their APs for the 'best' students which keeps down the numbers.

And it's not exactly a measure of value-added either.

No, at best it measures what it says: whether or not the high schools are challenging their students by exposing them to advanced courses. That may be very basic, but it's still worth knowing.

How would we make a comparable metric for law schools?

This entry was posted in Law School. Bookmark the permalink.