
School may still be out for the summer, but all eyes are on college this week: the 2010 U.S. News & World Report college rankings hit stands today, with Harvard and Princeton tying for first place among national universities and Williams ranking first among liberal-arts colleges. TIME spoke to Robert Morse, director of data research at U.S. News and a two-decade veteran of the controversial rankings, about how the list is put together and how it could be better, plus a look at this year's rising stars.
TIME: For readers who might not be familiar with it, what's the methodology behind the rankings?
Morse: [They're] based on 15 indicators, [including] a
reputation survey, admissions data, faculty data, financial-resources data, alumni giving and graduation and retention rates. We're not comparing all 1,400 schools. We're dividing them up into 10 categories, like national universities and liberal arts. We assign a weight to each of the variables. The peer survey, or the academic reputation, is the highest-weighted variable it's 25%.
What tells you how heavily to weight each factor?
Our accumulated judgment. Our rankings aren't social science in the sense
that we're not doing peer-reviewed rankings; we're not submitting our
conclusions and our weighting system to a group of academics and letting
them decide if they are right or wrong. We do meet regularly with academic
experts about the relative importance of the factors that we use. And we
have been doing this for 25 years.
You mentioned the reputation survey. How do you respond to the
criticism that some people charged with filling it out may not have direct
experience with the schools they're rating, so they may just be going on
rumors?
I think there is a small group of schools, mainly in the liberal-arts
category, that have strong feelings about the reputation survey. Generally
speaking, our response rate did tick up a little bit this year it went
to 48% from 46% so there's some indication that this boycott [among schools that are refusing to fill out the reputation survey] is
losing some of its potency. But U.S. News is not expecting people to
have knowledge or be able to rate each school in its category. It's based on
the premise that since we have a big enough respondent base, enough people
have some knowledge of enough schools that we get a statistically
significant number of respondents for each school. There are subjective
parts of education, parts that can't be measured by just quantitative data.
The peer survey tries to capture that part of it.
The rankings don't really seem to change much from year to year. Do
you ever have the desire to change the methodology and shake things
up?
I personally don't. I wish that we were able to measure things like outcomes
in learning that there was comparative data on what people say is
missing in the rankings and missing in our education in general. What are
students learning?
What keeps you guys from measuring that?
There just isn't any data available. The schools themselves aren't measuring
learning. What's the difference between your knowledge when you start as a
first-year student and when you graduate? What do you feel about the
teachers? What's the rigor of the academic program? How engaged are you on
campus? Information like that is just not available from all schools.
Have you considered something along the lines of the reputation
survey, but to assess some of the things you're talking about?
You need the schools to cooperate. We would have to get access to enough
students to have a statistically significant sample. I personally don't
think schools are going to work with U.S. News on something like
that.
Another criticism you often hear is that the rankings encourage
schools to add unnecessary perks, such as climbing walls. Is that
fair?
Some schools or college presidents or boards have used wanting to improve in
the rankings as an administrative goal. Some schools are targeting their
academic policies toward improving in the rankings. But I don't think
that's really hurting students. The factors that you cited aren't really
part of the rankings. Many people at the schools don't understand the
ranking methodology and say things as an excuse vs. the truth. Generally,
targeting the rankings doesn't hurt students. If schools are targeting
ranking factors like improving graduation rates and improving freshman retention
and paying faculty more money and having more small classes and fewer large
classes and having faculty with better credentials, those are all U.S.
News ranking factors, and the students are going to benefit from
that.
A couple of months ago, it was reported that Clemson University was trying to
manipulate the rankings through strategic voting giving competitors a below-average rating. The university denied the claim. Is that something you try to police?
We definitely do. We throw out the two highest ratings and the two lowest
ratings for each school, so we have some statistical safeguards to prevent
any strategic voting from impacting any school's score. And the fact that
the schools' scores have been stable there's been academic research to
show that is proof in and of itself that schools aren't able to game
the scores.
Let's talk about specific schools. Any surprises this year a
school whose ranking has jumped from last year?
There wasn't a lot of change from this year to last. Harvard was No. 1 last
year, and now Harvard and Princeton are [both] No. 1. People are
going to write about that. But they were very close before, and now they're
tied. That's not really a big change. Schools are pretty stable, and the top
schools have the resources to continue to draw the best students and
graduate them at a high rate year after year. It's hard to move big
institutions to a great degree at the top.
Is there a school that's been moving slowly and steadily up the list that
may strike new readers as being surprisingly high up?
Schools like Washington University and the University of Southern California
have had slow and steady climbs up the national-university list by making
across-the-board improvements. One school that's going to surprise people is
the University of MarylandBaltimore County. It's both a top Up and Coming
school and No. 4 in the Commitment to Undergraduate Teaching [list]. That's a school that isn't one of the name brands but is up there
near the top of these two lists, just like George Mason, Northeastern and
Drexel are near the top of Up and Coming.
Have you considered adding alumni earnings to the rankings formula,
along the lines of the PayScale rankings?
Yes. We have earnings in two of our rankings. In our MBA rankings, we have
earnings data. In our law rankings, we have [job] placement data.
There were issues about that, that [PayScale] was giving credit for somebody's salary for their undergraduate vs. their graduate degree. That's a
problem with using it. But obviously people care. It's important how much
money people make and the value of degrees. Students are graduating with
debts, and they have to pay them back.
There are so many college rankings these days. What sets you guys
apart?
I think our rankings make the most sense to the public because our rankings
are the most transparent. Our rankings have the methodology that makes the
most sense. When the public sees that the schools are wanting to do better
in our rankings, they say, well if the schools want to improve in these
rankings, they must be worth looking at. So, in essence, the colleges
themselves have been a key factor in giving us the credibility.