This month, the National Survey of Student Engagement (NSSE) released its ninth annual report on how involved students are in both academic and extracurricular activities which are key indicators of educational quality at four-year schools across the country. The survey, which is funded by the nonprofit Carnegie Foundation for the Advancement of Teaching, provides hundreds of colleges and universities with some of the most comprehensive data available on how well they're engaging students compared to their peer institutions. The survey's main goal is to help schools figure out how to do a better job of educating college kids. Now, however, more and more school presidents fed up with what they are feel are arbitrary school rankings are making the benchmarks public as an alternative resource for college applicants.
The schools' biggest grouse against rankings, such as the lists produced each spring by U.S. News & World Report, is that they take a complex institution and crunch it down into a single score. Critics castigate U.S. News not only for rewarding schools for such things as outspending their rivals, but also for basing a whopping 25% of a college's ranking solely on how its reputation is rated by administrators at rival institutions. (See pictures of the college dorm room's evolution.)
Enter the NSSE (pronounced "Nessie"), which tries to provide a detailed picture of how well a school is judged by its customers, i.e., the students who attend them. At each participating campus, the survey asks freshmen and seniors to rate their school, using a seven-point scale, on wide-ranging topics that hit upon almost every element of a student's experience, from how often he interacts with faculty outside of class to how challenging he thinks his coursework is to how much non-academic support is available. The numeric scores can then be compared to other schools that is, if they choose to make the data public. "If you mix a whole lot of random data in a blender, like the rankings do, what you get is a single composite score that doesn't tell you anything," says Doug Bennett, president of Earlham College in Richmond, Ind. "Looking through all the NSSE data paints a college's personality much better."
This year, NSSE surveyed more than 380,000 randomly selected students at 722 four-year colleges and universities. And the results were surprising. Rather than showing vast differences between schools, the survey highlighted huge disparities in how well each campus was engaging all of its students. For instance, how engineering students scored the quality of on-campus tutoring programs at School A vs. School B may not have varied much. But how School A's engineering students judged those programs may be radically different from how School A's business students do. That finding underscores why using one single number to gauge an entire school is an ineffective tool at best for predicting what an undergraduate experience will be like. "Even high-performing institutions as measured by average benchmark scores have work to do to improve the experience of all students," says Alex McCormick, NSSE's director and a professor of education at Indiana University. (See pictures of how teens would vote.)
Participation in NSSE is voluntary, but the number of schools that do so has tripled in the past five years, and some 1,300 campuses have now taken part in the survey at least once. But those that do participate tend to be small and independent colleges, which are likely to score better on NSSE if only because the size of their student body allows for more one-on-one attention than larger universities. Earlham is one of many NSSE participants that publish their results in pamphlets or on their websites.
But many schools that participate in the survey don't make their data public; some fear that the results, which include comparisons with the scores of similar institutions, might ultimately be used as part of what they view as the college rankings rat race. Indeed, U.S. News already publishes NSSE data for all the schools that are willing to fork it over. You won't find Ivy League schools in that category, but there are lots of big schools like Penn State and Texas A&M that make their NSSE data public.
USA Today last year managed to convince a lot of colleges to participate in its interactive database, which uses several sets of bar graphs to show how a particular school stacks up against a group of peer institutions. But the database doesn't provide numerical rankings. Which means consumers have to gather a lot of data and do their own calculations to compare one school to another, which is precisely the kind of thoughtful research colleges wants prospective students to be doing. (See pictures of eighth-graders being recruited for college basketball.)
Of course, ultimately if enough schools publish their NSSE results, there's nothing to stop someone from factoring that data into some sort of rankings. In the meantime, colleges will continue to use it to improve their educational quality.
This year's survey highlighted one big group of students who appear to be underserved by many schools: transfer students. NSSE found some 40% of seniors had started at a different institution. Yet according to the survey respondents, transfers tend to receive some of the lowest amount of support on campus. They speak less frequently with faculty members about future plans, work less often with classmates on assignments and half the number of transfers participate in co-curricular activities at about half the rate as non-transfer students. "Schools simply must work harder to pull transfer students in because they're not getting a full experience right now," McCormick says. "And that effort should start at day one with new-student orientations." Only time and future NSSE results will tell if colleges start to make improvements in this and other critical areas of student engagement.