Why You Should Never Rely on College Rankings
Princeton is the best place to get an undergraduate degree in America. Or maybe it’s the University of California, San Diego. Or perhaps it’s Williams College in Massachusetts. Each of those schools tops a different “best colleges” list (U.S. News & World Report, Washington Monthly, and Forbes, respectively).
College rankings are tricky. Everyone who puts together a list looks at different factors and uses a unique methodology to come up with a list of schools that are supposedly better than all the rest. And while experts routinely remind parents and high school students to ignore those lists, people eat them up anyway. Now there’s another reason you should put down that copy of the Best Colleges in America. Those college rankings are actually pretty bad at highlighting the schools that do a good job of educating their students.
Research presented at a recent meeting of the American Educational Research Foundation found that there was little connection between a college’s ranking on three popular lists (U.S. News & World Report, Forbes, and Washington Monthly) and how engaged students were. That’s worrisome, because high levels of student engagement both in the classroom and on campus are linked to better academic outcomes.
The study’s authors, John Zilvinskis and Louis Rocconi, both of Indiana University, compared college rankings with how engaged students were, as measured by the National Survey of Student Engagement. Not only did they find that students at top-ranked institutions weren’t any more engaged than their peers at less prestigious schools, they also discovered that students at higher-ranking schools actually spent less time interacting with professors than those at institutions that were ranked lower.
“Students attending inferior ranked schools reported more frequent interactions with their faculty than their counterparts at more highly ranked institutions,” the researchers wrote. Interacting with faculty is one measure of student engagement, along with the amount of time spent studying, collaborating with other students, and other factors.
Essentially, college rankings are good at ranking schools on factors like prestige or selectivity, but they aren’t necessarily that great at identifying whether students are actually engaged and (hopefully) learning while they’re on campus. That’s partly because measuring how much people learn in college is difficult. While there are standardized tests that attempt to evaluate student learning, as Libby Nelson noted in an article for Vox, the results aren’t as informative as they could be, in part because students don’t have any incentive to perform well on the tests.
The disconnect between rankings and student engagement matters not just because focusing too much on lists could be causing parents and prospective students to overlook lower-ranked schools that offer a good educational experience. According to Zilvinskis and Rocconi, pressure to increase a college’s performance on these lists might be encouraging schools to seek out short-term fixes that lead to a bump in rankings but do little to improve a student’s educational experience, “such as submitting inflated admissions information about students and providing self-applauding scores on peer assessment surveys.” Those actions could come at the expense of making more meaningful efforts to improve educational quality, especially those that aren’t measured by rankings, like student learning and engagement.
Despite longstanding concerns that college rankings focus on all the wrong aspects of the college experience, they’re not likely to disappear anytime soon. Instead, new ratings systems sprout up on a regular basis. In April, the Brookings Institution unveiled its own list, which takes a value-added approach to ranking schools – in other words, looking at the difference between the actual outcomes for students of a particular school and the predicted outcomes for similar institutions and students. Not only does the Brookings list include a larger a number of schools than most other rankings, but it isolates “the effect colleges themselves have on those outcomes, above and beyond what students’ backgrounds would predict,” according to the accompanying report.
Schools on the top of the Brookings list were surprisingly diverse, including science-focused institutions like Cal Tech and MIT, liberal arts schools like Carleton College, and even a maritime academy, SUNY Martime College. Two-year schools, which are usually overlooked in rankings, were also evaluated. NHTI-Concord’s Community College in New Hampshire and Lee College in Texas were the top-ranked schools in that group.
The Brookings Institution rankings don’t look at student engagement, instead focusing on economic outcomes like loan repayment and mid-career earnings. Lists like these that focus on the relationship between earnings and education are especially popular now, given concern about rising college tuition. The Obama administration is working on its own ranking system, which will take into account factors like net price and loan repayment. That may well help students make more informed decisions about what school is best for them. But it seems doubtful that we’ll ever reach agreement on which colleges truly are the best.