College Rankings

Reed and the Rankings Game

By Chris Lydgate ’90 | September 12, 2018

Reed offers one of the most profound educational experiences in the nation, with outstanding professors, small classes, and a depth of intellectual engagement that is second to none. Given our strong endowment, generous financial aid, and reputation as an academic powerhouse, how come we don’t rank near the top of the guidebooks assembled by U.S. News & World Report and its swarms of imitators?

There are three main reasons:

  • As a matter of principle, Reed does not respond to requests from for-profit guidebooks to fill out forms or surveys so they can rank colleges; as a result, we are penalized for refusing to play the game.
  • Most guidebooks are based on assumptions that don’t apply to our academic program. This means that their formulas don’t capture what is valuable and distinctive about Reed.
  • We believe that the value of an education is directly related to the depth of intellectual engagement in the classroom—something that most guidebooks do not and cannot measure.

Before we examine these points more closely, it’s worth emphasizing that Reed is committed to transparency. We provide solid, reliable information to help families choose the right college. We post scores of key statistics about Reed on our Institutional Research page and participate in leading data initiatives, including the National Center for Education Statistics College Navigator, the National Association of Independent Colleges and Universities Accountability Network, and the Common Data Set.

Now (cue trumpets) the backstory!

The Problem

Some things in life are easy to measure, and they produce clear, satisfying ranked lists, such as the biggest battleships or the brightest stars. Other things are more elusive—the best restaurants, the greatest movies, the most influential books. Obviously, the second category is by definition somewhat subjective, but you know where to start—you’d eat at the restaurants, watch the movies, and read the books.

Unfortunately, the guidebooks can’t do that. College is not like a movie you can watch in two hours. Guidebooks can’t possibly afford to send researchers to take classes at every college; even if they did, the results would be slipshod unless the same researchers took classes at all of the colleges in a given category.

So the guidebooks are in the business of ranking restaurants they will never visit, movies they will never watch, and books they will never read. Instead, they focus on the metrics they can measure from their desk chairs: the length of the menu, the size of the stock pots, the number of tines of the forks. Useful information, sure, but hardly a sound way of ranking things—it’s about as valid as claiming that Beethoven’s Ninth Symphony is better than Miranda’s Hamilton because it has more notes.

As the guidebooks have become more influential, more colleges have begun to adjust their menu to fit the dominant template—to cater to the crowd. Worse, the rankings have become powerful drivers of reputation, creating an intellectual echo chamber wherein reputation depends on rank, but rank depends on reputation. Some colleges have even submitted misleading data to the guidebooks in order to climb a few rungs on the self-referential treadmill of prestige.

The Revolt

Back in 1995, Reed led a revolt against the 800-pound gorilla of the guidebooks, U.S. News, because of our conviction that its methodology was fundamentally flawed. A front-page article in the Wall Street Journal revealed that many colleges were manipulating the system—some by “massaging” their numbers, others by outright fabrication.

In the wake of these reports, Steven Koblik, then-president of Reed, informed the editors of U.S. News that he didn’t find their project credible and that the college would not be returning any of their surveys—the unaudited questionnaires that form the basis of the guide’s ranking system.

Reed’s decision won praise from professors and administrators far and wide, many of whom had witnessed the pernicious effects of the rankings. The next year, however, U.S. News assigned Reed the lowest possible score in several areas and relegated the college to the lowest tier—not exactly a sterling example of impartiality.

Since then, U.S. News has devised several diabolical measures to prevent other colleges from pulling out of its system. In a presentation to the Annual Forum for the Association for Institutional Research in May 2014, Robert Morse, the director of data research for U.S. News, revealed that if a college doesn’t fill out the survey, then the guidebook arbitrarily assigns certain key statistics at one standard deviation below the mean. In other words, the guide automatically ranks non-responders as well below average. This statistical penalty has become a strong incentive for colleges to keep playing the game—or slip in the rankings.

Gaming the System

Over the years, U.S. News has made many adjustments to its system in an effort to prevent manipulation. Unfortunately, examples of misreporting are still widespread. In 2012, several prestigious colleges admitted to submitting inflated data. Other institutions have employed creative statistical techniques, such as encouraging more students to apply—and then rejecting them—in order to climb up a few rungs in the rankings.

The sad truth is that colleges and universities have a powerful incentive to game the system. The U.S. News rankings remain enormously popular—and surprisingly influential—despite widespread skepticism within the educational community. Just 2% of college admission directors think ranking systems are very effective at helping prospective students find a good fit, and 91% believe that institutions are cheating the system. In a similar vein, 89% of high school counselors and college admission officers reckon that the U.S. News rankings offer misleading conclusions about institutional quality; 83% think the rankings create confusion for prospective students and their families; and only 3% believe that the title “America’s Best Colleges” is accurate.

Misleading Assumptions

The next big problem with ranking systems is that many of their assumptions don’t make sense in the context of Reed. Take the deceptively simple issue of class size—widely interpreted as a key indicator of how much face-time students have with professors. Reed’s average class size is a strong 16.8. But one of the central pillars of a Reed education is the senior thesis, during which a student spends many hours conferring with a professor one-on-one on a research project that lasts an entire academic year. For most students, this is a profound experience. Many alumni describe it as the defining moment in their intellectual development. The thing is, most guidebooks do not count these projects as “classes,” so the senior thesis—the crowning achievement of every Reed graduate—simply disappears from the statistics.

Another misleading statistic is the so-called return on investment, or ROI, which is often reported as the average salary of graduates 10 years after their freshman year. Reed’s figure is $37,900, which is just above the national median. But a high proportion of Reed graduates—approximately 60%—head off to grad school, where they subsist on meagre stipends while they earn their credentials. When you look at their earnings once they join the workforce, the picture gets a lot brighter. According to PayScale, the median salary for Reed grads who are early in their career (0-5 years) is $53,400. The median for grads who are in mid-career (10+ years) is $112,700.

Better Alternatives

Reed is committed to sharing accurate, reliable information with prospective students and the general public. We also recognize the usefulness of independent guides in helping prospective students identify potential colleges of interest. For that reason, Reed does provide information to several guides—including Barron’s, the Fiske Guide to Colleges, Peterson’s, Colleges That Change Lives, and the Princeton Review—because we believe they do a better job of describing the experience, student culture, and academic environment that Reed provides. And, yes, we occasionally repost news items ranking us as #7 on the list of nerdiest colleges or #17 on the list of outdoorsy colleges—after all, we enjoy wacky lists as much as anyone.

Fundamentally, however, Reed continues to stand apart from ephemeral trends, resisting pressures to abandon its core principles and its unrelenting focus on intellectual exploration. We believe in the intrinsic value of the pursuit of knowledge—not just because it expands the frontier of human understanding, but also because knowledge transforms the mind that seeks it.

As former president Colin Diver once wrote: “Reed is a paradigmatic example of a college committed—and committed solely—to the cultivation of a thirst for knowledge. Reed illustrates a relatively small, but robust, segment of higher education whose virtues may not always be celebrated by the popular press, but can still be found by those who truly seek them.”

Read More

What college rankings really measure— it's not quality. By Jonathan Wai. Salon. September 16, 2018.

Eight more colleges identified as submitting incorrect data for 'U.S. News' rankings. By Scott Jaschik. Inside Higher Ed, August 27, 2018.

Temple Rankings Scandal: From Bad to Worse. By Scott Jaschik. Inside Higher Ed, July 20, 2018.

"U.S. News college rankings: Amid predictability, some major shifts," by Nick Anderson. Washington Post, 9 September 2014.

“Your Annual Reminder to Ignore the U.S. News & World Report College Rankings,” by John Tierney. The Atlantic, 10 September 2013.

“Forbes Boots 4 Colleges From Its Rankings” Inside Higher Ed, 25 July 2013.

“Can College Rankings Giant Keep Schools from Cheating?” by Lynn O’Shaughnessy. The College Solution, 6 February 2013.

“The College Rankings Racket,” by Joe Nocera. New York Times, 28 September 2012.

“Gaming the College Rankings,” by Richard Pérez-Peña and Daniel E. Slotnik. New York Times, 31 January 2012.

“The Order of Things—What College Rankings Really Tell Us,” by Malcolm Gladwell. The New Yorker, 14 February 2011.

“‘Manipulating,’ Er, Influencing ‘U.S. News,’” By Doug Lederman. Inside Higher Ed, 3 June 2009.

“Is There Life After Rankings?” by Colin Diver. The Atlantic, 1 November 2005.

“U.S. News’s Corrupt College Rankings,” by Robert L. Woodbury. College Advisor of New England, 2004.

“How To Make Your College No. 1 In U.S. News & World Report . . . And Lose Your Integrity In The Process,” by Robert Woodbury. Connection, New England Board of Higher Education, Spring 2003.

“Yet Another Rankings Fabrication,” by Scott Jaschik. Inside Higher Ed, 2 January 2013.

“Debt, Jobs, Diversity and Who Gets In: A Survey of Admissions Directors” Inside Higher Ed, October 2012.

“Cheat Sheets: Colleges Inflate SATs And Graduation Rates In Popular Guidebooks” by Steve Stecklow. Wall Street Journal, 5 Apr 1995.