Reed and the Rankings Game
Why doesn’t Reed participate in U.S. News & World Report’s college rankings? We hear this question every year—often from prospective students who are trying to square our reputation as an intellectual powerhouse with our weirdly low grade from U.S. News.
The short answer is that we pulled out of the USN survey in 1995 because of our conviction that the magazine’s methodology is hopelessly flawed—a belief widely shared in the educational community. Although we would prefer that USN simply leave us out of their survey, the magazine persists in ranking us against other colleges based on data that is questionable at best. We believe that the value of an education is directly related to the degree of intellectual engagement in the classroom—something that USN does not and cannot measure. However, we recognize the usefulness of independent guides and we work hard to provide families with solid, reliable information to help them choose the right college.
For those who are curious about the details, we offer the following explanation.
Some things in life are reasonably easy to rank—the tallest buildings, the biggest states, the heaviest elements. When it comes to defining the best college, however, the concept of rank begins to lose its bearings, especially in the case of the USN survey, which takes a one-size-fits-all approach to institutions with radically different missions and character—a bit like asking whether the Beatles are better than Beethoven.
Nonetheless, Reed participated in the survey until 1995, when a front-page article in the Wall Street Journal revealed that many colleges were manipulating the system—some by “massaging” their numbers, others by outright fabrication.1 In the wake of these reports, Steven Koblik, then-president of Reed, informed the editors of USN that he didn’t find their project credible and that the college would not be returning any of their surveys—the unaudited questionnaires that form the basis of USN’s ranking system.
Reed’s decision won praise from professors and administrators far and wide, many of whom had witnessed the pernicious effects of the rankings.2 The next year, however, instead of simply omitting Reed from its list, the magazine assigned Reed the lowest possible score in several areas and relegated the college to the lowest tier—the most precipitous decline in the history of the USN ratings. Since then, Reed’s rank has bounced around like a Ping-Pong ball in a game where the rules keep changing.
Are the Rankings Valid?
Over the years, USN has made many adjustments to its system in an effort to prevent manipulation. Unfortunately, examples of misreporting are still widespread. In 2012, several prestigious colleges admitted submitting inflated data.3 4 Other institutions have employed creative statistical techniques,5 such as encouraging more students to apply and then rejecting them, in order to climb a few rungs up in the rankings.
The sad truth is that colleges and universities have a powerful incentive to game the system. USN’s rankings remain enormously popular—and surprisingly influential—despite widespread skepticism within the educational community. Just 2% of college admission directors think ranking systems are very effective at helping prospective students find a good fit, and 91% believe that institutions are cheating the system.6 In a similar vein, 89% of high school counselors and college admission officers reckon the USN rankings offer misleading conclusions about institutional quality; 83% think the rankings create confusion for prospective students and their families; and only 3% believe that the title “America’s Best Colleges” is accurate.7
There is little doubt that Reed’s stand comes with a price tag. Every year, thousands of promising high school seniors (and their parents) may cross Reed off their list if it doesn’t land in the Top 50 or Top 25. Nonetheless, Reed continues to believe that USN’s system is beyond repair and refuses to participate in what is essentially a statistical charade.
A Case in Point
Let’s take a closer look at some of Reed’s stats over the last 30 years.
|REED THEN AND NOW||1983||2013|
|Mean fresh. SAT Verbal||680*||714|
|Mean fresh. SAT Math||630*||670|
|Av. class size||16.9||14.2|
|6-yr graduation rate||53%||74%|
|% profs w PhD||74%||90%|
|Av. faculty salary||$68,953*||$89,311|
|# alumni who gave||3,394||4,299|
|Rank in US News||#9||#75*|
*On September 10, 2013, US News and World Report announced its 2014 rankings; Reed was assigned to #74.
[Source for Reed figures is the Reed College Office of Institutional Research unless otherwise noted. SAT scores for 1983 were adjusted using tables provided by the College Board to compare historical scores with current ones. Faculty salaries for 1983 were obtained from Academe, July/Aug 1984, and adjusted for inflation using BLS calculator. Figure for alumni donations in 1983 was obtained from Reed College records. Figure for the market value of the endowment per student in 1983 was obtained from the Reed College Office of the Treasurer and adjusted for inflation using BLS calculator.]
Since 1983, mean SAT scores for the entering class have climbed 5%. The acceptance rate has dropped from 92% to 36%. The graduation rate has jumped from 53% to 74%. The student-to-faculty ratio has improved, classes are smaller, the proportion of professors with PhDs is up, faculty salaries have climbed, the number of alumni who gave money to Reed is higher, and the market value of the endowment per student has quadrupled. These are all important factors in USN’s equation, but our rank has gone from number 10 to number 75.
Take another example—the deceptively simple issue of class size. Reed’s average class size (14.2) compares well to our peers. One of the central pillars of a Reed education, however, is the senior thesis, during which a student spends many hours conferring one-on-one with a professor on a research project. USN does not count these projects as “classes,” however, so the senior thesis—the crowning achievement of every Reed graduate—is swallowed by an asterisk.
USN’s rankings don’t measure many crucial aspects of an undergraduate experience such as the depth of scholarship, quality of the faculty, academic standards, financial aid, or campus diversity. In fact, many educators believe that the rankings actually discourage colleges from seeking a more diverse student body because several key metrics favor students from wealthier, more mainstream backgrounds.
Reed is committed to sharing accurate, reliable information with prospective students and the general public. We also recognize the usefulness of independent guides in helping prospective students identify potential colleges. For that reason, Reed does provide information to several college guides—including Barron’s, the Fiske Guide to Colleges, Peterson’s, and Colleges That Change Lives—because we believe they do a better job of describing the experience, student culture, and academic environment Reed provides. And yes, we occasionally repost news items ranking us as #7 on the list of nerdiest colleges or #17 on the list of outdoorsy colleges—after all, we enjoy wacky lists as much as anyone.
Fundamentally, however, Reed continues to stand apart from ephemeral trends, resisting pressures to abandon its core principles and its unrelenting focus on intellectual exploration. We believe in the intrinsic value of the pursuit of knowledge—not just because it expands the frontier of human understanding, but also because knowledge transforms the mind that seeks it. As former president Colin Diver once wrote: “Reed is a paradigmatic example of a college committed—and committed solely—to the cultivation of a thirst for knowledge. Reed illustrates a relatively small, but robust, segment of higher education whose virtues may not always be celebrated by the popular press, but can still be found by those who truly seek them.”
Author: Chris Lydgate, editor of Reed magazine, September 10, 2013.
For more data about Reed, visit our institutional research office.
“Your Annual Reminder to Ignore the U.S. News & World Report College Rankings,” By John Tierney. The Atlantic, 10 Sep 2013.
“Forbes' Boots 4 Colleges From Its Rankings.” Inside Higher Ed, 25 July 2013.
“Can College Rankings Giant Keep Schools from Cheating?,” By Lynn O’Shaughnessy. The College Solution, 6 Feb 2013.
“The College Rankings Racket,” By Joe Nocera. New York Times, 28 Sep 2012.
“Gaming the College Rankings,” By Richard Pérez-Peña and Daniel E. Slotnik. New York Times, 31 Jan 2012.
“The Order of Things,” By Malcolm Gladwell. The New Yorker, 14 Feb 2011.
“‘Manipulating,’ Er, Influencing ‘U.S. News,’” By Doug Lederman. Inside Higher Ed, 3 June 2009
“Is There Life After Rankings?,” By Colin Diver. The Atlantic, 1 Nov 2005.
“US New’s Corrupt College Rankings,” By Robert L. Woodbury. College Advisor of New England, 2004.