One of the biggest challenges in the transition to college is time management. Students find many different approaches to deal with this problem, but one tip we recommend is to avoid activities that waste your time — those that truly have no purpose and bring no joy. There are many examples that might sound familiar to Tufts students: arguing with strangers on Sidechat, refreshing Instagram for the millionth time in a day or waiting in long lines for the bathrooms on the main floor of Tisch (it’s almost always faster to use the ones downstairs). A reasonable addition to this list of activities that you are better off without would be worrying — or even thinking — about Tufts’ spot in college rankings.
There is something almost irrational about trying to distill every aspect of a university into a single, rankable measurement of its quality. When U.S. News & World Report issues its “Best Colleges” ranking, it is not immediately apparent what the top colleges have more of, relative to their lower-ranked peers — ‘bestness?’ The real answer, it turns out, is an arbitrarily weighted average of statistics such as test scores, alumni donation rates and “peer assessments” from surveys that university administrators use to rate their rivals. Many of these statistics are, of course, beneficial to the public — a prospective student deserves to know what a college’s graduation rate and typical class size are. The aggregated score, however, is an opaque value that reveals little about life at a given college. The whole, in this case, is less useful than the sum of its parts.
Unfortunately for applicants researching colleges, even the disaggregated statistics gathered by ranking services can be misleading. A college may, for instance, narrowly tweak its own policies to create the appearance of a big change. For instance, Northeastern University decided to cap the size of smaller classes at precisely 19 students because U.S. News has historically assessed class sizes based on the percentage with fewer than 20 students. In more egregious cases, universities simply lie about their data — as in the cases of Columbia (which dropped from No. 2 in last year’s U.S. News rankings to No. 18 this year after a math professor at the school accused it of submitting “inaccurate, dubious, or highly misleading” data) and others.
Rankings also fly in the face of the societal goals of many institutions. A college aiming to do the most good for the world will seek out the students who have the greatest potential to benefit from higher education. However, mainstream rankings systems incentivize colleges to focus on often-arbitrary criteria which denote ‘eliteness’ rather than focusing on tangible metrics of positive impacts on students and society. For instance, in calculating their scores, U.S. News weighs social mobility metrics at 5% of the school’s final score while graduation rate and selectivity statistics count for 8% and 7%, respectively.
Selectivity is calculated from the standardized testing scores of the university’s incoming freshman class. Those with higher household family incomes and better access to resources tend to do better on standardized tests. Therefore, a college interested in moving up the rankings would heavily prioritize test scores in the admissions process. Similarly, graduation rates are also positively correlated with family income. By favoring schools with higher graduation rates, the ranking system also favors schools with wealthier attendees. Overall, rankings provide colleges an incentive to focus on often-arbitrary criteria that signal eliteness, rather than focusing on tangible metrics that measure extent to which they have a positive impact on their students.
Metrics published in Forbes found that low-income students make up over 65% of college dropouts. Conversely, students with a family income of $100,000 or more are 50% more likely to graduate than their peers from low-income households. These statistics are further exacerbated within first-generation demographics; roughly 90% of first generation, low-income students do not graduate within six years. In the context of social mobility and societal wellbeing, FGLI students arguably stand to gain the most from a college education, which can increase lifetime earnings by hundreds of thousands of dollars, yet current ranking systems create environments which may discourage colleges from admitting students of these backgrounds.
Though college rankings are ostensibly intended to help students choose the college that offers the best opportunities for them, these lists’ promotion of selectively and eliteness distract from the true promises and benefits of higher education. Last month, Education Secretary Miguel Cardona publicly slammed college ranking systems that prioritize the exclusive tendencies of many highly-selective institutions over a university’s ability to meet the needs of its students, calling them “a joke.”
In the wake of these criticisms, a few alternative lists that assess colleges on more relevant metrics have appeared. The New York Times published a study in 2017 evaluating colleges based on socioeconomic stratification and economic mobility of their graduates. Tufts, like many elite universities, ranked among the top 10 schools with more students coming from the top 1% than the bottom 60%. Elite colleges were conspicuously absent from the top of the list that measured the social mobility for low-income students upon graduation. Washington Monthly magazine also provides ranked lists of colleges according to metrics like social mobility, research, public service and return on investment. One list, which estimates college affordability for lower income students, ranks Tufts at a staggering 209th place among colleges in the northeast alone.
As students, we must push our universities to enact policies that benefit every student, regardless of the impact on ranking. For example, ending legacy preferences in admissions criteria is shown to increase economic diversity among student bodies, with a notable rise in the percentage of Pell-Grant recipients. In turn, universities are responsible for prioritizing accountability to students, rather than to arbitrary measures of prestige and exclusivity among peer institutions. On a societal level, it’s time to evaluate the utility of ranked lists such as U.S. News’. Though access to an array of publicly available metrics for educational institutions can be useful to both prospective and current students, alumni and employers, we must choose to value universities in whichever way we prefer and to not yield to the often arbitrary rankings that are provided to us each year.