Students need to research past the rankings of U.S. News and Report

November 27, 2017 — by Victor Liu

The system that the most popular ranking sites use to rank colleges lies on a foundation of seven differently weighted variables, which consist of: graduation and retention rates, undergraduate academic reputation, faculty resources, student selectivity, financial resources, graduation rate performance, and alumni giving rate.

 

When seniors complete their list of prospective colleges each September, it’s hard to determine which colleges are “better” than others.

College ranking systems like the U.S. News and World Report try to answer this question. The U.S. News and World Report is arguably the most prominent one of these systems, recording 2.6 million unique users and 18.9 million page views on a single day in 2014.

The system the U.S. News and World Report uses to rank colleges lies on a foundation of seven differently weighted variables, which consist of: graduation and retention rates, weighing 22.5 percent; undergraduate academic reputation, weighing 22.5 percent; faculty resources, weighing 20 percent; student selectivity, weighing 12.5 percent; financial resources, weighing 10 percent; graduation rate performance, weighing 7.5 percent; and alumni giving rate, weighing 5 percent.

But there are two major problems with these kinds of rankings. First, universities get caught up in a rat race in trying to improve or maintain their rankings. For students, a fixation on a school’s ranking may trick them into spending their next four years at an institution that may not be the best fit for them.

For colleges, falling down the rankings means fewer applicants and less prestige.

Looking at the magazine’s criteria, academic reputation stands out as one of the biggest determinants of a college’s ranking — it comprises almost one-fourth of the college’s entire score.

However, the methodology that the U.S. News and Report uses to determine reputation is faulty at best. To gather data about a college’s reputation, it sends out surveys to other schools’ deans and presidents to ask them about that school’s renown.

But how can each dean possibly know about every other college’s reputation? This uncertainty feeds into a never-ending cycle. If schools want to improve in the standings, they need to win recognition from other colleges, but they can’t win that recognition if they don’t rank high enough in the first place.

To solve this problem, some schools have resorted to artificially manipulating numbers to elevate their rankings. For example, Claremont McKenna admitted to inflating SAT scores of applicants (to improve their student selectivity score) for seven years in 2012 by sending incorrect numbers to the U.S News and Report. Similarly, Baylor University offered already-admitted students financial incentives to retake the SAT in hopes of getting a higher score to boost the school’s scores in 2008.

For students, buying into the U.S. News and Report and other rankings is also problematic. Without looking deeper into the specifics of a college and looking just at the surface-level scores the ranking provide, students can potentially be misled into attending a school that doesn’t suit their needs for the next four years.

The list also doesn’t cater to the specific needs of a student. With a rigid criterion that weighs all schools’ qualities under the same rubric, an extremely important aspect of a college that a student might like gets overshadowed.

For example, affordability, arguably one of the most crucial aspects of a college during a time of constantly rising tuition, isn’t even factored into the rankings. While U.S. News and Report does consider financial resources as a category, that doesn’t weigh in financial aid. A college that might perfectly cater to a student’s financial needs can get buried in a pile of schools that have excellent alumni donations.

Put simply, college ranking systems such as the U.S. News and Report’s don’t work because they try to categorize all colleges and universities under a one-size-fits-all rubric. The U.S. News and Report shouldn’t be a one-stop shop for learning about all colleges; instead, students need to learn more about a college by doing their own research.

For starters, students need to look at what a school can specifically offer them, and not at what the list thinks they will value in the school. Rankings can try to predict what a student wants in a college, but it is ultimately students who know what they want.

Instead of haphazardly Googling “best colleges” to determine where they want to spend arguably the most formative years of their lives, students should try to look into something more nuanced, such as “most connections with ecological consulting firms” or “highest average funding per individual undergraduate researcher.” Only then can students get a solid grasp of the colleges they want to attend and not a vague understanding of which colleges have best played by the rules of arbitrary college ranking requirements.

 

1 view this week