The Rankings Go Global

The Times Higher Education Supplement has now come out with its sixth annual listing of the world’s top universities. Harvard continues to top the list, followed by the denizen of that other Cambridge across the Pond, which has now edged out Yale. The big news this year: the number of North American universities in the top 100 dropped from 42 to 36 from last year, while Asian universities are coming on strong.
I typically react to such news items in three stages. First, OMG, American higher education is tanking. Then I begin to fear that U.S. News & World Report copy-cats are taking over the world. Then the left side of my brain checks in and I ask myself whether such international comparisons are worth the bother.
Let’s take these reactions one at a time, not necessarily in chronological order.


The easiest question to answer is that, yes, U.S. News-style rankings are indeed taking over the world. In about 40 countries someone, usually a commercial publisher with non-academic priorities, issues rankings of their national universities, and there are at least two cross-national surveys that are now widely followed. The first to appear, in 2003, was “Academic Rankings of World Universities,” which is published by Shangai Jiao Tong University in China.
It covers 500 of the world’s best universities and ranks them almost entirely on the quantity and quality of their faculty research. The Times (London, that is) began publishing its “World University Rankings” the following year, and its list generally attracts more attention from the public. That’s because, instead of focusing only on faculty research, it seeks to present a more “rounded assessment” that would appeal to the readers of a general periodical. Half of its rankings are based on reputational surveys among academics and employers.
Both sets of rankings are already widely followed by policy makers in countries around the world. Countries seeking to enhance the prestige of their higher education systems use them as a guide to which institutions warrant major infusions of funds. The rankings also inform decisions such as which foreign universities are worthy destinations for students receiving state scholarships.
Such international rankings are likely to proliferate in coming years for many of the same reasons that U.S. News hit upon the idea in the mid-1980s. That was the time that the last of the baby boomers were finishing up their undergraduate degrees, and colleges facing the prospect of lower enrollments began marketing themselves more aggressively. That created an opening for publishers to come in on the side of consumers with products such as my own Fiske Guide to College and, of course, the U.S. News rankings. It was the colleges that first sent out the message that higher education was a consumer product, and students and their parents got the message. If there is anything that Americans know how to do well, it is to become savvy consumers.
This consumer mindset in higher education is now a global phenomenon, and we can expect to see more attempts to rank universities. Already the European Union is working on a new “multi-dimensional global university rankings” that focuses mainly on European institutions. The Organization for Economic Cooperation and Development (OECD) is working on its own International Assessment of Higher Education Learning Outcomes that takes account of “outcome” measures such as how much students learn. For a thoughtful description of these see Ben Wildavsky’s recent article in Washington Monthly.
Should Americans lament the fact that six U.S. universities fell out of the top 100 according to this year’s Times list? Leaving aside methodological problems (more on those in a minute), it’s probably fair to say that the trend toward the democratizing of quality higher education is certainly inevitable and probably a good thing. It also mimics what happened in the U.S. During the second half of the 20th century the U.S. underwent a healthy dispersion of quality higher education. Whereas academic elites were traditionally concentrated on the two coasts, the vast expansion of American higher education following World War II affected the hinterlands as well. Columbia University, for example, which once dominated the social sciences, could no longer do so. The University of Texas at Austin became a center of high energy physics. And so on. The same process of spreading the wealth among academic resources is now taking place on a global scale. Faculty members operating at the frontiers of research in most fields now routinely operate with peers in multiple other countries. The same process of spreading the wealth is likely to apply to entire institutions all around the world. China, which has huge quality problems once you dip below a few elite universities, is making a massive investments in new universities. India is doing likewise.
One can debate whether current rankings favor or discriminate against American universities. Based as they are almost entirely on measures of strength in research, the Shanghai listings probably favor strong U.S. research universities. One factor may be that European countries tend to carry out research not in universities but in separate research institutions. On the other hand, Phil Baty, deputy editor at the Times, told Scott Jaschik of Inside Higher Ed that their formula works against U.S. universities because the citations index favors institutions where most faculty are in medicine or hard sciences. This downgrades rankings of U.S. universities, which are more likely to emphasize the humanities and the social sciences as well. U.S. universities score relatively low on two other factors, the proportion of international faculty and international students.
While the proliferation of global rankings of universities seems inevitable, so do concerns about some of the methodological weaknesses of the U.S. News rankings that seem to have inspired them. The big issue revolves around the use of “reputational” surveys that ask presidents and other high-level of officials to offer opinions on the quality of other colleges and universities. Other than the fact that this criterion invites gaming the system – a whistleblower at Clemson recently accused her superiors of doing just this – one has to wonder whether college presidents might not be the last people you would ever want to seek out for reliable information on their competitors. Reputational surveys account for 25 percent of U.S. News rankings, but the Times Higher Education Supplement doubles the ante, giving them a weight of 50 percent. One has to wonder how many leaders of the top 100 world universities know anything at all about more than a handful of other universities outside their own national borders. But if you take away reputational surveys, you don’t have a lot of other data by which to make cross-country comparisons. That’s one reason that the Shanghai listings stick to research outputs – a narrow perspective to be sure but one that at least has some semblance of statistical reliability.
The methodological weaknesses of comparative rankings is a huge topic for another day, but let me mention just two that do not always receive much attention here in the U.S.
The first has to do with the unitary scale – schools listed 1,2,3 and so forth – which tends to make institutions appear either or more or less similar than they really are. One study of the 2009 U.S. News rankings concluded that the differences between the #1 and #3 institutions were actually greater than the differences between #11 and #33.
The second has to do with issues of equity. It is well know that U.S. colleges and universities have found all kinds of ways to game the system. Some of these means involved outright deceit, such as not reporting the SAT scores of all freshmen or excluding adjunct faculty members from faculty descriptions. But others constitute honest – at least in the sense of not deceitful – attempts to alter the numbers. These include greater emphasis on SAT scores in admissions decisions, awarding financial aid on the basis of merit rather than need, or taking more students through early decision. Such policies clearly work in favor of privileged students from quality high schools, both public and private, and punish schools that take seriously their obligation to enhance access to higher education for disadvantaged students.
U.S. universities are already doing a poor job of enhancing access to higher education. Let’s not export this problem as well with more rankings that link academic prestige to the quality of incoming students.

Edward Fiske

Edward B. Fiske, who writes the Fiske Guide to Colleges (Sourcebooks), is a former education editor of the New York Times.

One thought on “The Rankings Go Global”

  1. It’s amazing how varied a range of ideas one finds through the web. I might not go along with all you say, but it does force you to maybe sit back and re-think your own bias and habits. So thanks for that.

Leave a Reply

Your email address will not be published.