THE WUR 2024: New Methodology, New Questions
Article
6 October 2023
Head of Research & Analytics
Senior Researcher, PhD
Liudmila Solntseva
Roman Trikin
Authors:
Overview
On 27 September 2023, Times Higher Education (THE) published the 20th annual world universities rankings (WUR) — this time using a radically new methodology. This year's list includes 1,904 universities from 108 countries (compared with 1,799 last year). Entries from four countries or territories — Armenia, Bosnia and Herzegovina, Kosovo, and North Macedonia — made it onto the list for the first time.
In total, 179 universities are new entries or were not in the rankings last year; 60 of them have been promoted from the previous edition's reporter list (Graph 1). At the same time, five universities have been demoted to reporters (769 in total).
Graph 1. Dynamics of universities in THE WUR 2024 vs THE WUR 2023
This year saw 69 universities disappear from the rankings — most likely because they did not provide data to THE. E-Quadrat’s observation is that most of these universities had previously shown negative dynamics in THE WUR. We can speculate that there are currently two major types of universities: those that feel "frustrated" with the rankings and those that want really badly to be ranked, preferably in the Top-100; the latter trend is much more prominent.
Many changes in the Top-500 — but not in the Top-100
The composition of the Top-100 usually remains very stable. The US still has the most entries (36) in this tier, although the number has been decreasing year after year; in 2010, for example, the US had 53 universities in the Top-100.
Table 1. Number of universities in the Top-500 (overall) by country and year
When we look at a bigger range — the Top-500 — there are many more changes. On average, universities that were in the Top-500 in THE WUR 2023 (published in 2022) have moved down in THE WUR 2024 (published this year) by more than 25 positions. In contrast, universities that had been in the Top-500 in THE WUR 2022 (published in 2021) lost about 10 positions in THE WUR 2023.Front matter, or preliminaries, is the first section of a book and is usually the smallest section in terms of the number of pages. Each page is counted, but no folio or page number is expressed or printed, on either display pages or blank pages.
In terms of country representation, here are those which lost the most positions in the Top-500: Iran (–6), the UK (–5), and India (–4) (Table 1). Saudi Arabia lost two positions in this tier, while three countries — Greece, Nigeria, and Vietnam — also lost two positions each and completely dropped out from the Top-500. China and Japan, on the contrary, were able to boost their presence in the Top-500 with four more entries each; Canada, Malaysia, South Korea, and Taiwan (China) have three new entries each; France, Switzerland, and the US — two more each.
The new methodology
We have looked at THE WUR 2024 through the prism of five top-level indicators — which have considerably changed this year. How has the new methodology affected the positions of countries and certain universities? And are there any peculiar results in this year’s list?
A commonly recognised weak point of THE WUR (before the changes in the methodology) was the small expert base. Just two years ago, only around 10,000 authors with publications in Scopus used to participate in the annual survey. For comparison, QS has been surveying over 100,000 experts for a long time — but some of these respondents may be HE managers, not necessarily researchers. It is an issue for QS; however, THE was also subjected to criticism, for example for having only 100 experts to judge the reputation of all law schools in the world.
Now there are hopes to resolve skewness: almost 40,000 experts participated in the latest THE survey (four times more than before). The current rankings take into account survey results from the previous two years, and consequently the expert base has grown to more than 68,000 people.
It is good to know that interactions with the expert community underwent significant changes which have had an effect on this year’s rankings. ТНЕ conducted the last two rounds of the Academic Survey by itself, while previously it had been done by Elsevier.
The surveys are used for two indicators, Teaching and Research Environment, and comprise more than 50% of each. Although some components of these indicators have slightly changed their weights, we believe that the drastic growth in the number of surveyed experts is behind the most striking changes in the rankings.
Let’s now take a closer look at the lists which appear when THE WUR 2024 are filtered by individual indicators — Teaching, Research Environment, Research Quality, Industry, and International Outlook.
Teaching
On average, universities that were in the Top-500 by Teaching in THE WUR 2023 (published in 2022) have moved down in THE WUR 2024 (published now) by about 20 positions. In contrast, universities that had been in the Top-500 by Teaching in THE WUR 2022 (published in 2021) lost about 10 positions in THE WUR 2023.
As for countries, the US has now lost the most universities (–7) in the Top-500 by Teaching, while India has gained the most in this tier (+8) (Table 2). Within the Top-100 by Teaching, Germany has lost the most positions (from seven to four) and Japan has gained the most (from four to seven).
Table 2. Number of universities in the Top-500 by Teaching (by country and year)
Research Environment
Although the rankings by Research Environment (formerly Research) have gained some new entries and lost some participants compared with the previous year, we do not observe striking changes which could have been caused by the new methodology.
Research Quality
One of THE’s goals was to change the methodology so as to balance the effect of the indicator reflecting the quality of research. In 2022, this indicator was called Citations Impact — equal to Elsevier’s Field-Weighted Citation Impact (FWCI). In the new methodology, the weight of FWCI has been halved and three more new parameters have been introduced, forming the aggregated Research Quality indicator. In this light, although the rankings by Research Quality / Citations Impact (we will be using the new name throughout the rest of the text) are a little hard to compare year-over-year, this indicator aims to reflect the overall quality of science — and that is why the comparison seems valid.
Last year’s list of the Top-6 universities by Research Quality looked quite odd as it did not feature any of the “usual suspects” but included entries from Vietnam, Türkiye, Iran, and Ethiopia (Table 3). This year, these universities have moved far down the table. At the same time, the changes in the methodology have brought some of the world’s (probably) best-known universities to the top positions by Research Quality (Table 4). We look forward to seeing next year’s rankings to check whether the methodology continues to deliver similar results.
Table 3. Top-6 universities by Research Quality in 2022 and their positions in 2023
Table 4. Top-5 universities by Research Quality in 2023 and their previous positions
Just over a half (58) of the universities in the Top-100 by Research Quality that were there in 2022 have remained in this tier in 2023. Geographical representation, too, has experienced major changes; of the 35 countries in the Top-100 by Research Quality in 2022, 21 have dropped from the tier, while eight new have entered. Iran has lost the most universities (–8) but still kept one entry (Table 5). At the same time, Egypt and India have lost three universities each; Saudi Arabia and Vietnam have lost two each — making these four countries drop from the Top-100 altogether.
Table 5. Number of universities in the Top-100 by Research Quality (by country and year)
On the other hand, we can highlight some success stories. The Netherlands has strengthened its presence in the Top-100 by Research Quality from one to six universities; Hong Kong is now represented by four entries instead of one; and Switzerland has three universities in the Top-100 compared with none in 2022.
The changes in the Research Quality indicator were meant to reflect the quality of science at various universities in a more comprehensive way, and that is why the indicator now contains several metrics. We speculate that the changes in the methodology have resulted in a yet higher number of the entries from the US (35) and the UK (22) in this year’s Top-100 by Research Quality.

Industry
Industry — formerly called Industry Income — is now a complex indicator which consists of not only the industry income metric itself, but also patents (i.e. the number of patents citing the university’s research).
There has been a major reshuffle in the geographical representation: of the 27 countries in the Top-100 by Industry in 2022, 11 have dropped from the tier, while two new have entered. As a result, the diversity in the Top-100 has decreased, and the entire tier is now represented by just 18 countries.
China has lost the most universities (–10) followed by Türkiye (–8) and Russia (–5) (Table 6). Iran, Saudi Arabia, and Taiwan (China) have lost two positions each. Only China and Taiwan (China) have been able to remain in the Top-100 by Industry while the rest of the mentioned countries have dropped out completely.
Table 6. Number of universities in the Top-100 by Industry (by country and year)
On the positive side, Australia — after some absence — has made a comeback to the Top-100 by Industry, with seven universities this year compared with none in 2022. Germany and the US were also successful in the Top-100, having added six and five entries, respectively.
It is also interesting that — for the first time in many years — the UK has entered the Top-100 by Industry, although the country is among the most represented ones in the Top-100 in the overall rankings and the rankings by other indicators.
Just over a half (52) of the universities in the Top-100 by Industry that were there in 2022 have remained in this tier in 2023. At the same time, those universities that have dropped from the Top-100 by Industry have lost a lot of positions: 25 entries have moved down by more than 100 positions each, and 11 — by more than 500. The Top-5 “outsiders” from last year’s Top-100 have lost more than 650 positions each (Table 7). On the other hand, when we examine the universities which have gained the most positions, their Top-5 consists of entries that have jumped up from 1500+ by more than 1000 positions (Table 8).
Table 7. Top-5 universities which lost the most positions (dropping from the Top-100 by Industry in 2022)
Table 8. Top-5 universities which gained the most positions (considerable growth by Industry in 2023)
In the case of the Industry indicator, the changes in the methodology are meant to improve the evaluation of how effectively universities interact with business and engage in technology transfer. This year’s results, calculated using the new approach, have shown radical shifts in the rankings — both at the university level and at the country level.
International Outlook
This top-level indicator has received a new component, Studying Abroad; however, the move has not affected this year’s rankings, because Studying Abroad has received zero weight (for now). Although the new metric will play a role next year, the methodology of the current rankings has also seen changes in the International Outlook indicator — which has now been normalised by population.
Among this year’s success stories are the UK which now occupies almost half (48) of the Top-100 positions by International Outlook (up from 38 positions last year), and the US which has returned to the Top-100 with two entries in 2023 after absence in 2022 and 2021.
Final thoughts
Among this year’s success stories are the UK which now occupies almost half (48) of the Top-100 positions by International Outlook (up from 38 positions last year), and the US which has returned to the Top-100 with two entries in 2023 after absence in 2022 and 2021.
This year has seen big changes in the methodologies used by the two leading agencies (former partners and now competitors) — Quacquarelli Symonds (QS) and Times Higher Education (THE). Certainly, this step was necessary as the landscape of higher education had changed significantly over the past 20 years. Today’s university is multifunctional; it does not just serve a dual purpose of teaching and carrying out research, but is also involved in technology transfer, innovation, and solving social problems. Consequently, it was necessary to introduce new metrics and reconfigure the entire evaluation system.
So what has happened, actually?
Perhaps some of the issues have been solved; however, QS rankings have become significantly less transparent, while THE rankings are now non-transparent at all.
What are the usual arguments against the rankings? We can group the issues into three areas:
  • quality of the metrics and their weights;
  • lack of transparency;
  • stimulation of fierce competition without supporting the diversity of universities.
Recently, criticism towards the rankings often went hand in hand with démarches by some universities, which went as far as stopping to provide their data to the ranking agencies; as a consequence, they have dropped out altogether. Some examples include South Korea’s leading universities that made a joint announcement to withdraw from QS WUR, and Utrecht University from the Netherlands which does not want to put effort into working with ranking agencies. For now, this trend has not involved too many universities, but if it intensifies, the role of the rankings — as well as the business of the ranking agencies — will be at risk.
We believe that it is better not to attract criticism or provoke universities to boycott the rankings. Greater transparency of the methodology would be beneficial; it would also help to publish the recalculated results of the previous year using the new methodology. In this case, it would be easier to see that some universities’ positions have changed sharply because of the new methodology — and not as a result of internal transformation.