Tuesday, February 27, 2018

Are the rankings biased?

Louise Richardson, vice-chancellor of the University of Oxford has published an article in the Financial Times proclaiming that British universities are a national asset and that their researchers deserve that same adulation as athletes and actors.

"Listening to the public discourse one could be forgiven for thinking that the British higher education system is a failure. It is not. It is the envy of the world."

That is an unfortunate phrase. It used to be asserted that the National Health Service was the envy of the world.

She cites as evidence for university excellence the Times Higher Education World University Rankings which have three British universities in the world's top ten and twelve in the top one hundred. These rankings also, although she does not mention it here, put Oxford in first place.

There are now, according to IREG, 21 global university rankings. One wonders why a world-class scholar and head of a world-class university would choose rankings that regularly produce absurdities such as Anglia Ruskin University ahead of Oxford for research impact and Babol Noshirvani University of Technology its equal.

But perhaps it is not really surprising since of those rankings THE is the only one to put Oxford  in first place. In the others it ranges from third place in the URAP rankings published in Ankara to seventh in the Shanghai Rankings (ARWU), Webometrics (WEB) and Round University Ranking (RUR) from Russia

That leads to the question of how far the rankings are biased in favor of universities in their own countries.

Below is a quick and simple comparison of how top universities perform in rankings published in the countries where they located and in other rankings.

I have looked at the rank of the top scoring home country university in each of eleven global rankings and then at how well that university does in the other rankings. The table below gives the overall rank of each "national flagship" in the most recent eleven global university rankings. The rank in the home country rankings is in red.

We can see that Oxford does better in the Times Higher Education (THE) world  rankings where it is first than in the others where its rank ranges from 3rd  to 7th. Similarly, Cambridge is the best performing UK university in the QS rankings where it is 4th. It is also 4th in  the Center for World University Rankings (CWUR), now published in the UAE, and 3rd in ARWU. In the other rankings it does less well.

ARWU, the US News Best Global Universities (BGU), Scimago (SCI), Webometrics (WEB), URAP, the National Taiwan University Rankings (NTU), and RUR do not seem to be biased in favour of their country's flagship universities. For example, URAP ranks Middle East Technical University (METU) 532nd which is  lower than five other rankings  and higher than three.

CWUR  used to be published from Jeddah in Saudi Arabia but has now moved to the Emirates so I count the whole Arabian peninsula as its home. The top home university is therefore King Saud University (KSU), which is ranked 560th, worse than in any other ranking except for THE.

The GreenMetric Rankings, produced by Universitas Indonesia (UI), have that university in 23rd place, which is very much better than any other.

It looks like THE, GreenMetric and, to a lesser extent QS, are biased towards their top home country institutions.

This only refers to the best universities and we might get different result looking at all the ranked universities.

There is a paper by Chris Claassen that does this although it covers fewer rankings.



THE
ARWU
QS
 BGU
SCI
WEB
URAP
NTU
RUR
CWUR
GM
Oxford
1
7
6
5
6
7
3
5
7
5
6
Tsinghua
35
48
25
64
8
45
25
34
75
65
NR
Cambridge
4
3
5
7
16
11
9
12
9
4
NR
Harvard
6
1
3
1
1
1
1
1
1
1
NR
Barcelona
201-250
201-300
156
81
151
138
46
64
212
103
180
METU
601-800
701-800
471-480
314
489
521
532
601-700
407
498
NR
NTU
195
151-200
76
166
342
85
100
114
107
52
92
Lomonosov  MSU
188
93
95
267
342
235
194
236
145
97
NR
KSU
501-600
101-150
221
377
NR
424
192
318
460
560
NR
UI
600-800
NR
277
NR
632
888
1548
NR
NR
NR
23

Tuesday, February 20, 2018

Is Erdogan Destroying Turkish Universities?


An article by Andrew Wilks in The National claims that the position of Turkish universities in the Times Higher Education (THE) world rankings, especially that of Middle East Technical University (METU) has been declining as a result of the crackdown by president Erdogan following the unsuccessful coup of July 2016.

He claims that Turkish universities are now sliding down the international rankings and that this is because of the decline of academic freedom, the dismissal or emigration of many academics and a decline in its academic reputation.


'Turkish universities were once seen as a benchmark of the country’s progress, steadily climbing international rankings to compete with the world’s elite.
But since the introduction of emergency powers following a failed coup against President Recep Tayyip Erdogan in July 2016, the government’s grip on academic freedom has tightened.
A slide in the nation's academic reputation is now indisputable. Three years ago, six Turkish institutions [actually five] were in the Times Higher Education’s global top 300. Ankara's Middle East Technical University was ranked 85th. Now, with Oxford and Cambridge leading the standings, no Turkish university sits in the top 300.
Experts say at least part of the reason is that since the coup attempt more than 5,800 academics have been dismissed from their jobs. Mr Erdogan has also increased his leeway in selecting university rectors.
Gulcin Ozkan, formerly of Middle East Technical University but now teaching economics at York University in Britain, said the wave of dismissals and arrests has "forced some of the best brains out of the country".'
I have no great regard for Erdogan but in this case he is entirely innocent.

There has been a massive decline in METU's position in the THE rankings since 2014 but that is entirely the fault of THE's methodology. 

In the world rankings of 2014-15, published in 2014, METU was 85th in the world, with a whopping score of 92.0 for citations, which carries an official weighting of 30%. That score was the result of METU's participation in the Large Hadron Collider (LHC) project which produces papers with hundreds or thousands of authors and hundreds and thousands of citations. In 2014 THE counted every single contributor as receiving all of the citations. Added to this was a regional modification that boosted the scores of universities located in countries with a low citations impact score.

In 2015, THE revamped its methodology by not counting the citations to these mega-papers and by applying the regional modification to only half of the research impact score.

As a result, in the 2015-16 rankings METU crashed to the 501-600 band, with a score for citations of only 28.8. Other Turkish universities had also been involved in the LHC project and benefited from the citations bonus and they too plummeted. There was now only one Turkish university in the THE top 300.

The exalted position of METU in the THE 2014-15 rankings was the result of THE's odd methodology and its spectacular tumble was the result of changing that methodology. In other popular rankings METU seems to be slipping a bit but it never goes as high as in THE in 2014 or as low as in 2015

In the QS world rankings for 2014-15 METU was in the 401-410 band and by 2017-18 it had fallen to  471-480 in 2017

The Russian Round University Rankings have it 375 in 2014 and 407 in 407. The US News Best Global Universities placed it 314th last year.

Erdogan had nothing to do with it.















Friday, February 16, 2018

It's happened: China overtakes USA in scientific research

Last November I noted that the USA was barely managing to hold onto its lead over China in scientific research as measured by articles in the Scopus database. At the time, there were 346,425 articles with a Chinese affiliation and 352,275 with a US affiliation for 2017.

As of today, there are 395,597 Chinese and 406,200 US articles dated 2017.

For 2018 so far, the numbers are 53,941 Chinese and 49,428 US.

There are other document types listed in Scopus and perhaps the situation may change over the course of the year.

Also, the United States still has a smaller population so it maintains its lead in per capita research production. For the moment.

Saturday, February 10, 2018

Influence of Rankings on State Policy: India

In case you are wondering why the Indian media get so excited about the THE and QS rankings and not about those that are just as good or better such as Leiden Ranking, RUR or Nature Index, see this from the University Grants Commission.

Note that it says "any time" and that only the Big Three rankings count for getting Assistant Professor jobs.


"NEW DELHI:  University Grants Commission (UGC) has come up with, UGC Regulations 2018, which exempts PhD candidates from having NET qualification for direct recruitment to Assistant Professor post. This new draft regulation is known as Minimum Qualifications for Appointment of Teachers and Other Academic Staff in Universities and Colleges and Measures for the Maintenance of Standards in Higher Education. Further the Commission has also listed 'Ph.D degree from a university/ institution with a ranking in top 500 in the World University ranking (at any time) by Quacquarelli Symonds (QS), the Times Higher Education (THE) and Academic Ranking of World Universities (ARWU) of the Shanghai Jiao Tong University (Shanghai),' as one of the criteria for Assistant Professor appointment."


Thursday, February 08, 2018

Playing the Rankings Game in Pakistan

This article by Pervez Hoodbhoy from October 2016 is worth reading:

"A recently released report by Thomson-Reuters, a Canada based multinational media firm, says, “In the last decade, Pakistan’s scientific research productivity has increased by more than 4 times, from approximately 2000 articles per year in 2006 to more than 9000 articles in 2015. During this time, the number of Highly Cited Papers (HCPs) featuring Pakistan based authors increased tenfold from 9 articles in 2006 to 98 in 2015.”
This puts Pakistan well ahead of Brazil, Russia, India, and China in terms of HCPs. As the reader surely knows, every citation is an acknowledgement by other researchers of important research or useful new findings. The more citations a researcher earns, the more impact he/she is supposed to have had upon that field. Research evaluations, through multiple pathways, count for 50-70 percent of a university’s ranking (if not more).
If Thomson-Reuters has it right, then Pakistanis should be overjoyed. India has been beaten hollow. Better still, two of the world’s supposedly most advanced countries–Russia and China–are way behind. This steroid propelled growth means Pakistan will overtake America in just a decade or two.
But just a little analysis shows something is amiss. Surely a four-fold increase in scientific productivity must have some obvious manifestations. Does one see science laboratories in Pakistani universities four times busier? Are there four times as many seminars presenting new results? Does one hear animated discussions on scientific topics four times more frequently?
Nothing’s visible. Academic activity on Pakistani campuses might be unchanged or perhaps even less today, but is certainly not higher than ten years ago. So where–and why–are the authors of the HCP’s hiding? Could it be that these hugely prolific researchers are too bashful to present their results in departmental seminars or public lectures? The answer is not too difficult to guess."




Should Pakistan Celebrate the Latest THE Asian Rankings?


This is an updating and revision of a post from a few days ago


There appears to be no end to the craze for university rankings. The media in many parts of the world show almost as much interest in global university rankings as in the Olympics or the World Cup. They are now used to set requirements for immigration, chose research collaborators, external examiners, international partners and for marketing, public relations, and recruitment.

Pakistan has not escaped the craze although it was perhaps a bit slower than some other places. Recently, we have seen headlines announcing that ten Pakistani universities are included in the latest Times Higher Education (THE) Asian rankings and highlighting the achievement of Quaid-i-Azam University (QAU) in Islamabad reaching the top 100.

Rankings are unavoidable and sometimes they have beneficial results. The first publication of the research-based Shanghai rankings in 2003, for example, was a salutary shock to continental European universities and a clear demonstration of how far China had to go to catch up with the West in the natural sciences. But rankings do need to be treated with caution especially when ranking metrics are badly and obviously flawed.

THE note that there are now ten Pakistani universities in the Asian rankings and one, QAU, in 79th place, which would appear to be evidence of academic progress.

Unfortunately, Pakistani universities, especially QAU, do very much better in the THE rankings than in others. QAU is in the 401-500 band in the THE world rankings, which use the same indicators as the Asian rankings. But in the QS World University Rankings it is in the 650-700 band. It does not even get into the 800 ranked universities In the Shanghai rankings, the 903 in the Leiden Ranking, or the 763 in the Russian Round University Rankings. In the University Ranking by Academic Performance, published in Ankara, it is 605th, in the Center for World University Rankings list 870th.

How can we explain QAU’s success in the THE world and Asian rankings, one that is so much greater than any other ranking? It is in large part the result of a flawed methodology.

Take a look at the scores that QAU got in the THE rankings. In all cases the top scoring university gets 100.

For Teaching, combining five indicators, it was 25.7 which is not very good. For international outlook it was 42.1. Since QAU has very few international staff or students this mediocre score is very probably the result of a high score for international collaboration.

For research income from industry it was 31.8. This is probably an estimate since exactly the same score is given for four other Pakistani universities.

Now we come to something very odd. QAU’s research score was 1.3. It was the lowest of the 350 universities in the Asian rankings, very much lower than the next worse, Ibaraki University in Japan with 6.6.  The research score is composed of research reputation, publications per faculty and research income per faculty. This probably means that QAU’s score for research reputation was zero or close to zero.

In contrast, QAU’s score of 81.2 for research impact measured by citations is among the best in Asia. Indeed, in this respect it would appear to be truly world class with a better score than Monash University, the Chinese University of Hong Kong, the University of Bologna or the University of Nottingham.

How is it being possible that QAU could be 7th in Asia for research impact but 350th for research?

The answer is that THE’s research impact indicator is extremely misleading. It does not simply measure the number of citations but the number of citations in over 300 fields, five years of publication and up to six years of citations. This means that a few highly cited papers in a strategic discipline at a strategic time can have a disproportionate effect on the impact score especially if the total number of papers is low.

Added to this is THE’s regional modification which means that the citation impact score of a university is divided by the square root of the score of the whole country in which they university is located. That means that the score of universities in the top scoring country remain the same but that of all the others goes up, the worse the country the bigger the increase. The effect of this is to give a big boost to countries like Pakistan. THE used to apply this bonus to all of the citations indicator but now only to 50%.

Then we have to consider how THE deals with mega-papers mainly in physics and medicine, those with hundred even thousands of authors and hundreds and thousands of citations.

Until the world rankings of 2015-16 THE treated every single author of such papers as though he or she were the only author of the papers. Then they stopped counting citations to these papers and then in 2016-17 they awarded each institution a minimum 5% for citations.

The effect of the citations metric has been to make a mockery of the THE Asian and world rankings. A succession of unlikely places has been propelled to the top of the indicator because of contributions to mega-papers or because of a few or even a single prolific author combined with a low overall number of papers. We have seen Alexandria University, Anglia Ruskin University, Moscow State Engineering Physics Institute, Tokyo Metropolitan University rise to the top of this indicator. In last year’s Asian rankings, Veltech University in India appeared to be first for research impact.

QAU has been involved in the Large Hadron Collider (LHC) project, which produces papers with hundreds or thousands of authors and hundreds or thousands of citations, and has provided authors for several papers. One 2012 paper derived from this project received 4094 citations so that QU would be credited with 205 citations just for this paper.

In addition to this QAU employs an extremely productive mathematician, Tasawar Hayat, who is among the world’s elite of researchers in Clarivate Analytics list of Highly Cited Researchers where his primary affiliation is King Abdulaziz University in Saudi Arabia and QAU is his secondary affiliation. Professor Hayat is extremely prolific: in 2017 alone, he was author or co-author of 384 scientific documents, articles, reviews, notes and so on.

There is nothing wrong with QAU taking part in the LHC project and I am unable to comment on the quality of his research. It should, however, be understood that if Professor Hayat left QAU or QAU withdrew from the LHC project or THE changed its methodology then QAU could suffer a dramatic fall in the rankings similar to those suffered by some Japanese, Turkish or Korean universities in recent years. This is an achievement built on desperately weak foundations.

It would be very unwise to use these rankings as evidence for the excellence of QAU or any other university.

Tuesday, February 06, 2018

Rising Stars of Asian research

Times Higher Education (THE) has just announced the latest edition of its Asian rankings. Since the indicators are the same as the world rankings with adjusted weightings there was absolutely no suspense about who would be top. In case anybody still doesn't know it was the National University of Singapore.

The really interesting part of the rankings is the citations indicator, field- and year-normalised, based on Scopus, with fractional counting only for papers with more than 1,000 authors.

Here are some of the superstars of Asian research. On the left is the citations rank and the score for citations. On the right in brackets is the score for Research comprising research reputation, publications per faculty, and research income. To achieve a score in the seventies, eighties or nineties  for citations with minimal research reputation, very few publications and limited funding is remarkable.

1st. 99.1. Babol Noshirvani University of Technology (15.3)
2nd. 92.0 King Abdulaziz University (92.3)
3rd.  93.1. Ulsan National Institute of Science and Technology (37.8)
7th. 81.2. Quaid-i-Azam University (1.3)
13th. 74.5. Fujita Health University (9.4)
16th.72.5.  Central China Normal University (11.3)






Free speech rankings from Spiked

The magazine Spiked is descended from Living Marxism although some think it is now more libertarian than socialist. It has just published the latest edition of its free speech university rankings.

These are not actually rankings but a classification or a rating, since they just divide UK universities into three groups. They have been subjected to mockery from sections of the academic blogosphere, including WONKHE, that might be justified on technical grounds. This is, however, such an important topic that any sort of publicity has to be welcomed.

Universities are divided into three categories: 

RED; "A students’ union, university or institution that is hostile to free speech and free expression, mandating explicit restrictions on speech, including, but not limited to, bans on specific ideologies, political affiliations, beliefs, books, speakers or words."

AMBER; "A students’ union, university or institution that chills free speech and free expression through restricting vague and subjective types of speech, such as ‘offensive’ or ‘insulting’ speech, or requiring burdensome vetting procedures for events, speakers, posters or publications. Many policies in this category might not explicitly limit speech, but have the potential to be used to that end, due to purposefully vague or careless wording."

GREEN; "A students’ union, university or institution that, as far as we are aware, places no significant restrictions on free speech and expression – other than where such speech or expression is unlawful."

The roll of honour in the green category includes exactly seven universities, none of them in the Russell Group: Anglia Ruskin, Buckingham, Hertfordshire, Robert Gordon, Trinity St David, West of Scotland, and Winchester.


Interesting data from Webometrics

The Webometrics rankings perform the invaluable function of ranking 27,000 plus universities or entities claiming to be universities around the world. Also, their Excellence indicator identifies those  institutions, 5,776 this year, with any claim to involvement in research.

Consequently, it has often been used in unofficial national rankings in countries, especially in Africa, where very few places can make it into the top 500 or 1,000 universities included in the better known international rankings.

However, there seems to be a universal law that when a ranking becomes significant it will have unintended and perverse consequences. In the UK we have seen massive inflation in the number of first and upper second class degrees partly because this is a n element in popular national rankings. Sophisticated campaigns can also produce  significant gains in the QS academic opinion survey which has a 40% weighting  and a few hundred strategic citations can boost the most unlikely universities in the research impact indicator of THE world and regional rankings.

Webometrics also has indicator that seems to be susceptible to bad practices. This is "Presence", the number of pages in the main webdomain including subdomains and file types such as rich files, with a 5% weighting. Apparently this can be easily manipulated. Unlike other rankings, Webometrics does not attempt to ignore this but has highlighted it in several recent tweets, which is helpful since it indicates who might be manipulating the variable. It is  possible that there might have been a misunderstanding of the Webometrics guidelines, an error somewhere, or perhaps some totally valid and innocent explanation. If the latter is the case iIwill be happy to publish a statement.

Here is a selection of universities with their world rank in the Webometrics Presence indicator. The overall rank is in brackets.

4.  University of Nairobi, Kenya (874)

5.  Masaryk University in Brno, Czechia (433)

9.  Federal university of Santa Catarina, Brazil  (439)

15.  Charles University in Prague (203)

17.  University of Costa Rica (885)

20.  University of the West Indies St Augustine (1792)

32.  National University of Honduras (3777)

40.  Mahidol University, Thailand (548)

55.  Universitas Muhammadiyah Surakarta, Indonesia  (6394)