According to De Volkskrant (in Dutch), King Saud University (KSU) has approached several researchers in the Netherlands to link their name to that university for a payment of tens of thousands of euros. That would supposedly give KSU a higher position in the world rankings.
A researcher in Wageningen and a former Utrecht researcher have accepted the offer. Wageningen has already announced an investigation. Education Minister Robbert Dijkgraaf, who himself switched to politics from a top position in science, calls the deals “a very serious matter”.
The news started in Spain, where the University of Córdoba has suspended a much cited scientist because he had linked himself in publications with universities in Saudi-Arabia and Russia at which he had never worked.
‘Without me, the university would drop 300 places’
International rankings of universities are based partly on the publications of the scientists who are employed there. Consequently, the researchers in question are taken into account for the ranking of universities that they have little to do with.
“Without me, the University of Córdoba would drop 300 places”, said the Spanish researcher, who has hundreds of publications to his name, in response to his suspension. “They have shot themselves in the foot.” In the weeks that followed, a Spanish foundation wrote a report about such practices.
Almost all the Dutch universities score highly in the international rankings, such as the Shanghai Ranking and Times Higher Education’s World University Ranking. The latter ranking has seven Dutch universities in the top 100 and another three in the top 200.
That is based partly on the scores for research: the citations account for 30 percent at Times Higher Education. Other criteria include the reputation in teaching and research, the number of students per lecturer, the collaboration with the business community, and internationalisation.
The news about researchers being ‘bought’ is illustrative of the criticism that rankings are not neutral soundings: they can be manipulated. That is done not only by cheating but also by strategic choices.
Publications in leading international journals carry more weight in the rankings. Research into topics like the consequences of the earthquakes in Groningen or the aftermath of the Dutch childcare benefits scandal are less likely to be considered.
So you mustn’t rely blindly on something like that, critics say. That criticism got a boost in the Netherlands thanks to the action group ‘Science in Transition’, which was founded in 2013. Front man Frank Miedema gave an example recently: research into the causes of strokes has more prestige than research into the rehabilitation of the patients, but why?
‘You should not assess researchers on the ‘score’ of their research alone’
To get a better score, some researchers divided their research over a number of articles (salami tactics), so they got more citations for precisely the same work. They sometimes put their name to research in which they were barely involved. And many more tactics besides. Mergers between universities even used to be considered, partly with a view to end up higher in the rankings.
Recognition and reward
The opposition to this type of practice is one of the reasons for aspiring to ‘recognition and reward’, a term that has been in vogue in the Netherlands since 2019. The thought process is that you should not assess researchers on the ‘score’ of their research alone. Teaching, dissemination of knowledge, the application of scientific findings and providing leadership need to be taken into account as well. Both university administrators and the government have embraced the aspiration.
At the same time, notes of warning are being sounded: is the Netherlands about to drop in the international rankings if there is less focus on top-flight research? Among those who fear that possibility is Hans Clevers, former president of the Royal Netherlands Academy of Arts and Sciences. Coalition party VVD also regularly asks for attention to be paid to the matter.
The big question is whether a drop in the rankings stems from a deterioration. It could also be the result of an improvement at other institutions or from opting to work on topics that are less easy to compare internationally. The Rathenau Institute recently wrote about the positive features and the limitations of rankings.
HOP, Bas Belleman