Thursday, July 12, 2012

Should You Trust University Rankings?

I posted this on Google+ but I figure it fits here as well:

Times Higher Education university ranking has been released, with an interesting surprise. Alexandria University in Egypt — an good university but not remarkable in any way — ends up fourth in the world for research impact.

The reason: physicist, engineer and near-crackpot Mohamed El Naschie. He was founder and editor for a high impact-factor Elsevier journal, but was found to have published over three hundred of his own, unreviewed papers in the journal, and his rampant celf-citations both blew up the impact factor of the journal, and, apparently, the research ranking of his university.

So we have one sole researcher that has stuffed unreviewed papers into a journal he controls and cited his own work at every turn. That inflation of citations by a single researcher in one single department is enough to trump the massed outputs of major research universities. A major ranking category is thus easily accidentally gamed by a single person — I doubt El Naschie never even considered this possible side effect of his paper stuffing.

The irony is not lost on me, by the way, that this ranking is published by Thomson Reuters, who produce the impact factors in the first place. There is a kind of poetic justice in play here.

It shows just how much of a fraud the idea of impact factors is in the first place. The idea that the number of citations to papers could determine the quality of the journal they appear in was never a good one. It is crude and misleading; prone to gaming and manipulation by authors, journals and by Thomson Reuters themselves; and was only ever adopted because there was no better option available at the time. The impact factor is an idea whose time has well and truly passed.


Now, if a single researcher can manipulate an easy-to-measure ranking like this, imagine how easy it might be to manipulate the ranking in less easily measured areas such as the teaching environment. Consider how much these rankings mean in income for a university — your place on lists such as this one can make a major difference in your enrolment numbers, alumni donations and so on. And imagine how easily the system can be gamed if the university leadership — not just a lone researcher at a single department — decide that their university really deserves a better rank, and "adjusts" data to make it happen.

In other words, don't trust these rankings. Ignore them. There is a lot of rumbles about universities polishing their data, being selective with information, or blatantly making things up to improve their numbers for various rankings. These figures are being gamed; we just don't know to what extent.

And even if the rankings were accurate and unbiased, they are not informative for any one student or researcher. Even a small university is a sprawling place, with lots of variation in quality from department to department and from lab to lab. What you will get out of an education or a post-doc depends a lot less on the overall rank, and far more on the culture and atmosphere of the specific department and on the specific people you will have as teachers, co-workers and fellow students.



No comments:

Post a Comment

Comment away. Be nice. I no longer allow anonymous posts to reduce the spam.