Posts

Interesting Times for Times Higher?

Changes may be coming for the "universities Bible", aka Times Higher Education, and its rankings, events, consultancies and so on. It seems that TES Global is selling off its very lucrative cash cow and that, in addition to private equity firms, the  RELX group which owns Scopus and Clarivate Analytics are in a bidding war. Scopus currently provides the data for the THE rankings and Clarivate used to. If one of them wins the war there may be implications for the THE rankings, especially for the citations indicator. If anybody has information about what is happening please send a comment.

THE uncovers more pockets of research excellence

I don't want to do this. I really would like to start blogging about whether rankings should measure third missions or developing metrics for teaching and learning. But I find it difficult to stay away from the THE rankings, especially the citations indicator. I have a couple of questions. If someone can help please post a comment here.  Do the presidents, vice-chancellors, directors, generalissimos, or whatever of universities actually look at or get somebody to look at the indicator scores of the THE world rankings and their spin-offs? Does anyone ever wonder how a ranking that produces such such imaginative and strange results for research influence, measured by citations, command the respect and trust of those hard-headed engineers, MBAs and statisticians running the world's elite universities? These questions are especially relevant as THE are releasing subject rankings . Here are the top universities in the world for research impact (citations) in various subjects. For co...

A modest suggestion for THE

A few years ago the Shanghai rankings did an interesting tweak on their global rankings. They deleted the two indicators that counted Nobel and Fields awards and produced an Alternative Ranking. There were some changes. The University of California San Diego and the University of Toronto did better while Princeton and Vanderbilt did worse. Perhaps it is time for Times Higher Education (THE) to consider doing something similar for their citations indicator. Take a look at their latest subject ranking , Clinical, Pre-clinical and Health. Here are the top ten for citations, supposedly a measure of research impact or influence. 1.   Tokyo Metropolitan University 2.   Auckland University of Technology 3.   Metropolitan Autonomous University, Mexico 4.   Jordan University of Science and Technology 5.   University of Canberra  6.   Anglia Ruskin University 7.   University of the Philippines 8.   Brighton and...

Ranking Rankings: Measuring Stability

I have noticed that some rankings are prone to a large amount of churning. Universities may rise or fall dozens of places over a year, sometimes as a result of methodological changes, changes in the number or type of universities ranked, errors and corrections of errors (fortunately rare these days), changes in data collection and reporting procedures, or because there is a small number of data points. Some ranking organisations like to throw headlines around about who's up or down, the rise of Asia, the fall of America, and so on. This is a trivialisation of any serious attempt at the comparative evaluation of universities, which do not behave like volatile financial markets. Universities are generally fairly stable institutions: most of the leading universities of the early twentieth century are still here while the Ottoman, Hohenzollern, Hapsburg and Romanov empires are long gone. Reliable rankings should not be expected to show dramatic changes from year to year, unless there h...

Is THE going to reform its methodology?

An article by Duncan Ross in  Times Higher Education  (THE) suggests that the World University Rankings are due for repair and maintenance. He notes that these rankings were originally aimed at a select group of research orientated world class universities but THE is now looking at a much larger group that is likely to be less internationally orientated, less research based and more concerned with teaching. He says that it is unlikely that there will be major changes in the methodology for the 2019-20 rankings next year but after that there may be significant adjustment. There is a chance that  the industry income indicator, income from industry and commerce divided by the number of faculty, will be changed. This is an indirect attempt to capture innovation and is unreliable since it is based entirely on data submitted by institutions. Alex Usher of Higher Education Strategy Associates has pointed out some problems with this indicator. Ross seems most concerned,  h...

How many indicators do university rankings need?

The number of indicators used in international university rankings varies a lot. At one extreme we have the Russian Round University Rankings (RUR), which have 20 indicators. At the other, Nature Index and Reuters Top 100 Innovative Universities have just one. In general, the more information provided by rankings the more helpful they are. If, however, the indicators produce very similar results then their value will be limited. The research and postgraduate teaching surveys in the THE world rankings and the RUR correlate so highly that they are in effect measuring the same thing. There is probably an optimum number of indicators for a ranking, perhaps higher for general than for  research-only rankings, above which no further information is provided.  A paper by Guleda Dogan  of Hacettepe University, Ankara, looks at the indicators in three university rankings the Shanghai Academic Ranking of World Universities, the National Taiwan University Rankings and University Rank...

The link between rankings and standardised testing

The big hole in current international university rankings is the absence of anything that effectively measures the quality of graduates. Some rankings use staff student ratio or income as a proxy for the provision of resources, on the assumption that the more money that is spent or the more teachers deployed then the better the quality of teaching. QS has an employer survey that asks about the universities from where employers like to recruit but that has many problems. There is a lot of of evidence that university graduates are valued to a large extent because they are seen as intelligent, conscientious and, depending on place and field, open-minded or conformist. A metric that correlates with these attributes would be helpful in assessing and comparing universities.  A recent article in The Conversation by Jonathon Wai suggests that the US News America's Best Colleges rankings are highly regarded partly because they measure the academic ability of admitted students, which cor...