Posts

Showing posts from November, 2018

THE uncovers more pockets of research excellence

I don't want to do this. I really would like to start blogging about whether rankings should measure third missions or developing metrics for teaching and learning. But I find it difficult to stay away from the THE rankings, especially the citations indicator. I have a couple of questions. If someone can help please post a comment here.  Do the presidents, vice-chancellors, directors, generalissimos, or whatever of universities actually look at or get somebody to look at the indicator scores of the THE world rankings and their spin-offs? Does anyone ever wonder how a ranking that produces such such imaginative and strange results for research influence, measured by citations, command the respect and trust of those hard-headed engineers, MBAs and statisticians running the world's elite universities? These questions are especially relevant as THE are releasing subject rankings . Here are the top universities in the world for research impact (citations) in various subjects. For co...

A modest suggestion for THE

A few years ago the Shanghai rankings did an interesting tweak on their global rankings. They deleted the two indicators that counted Nobel and Fields awards and produced an Alternative Ranking. There were some changes. The University of California San Diego and the University of Toronto did better while Princeton and Vanderbilt did worse. Perhaps it is time for Times Higher Education (THE) to consider doing something similar for their citations indicator. Take a look at their latest subject ranking , Clinical, Pre-clinical and Health. Here are the top ten for citations, supposedly a measure of research impact or influence. 1.   Tokyo Metropolitan University 2.   Auckland University of Technology 3.   Metropolitan Autonomous University, Mexico 4.   Jordan University of Science and Technology 5.   University of Canberra  6.   Anglia Ruskin University 7.   University of the Philippines 8.   Brighton and...

Ranking Rankings: Measuring Stability

I have noticed that some rankings are prone to a large amount of churning. Universities may rise or fall dozens of places over a year, sometimes as a result of methodological changes, changes in the number or type of universities ranked, errors and corrections of errors (fortunately rare these days), changes in data collection and reporting procedures, or because there is a small number of data points. Some ranking organisations like to throw headlines around about who's up or down, the rise of Asia, the fall of America, and so on. This is a trivialisation of any serious attempt at the comparative evaluation of universities, which do not behave like volatile financial markets. Universities are generally fairly stable institutions: most of the leading universities of the early twentieth century are still here while the Ottoman, Hohenzollern, Hapsburg and Romanov empires are long gone. Reliable rankings should not be expected to show dramatic changes from year to year, unless there h...