I did this a short while ago for another ranking, for work in Dublin Institute of Technology’s Higher Education Policy Research Unit (HEPRU), but here it is for the Times Higher Education World University Ranking. First some facts, then a tiny bit of analysis. I’ll make it quick. We read that Trinity College Dublin has slipped in the ranking from 129th position to 138th. From the data that THE makes available, however, the overall score for TCD, as provided by THE, has improved, from 50.3 to 51.2:
This is significant, especially when we consider that the overall score is not simply the aggregate of the five variables listed above. Alex Usher of Higher Education Strategy Associates notes in a recent post: “But this data isn’t entirely transparent. THE […] hides the actual reputational survey results for teaching and research by combining each of them with some other indicators (THE has 13 indicators, but it only shows 5 composite scores).” This is especially significant when we consider what has happened to UCD in the THE ranking. We go from last year’s result, when it was in the top 200:
To its latest rank, this year:
Notice anything? The overall score is withheld. Sure, there are clear differences in the individual indicators, but what do these mean? Did UCD’s researchers really publish 15.3(%? Notches? Magic beans?) less this year (Citations)? The difference in Research is 0.9, so the “volume, income, and reputation” seems to be more or less intact. Teaching has actually improved by 4.6. At best, however, the overall ‘improvement’ in score by TCD could indicate (charitable interpretation of the ranking) that other universities are also improving, but that they have improved quicker. This echoes the truth about life among predators in the wild that you don’t necessarily need to be the fastest to survive a predator – you just need your neighbour to be slower than you.
An Irish Times article goes on about the above, saying that “main impact of the cuts has been on the student-staff ratio, which is one of the major factors used by all the ranking agencies.” Which is true. But the OECD in its recent Education at a Glance report notes that staff-student ratio is not an indicator of teaching quality, nor teaching outputs, nor results. It’s an indicator which has been jumped on because it is an intuition pump, in that it “elicits intuitive but incorrect answers.” There is as much evidence saying that large classes can lead to better learning outcomes as suggests the opposite.
One may then be inclined to agree with Prof. Andrew Deeks of UCD when he says “Our own analyses show that in terms of objective measures of teaching and research performance, we are performing well and making good progress.” The call to reverse cuts, in the hope that this will magically lead to an improved performance in rankings is a political argument. And that’s fine. But beware of rankings bearing ill-tidings. Rankings measure what they measure, rather than measuring the objective reality of higher education – and what they claim to measure may be questionable in and of itself.