Some points on Ireland’s university ranking ‘slide’

I did this a short while ago for another ranking, for work in Dublin Institute of Technology’s Higher Education Policy Research Unit (HEPRU), but here it is for the Times Higher Education World University Ranking. First some facts, then a tiny bit of analysis. I’ll make it quick. We read that Trinity College Dublin has slipped in the ranking from 129th position to 138th. From the data that THE makes available, however, the overall score for TCD, as provided by THE,  has improved, from 50.3 to 51.2:

Screen Shot 2014-10-02 at 09.16.53Screen Shot 2014-10-02 at 09.17.43This is significant, especially when we consider that the overall score is not simply the aggregate of the five variables listed above. Alex Usher of Higher Education Strategy Associates notes in a recent post: “But this data isn’t entirely transparent. THE […] hides the actual reputational survey results for teaching and research by combining each of them with some other indicators (THE has 13 indicators, but it only shows 5 composite scores).” This is especially significant when we consider what has happened to UCD in the THE ranking. We go from last year’s result, when it was in the top 200:

Screen Shot 2014-10-02 at 09.31.02

To its latest rank, this year:

Screen Shot 2014-10-02 at 09.30.47

Notice anything? The overall score is withheld. Sure, there are clear differences in the individual indicators, but what do these mean? Did UCD’s researchers really publish 15.3(%? Notches? Magic beans?) less this year (Citations)? The difference in Research is 0.9, so the “volume, income, and reputation” seems to be more or less intact. Teaching has actually improved by 4.6. At best, however, the overall ‘improvement’ in score by TCD could indicate (charitable interpretation of the ranking) that other universities are also improving, but that they have improved quicker. This echoes the truth about life among predators in the wild that you don’t necessarily need to be the fastest to survive a predator – you just need your neighbour to be slower than you.

An Irish Times article goes on about the above, saying that “main impact of the cuts has been on the student-staff ratio, which is one of the major factors used by all the ranking agencies.” Which is true. But the OECD in its recent Education at a Glance report notes that staff-student ratio is not an indicator of teaching quality, nor teaching outputs, nor results. It’s an indicator which has been jumped on because it is an intuition pump, in that it “elicits intuitive but incorrect answers.” There is as much evidence saying that large classes can lead to better learning outcomes as suggests the opposite.

One may then be inclined to agree with Prof. Andrew Deeks of UCD when he says “Our own analyses show that in terms of objective measures of teaching and research performance, we are performing well and making good progress.” The call to reverse cuts, in the hope that this will magically lead to an improved performance in rankings is a political argument. And that’s fine. But beware of rankings bearing ill-tidings. Rankings measure what they measure, rather than measuring the objective reality of higher education – and what they claim to measure may be questionable in and of itself.

About those QS rankings and Trinity College Dublin’s “slide”

I’ll be brief. All kinds of teacup storms bubble up every year about rankings, especially with regard to Ireland’s ‘falling performance’, and usually with a focus on our leading institution, Trinity College Dublin. If we look at how things actually are, however, without a sub-editor’s eye for disaster, the situation seems less awful. Here is a link to Trinity’s rankings page. Check out the box on the right, on the ‘historical data’.

Screen Shot 2014-09-16 at 17.35.03We can ignore the numbers before 2011 (when QS split from Time Higher Education [THE], and they both went their separate ranking ways after 2009), and focus on what has happened in the QS ranking. Now, the weightings change for all of the rankings somewhat (though not really or at all for for the Shanghai Academic Ranking of World Universities [ARWU]), but even with those fluctuations, Trinity’s score has in fact been improving. The rank has bopped around the place a bit, but there isn’t much to suggest that there is some kind of horrific decline in evidence here. Trinity’s QS score is improving, but its rank is not. So, we have to conclude that perhaps Trinity is getting better, but that other universities and institutions may simply be getting better more quickly.

Now, I know there is lots to disagree with here. QS scores what it finds important, Trinity is thus only getting better/staying the same according to what the QS wants etc. But there isn’t much sign of decline, with Trinity or other Irish HEIs. Trinity even made it into the ARWU top 200 hundred this year (one of the more research-stringent rankings, and begetter of all this “world class university” chatter). So yeah. Not quite the decline and fall that makes for good click-bait, but there you have it.

Sidebar: I note that Imperial College London is joint second this year in the QS. So what with the LSE and MIT similarly highly ranked, you’d think Trinity’s powers that be might consider the renaming exercise….

Quote: Rankings, reputation, and soft power

Quote

“Plentiful information leads to scarcity of attention. When people are overwhelmed with the volume of information confronting them, they have difficulty knowing what to focus on. Attention, rather than information, becomes the scarce resource, and those who can distinguish valuable information from background clutter gain power. Cue-givers become more in demand, and this is a source of power for those who can tell us where to focus our attention.
Among editors and cue-givers, credibility is the crucial resource and an important source of soft power. Reputation becomes even more important than in the past, and political struggles occur over the creation and destruction of credibility.”

Joseph Nye, The Future of Power, pp. 103-4

“Profiling (or ranking?) universities” – Comment

Professor Ferdinand von Prondzynski (formerly President of DCU, currently Vice-Chancellor of Robert Gordon University, Aberdeen) discusses new forms of higher education ‘profiling’ over at his A University Blog

I don’t really doubt that as recipients of public money universities should present transparent data as to how this is being spent and what value is being generated by it. But comparisons between institutions based on such data always carry some risk. So for example, DCU’s student/staff ratio looks more favourable because the university has a much larger focus on science and engineering than other Irish universities, and laboratory work requires more staff input. NUI Maynooth is ‘cheap’ because the main bulk of its teaching is in the humanities, which are less resource-intensive. This information may not be immediately obvious to the casual observer, who may therefore be driven to questionable conclusions. Ironically some of these risks are not so prominent in the more mature league tables, such as the Times Higher Education global rankings, which will already have allowed for such factors in their weightings. The raw data are more easily misunderstood.

It seems to me that institutional profiling is not necessarily preferable to rankings. And it could be open to mis-use.

The ‘comparisons = rankings’ conflation seems a bit premature, specifically regarding U-Multirank. It has been specifically designed not to give a one-number-ranking, and instead to provide the kind of information to those interested that they are looking for, across various variables and indicators. There are always dangers with any numbers that attempt to quantify the quantitative, that is why they require interpretation and clarification. The handwringing here just seems performative, as though one should be seen to have an issue with all attempts at benchmarking/measurement. Rankings have been around for a decade at this stage, and so our attitudes towards them should move beyond reactions that they are simply ideologically suspect, that they are to be rejected accordingly.

I would ask, that if this new attempt at improving the situation of rankings is “not necessarily preferable”, then what is? Or what could be added to the U-Multirank system? It includes teaching (which most other rankings don’t), as well as regional engagement, and technology transfer. It is going to be available to use by any number of stakeholders (rather than bureaucrats and administrators, as was previously the case), and this is the expectation today – transparency. Finally, it will include subject/discipline specific indicators for those often-overlooked areas of study (arts, humanities, and social sciences).

Websites like Eurostat are also open to “mis-use”, and mis-interpretation, but is this a good reason to reject them? Open datasets, which can be interrogated and reconfigured in novel ways by the curious, are becoming the norm (data journalism, or groups such as Open Data Dublin, etc.) For a change there is an opportunity for real bench-marking, for real transparency, which no doubt will have room for improvement, and yet it seems that some still want to protect their data fiefdom from the unitiated, the great unwashed.

[Update 17/1/14: Education Datapalooza to Promote Innovation in Improving College Access, Affordability, and Completion: “Today, in response to the President’s call, the White House, the U.S. Departments of Education and Treasury, and the General Services Administration are hosting an Education “Datapalooza”, highlighting innovators from the private, nonprofit, and academic sectors who have used freely available government data to build products, services, and apps that advance postsecondary education, empower students with information, and help colleges innovate in creative and powerful ways.”]

Enhanced by Zemanta