Some points on Ireland’s university ranking ‘slide’

I did this a short while ago for another ranking, for work in Dublin Institute of Technology’s Higher Education Policy Research Unit (HEPRU), but here it is for the Times Higher Education World University Ranking. First some facts, then a tiny bit of analysis. I’ll make it quick. We read that Trinity College Dublin has slipped in the ranking from 129th position to 138th. From the data that THE makes available, however, the overall score for TCD, as provided by THE,  has improved, from 50.3 to 51.2:

Screen Shot 2014-10-02 at 09.16.53Screen Shot 2014-10-02 at 09.17.43This is significant, especially when we consider that the overall score is not simply the aggregate of the five variables listed above. Alex Usher of Higher Education Strategy Associates notes in a recent post: “But this data isn’t entirely transparent. THE […] hides the actual reputational survey results for teaching and research by combining each of them with some other indicators (THE has 13 indicators, but it only shows 5 composite scores).” This is especially significant when we consider what has happened to UCD in the THE ranking. We go from last year’s result, when it was in the top 200:

Screen Shot 2014-10-02 at 09.31.02

To its latest rank, this year:

Screen Shot 2014-10-02 at 09.30.47

Notice anything? The overall score is withheld. Sure, there are clear differences in the individual indicators, but what do these mean? Did UCD’s researchers really publish 15.3(%? Notches? Magic beans?) less this year (Citations)? The difference in Research is 0.9, so the “volume, income, and reputation” seems to be more or less intact. Teaching has actually improved by 4.6. At best, however, the overall ‘improvement’ in score by TCD could indicate (charitable interpretation of the ranking) that other universities are also improving, but that they have improved quicker. This echoes the truth about life among predators in the wild that you don’t necessarily need to be the fastest to survive a predator – you just need your neighbour to be slower than you.

An Irish Times article goes on about the above, saying that “main impact of the cuts has been on the student-staff ratio, which is one of the major factors used by all the ranking agencies.” Which is true. But the OECD in its recent Education at a Glance report notes that staff-student ratio is not an indicator of teaching quality, nor teaching outputs, nor results. It’s an indicator which has been jumped on because it is an intuition pump, in that it “elicits intuitive but incorrect answers.” There is as much evidence saying that large classes can lead to better learning outcomes as suggests the opposite.

One may then be inclined to agree with Prof. Andrew Deeks of UCD when he says “Our own analyses show that in terms of objective measures of teaching and research performance, we are performing well and making good progress.” The call to reverse cuts, in the hope that this will magically lead to an improved performance in rankings is a political argument. And that’s fine. But beware of rankings bearing ill-tidings. Rankings measure what they measure, rather than measuring the objective reality of higher education – and what they claim to measure may be questionable in and of itself.

About those QS rankings and Trinity College Dublin’s “slide”

I’ll be brief. All kinds of teacup storms bubble up every year about rankings, especially with regard to Ireland’s ‘falling performance’, and usually with a focus on our leading institution, Trinity College Dublin. If we look at how things actually are, however, without a sub-editor’s eye for disaster, the situation seems less awful. Here is a link to Trinity’s rankings page. Check out the box on the right, on the ‘historical data’.

Screen Shot 2014-09-16 at 17.35.03We can ignore the numbers before 2011 (when QS split from Time Higher Education [THE], and they both went their separate ranking ways after 2009), and focus on what has happened in the QS ranking. Now, the weightings change for all of the rankings somewhat (though not really or at all for for the Shanghai Academic Ranking of World Universities [ARWU]), but even with those fluctuations, Trinity’s score has in fact been improving. The rank has bopped around the place a bit, but there isn’t much to suggest that there is some kind of horrific decline in evidence here. Trinity’s QS score is improving, but its rank is not. So, we have to conclude that perhaps Trinity is getting better, but that other universities and institutions may simply be getting better more quickly.

Now, I know there is lots to disagree with here. QS scores what it finds important, Trinity is thus only getting better/staying the same according to what the QS wants etc. But there isn’t much sign of decline, with Trinity or other Irish HEIs. Trinity even made it into the ARWU top 200 hundred this year (one of the more research-stringent rankings, and begetter of all this “world class university” chatter). So yeah. Not quite the decline and fall that makes for good click-bait, but there you have it.

Sidebar: I note that Imperial College London is joint second this year in the QS. So what with the LSE and MIT similarly highly ranked, you’d think Trinity’s powers that be might consider the renaming exercise….

Quote: Rankings, reputation, and soft power

Quote

“Plentiful information leads to scarcity of attention. When people are overwhelmed with the volume of information confronting them, they have difficulty knowing what to focus on. Attention, rather than information, becomes the scarce resource, and those who can distinguish valuable information from background clutter gain power. Cue-givers become more in demand, and this is a source of power for those who can tell us where to focus our attention.
Among editors and cue-givers, credibility is the crucial resource and an important source of soft power. Reputation becomes even more important than in the past, and political struggles occur over the creation and destruction of credibility.”

Joseph Nye, The Future of Power, pp. 103-4

World University Rankings – Information or Noise

Messing around with some of the results available from the Times Higher Education World University Rankings website, it’s interesting to note that near the top of the ranking over the years, things stay relatively stable, and further down there’s quite a bit of variation. In an ideal world, all the datasets would be available for download and easily manipulable (transparency!) but this is not yet the case. Anyway, doing some work for work, here’s a selection of a few institutions with their ranks plotted from the last THE-QS ranking in 2009-2010, to the most recent THE(-TR) ranking for 2013-2013.

There’s quite a bit of change from 2009-2010 to 2010-2011, when THE split from QS (or vice versa). This split resulted in a change in methodology and weightings, but things have not yet settled down, because weightings have either continued to change (though they have stayed the same since 2011 and 2012 it seems), but as Andrejs Rauhvargers notes (pdf), “the scores of all indicators, except those for the Academic Reputation Survey […] have been calucalted differently.” As well as this, in a recent journal article (“Where Are the Global Rankings Leading Us? An Analysis of Recent Methodological Changes and New Developments”), Rauhvargers notes that the THE doesn’t/won’t publish the scores of its 13 indicators. Transparency! Anyway, for what its worth, here are some pretty pictures that illustrate the noisiness of the rankings. Just fooling around with the data to see if I will return to this with the data for the full top 200 over the past 5 years.

New Picture New Picture (4)

Rankings and other factors in US student choice, 1995 – 2012

I collected data from UCLA‘s Higher Education Research Institute‘s American Freshman Surveys (found here), and combined them all into one big spreadsheet (for download here – grey cells indicate data related to these questions were not collected for that year). 1995 is taken as the start year, as this was an exercise to look at the influence of university rankings (such as the US News & World Report, etc.) on how students make decisions about where to study, and 1995 was the first year in which information related to rankings was collected. This was done as part of my research work with Prof. Ellen Hazelkorn in Dublin Institute of Technology‘s Higher Education Policy Research Unit (HEPRU). This is intended to be indicative, rather than asserting any hard trends. I have accordingly allowed myself some flexibility. Continue reading