Some points on Ireland’s university ranking ‘slide’

I did this a short while ago for another ranking, for work in Dublin Institute of Technology’s Higher Education Policy Research Unit (HEPRU), but here it is for the Times Higher Education World University Ranking. First some facts, then a tiny bit of analysis. I’ll make it quick. We read that Trinity College Dublin has slipped in the ranking from 129th position to 138th. From the data that THE makes available, however, the overall score for TCD, as provided by THE,  has improved, from 50.3 to 51.2:

Screen Shot 2014-10-02 at 09.16.53Screen Shot 2014-10-02 at 09.17.43This is significant, especially when we consider that the overall score is not simply the aggregate of the five variables listed above. Alex Usher of Higher Education Strategy Associates notes in a recent post: “But this data isn’t entirely transparent. THE […] hides the actual reputational survey results for teaching and research by combining each of them with some other indicators (THE has 13 indicators, but it only shows 5 composite scores).” This is especially significant when we consider what has happened to UCD in the THE ranking. We go from last year’s result, when it was in the top 200:

Screen Shot 2014-10-02 at 09.31.02

To its latest rank, this year:

Screen Shot 2014-10-02 at 09.30.47

Notice anything? The overall score is withheld. Sure, there are clear differences in the individual indicators, but what do these mean? Did UCD’s researchers really publish 15.3(%? Notches? Magic beans?) less this year (Citations)? The difference in Research is 0.9, so the “volume, income, and reputation” seems to be more or less intact. Teaching has actually improved by 4.6. At best, however, the overall ‘improvement’ in score by TCD could indicate (charitable interpretation of the ranking) that other universities are also improving, but that they have improved quicker. This echoes the truth about life among predators in the wild that you don’t necessarily need to be the fastest to survive a predator – you just need your neighbour to be slower than you.

An Irish Times article goes on about the above, saying that “main impact of the cuts has been on the student-staff ratio, which is one of the major factors used by all the ranking agencies.” Which is true. But the OECD in its recent Education at a Glance report notes that staff-student ratio is not an indicator of teaching quality, nor teaching outputs, nor results. It’s an indicator which has been jumped on because it is an intuition pump, in that it “elicits intuitive but incorrect answers.” There is as much evidence saying that large classes can lead to better learning outcomes as suggests the opposite.

One may then be inclined to agree with Prof. Andrew Deeks of UCD when he says “Our own analyses show that in terms of objective measures of teaching and research performance, we are performing well and making good progress.” The call to reverse cuts, in the hope that this will magically lead to an improved performance in rankings is a political argument. And that’s fine. But beware of rankings bearing ill-tidings. Rankings measure what they measure, rather than measuring the objective reality of higher education – and what they claim to measure may be questionable in and of itself.

Link

Along with some recent developments in Canada (Ontario at least), as highlighted by Alex Usher of Higher Education Strategy Associates, this is interesting. Time for an Oswald Spengler of higher ed to make some hay on this.

From Alex Usher’s piece, the following is telling (but he doesn’t use the word “peak” anywhere here):

The numbers by field of study are even more stunning.  Overall, there was a loss of 2,050 Ontario secondary students.  The decline in Arts enrolment was 2,600.  Put differently: more than 100% of the decline can be attributed to a fall in Arts enrolment.  Hell, even journalism increased slightly.  This should be a wake-up call to Arts faculties – however good a job you think you’re doing, however intrinsically valuable the Arts may be, kids just aren’t buying it the way they used to.  And if you think that isn’t going to have an effect on your budget lines, think again.  Even at those institutions where responsibility-centred budgeting hasn’t taken hold, cash-strapped universities are going to think twice about filling vacant positions in departments where enrolments are declining.

Bryan Alexander

peakDid we just experience peak higher education in the United States?

I want to try out this hypothesis as a way of thinking about many current trendlines.  Readers and listeners know I have been tracking a large number of grimdevelopments in the American higher education world.  Synthesizing them is what I’m currently addressing.

Peak higher ed means we’ve reached the maximum size that colleges and universities can support.  What we see now, or saw in 2012, is as big as it gets.  After two generations of growth, American higher education has reached its upper bound.

Consider recent news and data:

Student population: The number of students enrolled in American higher education dropped by more than 400,000 from 2011 to 2012, according to Census data.   The number of graduate students also dropped over the same period, falling 2.3% after a decade of growth.  Note that the number of American…

View original post 636 more words

About those QS rankings and Trinity College Dublin’s “slide”

I’ll be brief. All kinds of teacup storms bubble up every year about rankings, especially with regard to Ireland’s ‘falling performance’, and usually with a focus on our leading institution, Trinity College Dublin. If we look at how things actually are, however, without a sub-editor’s eye for disaster, the situation seems less awful. Here is a link to Trinity’s rankings page. Check out the box on the right, on the ‘historical data’.

Screen Shot 2014-09-16 at 17.35.03We can ignore the numbers before 2011 (when QS split from Time Higher Education [THE], and they both went their separate ranking ways after 2009), and focus on what has happened in the QS ranking. Now, the weightings change for all of the rankings somewhat (though not really or at all for for the Shanghai Academic Ranking of World Universities [ARWU]), but even with those fluctuations, Trinity’s score has in fact been improving. The rank has bopped around the place a bit, but there isn’t much to suggest that there is some kind of horrific decline in evidence here. Trinity’s QS score is improving, but its rank is not. So, we have to conclude that perhaps Trinity is getting better, but that other universities and institutions may simply be getting better more quickly.

Now, I know there is lots to disagree with here. QS scores what it finds important, Trinity is thus only getting better/staying the same according to what the QS wants etc. But there isn’t much sign of decline, with Trinity or other Irish HEIs. Trinity even made it into the ARWU top 200 hundred this year (one of the more research-stringent rankings, and begetter of all this “world class university” chatter). So yeah. Not quite the decline and fall that makes for good click-bait, but there you have it.

Sidebar: I note that Imperial College London is joint second this year in the QS. So what with the LSE and MIT similarly highly ranked, you’d think Trinity’s powers that be might consider the renaming exercise….

Quote: Rankings, reputation, and soft power

Quote

“Plentiful information leads to scarcity of attention. When people are overwhelmed with the volume of information confronting them, they have difficulty knowing what to focus on. Attention, rather than information, becomes the scarce resource, and those who can distinguish valuable information from background clutter gain power. Cue-givers become more in demand, and this is a source of power for those who can tell us where to focus our attention.
Among editors and cue-givers, credibility is the crucial resource and an important source of soft power. Reputation becomes even more important than in the past, and political struggles occur over the creation and destruction of credibility.”

Joseph Nye, The Future of Power, pp. 103-4

Sugata Mitra on the role of technology in education

While looking around on the website of the Institute of International and Economic Affairs (IIEA, an Dublin-based think-tank), I discovered the following lecture by Sugata Mitra. Mitra, originally trained as a physicist, and then got into programming and technology, which has led to his present work on getting computers into schools and the ‘Hole in the Wall’ experiment. Ken Robinson’s TED talk gets a lot of attention (arguably too much if you’re me and you’re arguing), and though it undoubtedly introduced many people to the debates surrounding what education is and should be, it never quite hit the spot for me. Mitra’s lecture here – which admittedly isn’t subject to the TED tyranny of 20 minutes – goes from the history of education and technology in education, to the implications of sociological research on teaching and education, to specific policy and technical suggestions. It’s an hour long, but well worth watching. Alternatively, check out his own two TED talks below (Mitra also won the TED Prize in 2013). Much cause for optimism with the future of technology in education, mercifully free of the platitudes of tech in pedagogy and ‘there’s an app for that’.

Daniel Bell, post-industrial society, and who should pay for basic research

A few things popped Daniel Bell’s The Coming of Post-Industrial Society (1973) on my radar, and so I got an old copy for myself online. The edition I have is from 1976, with a new introduction from the author where he attempts to lessen the strain of the excessively heavy lifting some of his ideas were being forced to do by subsequent interpreters. What struck me is that for a 40 year old book, much the same conversations are being had, although it appears that in some respects we have leap-frogged the substantive elements in favour of nitty-gritty technical fixes. Bell’s book rewinds us to these bigger picture problems. Continue reading

World University Rankings – Information or Noise

Messing around with some of the results available from the Times Higher Education World University Rankings website, it’s interesting to note that near the top of the ranking over the years, things stay relatively stable, and further down there’s quite a bit of variation. In an ideal world, all the datasets would be available for download and easily manipulable (transparency!) but this is not yet the case. Anyway, doing some work for work, here’s a selection of a few institutions with their ranks plotted from the last THE-QS ranking in 2009-2010, to the most recent THE(-TR) ranking for 2013-2013.

There’s quite a bit of change from 2009-2010 to 2010-2011, when THE split from QS (or vice versa). This split resulted in a change in methodology and weightings, but things have not yet settled down, because weightings have either continued to change (though they have stayed the same since 2011 and 2012 it seems), but as Andrejs Rauhvargers notes (pdf), “the scores of all indicators, except those for the Academic Reputation Survey […] have been calucalted differently.” As well as this, in a recent journal article (“Where Are the Global Rankings Leading Us? An Analysis of Recent Methodological Changes and New Developments”), Rauhvargers notes that the THE doesn’t/won’t publish the scores of its 13 indicators. Transparency! Anyway, for what its worth, here are some pretty pictures that illustrate the noisiness of the rankings. Just fooling around with the data to see if I will return to this with the data for the full top 200 over the past 5 years.

New Picture New Picture (4)