Malaysia’s universities have been told not to obsess over international varsity rankings.
But every time QS, the popular and highly commercialised rankings operator, releases new listings, we embrace and applaud the result as though the esteem and purpose of our universities depend on it.
GERAK firmly believes that our universities must not be obsessed with rankings. Our institutions of higher learning serve higher purposes than our performance according to QS.
We strongly urge the government and all university administrations to define for ourselves the meaning and measurement of excellence in research, teaching, and contribution to public knowledge and national development.
We acknowledge the dilemmas the government and university administrations face, now that the system is neck-deep in the rankings system. Moving up in the rankings provides a simple and tangible validation.
No leader wants to see their institution’s international position drop on their watch, and many in the academic community are inclined to celebrate upward movements.
Even Education Minister Dr Maszlee Malik, who issued the directive to not obsess over the rankings, cannot resist.
QS just announced that Malaysian universities have moved up the Asia list – especially Universiti Malaya, which is now regarded 13th best in the continent.
Maszlee had to applaud this news.
Earlier this year, UM’s arrival at No. 70 in the world, and other institutions’ best-ever showings, triggered widespread self-congratulation.
Have we stopped the obsession? Emphatically, no!
But, playing the rankings game comes at a cost.
Under the Pakatan Harapan government, the KPIs of academic work remain substantially dictated by the metrics of university rankings.
It’s a numbers game, because only quantifiable outputs matter in the points system that determines whether one institution is greater than the next.
To rank well, score high, with “soaring up”, the mantra of Idris Jusoh of the former regime, evidently echoing in the background.
Many “best researcher” rewards nowadays robotically defer to computer-generated scores of the number of publications and citations – key criteria of the rankings, of course – instead of recognising work that is original, path-breaking and nationally important.
Research outputs overwhelmingly trump excellence in teaching and contribution to public knowledge and national development, which may not translate into high citations and other points that boost rankings.
The heavy weightage of volume publication and citation leads to the pursuit of quantity over quality, and popularity over originality, proclivity for easy research subjects over more challenging ones.
Worst of all, academic dishonesty has become entrenched, with rampant practices of free-riding co-authorship, plagiarism and cosmetic citation (you cite me, I cite you, we both inflate our citation stats). Participating in international rankings is not the only cause, but obsessing over the rankings is clearly a major contributing factor.
So, how do we stop obsessing?
First, we must show resolve and autonomy, and exercise discipline, in setting policies and reward schemes that are not dictated by the rankings scheme.
Of course, universities must promote research and publication, but emphasise, recognise and reward originality, honesty and significant contribution to public knowledge and nation-building priorities.
Malaysia must critically revisit academic work conditions, addressing concerns that we widely observe concerning bureaucratic overload (due to any factors, not just the rankings), KPI pressure that prizes quantity over quality, and demoralisation at the reality that those who game the system reap the rewards.
We can set the agenda independently and make steady gains in the rankings – which can remain a reference point, but not the defining pursuit.
Who knows how long this rankings business will survive, anyway? Why hitch our long-term plans to it?
Second, we must not accept the rankings at face value, but critically examine whether the improved rankings are in line with Malaysia’s policy priorities.
Analysing the QS data
The QS data can be useful, when appropriately analysed. A brief unpacking of the components of UM’s scores in recent years is instructive.
UM has shot up in the rankings, from No. 146 in 2016 to an admirable No. 70 for 2020.
However, its overall score, a weighted average of the six components shown in Figures 1 and 2, only slightly increased over that period (black line).
In other words, UM’s rankings improvement was largely due to other universities progressing poorly in their scores.
When we probe further, we also observe that UM improved most in the “gameable” components of citations and reputation.
Reputation scores derive from QS’ surveys (Figure 1). Academics cannot appraise their own institution, but nothing stops them from singing the high praises of their counterparts. In the 2018/19 round, Malaysia-based survey participants comprised 4.6% of QS’s global responses.
In the components based on empirical data – faculty-student ratio, international faculty and international students – UM’s scores have actually been declining (Figure 2).
Next, we look at the QS subject rankings, where UM has celebrated stellar successes of breaching the world’s top 50.
Well, that stopped in 2017.
We have been hearing less of UM’s subject rankings in the past two years, and only applause at the overall ranking, because it has been falling – quite steeply – in the subject rankings of its supposed flagship departments.
Electronic and electrical engineering, the darling in 2017 when it shot up to No. 23 in the world, slipped to No. 47 in 2019 (Figure 3).
Development studies also peaked in 2017, touching No. 26, but has steadily declined since then (Figure 4).
In sum, UM has progressed in areas where its scores can be manipulated and regressed in internationalisation, and has witnessed all its flagship subject areas fall in the rankings since 2017.
Amid all this, UM has kept climbing up the overall rankings. These are troubling signs that should make us pause and reflect.
Third, we must resist the temptation to measure and reward success based on the rankings. It is too easy, and as we have found above, deeply flawed.
Can Malaysia extricate itself from the QS universe?
We appreciate that withdrawing Malaysia’s participation in the rankings is a difficult political undertaking. But, there is still much we can do that is within our control.
The education minister should issue a moratorium on press releases in response to QS reports, and urge universities to follow suit.
We also need to address critically and decisively whether to continue investing so much in the QS system specifically, which, as we have shown above, is highly problematic.
Gerak proposes a sober and balanced consideration of all the available rankings instruments.
Malaysia’s progress in the rankings should be evaluated – in a critical and rigorous manner worthy of academia – every three to five years, instead of the current practice of superficial and annual tracking. An immediate step can be to stop jumping at every QS press release, which the profit-seeking company spaces out through the year.
There are many ways to stop the obsession with university rankings. It’s time to walk the talk.
No comments:
Post a Comment
Please provide your name and affiliation