The Nation

opinion

Smaller
Larger
chalk talk

World university ranking systems: Measuring the unmeasurable

During the past decade various systems have emerged to rank the universities of the world. Two of the most prominent are the Academic Rankings of World Universities (ARWU) compiled by Jiao Tong University in Shanghai and the Times Higher Education World University Rankings developed in London.

The latter rankings for 2012-2013 were recently released. Thai higher education officials and professors are certainly deeply disappointed by the results. Only one Thai university, KMIT-Thonburi, ranks in the global top 400 and that is in the lowest group with a ranking in the 351-400 range.

There were no Thai universities in the top 500 of the Shanghai rankings. In the 2011-2012 QS World University Rankings, Chulalongkorn was number 171. In 2013, Times Higher Education released the top 100 Asian Universities rankings. The University of Tokyo was number 1, KMIT-Thonburi, 55, Mahidol University, 61 and Chulalongkorn University, 82. Mahidol University, which has created an attractive "green campus", ranked 36th internationally for environmental performance on the GreenMetric. Nottingham University in the UK ranked first. Among the top six universities in both the Shanghai and London rankings were Stanford, Harvard, MIT, and Cal Tech, reflecting the "super brand" status of such institutions.

Despite the controversy surrounding these systems and debates about their methodologies and validity, these rankings do matter and many people pay close attention to them including the universities themselves and potential student applicants.

There are two basic ways of doing the rankings. The first and that used by Jiao Tong relies only on "objective" indicators. Other ranking systems such as the Times Higher Education system include reputational data based on surveys of prominent academics (the reputational factor accounts for 40 per cent in their system). Their system involves 13 carefully selected performance indicators.

Universities and colleges have three basic missions: teaching students, advancing the frontiers of knowledge through research, and providing service to the larger community through various kinds of engagement.

When was I working at the National University of Laos in a communist system, I often heard the words: "serve the people" reflecting the service role of the university. A vivid example of service to the larger community has been the work of Magsaysay Prize-winning Dr Krisana Kraisintu of Rangsit University, who has worked tirelessly to provide low cost pharmaceuticals to victims of HIV/Aids and malaria in Sub-Saharan Africa. That work is not at all reflected in the ranking of Rangsit.

It is virtually impossible to measure validly the quality of teaching or the service engagement of universities.

Also modern universities nearly everywhere are failing to assess their learning outcomes, perhaps the single best indicator of excellence in higher education, and are also not systematically studying what happens to their

graduates.

Various proxies are used to try to assess teaching quality such as the student-teacher ratio and the quality of students as measured by scores on examinations such as the SAT and GRE. These measures are highly indirect and inadequate in assessing the quality of teaching excellence.

Princeton, unlike Stanford and Harvard, does not allow teacher assistants to teach undergraduate classes, but that key factor related to teaching quality is ignored. Princeton also requires its freshman to have international experience and that too is not reflected in the rankings.

Results from the various systems, except for the very top 10-12 schools, are highly inconsistent, reflective of serious problems with these ranking systems. My own institution, the University Minnesota, is 29th in the Shanghai rankings but 47th in the Times Higher Education rankings.

In both 2011 and 2013, the European University Association released reports on these ranking systems. They criticise the systems for bias against smaller schools, which may be strong in only certain fields and may not emphasise research but teaching excellence. They criticise the systems for overly emphasising tangible easily measured research.

The systems are also heavily biased against universities in non-English speaking countries such as Thailand and do not adequately recognise the value of non-English language publications. Generally the rankings ignore the subjective quality of a campus. Thus, the world-class quality of campuses such as Princeton, Stanford, and the new Assumption campus in Bang Na is ignored.

Also most rankings do not give adequate attention to internationalisation, which is of growing importance in an increasingly multicultural global era.

There are also numerous dimensions of the quality of higher education, which are not comparable across nations. College sports are an important part of campus life in the US, but are irrelevant in other countries.

Also missing in most rankings (except those of the magazine US News & World Report for certain US institutions) are estimates of quality relevant to cost.

Since 1995, the National Research Council in the US has been providing national ratings of all doctoral programmes. Such rankings by field are subject to a serious "halo effect" which benefits highly rated universities.

Unfortunately, institutions of higher education may be making important decisions influenced by a desire to improve their rankings ("teaching to the test"). A dean at the University of Oregon, for example, had as her top priority hiring "disciplinary stars" and neglecting international and interdisciplinary studies.

If Thailand wants to boost its rankings, it must find ways to encourage more faculty members to publish in top English-language journals and/or hire more international faculty with this capability. Institutions in China, Hong Kong, and Singapore are doing exactly. As Einstein said: "Many of the things you can count, don't count. Many of the things you can't count really count." This reflects the key issue with global university ranking systems.

Gerald W Fry is a Professor of the Department of Organisational Leadership, Policy, and Performance, University of Minnesota. He can be contacted at gwf@umn.edu.


Comments conditions

Users are solely responsible for their comments.We reserve the right to remove any comment and revoke posting rights for any reason withou prior notice.