A new university rankings system based on one of the world’s most comprehensive databases of university performance has demonstrated the inherent flaws in league tables that rank universities in simple numerical order.
The new ranking, released today by Curtin University’s Open Knowledge Initiative (COKI) and included in a research paper published in the Journal eLife, is one of the first to show the reliability of ranking scores and draws on an immense database sourced from across the world.
Lead author Dr Karl Huang, from COKI, said the Index sets the scene for a fresh conversation about how universities collaborate and give back to their communities.
“Each year, the release of a range of university rankings provokes significant debate about the relevance and value of each ranking system, but these rankings have become highly influential over the past two decades, impacting on student recruitment and university strategy,” Dr Huang said.
“Our paper developed an index of the top 100 universities in terms of open access performance, assessing the reliability of the score for each institution. We found that universities can’t be easily ranked from best to worst with a simplistic league table.
“This new ranking highlights deep flaws in our reliance on existing league tables as dependable measures of how good each university is.”
Dr Huang said that no score is perfect, but we can estimate how precise the score for each university, and therefore the rank position, is.
“Simple rankings have always been misleading and what our statistical work shows is just how much of an issue that is,” Dr Huang said.
“We can measure a performance for each university, but statistically speaking many in the top 100 are not different from each other. What this means it that the question of which university is at the top, or whether university A is one position above university B is meaningless.”
Co-author Professor Cameron Neylon, the co-lead of COKI, said the findings have serious implications for many other league tables, which have not addressed the issue of variability in the measures they use.
“Whole continents are largely ignored by current league tables, which raises questions about the measures currently relied upon to decide whether a university is doing its’ job,” Professor Neylon said.
“Examining data across a much wider range of measures, drawing in statistics from around the world will help redefine the impact and role of universities in the 21st century.”