The Convergence of University Rankings and System Benchmarking An Apparent Paradox of “Rankology”
Questions Two approaches: University Rankings System Benchmarking Are they: Complementary? Competing? Consistent? IREG - Warsaw, May 20132
Outline (1) Background: from ranking to benchmarking (2) Method of investigation (3)Results (4) Interpretation and conclusion IREG - Warsaw, May 20133
(1) University Rankings IREG - Warsaw, May 20134
U Rankings: a Polarizing Exercise U Rankings: hated/loved, criticized/commended, threatening/stimulating but proliferating (“here to stay”) Ph. Albatch’s advice [“Don’t take too much notice of rankings” ( UWN, March 23, 2013 )]: unlikely to be widely followed More pitfalls discovered, uncovered, elucidated more attempts to improve methods IREG - Warsaw, May 20135
U Rankings: the Disease Methodological caveats Biases: Research, English, STEM Composite indicators: Weighting=> Elitism Subjective (reputation) /non transparent Dangerous use (“misuses”, “abuses”) Universities:(1) Focus on competition with others instead of own improvement / Affect strategic planning (2) Focus on biased criteria (research) Policy makers: Focus on a few WCUs instead of whole system Students:Impact on university selection Overall: Impact on financing Commercialization(crowded) market IREG - Warsaw, May 20136
From Ranking to Benchmarking “If Ranking is the Disease, Is Benchmarking the Cure?” (Jamil Salmi, Sunita Kosaraju. Evaluation in Higher Education, Vol. 5 no.1, June 2011) “Rankings: Neither a Disease nor a Cure” (Ph. Albatch, UWN, 2013) IREG - Warsaw, May 20137
(2) System Benchmarking IREG - Warsaw, May 2013 Resources Access Equity TE SYSTEM Governance Quality control Private Providers 8 Economic, Social & Technological Environment
Benchmarking: Objective & Criteria Objective: assess strength, health and performance of countries' tertiary education systems Criteria: resources, inputs, governance, outputs and outcomes of the system (access, equity, quality, relevance) IREG - Warsaw, May 20139
Benchmarking: Main Initiatives SABER: System Approach for Better Education Results (World Bank) Still under construction U21(Universitas 21/ University of Melbourne) Most recent, comprehensive available case See below Benchmarking University Governance (World Bank – MENA): Hybrid AHELO: Assessment of Higher Education Learning Outcomes (OECD) Still under experimentation IREG - Warsaw, May
Hypothesis Benchmarking developed in reaction to Rankings Objectives, level of observation and criteria of Benchmarking and Ranking are quite different == Shouldn’t they yield different results? IREG - Warsaw, May
Method (1) 1/ Select 4 of the more popular university rankings: ARWU, THE, QS, WEBOmetrics 2/ Pick the most recent system benchmarking: U21 3/ Compare their results IREG - Warsaw, May
Method (2) Issue: How to compare U and Systems? Solution:Translate U rankings into Country Rankings Method: From: number of top universities to: number of tertiary aged youths in one country potentially served by top universities in that country (e.g. supply of top universities) NB: no correlation between the 2 measures IREG - Warsaw, May
NB: Number of Top 400 U and Supply of Top 400 U (THE) : Rank) IREG - Warsaw, May
Method (3) Quick look at the 4 leagues selected The “sample”: Top 400 universities IREG - Warsaw, May 2013 THEARWUQSWEBO Nbr of countries with at least one top 400 university in each league Nbr of countries with at least one top 400 university found in all 4 leagues (Overlap) 34 Nbr of top 400 universities in the countries with at least one top 400 university found in all 4 leagues (Overlap)
The 34 AustraliaJapan AustriaMexico BelgiumNetherlands BrazilNew Zealand CanadaNorway ChinaPoland Czech RepublicPortugal DenmarkRussian Federation FinlandSingapore FranceSouth Africa GermanySouth Korea GreeceSpain Hong KongSweden IndiaSwitzerland IrelandTaiwan IsraelUnited Kingdom ItalyUnited States
Comparing the results of the 4 Rankings (1) Correlation between results of the 4 leagues: (Number of top universities in each country) IREG - Warsaw, May
Comparing the results of the 4 Rankings (2) Correlation between results of the 4 leagues: (1) number of top universities in each country IREG - Warsaw, May 2013 Nbr of Top 400 Universities: R 2 THEQSARWUWEBO THE QS ARWU 0.98 WEBO 18
Comparing the results of the 4 Rankings (3) Correlation between results of the 4 leagues: (2) Supply of top universities IREG - Warsaw, May 2013 Density: R 2 THEQSARWUWEBO THE QS ARWU 0.86 WEBO 19
Supply: Nbr of top U/ TE aged population The first five countries QSARWUTHEWEBO 1Finland New Zealand Switzerland Ireland Denmark The last five countries QSARWUTHEWEBO 30Poland Mexico Brazil China India
Benchmarking: “U 21”Method (1) 1/ A priori selection of 48 countries ( +2) 2/ Assessment of countries’ performance based on one overall indicator and 4 “measures”: (1) Resources (2) Environment (3)Connectivity (4)Output IREG - Warsaw, May
Benchmarking: Method (2) (1)Resources (25%) : 5 indicators on expenditures (2) Environment (25%) : 2 indicators on gender balance, 1 indicator on data quality, 3 indicators on policy and regulatory environment, 1 homegrown index on internal governance IREG - Warsaw, May
Benchmarking: Method (3) (3) Connectivity (10%) : 2 indicators on degree of internationalization (students & research) (4) Output (40%): 5 indicators on research, 1 indicator on Probability of a person to attend a top 500 university (*) based on ARWU… 1 indicator on enrollment 1 indicator on tertiary educated population 1 indicator on unemployment among tertiary educated population IREG - Warsaw, May
Benchmarking: Links between the 5 measures IREG - Warsaw, May 2013 Overall Resources (25%) Outputs (40%) Environment ( 25%) Connectivity (25%) Overall Resources Outputs Environment 0.40 Connectivity 24
Comparing Results of Rankings and Benchmarking (1a) Countries Overlap between UR and SB: U21 & THE:37 common countries U21 & QS:40 common countries U21 & ARWU:37 common countries U21 & WEBO:41 common countries Essentially same pool of countries IREG - Warsaw, May
Comparing Results of Rankings and Benchmarking (1b) IREG - Warsaw, May Not in U21 Not in one (or more) Ranking ColombiaArgentina EstoniaBulgaria IcelandChile LebanonCroatia OmanHungary PhilippinesIndonesia Saudi ArabiaIran UAEMalaysia Romania Slovakia Slovenia Thailand Turkey Ukraine
Comparing Results of Rankings and Benchmarking (2) IREG - Warsaw, May 2013 U21THEQSARWUWEBO Overall Ressources Outputs Environmment Connectivity Correlation between U21 Indicators and Rankings (Supply): R 2 27
Comparing Results of Rankings and Benchmarking (3) U21 (Overall) and THE Rankings (R 2 = 0.74) IREG - Warsaw, May
Comparing Results of Rankings and Benchmarking (4) U21 (Resources) & ARWU (Supply): R 2 = 0.78 IREG - Warsaw, May 2013 U21 (Resources)ARWU (Supply) U21 (Resources)ARWU (Supply) Canada10060 South Korea6016 Denmark9792 New Zealand5948 Sweden94136 Portugal5813 USA9246 Spain5823 Norway9274 Iran571 Finland8969 UK5660 Switzerland87118 Japan5319 Singapore8248 Poland495 Netherland8090 Italy4731 Austria7546 Czech Rep4712 Ireland7280 Russia431 Belgium6972 Brazil422 France6732 Mexico401 Hong Kong6486 Hungary4012 Israel6480 Argentina392 Germany6446 South Africa353 Taiwan6316 China331 Australia6382 India230.1 Greece
Conclusions /Interpretation 1/Hypothesis not confirmed: a/same set of countries b/similar results 2/Two types of explanations: a/methodological b/structural IREG - Warsaw, May
Epilogue System Benchmarking ends up ranking countries Boundaries between UR and SB are blurred SB suffers common symptoms with UR Convergence of the two streams of “Rankology” not surprising Benchmarking needs to expand its pool of countries to become more relevant IREG - Warsaw, May
Take Away IREG - Warsaw, May
Thank You IREG - Warsaw, May SB UR