Presentation on theme: "USING WORLD UNIVERSITY RANKING SYSTEMS TO INFORM AND GUIDE STRATEGIC POLICY MAKING. A CASE STUDY OF A CANADIAN RESEARCH UNIVERSITY AC21 International FORUM."— Presentation transcript:
USING WORLD UNIVERSITY RANKING SYSTEMS TO INFORM AND GUIDE STRATEGIC POLICY MAKING. A CASE STUDY OF A CANADIAN RESEARCH UNIVERSITY AC21 International FORUM 2010 Shanghai Jiao Tong University Roland Proulx Université de Montréal - Canada
PURPOSE PURPOSE OF THE PRESENTATION THPRESENTATION The intent of this paper is to illustrate, in the format of a case study, how the international rankings have initiated and informed a strategic thinking and planning process at the University of Montreal. The methodology used: Set the strategic position of the University from a global perspective : assessing the “overall scores” of the rankings as a general reference and addressing, beyond the “overall scores”, the various individual performance indicators; Apply, according to the intent of the organization, the various scores and indicators to a formal strategic decision making process.
Outline of the presentation The Findings Ranking of University of Montreal according to the “overall scores” Ranking of University of Montreal according to selectively chosen research indicators Diagnosis and strategies
FINDING # 1 UNIVERSITY’S WORLD POSITION ACCORDING THE FINAL SCORES (2008) The University of Montreal is clearly ranked among the top 100 world universities, though on the border line on most of the rankings. On the North American scene, the University does not make the top 50, though it does rank among the top 50 European universities. Among the 15 Canadian research universities with medical schools, the rank of the University of Montreal fluctuates from position 4 to 6.
FINDING # 2 Ranking of University of Montreal according to selectively chosen research indicators For the purpose of a more targeted and more limited analysis of the University of Montreal as a world class research intensive university, the University have decided to evaluate and position its performance mostly against the various research indicators, thus considering the indicators per se without the biases of the selection of criteria, the score calculating and sorting, adopting a customizing process for planning purposes (the CHE model). Marginson (2008) has rightly argued that “Each indicator has distinct meanings for measures of value for the k-economy, and for policies of innovation and performance improvement”
The research performance indicators used
Université de Montréal: its position according to the ranks of the indicators: world, canada, north america, europe criteriasourceindicatorsweightworldCanadaN.AmericaEurope productivityARWU Score on SCI (2007)20%6353814 LEIDEN Number of publications (2003-2007)N/A11755643 TAIWAN Number of articles (1997-2007)10%8465736 TAIWAN Current articles (2007)10%6754523 ExcellenceTHES Citations/Faculty (2002-2007)20%10444837 & intensityLEIDEN Citations / publication (2003-2007)N/A10456935 ARWU Score on HiCi20%194=12=11362 TAIWAN HiCi papers (1997-2007)15%9263729 TAIWAN Hi-Index (2006-2007)20%7643019 TAIWAN Hi-Impact journal articles (2007)15%7752422 ARWU Score on N&S (2003-2007)20%170=6=8766 impact TAIWAN 11 year citations (1997-2007)10%9367034 LEIDEN Number of citations (2003-2007)N/A11056340 TAIWAN number of citations (2006-2007)10%7645621 TAIWAN Average # of citations (1997-2007)10%7985864 LEIDEN CPP/FCSm (2003-2007)N/A12558738
Summarizing the posisitions The various indicators related to research productivity and impact tend to give a good mark to the University of Montreal (with the exception of the Leiden Ranking 2008), being ranked in the top 100 But the excellence/intensity indicators do not compete as well… On most indicators, the University performs well on the European scene, being among the top 50, but is unable to make the top 50 North American Universities and reach the first positions among the Canadian research universities As a whole, the most indicators are borderline…
Diagnosis and Strategies
The reputation of University of Montreal The THES "Peer review" indicator classifies the University of Montreal to the 58th rank among the top 100 universities worldwide, to the 25th rank among the North American universities and to the 14th rank among the European universities.
Diagnosis Doubts may be expressed whether the specific indicators do reflect and support such a reputation and classification, at least for how long! The research performance of the University of Montreal remains fragile and precarious and may not substantiate and support its reputation. The productivity indicators are on the border line of the top 100 world- wide and of the top 50 North American The excellence/intensity and impact indicators have serious problems on the world and North American scenes the Institution have not succeeded to improve its position among the Canadian research universities. As a whole, we notice a lost of performance when shifting from productivity indicators to intensity and impact indicators the University’s future strategic direction and plan related to research are at stake
Formulating Strategic Goals The University of Montreal has no choice: it must consolidate and improve significantly its performance in scientific publications and citations if the Institution aims at maintaining its enviable international reputation. The University must then take action to position its research performance clearly under the threshold of the top 100 world universities (preferably under the top 75) the top 50 North American universities (preferably under the top 40); the University of Montreal must also aim at being clearly ranked among the 3 first Canadian research universities.
Crafting Strategies To confirm the international stature of the University, 2 sets of well targeted strategies must be implemented.
Strategy #1 The first set would aim at recovering the numerous publications and citations not counted in or left aside by the major databases: Thomson Reuters, Scopus and Google Scholar. The problems of language Absence of institutional signature ISI Highly Cited Researchers missing
Strategy # 2 A second set of strategies to be crafted, beyond the potential recovery that should yield the bibliometric evaluation of the databases, must deal with the increase in the number, quality and impact of the publications attributed to the University of Montreal.
Mapping the scenarios Three Canadian research universities (Toronto, UBC and McGill) are clearly ranked among the top 100 world universities by all ranking systems and two others are listed among the top 100 according to some systems. Being easily benchmarked with full access to reliable data ((publications numbers, citations numbers, citations per publications and field normalized average impact), the first five universities will serve here as the main reference for the various mapping scenarios.
Implementing the scenario To align the research performance of the University of Montreal from its present world positions to the 3d and 4th quartiles of the top 100 world universities and to position itself among the 3 first Canadian research universities –in other words to live up to its reputation (THES) and visibility (WEB), The average number of publications and citations should massively be increased; Accordingly the average normalized field impact (CPP/FCSm) of the papers published by the Faculty members, the ratio of 1, 26 should be increased: a special care should then be given to publishing in high impact fields and journals.
Concluding remarks The author of this paper has argued that the world university rankings can reasonably inform an effective decision making process, if such a process goes beyond the overall scores to take full advantage of the individual indicators, being selectively chosen according to the strategic intent of the institution and linked to a formal strategic environmental thinking exercise leading to operational changes.
We have then learned that In encapsulating the rankings with a single composite index and focusing mostly on world “top” universities, the league tables have somehow hidden and concealed the very broad information used for their production. We must restore the intrinsic value and individual indicators: "Each indicator has distinct meanings for measures of value for the k- economy, and of policies for innovation and performance improvement" (Marginson 2008: 17). In this sense, the individual indicators provided by the rankings draw the contours of a worldwide genuine public information system capable of producing a customized ranking “à la carte” of universities (Bourdin Report 2007-2008).