Presentation is loading. Please wait.

Presentation is loading. Please wait.

U-Multirank – The implementation of a multidimensional international ranking Higher Education Conference Rankings and the Visibility of Quality Outcomes.

Similar presentations


Presentation on theme: "U-Multirank – The implementation of a multidimensional international ranking Higher Education Conference Rankings and the Visibility of Quality Outcomes."— Presentation transcript:

1 U-Multirank – The implementation of a multidimensional international ranking Higher Education Conference Rankings and the Visibility of Quality Outcomes in the European Higher Education Area Dublin, 30-31 January 2013 Don Westerheijden, Frank Ziegele Team Leaders: Gero Federkeil, Jon File, Frans van Vught, Frank Ziegele

2 Why one more ranking? – The purpose of U-Multirank Policy purposes Transparency (on diversity) about European Higher Education Area - triad of projects: EUMIDA, e3m, U-Multirank Benchmarking with Non-European higher education (Modernisation Agenda) Stakeholder related purposes, e.g. Students: informed choices Institutions: strategy development through comparison & benchmarking Policy makers: diversity/performance of systems Employers: partners for cooperation

3 How can those purposes be achieved? - The basic approach  Multidimensional ranking - Going beyond the traditional focus on research excellence 5 dimensions: Teaching & learning, research, knowledge transfer, international orientation, regional engagement No composite indicators, no pre-defined weights on individual indicators  User-driven ranking Personalised ranking allows users to rank by their own preferences and priorities on dimensions and indicators flexible web tool

4 How can those purposes be achieved? - The basic approach  Comparing like with like Link to mapping indicators allowing identification of institutions with similar institutional profiles  Multi-level ranking Combining institutional ranking (whole institutions) and field-based rankings  Stakeholder-oriented processes Intensive inclusion of stakeholders in development and continuous refinement of U-Multirank

5 With this approach U-Multirank will create performance profiles respecting mission diversity Teaching & learning Expenditure teaching Graduation rate Interdisciplinary programmes Relative graduate employment Time to degree Research Art-related output Expenditure research Citations Publications + hi-cited Interdisciplinary research International awards Number of post-docs Competitive research income Knowledge transfer Incentives University-industry publications Third-party funding Patents Size of TTO CPD courses offered Co-patents Spin offs International orientation Programmes in foreign tongue Academic staff PhD graduations Joint research publi’s Joint degree progr. Regional engagement Graduates in region Income from region Research publi’s Contracts region Regional student internships example institutional level

6 The feasibility study proved that U-Multirank works Feasibility study (2009 – 2011)  Pilot study with 150 HEIs, institutional ranking and 3 pilot fields (mechanical /electrical engineering, business)  Some general results: indicators and processes of data collection and analysis are feasible good response from most countries “pre-filling” options, but institutional data collection inevitable need for further refinement of indicators on knowledge transfer, regional engagement, employability/labour market

7 The real implementation has just started 2012-2014: New project to implement U-Multirank and to develop a sustainable business model Kick-off Presidency Conference Dublin two years + option for another two years First ranking to be published in February 2014 Institutional ranking and four fields (pilot fields + physics) Minimum coverage of 500 institutions 2015 > gradual extension by number of institutions and fields special task: feasibility study on inclusion of research- performing institutions (non-university research)

8 U-Multirank is done by a consortium of partners combining different functions and expertise Coordination/lead and rankings CHE Centre for Higher Education CHEPS Center for Higher Education Policy Studies Partners Data collection: CWTS Center for Science and Tehnology Studies, U Leiden, Incentim: International Centre for Research on Entrepreneurship, Technology and Innovation Management, KU Leuven Web tool experts: folge3, Johnny Rich (push) Business Model: Elsevier, Bertelsmann Foundation Associate partners National rankings: OST (France); Perspektywy (Poland), Fundación CYD (Spain) Stakeholder Organisations: ESU, CESAER, IRUN, UASNet

9 The U-Multirank tool will provide substantial benefits to users For prospective / mobile students Selection of field of study Guided journey to identify personal preferences / priorities Personalised selection of indicators  Result: shortlist of institutions matching to personal preferences, ranking outcomes for personally relevant indicators For higher education institutions Selection of institutional profiles (mapping indicators) to compare like with like Compare all/selected indicators or one dimension (institutional + field- based for all faculties ) See and compare full institutional performance profiles Additional benefits to participating institutions:  Detailed analysis of their data compared to averages total sample  Option of creating benchmarking networks  services such as widgets for the website

10 For policy makers Identify and compare performance profiles of national institutions, not only research universities  Insight into institutional diversity Performance of national HE institutions in international context  Benchmarking of national systems, probably also for specific dimensions/indicators  crucial: a user-friendly and flexible web tool which provides comfortable “user journeys” and sufficient guidance The U-Multirank tool will provide substantial benefits to users

11 Illustration of user-journey (student): We could take a first look in the webdesigners’ laboratory……

12 …and start user journeys by defining an institutional profile or by selecting a particular university…

13 …defining an institutional profile to compare like with like… only examples, other criteria possible

14 …leading to the result of a ranking of those institutions…

15 … with the possibility to personalise the ranking by selecting indicators according to personal preferences example for choice of indicators

16 Example for choice of indicators: Personalisation depends on individual preferences personalise your ranking: choose relevant indicators from your perspective want quick study graduation rate within time want small groupsstudent-staff ratio want satisfied studentsoverall satisfaction want practically oriented teachers profs with work experience outside probably will go for Ph.D doctorate productivity new list with chosen indicators, personal ranking!

17 There are also typical “user journeys” from the perspective of HEIs, for instance for rectors

18 how is my university performing against the others that are similar to mine? show the overall performance profile of my university filter the universities with similar activity profile compare performances: overview on all dimension compare performances: select i ndicators/dimensions with high relevance for institutional strategy use of benchmarking results for internal discussions, internal analysis of reasons, strategic development

19 What will be the next steps? - Recruitment of institutions PHASE 1 individual expression of interest 150 pilot institutions partner networks (CESAER, UASnet, IRUN) research intensive universities (bibliometrics) PHASE 2: country-specific recruitment (getting a certain amount and scope per country on board) PHASE 3: targeted recruitment to ensure adequately balanced sample for the first ranking

20 What will be the next steps? - Consultations and data collection Stakeholder consultations On refined indicators (with partners and stakeholder/field organisations) – starting in February On the web tool (information needs, functions, presentation modes) – starting April (until April: development of prototype) Data collection starting May/June 2013 Continuous communication transparency about all steps and activities responsiveness channels presentations (contact us if needed!) media partners

21 Information / Contact Information about U-Multirank www.u-multirank.eu Final report of the feasibility study http://ec.europa.eu/education/higher-education/doc/multirank_en.pdf Contact/Expression of interest in participation info@u-multirank.eu

22 U-Multirank – The implementation of a multidimensional international ranking Higher Education Conference Rankings and the Visibility of Quality Outcomes in the European Higher Education Area Dublin, 30-31 January 2013 Don Westerheijden, Frank Ziegele Team Leaders: Gero Federkeil, Jon File, Frans van Vught, Frank Ziegele


Download ppt "U-Multirank – The implementation of a multidimensional international ranking Higher Education Conference Rankings and the Visibility of Quality Outcomes."

Similar presentations


Ads by Google