Presentation is loading. Please wait.

Presentation is loading. Please wait.

World University Rankings

Similar presentations

Presentation on theme: "World University Rankings"— Presentation transcript:

1 World University Rankings
What, why and how? Presented by: Andrew King, Global Head of Business Development, QS Intelligence Unit Date: 22 February 2009

2 Objectives Establish QS background & credentials
Explain research methodology New and future developments Understand limitations of rankings Collect feedback and input Objectives A serious objective of these presentations has been to establish the credibility of QS as a qualified operator of this kind of product that is the reason for quite a number of slides in the presentation being focused on seemingly unrelated aspects of the QS business. The feedback and input one is key it puts your audience at ease, makes them feel that we recognise their concerns and expertise. I now hide this slide and talk about the objectives when I’m running through the next one but it might be better to leave it in. It is wise never to claim to know more about their own institution or national situation – if you plan to draw any conclusions about the country or institution you are running the presentation in, make sure they are well-founded and, where possible, positive. Copyright © 2008 QS Intelligence Unit

3 Available Global Rankings
Session Flow QS The Context Available Global Rankings The Methodology The Future The Results Session flow slide – sections appear one at a time with a click of the mouse or alternative. If you need to skip to a section of the presentation you can do so by clicking on the appropriate section Copyright © 2008 QS Intelligence Unit

4 Available Global Rankings
Session Flow QS The Context Available Global Rankings The Methodology The Future The Results Express that the QS overview will be brief – it’s not a marketing pitch, just designed to help them understand what QS is all about – they have heard of THE but may not understand why QS is involved Copyright © 2008 QS Intelligence Unit

5 Founded 1989/1990 Nunzio Quacquarelli MBA Career Guide
MBA Wharton MBA Career Guide Approximately 100 staff in 2008 Principal offices in London, Paris & Singapore Associates in Washington, Johannesburg, Beijing & Sydney Nunzio was the founder of QS whilst studying for his MBA at Wharton The MBA Career Guide was the first product Expanding into MBA events (QS World MBA Tour) From there to Masters & PhD fairs in the form of the QS World Grad School Tour Careers services (QS Global-Workplace – job site for MBA grads, QS Leadership Career Forums – careers events focused on diversity) Software (QS TopMBA Search & Scorecard – personalised ranking system for MBA programmes, QS Top Apply – fully featured, customisable application management system) Scholarships – since 2004 (foundation of the QS Education Trust – see later slide) Copyright © 2008 QS Intelligence Unit

6 QS Mission… Our Channels
…to enable motivated people around the world to fulfill their potential, by fostering international mobility, educational achievement and career development Our Channels primary research leading-edge editorial & publications developmental events web solutions Speaks for itself – may wish to emphasise that whilst QS is a commercial operation at our centre we do have a mandate for helping people Copyright © 2008 QS Intelligence Unit

7 We don’t include this slide to show off our nice pretty purple website, but to give an idea of the global public interest in the kind of work that we do Copyright © 2008 QS Intelligence Unit

8 Over 3,219,000 visits in 2008
Over 8,500 visits per day in 2008 Increasingly strong prominence in Google & Yahoo searches Ranked as high as 23,388 in Alexa Home of the THE – QS World University Rankings Institution Profiles Detailed study abroad information Numbers for this slide are changing all the time As can be seen on the previous slide there are a lot of advertising opportunities on the site. 2009 traffic is off to a flying start with Jan 09 vs Jan 08 showing a strong improvement Copyright © 2008 QS Intelligence Unit

9 Top Universities Guide
New streamlined edition available soon Contains much information not available on web Order online at Book is 500+ pages, distributed in major bookshops including Amazon. New edition due this month/next month it will be leaner, smarter, better Copyright © 2008 QS Intelligence Unit

10 Other QS Initiatives QS World Grad School Tour
QS Top Grad School Guide QS World MBA Tour QS Top MBA Career Guide QS APPLE – Kuala Lumpur November 2009 QS Top Apply QS Education Trust QS Intelligence Unit This list is not exhaustive. Pick a couple of these you know the most about to highlight – QS World MBA Tour is the most extensive sequence of MBA events in the world. On average, QS is running an event every other day Copyright © 2008 QS Intelligence Unit

11 QS Offices & Events This slide says the most about why THE chose us as a partner. All the shaded areas are covered by either QS offices or events. THE’s influence and network is principally UK oriented. QS has a global network of media partners, employers and institutions. Which have helped provide leverage to deliver the data we need for the rankings Copyright © 2008 QS Intelligence Unit

12 Available Global Rankings
Session Flow QS The Context Available Global Rankings The Methodology The Future The Results These slides just ground you back to the schedule. At this point I might make some comment like – “That’s the QS bit done. I promised it would be brief. Next I want to discuss the context in which rankings have emerged” Copyright © 2008 QS Intelligence Unit

13 Why rank universities? Interest in ranking things and people Hospitals
Schools Local authorities Rich lists; Britain, world, Asian British, footballers Universities: The Times / US News etc… There is a global public fixation with rankings of things – other examples include the top 10 most downloaded celebrities for example The US News & World Reports Ranking has been in existence since 1983, The Times Ranking since the early 90s. It works quite well to personalise this somehow. I have often asked for a show of hands as to how many people have children and talked a little more about the idea of using league tables of schools as a factor in making decisions about where to send them... And even where to live Copyright © 2008 QS Intelligence Unit

14 Why World Rankings? Higher education becoming more global
Knowledge the key driver of international competitiveness Increasing desire for comparative information Unique positions of THE and QS as international and independent experts in higher education Raises awareness of all 500+ universities involved in the project The last point is interesting. It is my view that exercises of this nature have put performance evaluation on the map for many institutions. Regardless of our own work, inspiring institutions to pursue performance evaluation will lead to the inevitable... Performance enhancement. Indirectly, our work makes universities better. Copyright © 2008 QS Intelligence Unit

15 International Study Trends
Worldwide, the demand for international education is forecast to increase rapidly In 2003, an estimated 3.1 million students studied internationally There are likely to be in excess of 5.8 million international students in 2010 Two-thirds of this global demand will be Asians US domination of international student recruitment has begun to falter – tight Visas, increased competition (IIE, NAFSA, CCE) Bologna Accord will create an ‘explosion in English-language based postgraduate courses in Europe’ It is no coincidence that two independently conceived and developed rankings of world universities appeared within 12 months of each other… the Webometrics online ranking and Taiwan study actually makes 4. A global environment of readiness appears to emerge with any university that hopes to be successful looking to globalise. These few slides are designed to contextualise the emergence of rankings. The presenter may have their own statistics, anecdotes or examples that reinforce this aspect of the presentation. Copyright © 2008 QS Intelligence Unit

16 Student Mobility on the Rise
Note the plateau in US results post 9/11. Tighter visa demands and security for entry to the US has led to many successful applicants to US institutions being unable to take up their places. These are the top 10 most popular destinations for international students Source: UNESCO Global Education Digest Copyright © 2008 QS Intelligence Unit

17 International Study Trends
Government funding of tertiary education being cut (per capita) in most countries around the world (IFC) Many governments targeting international students as a source of tertiary sector funding Australia has estimated that international students are more important to their economy than manufacturing & mining The UK Government estimates that 270,000 students contribute $3 billion in fees & a further $3 billion in other spending IFC = International Finance Corporation – Part of the World Bank Group In relation to point one you could talk about government targets. The UK government aims to send 50% of its 18 year old cohort through university – resulting in more students to divide the money between Copyright © 2008 QS Intelligence Unit

18 International Study Trends
By 2010 The European Commissions Bologna Accord will make room for over 500,000 first degree graduates to study in other EU nations for a Masters degree By 2016 In the USA, recent Abraham Lincoln Study Abroad legislation aims to encourage 1 million students to study overseas By 2020 Over 3 million Asians are expected to study outside their home country (Source: IDP – British Council) It is my view that Bologna is not only influencing Europe but also opening applicants minds to the idea of studying overseas – once that is done. Why not the US, why not Australia, why not Indonesia? Copyright © 2008 QS Intelligence Unit

19 Tony Blair Labour Party Conference, September 2006
In 10 years we will think nothing of students going off to study anywhere in the world This is remarkable because the UK is a conventional DESTINATION market. Yet even there the requirement for an international resumé at or before graduation is becoming increasingly fierce. Tony Blair Labour Party Conference, September 2006 Copyright © 2008 QS Intelligence Unit

20 Available Global Rankings
Session Flow QS The Context Available Global Rankings The Methodology The Future The Results Copyright © 2008 QS Intelligence Unit

21 Academic Ranking of World Universities
Operated by Shanghai Jiao Tong University in China Focuses entirely on research factors Nobel Prizes & Fields Medals Citations HiCis Strongly correlated with educational reputation Historical dependency Available on I pick holes in each – and ours as well. It is important to show humility – none of these systems are the right answer. We are striving for improvement all the time and need their help to achieve it. The world’s best research universities are also those most desirable for education, with some exceptions – Oxbridge, Ivy League, NUS, ANU, SNU etc... Copyright © 2008 QS Intelligence Unit

22 Shanghai Jiao Tong Criteria
Indicator Weight Quality of Education Alumni of an institution winning Nobel Prizes and Fields Medals 10% Quality of Faculty Staff of an institution winning Nobel Prizes and Fields Medals 20% Highly cited researchers in 21 broad subject categories Research Output Articles published in Nature and Science* Articles in Science Citation Index-expanded, Social Science Citation Index Size of Institution Academic performance with respect to the size of an institution Nobel Prizes are not given out in great number resulting in this measure dating back to 1954 – but that only yields around (50 years x 6/ pertinent awards per year) 350 statistical observations by which Nian Cai Liu at SJTU claims to evaluate over 2000 universities... And then he counts them twice. These particular indicators drop off very sharply after the top 40 or so institutions – which is part of the reason why they don’t explicitly rank institutions beyond 100. Copyright © 2008 QS Intelligence Unit

23 Performance Ranking of Scientific Papers for World Universities
Launched in 2007 Operated by HEEACT (Higher Education Evaluation & Accreditation Council of Taiwan) Strong scientific research focus Attempts to present a more contemporary picture than Shanghai Results and methodology available at This is a counterpoint to Shanghai – HEEACT is a government organisation and they have designed a similar approach but avoiding the Nobel Prizes pitfalls Copyright © 2008 QS Intelligence Unit

24 Performance Ranking of Scientific Papers for World Universities
*All decisions regarding the allocation of weightings are the responsibility of the Times Higher Education Supplement Criteria Indicator Weight Research productivity Number of articles in the last 11 years ( ) 10 20% Number of articles in the current year (2006) Research impact Number of citations in the last 11 years ( ) 30% Number of citations in the last 2 years ( ) Average number of citations (per year) in the last 11 years ( ) Research excellence H-index of the last 2 years ( ) 20 50% Number of Highly Cited Papers ( ) Number of articles in high-impact journals in the current year (2006) Number of subject fields where the university demonstrates excellence ( ) The title is pretty explicit in saying what this does – but note the contrast in short and medium term achievements – this is an attempt to recognise strength in depth but also to allow more innovative but younger beacons of excellence to shine Copyright © 2008 QS Intelligence Unit

25 Webometrics Ranking of World Universities
Focuses on web profile Research productivity using Google Scholar Search Engine performance Site popularity Good practices in web management Fully automated Tracks in the realm of 14,000 institutions Prone to anomaly A university domain change can be disastrous Available on Not really looking at uni quality but at their ability to project that quality online. Benefits in depth of content by being fully automated. But something simple like changing a domain name can be disastrous Copyright © 2008 QS Intelligence Unit

26 Webometrics Criteria 20% 15% 50%
Indicator Weight Size Number of pages recovered from four engines: Google, Yahoo, Live Search and Exalead. 20% Rich Files After evaluation of relevance to academic and publication activities and considering the volume of the different file formats, the following were selected: Adobe Acrobat (.pdf), Adobe PostScript (.ps), Microsoft Word (.doc) and Microsoft Powerpoint (.ppt). These data were extracted using Google, Yahoo Search, Live Search and Exalead. 15% Scholar Google Scholar provides the number of papers and citations for each academic domain. These results from the Scholar database represent papers, reports and other academic items. Visibility The total number of unique external links received (inlinks) by a site can be only confidently obtained from Yahoo Search, Live Search and Exalead. 50% Draw attention to the Visibility metric. This is what may lead us to suggest Webometrics as a barometer for future success in our own Peer Review. Improvements in visibility precede improvements in reputation. Copyright © 2008 QS Intelligence Unit

27 World University Rankings
That leaves funny old us!

28 THE - QS World University Rankings
Began in 2004 Collaboration between Times Higher Education (THE) & QS THE running since 1971 THE is a weekly magazine distributed to academics in the UK and internationally Formerly associated with The Times Published annually in the fall THE is not the times –people seem to forget that. Copyright © 2008 QS Intelligence Unit

29 Available Global Rankings
Session Flow QS The Context Available Global Rankings The Methodology The Future The Results Copyright © 2008 QS Intelligence Unit

30 World Class University? Graduate Employability International Outlook
Our Approach World Class University? Research Quality Graduate Employability International Outlook Teaching Quality I often describe this as the most important slide in my presentation – it’s the last that people don’t argue with too much. In response to Shanghai we felt that a university was more than just a research institute and wanted to recognise them as multi-faceted institutions. In particular we identified these four pillars as those upon which a university may be able to develop a world class university. That phrase is a source of much debate – I recommend you Google it and do some reading for added context if you have time. This list of four is not exhaustive, the notion of “third mission” is not covered. Third mission relates to social responsibility, community investment, knowledge transfer, cultural preservation and any number of soft and fluffy things a university may consider important. Very difficult to measure of course. Copyright © 2008 QS Intelligence Unit

31 Ranking Criteria & Weights
Peer Review – 40% Composite score drawn from peer review (which is divided into five subject areas). 6,354 responses. Recruiter Review – 10% Score based on responses to recruiter survey. 2,339 responses Citations per Faculty Student Faculty Ratio – 20% Score based on student faculty ratio Peer Review Int’l Students International Faculty – 5% Score based on proportion of international faculty Int’l Faculty International Students – 5% Score based on proportion of international students Student Faculty Ratio You’ll need to run this slide in slide show mode to make sense of it – more detail on each indicator is provided on further slides You may want to draw attention to the fact that our system is based 50% on data collected via surveys and 50% on factual data collected from institutions and central statistics bodies like HESA (Higher Education Statistics Agency) in the UK and NCES in the US (National Center for Education Statistics). Even if, within each half of the system we add indicators in the future, we would not look to upset this balance. Any results from a student survey, for example, would come out of the right hand side of the pie – probably the peer review. Recruiter Review Citations per Faculty– 20% Score based on research performance factored against the size of the research body. Copyright © 2008 QS Intelligence Unit

32 Ranking Criteria & Weights
Peer Review Citations per Faculty Recruiter Int’l Faculty Int’l Students Student Faculty Ratio All decisions regarding the allocation of weightings remain the responsibility of Times Higher Education Importance of criteria offset by appropriateness of indicators This slide essentially plots the indicators back against the four “pillars” discussed two slides ago and it shows considerable inequities. From this you could draw the conclusion that THE (and it is important to draw attention to the fact that they have responsibility for the allocation of weightings as it distances us from arguably the most subjective part of the exercise) feel Research Quality is three times more important than Teaching Quality in the pursuit of becoming a “World Class University” [pause, wait for any reaction] Which is, of course, ridiculous. Allocation of weightings is more complicating than a subjective assessment of the importance of each pillar, it also must take into account the appropriateness of each indicator to evaluate that pillar. We have developed an “Applicant, Student & Alumni Survey” which is designed to dovetail with the results of existing surveys in the UK, Australia and the US. It is hoped that in 2010 or 2011 this may yield some indicators on teaching quality and student satisfaction. In general, no one believes that Student Faculty Ratio is anything more than a proxy for Teaching Quality. It says little about the genuine experience a student is experiencing in the classroom – although it may say more about access to faculty, which is perhaps one aspect of academic support for students. Copyright © 2008 QS Intelligence Unit

33 Selection of Initial List
Began in 2004 with list of the top 500 world institutions by research impact Excludes single-faculty institutions Excludes postgraduate-only institutions Added on recommendation from certain countries where English publication culture is not strong (e.g. Germany) Each year list is re-evaluated Omitted institutions are welcome to make a case for inclusion with supporting evidence for their inclusion relative to an included institution 604 universities considered in 2008 This is essentially an main FAQ. Institutions tend to want to know how they can get in or why they are not currently in. The most important observation here is the penultimate one – we don’t claim that our list is absolute. Institutions are invited to make a case for inclusion. This is usually done in the form of comparing oneself to a selection of universities already considered in the ranking – the easiest way to do this is to refer to a domestic ranking. If there is no such exercise, then we would have to look more deeply at their potential performance – perhaps strating with their scientific outputs. When a new institution is introduced to the list – and there have been a number over the years (e.g. Essex, Aston, UEA, Royal Holloway, Griffith, Deakin, James Cook, Mahidol etc...) it takes three years for them to reach their appropriate position in the rankings as they need three years of survey responses in order to get there. Copyright © 2008 QS Intelligence Unit

34 Sir Richard Sykes Former Rector, Imperial College
Peer review is an effective way to evaluate universities. It takes smart people to recognise smart people A quote that inspired the focus on Peer Review, no need to dwell on it. In reality, our academic survey does not really qualify as a peer review in an academic sense and occasionally that observation has been put forward. We do the best that we can to provide an approximation of peer review which is deliverable in so many countries worldwide. Sir Richard Sykes Former Rector, Imperial College Copyright © 2008 QS Intelligence Unit

35 Peer Review of Research Output
Centrepiece of this ranking 6,354 respondents (3,069 new or updated responses in 2007) International spread Weighted Results Subject spread Arts & Humanities Engineering & IT Life Sciences & Biomedicine Natural Sciences Social Sciences Overall score built up from each faculty area Active academics 3 year “latest response” 2008 Top Responding Countries United States 638 United Kingdom 563 Australia 286 Italy 277 Canada 239 India 236 Indonesia 228 Philippines 201 Germany 182 Malaysia 180 China – 116 Responses placing them in 16th place. We are investigating further survey translations which may help improve this number The Peer Review is conducted in the five subject areas and then these are brought together with EQUAL WEIGHTING Given the number of WUR institutions from Indonesia featured in WUR, the response level we get to this survey is very good. It may be worth noting that our Peer Review respondents identify an average of 20 institutions. 20 x 6,354 is in excess of 120,000 observations with which we evaluate just 600 institutions. Whilst each individual observation may not be as profound, the statistical appropriateness makes a lot more sense than using the 350 Nobel Prize observations as SJTU do. Potential Questions... Is this response level enough? No it isn’t. It is our aim to up this to 16,000 or 26,000. At that point we may be able to apply weightings at a more granular level ensuring better representativeness. We are working on a number of ways to improve this, such as translating the survey into more languages (currently only English & Spanish) Copyright © 2008 QS Intelligence Unit

36 Peer Review – Selection of Peers
3 year latest response Invitation to previous respondents World Scientific Database (180,000) Singapore Headquarters Mardev Database (12,000+) Part of Reed Elsevier UK Headquarters Academic Sign-up facility The Academic Sign-up facility is still not complete. The idea is that we will have an area where the very people you are talking to can go and volunteer to become peer reviewers. The catch is that respondents are not able to select their own university so, in this context, we have to verify the identity of the respondent before we allow them to participate. Copyright © 2008 QS Intelligence Unit

37 Who are the peers? This is a breakdown of just the 2008 response – not the full 3 years. WE HAD 17 VICE-CHANCELLORS/PRESIDENTS RESPOND. But over 75% have a watertight frame of reference for understanding the quality of other institutions in their field. Do we know what makes up the “Other”? No, we have not done that analysis. Copyright © 2008 QS Intelligence Unit

38 2008 Top Responding Countries
Employer Review 2008 Top Responding Countries United States 346 United Kingdom 269 Australia 178 Mexico 75 Netherlands Singapore 74 Russia 69 India 64 Argentina 60 Greece 59 Another group who have knowledge of university quality Key relevance to “graduate employability” criteria Introduced for 2005 rankings Sourced from QS database and media and university referrals (over 150 lists received for 2008) 2,339 responses used for 2008 China – 25 Responses placing them in 26th place. We are investigating further survey translations which may help improve this number. Stress the importance of supplying employer lists. This works in a similar to the Peer Review but in only one “subject area”. Employers are asked to identify the universities they consider to produce the best graduates, based on their experience. Note that Indonesia do not contribute as effectively here as in the Peer Review. Respondents are sourced from a combination of QS databases, media partners and lists supplied by institutions. Participating universities are invited to submit lists of employers with whom they work – these lists need to include contact details. If Indonesian universities did this in greater numbers the employer review would yield greater response from Indonesia. Copyright © 2008 QS Intelligence Unit

39 Who are the employers? Breakdown of employer responses by sector. You could make some joke about the likelihood of the Financial Services / Banking sector yielding as much response in 2009 Copyright © 2008 QS Intelligence Unit

40 Quantitative Measures
Aim to measure universities in terms of Student commitment Student Faculty Ratio “Classic measure” Research commitment Citations per Faculty International commitment and competitiveness International students and faculty Is this somewhere that people want to be? Quantitative measures to support the softer survey side Copyright © 2008 QS Intelligence Unit

41 Citations per Faculty Score
Classic measure of research quality Citations per staff member (not per paper) 5 year score (changed from 10 in 2006 to increase “topicality”) Source: Scopus We switched to Scopus in 2007 – operated by Elsevier. We were previously using the Essential Science Indicators (ESI), which is a truncated subset of the Web of Science (also referred to as ISI and the conventional market leader) which is operated by Thomson Reuters. Reasons for the switch: The relationship had always been a challenge with Thomson and is very collaborative with Elsevier Scopus has a much broader journal coverage – about 15 million journals as opposed to around 8 million. This means it is is less good at discerning differences between highly ranked universities but better and digging further down the scale As a result of (2) we were able to derive data in 2007 for 127 institutions for which ESI was unforthcoming Scopus has better coverage of non-English language publications (as long as abstracts are in English) Copyright © 2008 QS Intelligence Unit

42 2008 Indicator Correlations
Peer Review / Recruiter Review 0.68 (0.59) Large Peer Review / Faculty Students 0.30 (0.32) Medium Peer Review / Citations Faculty 0.56 (0.45) Peer Review / Int'l Faculty 0.30 (0.28) Peer Review / Int'l Students 0.31 (0.35) Recruiter Review / Faculty Students 0.30 (0.27) Recruiter Review / Citations Faculty 0.28 (0.12) Small Recruiter Review / Int'l Faculty 0.43 (0.33) Recruiter Review / Int'l Students 0.42 (0.37) Faculty Students / Citations Faculty 0.24 (0.21) Faculty Students / Int'l Faculty 0.14 (0.15) Faculty Students / Int'l Students 0.20 (0.23) Citations Faculty / Int'l Faculty 0.21 (0.15) Citations Faculty / Int'l Students 0.18 (0.19) Int'l Faculty / Int'l Students 0.67 (0.66) This slide is hidden and won’t appear in the presentation (unless you want it to). It is for the geeks really and displays the increasing intuitive strength of our measures. Correlation between related measures is getting stronger. Copyright © 2008 QS Intelligence Unit

43 World University Rankings - Issues?
The ranking has limitations. Lack of data on: Teaching quality Student satisfaction Investment in infrastructure The ranking relies on the best comparable data available I usually promise, when picking holes in the other ranking systems, that I will do so on ours – and this is lip service to that. We have covered matters of teaching quality earlier and those can be reiterated. Student satisfaction is also not covered but our survey may, in time, put pay to that. Economic factors are difficult to apply globally – not only are their conversion issues but also social, cultural and legislative factors. Many universities in Germany and Scandinavia are free – even to international students. It is very common in the US to give back to your university in the form of alumni donation – even if you’re not that well off. All in all this makes economic factors very difficult to use. With the second point, we are continually pushing the envelope, asking for more data seeing where it might get us – and as a trend we are finding more an more universities able to do so – another opbjectively positive outcome of the work. Copyright © 2008 QS Intelligence Unit

44 Available Global Rankings
Session Flow QS The Context Available Global Rankings The Methodology The Future Copyright © 2008 QS Intelligence Unit

45 Online Database Submit a broad range of quantitative and qualitative data (including those we need for the rankings) about your institution using our online interface Populate and enrich your profile on Educate visitors on other aspects of your institution We want to encourage universities themselves to populate their profiles on our website. The pitch is that the website enables institutions to describe those areas of strength they have that are not easily whittled down to a number. We have three levels of profile... Any university or HEI (Higher Education Institution) in the world is entitled to a Basic Profile for free. Intermediate profiles are free for QS clients of products at the relevant level and Advanced Profiles are available for direct purchase. This needn’t be mentioned during the presentation – it’s not a pitch – but you have an answer if the question is asked. The important fact is that everyone gets the Basic. If you’re curious as to why that is free... It is because it strengthens the whole database making it more easy to sell the advanced levels, offers zero risk engagement with potentially tentative clients and can give us a lot of sustainably updated information which is great for online marketing and helps to frive the advertising potential. Copyright © 2008 QS Intelligence Unit

46 Recruiter Review Supply us with lists of employer contacts for the recruiter review Exclusive single use only Benefit your institution in recruiter review Improve the overall strength of recruiter review Copyright © 2008 QS Intelligence Unit

47 Current & Future Developments
Greater transparency Recruiter Review Breakdown More detail on the methodology Preview Access to Recruiter Survey More relevance Subject Focuses Personalised Rankings Better response Academic Sign Up Facility More partners New Analyses Developing world National system strength Asian University Rankings New indicators? Global student and graduate survey Related ideas Star Ratings Innovation Showcase Engagement Database Entry Requirements Translator PhD Matching Engine Copyright © 2008 QS Intelligence Unit

48 Available Global Rankings
BREAK Session Flow QS The Context Available Global Rankings The Methodology The Future The Results Suggest a break here Copyright © 2008 QS Intelligence Unit

49 Howard Davies Director, London School of Economics
I imagine that all university heads broadly share my own view of these [league] tables. They are terrific and unquestioned when you score well and better than last time. They are fatally flawed and fundamentally unfair when you move in the opposite direction. So true. Universities pick and choose the results to present. Howard Davies Director, London School of Economics Copyright © 2008 QS Intelligence Unit

50 Comparing Results THE – QS SHANGHAI HEEACT WEBOMETRICS 1 Harvard
2 Yale 3 Cambridge 4 Oxford 5 Caltech 6 Imperial 7 UCL 8 Chicago 9 MIT 10 Columbia SHANGHAI Harvard Stanford Berkeley Cambridge MIT Caltech Columbia Princeton Chicago Oxford HEEACT Harvard Johns Hopkins Washington Stanford UCLA Michigan Berkeley UCSD Columbia MIT WEBOMETRICS MIT Harvard Stanford Berkeley Penn State Michigan Cornell Minnesota UW Madison UT Austin These are the top 10 in the four systems talked about earlier. When I went to get the new results for SJTU I had to make no changes to their list here. Their exercise is extremely static – the only way to easily influence it is to spend an obscene amount of money to hire Nobel Prize winners or highly cited authors. The top 27 or so in Webometrics are US institutions, the top 15 in HEEACT are US institutions. Ours is the most geographically diverse but still concludes that the US is where the majority of the world’s best universities are to be found. 58 of our top 200 are US institutions. Copyright © 2008 QS Intelligence Unit

51 Key Findings The top 200 in 2008 includes universities in 33 states up from 28 in 2007 US, UK, Canada, Australia, Netherlands Korea, China, Japan, Singapore Thailand, Malaysia Continental Europe Developing world small but improving (1 in 2004, 2 in 2005, only UNAM in 2006, 3 in 2007, 5 in 2008) In cars or computers – the top 200 organisations are probably in 5 or 10 countries. This shows us that universities really are an international industry in which quality can be measured by simple and robust means US, UK Canada, Australia and the Netherlands lead the pack Korea, China, Japan and Singapore lead the Asian assault on the top 200 Thailand and Malaysia less so Continental Europe does worse than might be expected – we think that this is largely to do with much of the research taking place in state funded research institutes outside the university sector and whilst those undertaking the research are often also employed by a university ot is impossible to attribute that research back to the university in question Copyright © 2008 QS Intelligence Unit

52 Where are the top universities?
Region/Country 2008 2007 2006 2005 N. America 14 11 12 23 25 42 43 36 35 Europe 4 5 13 41 37 Asia/Australia 2 3 15 22 28 USA 20 33 31 UK 8 17 19 Australia 1 6 7 Canada Take some time to figure out what this slide is saying or you may be caught out – e.g... 14 of the top 20 institutions are in North America (13 in the US and 1 in Canada – McGill) There were 21 institutions in the top 20 in 2007 because there was a 20= North American dominance tails off as you move down to the Top 50 and the Top 100 Our ranking has been criticised for being too generous to Australian institutions – yet you can see that this is correcting itself (as surveys are gradually gathering a more representative response and some of the response bias is being counteracted) Copyright © 2008 QS Intelligence Unit

53 Consistency of Results
2008 2007 2006 2005 2004 1 Harvard 2 Yale Cambridge MIT Berkeley 3 Oxford 4 Caltech 5 Imperial Stanford 6 Princeton 7 UCL Chicago 8 9 10 Columbia Ecole Polytechnique ETH Zurich We are pretty happy with the consistency of results in the top 10 over the five years but there were concerns over the volatility of our ranking further down. In 2007 we introduced a bank of changes designed to stabilise our results and we can begin to see them working. The average change in position in the top 500 between 2006 and 2007 was 33 places; between 2007 and 2008 it was just 19. Copyright © 2008 QS Intelligence Unit

54 Global Top 20 2008 RANK 2007 RANK INSTITUTION NAME COUNTRY PEER REVIEW RECRUITER REVIEW FACULTY STUDENT CITATIONS PER FACULTY INT'L FACULTY INT'L STUDENTS OVERALL 1 HARVARD University United States 100 96.3 87 81.3 2 2= YALE University 99.6 97.5 89 70.6 99.8 3 University of CAMBRIDGE United Kingdom 99.4 88.5 97.6 95.4 99.5 4 University of OXFORD 99.7 85.2 95.9 96.4 98.9 5 7= CALIFORNIA Institute of Technology 73.6 92.6 98.6 6 IMPERIAL College London 99 99.9 83.1 97.8 98.4 7 9 UCL (University College London) 95.8 99.3 98.1 8 University of CHICAGO 91.2 78.4 98 10 MASSACHUSETTS Institute of Technology 90.3 33.4 93.9 96.7 11 COLUMBIA University 93.6 29.4 89.1 14 University of PENNSYLVANIA 96.6 87.7 83.3 78.8 96.1 12 PRINCETON University 75.4 90.9 81.9 95.7 13= 13 DUKE University 97 93.8 30 65.6 94.4 15 JOHNS HOPKINS University 99.1 77.6 29.9 68.3 20= CORNELL University 99.2 89.6 95.5 27.9 75.7 94.3 16 AUSTRALIAN National University Australia 82 74 90.7 92 17 19 STANFORD University 67.4 26 87.1 18 38= University of MICHIGAN 98.8 85.4 83.6 59.2 51.3 91 University of TOKYO Japan 94 27 40.4 90 20 MCGILL University Canada 96.9 51.2 62.2 94.7 89.7 Hide this slide if you want. There is a lot of information on just one slide and it is so far out of the Indonesian context so as not to matter much. Copyright © 2008 QS Intelligence Unit

55 Indonesia Results THE - QS Webometrics 1 University of Indonesia (287)
Universitas Gadjah Mada (623) 2 Bandung Institute of Technology (315) Bandung Institute of Technology (676) 3 Universitas Gadjah Mada (316=) University of Indonesia (906) 4 Airlangga University (501+) Gunadarma University (1604) 5 Bogor Agricultural University (501+) Institut Teknologi Sepuluh Nopember (1762) 6 University of Brawijaya (501+) Sekolah Tinggi Teknologi Telkom (1960) 7 Diponegoro University (501+) Petra Christian University (2013) 8 Bogor Agricultural University (2063) 9 University of Brawijaya (2152) 10 Sebelas Maret University (2159) 11 Airlangga University (2672) 12 Universitas Padjadjaran (2730) 13 Electronic Engineering Polytechnic Institute of Surabaya (3016) 14 Bina Nusantara University (3026) 15 Diponegoro University (3138) Indonesia non existent in SJTU due to very low publication output in journals covered by Web of Science. The top three institutions all improved considerably between 2007 and 2008 with University of Indonesia gaining over 100 places from 395 in 2007. The Webometrics ranking agrees on the top three although not in the same order. Beyond that other universities appear that are not in our assessment – this may indicate some institutions we should include – but then the positions would perhaps suggest that many of them oughtn’t feature in the top 600 or so anyway. But then if we took the Webometrics ranking too seriously our exercise might look very different. Copyright © 2008 QS Intelligence Unit

56 Trend Analysis by Indicator – Indonesia
This slide makes more sense if you view it in slideshow mode – it will run through one indicator at a time. The solid lines represent the average position for each indicator for Indonesia and the dashed lines represent the average for Asia. Averages only include institutions that have been involved since the outset. Key observations: Indonesia’s key weaknesses in relation to the Asian averages are in the Citations per Faculty and international criteria – in the other areas it offers close to the average. Overall Indonesia has hovered around the 400 mark, coming closer to the Asian average in 2008 than in 2007 Copyright © 2008 QS Intelligence Unit

57 Faculty Level Data Peer opinion Citations per paper Not aggregated
THE publish citations per paper alongside the faculty area rankings but not as an indicator that influences the ranking positions No citations and arts and humanities Not citations per person as no staff numbers by faculty area Copyright © 2008 QS Intelligence Unit

58 Life Sciences & Biomedicine
2008 Faculty Level Arts & Humanities Engineering & IT Life Sciences & Biomedicine Natural Sciences Social Sciences Harvard Berkeley Oxford Cambridge Yale Princeton Columbia Stanford Chicago UCL Toronto ANU McGill Cornell NYU MIT Caltech Carnegie Mellon Imperial Georgia Tech Tokyo NUS Tsinghua ETH Zurich Johns Hopkins UC San Diego UCLA British Columbia Kyoto LSE Looking at the faculty level results at a world level reveals that the Academic Peer Review does respond well to intuitive faculty strengths – MIT tops the Engineering & IT area, LSE appears highly in Social Sciences and Johns Hopkins in the Life Sciences field Copyright © 2008 QS Quacquarelli Symonds Limited (

59 2008 Faculty Level - Indonesia
Arts & Humanities Engineering & IT Life Sciences & Biomedicine Natural Sciences Social Sciences Indonesia (173) Gadjah Mada (178=) Bandung ( ) Airlangga ( ) Diponegoro ( ) Bogor ( ) Brawijaya (501+) Bandung (90) Indonesia (206) Gadjah Mada (234) Diponegoro ( ) Bogor ( ) Airlangga ( ) Brawijaya (501+) Gadjah Mada (106) Indonesia (207) Bandung (210) Airlangga ( ) Diponegoro ( ) Brawijaya ( ) Bogor (501+) Bandung (143) Gadjah Mada (220) Indonesia ( ) Diponegoro (500+) Airlangga (501+) Indonesia (131) Gadjah Mada (167) Bandung ( ) Faculty level analysis shows that some Indonesian institutions in Indonesia do well in specific areas. Bandung’s top 100 position in Engineering & IT for the first time in 2008 is particularly impressive Copyright © 2008 QS Quacquarelli Symonds Limited (

60 Trend Analysis by Faculty – Indonesia
This slide works similarly to the one for indicators but is by the five faculty areas – again solid lines are Indonesia and dashed lines are the Asian regional average Copyright © 2008 QS Intelligence Unit

61 System Strength Rank Country System Access Flagship Economic Overall 1
United States 100 2 United Kingdom 98 94 3 Australia 92 97 99 88 4 Germany 95 87 93 5 Canada 86 6 Japan 91 80 89 90 7 France 82 8 Netherlands 83 76 9 Korea, South 71 81 96 79 10 Sweden 75 64 11 Switzerland 62 12 Italy 73 77 13 Belgium 63 14 New Zealand 66 15 China 78 32 16 Hong Kong 61 49 72 17 Ireland 65 45  Lisbon Australia United Kingdom Denmark Finland USA Sweden Ireland Portugal Italy France Poland Hungary Netherlands Switzerland Germany Austria Spain Animations will reveal Lisbon results and then Indonesia... In 2008 for the first time, QS conducted some additional analysis looking at the strength of overall national systems. This includes information from the rankings combined with national data such as population and GDP per capita. These national data are sourced from the CIA World Factbook. There is a description of how this system works on There has been a comparable study of 17, mainly European, national systems compiled by the Lisbon Council. 32 Indonesia 35 38 54 84 53 Copyright © 2008 QS Intelligence Unit

62 The Results Available at
Top 100 for the following subject areas Arts & Humanities Engineering & IT Life Sciences & BioMedicine Natural Sciences Social Sciences World’s oldest universities Year on year results Copyright © 2008 QS Intelligence Unit

63 Thank You Andrew King
Presentation slides available on

Download ppt "World University Rankings"

Similar presentations

Ads by Google