Presentation is loading. Please wait.

Presentation is loading. Please wait.

MEASURING ACADEMIC RESEARCH IN CANADA: ALEX USHER HIGHER EDUCATION STRATEGY ASSOCIATES IREG-7 Warsaw, Poland – May 17, 2013.

Similar presentations


Presentation on theme: "MEASURING ACADEMIC RESEARCH IN CANADA: ALEX USHER HIGHER EDUCATION STRATEGY ASSOCIATES IREG-7 Warsaw, Poland – May 17, 2013."— Presentation transcript:

1 MEASURING ACADEMIC RESEARCH IN CANADA: ALEX USHER HIGHER EDUCATION STRATEGY ASSOCIATES IREG-7 Warsaw, Poland – May 17, 2013

2 The Problem When making institutional comparisons, biases can occur both because of institutional size and distribution of fields of study Can we find a way to compare institutional research output in a way that controls for size and field of study?

3 YES

4 Basic methodology  Simple 2-indicator system: publication (H-index) and research income (granting councils)  Data gathered at the level of the individual researcher, not institution  Every researcher given a score for his/her performance relative to the average of his/her discipline. Scores are then summed and averaged.

5 Publication Metric: H-Index “A scientist has index h if h of his/her N p papers have at least h citations each, and the other (N p − h) papers have no more than h citations each.”  (i.e., the largest possible number N where a scientist has a total of N papers with N or more citations) Ex. 2 Publication 1: 10 citations Publication 2: 2 citations Publication 3: 2 citations Publication 4: 2 citations Ex. 1 Publication 1: 5 citations Publication 2: 4 citations Publication 3: 3 citations Publication 4: 2 citations H-Index: 3H-Index: 2

6 H-Index (pros and cons) - Pros - Discounts publications with little or no impact - Discounts sole publications with very high impact Cons - Requires a large, accurate, cross-referenced database (labour) - Age bias (less concern on aggregates) - Differences in publication cultures (can be fixed) - Not very useful in disciplines with low publication cultures

7 The HiBar Database Automated collection & calculation Manual correction Analysis Faculty lists Standardized discipline names

8 Example: Dr. Joshua Barker Barker, Joshua D. Associate Professor University of Toronto Social cultural anthropology, violence & power, crime & policing, theories of modernity, anthropology of technology, nationalism, urban studies; Indonesia, South East Asia Simple automated search 129 (1000+ pubs) Add advanced filtering and Boolean logic 43 (800+ pubs) Manual elimination of false positives, excluded publication types, etc. 2 (5 pubs)

9 The Canadian Prestige Hierarchy InstitutionARWU/THE Toronto1 British Columbia2 McGill3 Alberta, McMaster, Montreal, Waterloo2 nd tier Dalhousie, Laval, Queen’s, Simon Fraser, Calgary, Western, Guelph, Manitoba, Ottawa, Saskatchewan, Victoria 3 rd tier Laval, Carleton, Quebec, UQAM, ConcordiaOther major institutions

10 Science-Engineering H-Index RankInstitutionScoreRankInstitutionScore 1UBC McMaster Toronto – St. G Trent Montreal Scarborough McGill Manitoba Simon Fraser Trois-Rivieres Waterloo Alberta Ottawa Western York Concordia Queen’s Laval Rimouski UQAM0.967

11 Arts H-Index RankInstitutionScoreRankInstitutionScore 1UBC Concordia Toronto – St. G Trent McGill Mississauga Queen’s Scarborough Alberta Carleton McMaster Manitoba York Montreal Guelph Calgary Simon Fraser Saskatchewan Waterloo Western1.016

12 Medicine We did not cover medical fields Impossible to do so because manner in which certain institutions choose to list staff at associated teaching hospitals made it impossible to generate equivalent staff lists.

13 Research Income Collected data on peer-evaluated individual grants (i.e. major institutional allocations for equipment, etc excluded) made by two main granting councils (SSHRC and NSERC) over a period of three years Data then field-normalized as per process for H-Index.

14 Research Income (pros and cons) - Pros - Publicly available, 3 rd party data, with personal identifiers - Based on a peer-review system designed to reward excellence Cons - Issues with respect to cross-institutional awards - Ignores income from private sources which may be substantial

15 Science-Engineering Income RankInstitutionScoreRankInstitutionScore 1UBC Guelph Ottawa McMaster Montreal Waterloo Alberta Queen’s Toronto- St. G Simon Fraser Calgary Scarborough Rimouski Carleton Saskatchewan Western McGill Sherbrooke Laval Chicoutimi0.969

16 Arts Income RankInstitutionScoreRankInstitutionScore 1McGill Calgary UBC Dalhousie Montreal Laval Guelph Queen’s Alberta Ottawa McMaster Waterloo Toronto – St. G Carleton York Rimouski Concordia Scarborough Simon Fraser Western0.951

17 Science-Engineering Total RankInstitutionScoreRankInstitutionScore 1UBC10011Queen’s Montreal Scarborough Toronto – St. G Calgary Ottawa Laval McGill Saskatchewan SFU Guelph Rimouski Western Waterloo York Alberta Carleton McMaster Concordia59.67

18 Arts Total RankInstitutionScoreRankInstitutionScore 1UBC Queen’s McGill Waterloo Toronto – St. G Calgary Alberta Dalhousie Guelph Carleton Montreal Scarborough McMaster Trent York Western Concordia Mississauga Simon Fraser Ottawa46.06

19 Controversies (1)  The double-count issue. In an initial draft, we included a record count of staff rather than a head count (former is higher because of cross-appointments). Led to questions  The part-time professor issue. Many objected to our inclusion of part-time staff in the total. So we re-did the numbers without them…

20 NSERC Scores (revised) New Rank InstitutionOld Rank New Rank InstitutionOld Rank 1UBC111Rimouski7 2Toronto-St. G312McMaster10 3Montreal213Queen’s11 4SFU614York18 5McGill515Guelph16 6Ottawa416Saskatchewan15 7Alberta917Manitoba27 8Waterloo818Trent21 9Laval1419Western17 10Calgary1320Concordia20

21 SSHRC Scores (revised) New Rank InstitutionOld Rank New Rank InstitutionOld Rank 1McGill211Concordia9 2UBC112Calgary13 3Toronto-St.G313Waterloo12 4Guelph514Laval21 5Alberta415Ottawa20 6McMaster716Dalhousie14 7Montreal617UQAM43 8Queen’s1118Trent17 9Simon Fraser1019Carleton15 10York820Western18

22 The Philosophical Part

23 Who is a university?  Whose performance gets included in a ranking says something about who one believes embodies a university. Should it include: FT faculty only? PT faculty? Emeritus faculty? Graduate students?  At the moment, most ranking systems decision driven by data collection methodology.

24 Do all subjects matter equally?  Field-normalization implies that they do. But is this correct? Are some fields more central to the creation of knowledge than others? Should some fields be privileged when making inter-institutional comparisons?

25 Does Size Matter?  Does aggregation of talent bring benefits of its own, independent of the quality of people being aggregated?

26 Where Does Greatness Lie?  On whose work should institutional reputation be based? Its best scholars, or all of its scholars?  Norming for size implicitly rewards schools with good average professors. Failure to norm more likely to reward a few “top” professors


Download ppt "MEASURING ACADEMIC RESEARCH IN CANADA: ALEX USHER HIGHER EDUCATION STRATEGY ASSOCIATES IREG-7 Warsaw, Poland – May 17, 2013."

Similar presentations


Ads by Google