Presentation is loading. Please wait.

Presentation is loading. Please wait.

Testing the validity of indicators in the field of education The experience of CRELL Rome, October 3-5, 2012, Improving Education through Accountability.

Similar presentations


Presentation on theme: "Testing the validity of indicators in the field of education The experience of CRELL Rome, October 3-5, 2012, Improving Education through Accountability."— Presentation transcript:

1 Testing the validity of indicators in the field of education The experience of CRELL Rome, October 3-5, 2012, Improving Education through Accountability and Evaluation: Lessons from Around the World Andrea Saltelli European Commission, Joint Research Centre

2 212 May 2015 CRELL Centre for research on lifelong learning based on indicators and benchmarks DG Education and Culture+Joint Research Centre, since 2005 http://crell.jrc.ec.europa.eu/

3 312 May 2015 CRELL -Trajectories to achieving EU 2020 objs. -Employability and other benchmarks (mobility, multi-linguism) -Labour market outcomes Focus Foci of econometric research at the JRC, Ispra

4 412 May 20154 Counter Factual analysis and other Impact Assessment methodologies Regional Studies (com…petitiveness, innovation, well being) Composite indicators and social choice Focus Foci of econometric research at the JRC, Ispra

5 Indicators

6 Context: Knowledge in support to policy; evaluation and impact assessment, but also advocacy Caveat: Validity = plausibility, defensibility … and not ‘proof of truth’

7 When testing the evidence some reasonable people (and guidelines) suggest that ‘sensitivity analysis would help’. JRC fostered sensitivity analysis development and uptake (20 years of papers, schools and books). Today we call it sensitivity auditing and teach it within the syllabus for impact assessment run by the SEC GEN. … Sensitivity analysis

8 How to shake coupled stairs How coupled stairs are shaken in most of available literature Sensitivity analysis

9 Testing (composite) indicators: two approaches Michaela Saisana, Andrea. Saltelli, and Stefano Tarantola (2005). Uncertainty and sensitivity analysis techniques as tools for the quality assessment of composite indicators. J. R. Statist. Soc. A 168(2), 307–323. Paolo Paruolo, Michaela Saisana, Andrea SaltelliRatings and rankings: Voodoo or Science?, J. R. Statist. Soc. A, 176 (2), 1-26 Sensitivity analysis

10 1012 May 2015 First: The invasive approach Michaela Saisana, Béatrice d’Hombres, Andrea Saltelli, Rickety numbers: Volatility of university rankings and policy implications Research Policy (2011), 40, 165-177 Sensitivity analysis

11 1112 May 2015 ROBUSTNESS ANALYSIS OF SJTU AND THES

12 1212 May 2015 SJTU: SIMULATED RANKS – TOP20  Harvard, Stanford, Berkley, Cambridge, MIT: top 5 in more than 75% of our simulations.  Univ California SF: original rank 18 th but could be ranked anywhere between the 6 th and 100 th position  Impact of assumptions: much stronger for the middle ranked universities

13 1312 May 2015 THES: SIMULATED RANKS – TOP 20  Impact of uncertainties on the university ranks is even more apparent.  M.I.T.: ranked 9th, but confirmed only in 13% of simulations (plausible range [4, 35])  Very high volatility also for universities ranked 10 th -20th position, e.g., Duke Univ, John Hopkins Univ, Cornell Univ.

14 1412 May 2015 Second: The non-invasive approach Comparing the weights as assigned by developers with ‘effective weights’ derived from sensitivity analysis. Sensitivity analysis

15 1512 May 2015 University Rankings Comparing the internal coherence of ARWU versus THES by testing the weights declared by developers with ‘effective’ importance measures.

16 Partnerships with OECD, WEF, INSEAD, WIPO, UN-IFAD, FAO, Transparency International, World Justice Project, Harvard, Yale, Columbia … Sixty analyses (Michaela Saisana, JRC) JRC fosters the development of good practices for the construction of aggregated statistical measures (indices, composite indicators).

17 Something worth advocating for (1): More use of social choice theory methods both for building meaningful aggregated indicators … (A pity that methods already available between the end of the XIII and the XV century are neglected by most developers) … they could be used more also in comparing options in the context of impact assessment studies.  course at JRC Ispra October 11-12

18 Sensitivity analysis Composit e indicators Impact Assessme nt Econometrics and Applied Statistics Unit http://ipsc.jrc.ec.europa.eu/?id=155 Sensitivity Analysis: http://sensitivity-analysis.jrc.ec.europa.eu/ Sensitivity Auditing: http://sensitivity-analysis.jrc.ec.europa.eu/ Presentations/Saltelli-final-February-1-1.pdf http://sensitivity-analysis.jrc.ec.europa.eu/ Presentations/Saltelli-final-February-1-1.pdf Quality of composite indicators: http://ipsc.jrc.ec.europa.eu/index.php?id=739 http://ipsc.jrc.ec.europa.eu/index.php?id=739 Useful links:


Download ppt "Testing the validity of indicators in the field of education The experience of CRELL Rome, October 3-5, 2012, Improving Education through Accountability."

Similar presentations


Ads by Google