Discussion From Republic of Science to Audit Society, Irwin Feller S. Charlot ASIRPA, Paris, France; June 13, 2012.

Slides:



Advertisements
Similar presentations
Program Evaluation Alternative Approaches and Practical Guidelines
Advertisements

Andrea M. Landis, PhD, RN UW LEAH
Study Objectives and Questions for Observational Comparative Effectiveness Research Prepared for: Agency for Healthcare Research and Quality (AHRQ)
What is Diffusion? The process of communicating innovation through certain channels over time through members of a social system.
Introduction to Research Methodology
Mr. Brooks Foundations of Technology.  Students will: ◦ Develop an understanding of the relationships among technologies and connections with other fields.
The Technopolis Group Paul Simmonds Director. Introduction Private limited company Founded in 1989 A spinoff from SPRU (University of Sussex) In 2012,
The Art and Science of Teaching (2007)
Evaluation of the Technology Policy Limitations to the evaluation of the technology program in Brazil Ana Paula Avellar PhD Student, Economics Institute,
1 CCLI Proposal Writing Strategies Tim Fossum Program Director Division of Undergraduate Education National Science Foundation Vermont.
TOOLS OF POSITIVE ANALYSIS
The Process of Scope and Standards Development
Formulating the research design
Introduction to Molecular Epidemiology Jan Dorman, PhD University of Pittsburgh School of Nursing
Funding Opportunities at the Institute of Education Sciences Elizabeth R. Albro, Ph.D. Associate Commissioner Teaching and Learning Division National Center.
Moving forward with Scalable Game Design. The landscape of computer science courses…  Try your vegetables (sneak it in to an existing course)  Required.
MEADOW: Guidelines for a European survey of organisations Nathalie Greenan CEE and TEPP-CNRS Exploring possibilities for the development of European data.
From Republic of Science to Audit Society Irwin Feller Professor Emeritus, Economics, Pennsylvania State University Assessing impact of public policies.
SIMAD University Research Process Ali Yassin Sheikh.
This presentation contains copyrighted information belonging to Dr. Lesia L. Crumpton-Young All Rights Reserved. No part of this presentation may be reproduced,
Integrating Digital Curation in a Digital Library curriculum: the International Master DILL case study Anna Maria Tammaro University of Parma Florence,
Copyright © 2008 Wolters Kluwer Health | Lippincott Williams & Wilkins Chapter 12 Undertaking Research for Specific Purposes.
Qualitative Analysis Information Studies Division Research Workshop Elisabeth Logan.
Program Evaluation EDL 832 Jeffrey Oescher, Instructor 6 June 2013.
© 2011 Brooks/Cole, A Division of Cengage Learning Chapter 16 Consultation and Collaboration You must be the change you wish to see in the world. Mahatma.
Applying Educational Research A Practical Guide EdAd 692 Research in Educational Leadership.
Lesson Overview Lesson Overview Science in Context Lesson Overview 1.2 Science in Context.
Creating a Shared Vision Model. What is a Shared Vision Model? A “Shared Vision” model is a collective view of a water resources system developed by managers.
Hartley, Project Management: Integrating Strategy, Operations and Change, 3e Tilde Publishing Chapter 10 Risk Management Proactively managing the positive.
Learning Progressions: Some Thoughts About What we do With and About Them Jim Pellegrino University of Illinois at Chicago.
1 Science as a Process Chapter 1 Section 2. 2 Objectives  Explain how science is different from other forms of human endeavor.  Identify the steps that.
Experimental Research Methods in Language Learning Chapter 1 Introduction and Overview.
Making Sense of the Social World 4 th Edition Chapter 11, Evaluation Research.
ScWk 242 Course Overview and Review of ScWk 240 Concepts ScWk 242 Session 1 Slides.
Evaluating the impact of health research: Revisiting the Canadian Institutes of Health Research Impact Assessment Framework Nicola Lauzon, Marc Turcotte.
LEVEL 3 I can identify differences and similarities or changes in different scientific ideas. I can suggest solutions to problems and build models to.
Lesson Overview 1.2 Science in Context.
Lesson Overview Science in Context THINK ABOUT IT Scientific methodology is the heart of science. But that vital “heart” is only part of the full “body”
Third Sector Evaluation: Challenges and Opportunities Presentation to the Public Legal Education in Canada National Conference on “Making an Impact” 26.
Mapping New Strategies: National Science Foundation J. HicksNew York Academy of Sciences4 April 2006 Examples from our daily life at NSF Vision Opportunities.
Basics of Research and Development and Design STEM Education HON4013 ENGR1020 Learning and Action Cycles.
Introduction to Earth Science Section 2 Section 2: Science as a Process Preview Key Ideas Behavior of Natural Systems Scientific Methods Scientific Measurements.
Open and Collaborative Innovation in US Healthcare: The Case of Health Insurance Exchange (HIX), Sudeep Krishnan, IIM Ahmedabad (IIMA), ICEIM 2014, Durban, SA, Conference Presentation
Research Design. Selecting the Appropriate Research Design A research design is basically a plan or strategy for conducting one’s research. It serves.
The Practical Aspects of Doing Research An Giang University June, 2004 Dennis Berg, Ph.D.
Developing the theoretical and conceptual framework From R.E.Khan ( J199 lecture)
A project implemented by the HTSPE consortium This project is funded by the European Union SECURITY AND CITIZENSHIP RIGHT AND CITIZENSHIP
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
Program Evaluation Overview. Definitions of Program Evaluation systematic collection of information abut the activities, characteristics, and outcome.
Research for Nurses: Methods and Interpretation Chapter 1 What is research? What is nursing research? What are the goals of Nursing research?
Types of Research: General categories. The general types: 1. Analytical –Historical –Philosophical –Research synthesis (meta-analysis) 2. Descriptive.
Course, Curriculum, and Laboratory Improvement (CCLI) Transforming Undergraduate Education in Science, Technology, Engineering and Mathematics PROGRAM.
How to measure the impact of R&D on SD ? Laurence Esterle, MD, PhD Cermes and Ifris France Cyprus, 16 – 17 October L. ESTERLE Linking science and.
The FDES revision process: progress so far, state of the art, the way forward United Nations Statistics Division.
1 Issues affecting the evaluation of the beneficial effect of new technologies and ways to solve these issues. Prof. Dr. Viliam Sarian, NAN RA Acad., Vice-Chairman.
Lecture №4 METHODS OF RESEARCH. Method (Greek. methodos) - way of knowledge, the study of natural phenomena and social life. It is also a set of methods.
A. Strategies The general approach taken into an enquiry.
Welcome to MT- 320 Research and Presentation Seminar Dear Students, The Seminar will start promptly at the Schedule Time. Remember that the system saves.
SCIENTIFIC METHOD RESEARCH METHODS ETHICS PSYCHOLOGICAL RESARCH.
[Presentation location] [Presentation date] (Confirm ABT logo) Building Bridges and Bonds (B3): An introduction.
Unit 1.  Fundamentals of research ◦ Meaning of research ◦ Objectives of research ◦ Significance of research  Types of Research  Approaches to research.
Stages of Research and Development
Monitoring and Evaluating Rural Advisory Services
Fundamentals of Monitoring and Evaluation
Section 2: Science as a Process
Strategic Planning for Learning Organizations
Competitiveness of the regional market, importance of statistics and innovations THE ROLE OF RESEARCH CENTERS IN PROMOTING OF RESEARCH Sarajevo, 8th.
Monitoring and Evaluating FGM/C abandonment programs
Presentation transcript:

Discussion From Republic of Science to Audit Society, Irwin Feller S. Charlot ASIRPA, Paris, France; June 13, 2012

Outline New Questions/Issues & What’s at Stake How They’re Answered? Validity of Performance Metrics and Methodological Choice(s)  Econometrics Use, Non-Use and Misuse of Research Assessments

Pre-New Public Management Assessment Paradigm Republic of Science (M.Polyani) Peer (Expert) Review Social Contract

New Public Management Paradigm Accountability Deregulation Competition (among different uses of public funds) Performance Measurement (for evaluating research uses)

Promises of Research Performance Assessment Objectives provide useful baseline for assessing performance. Performance measurement focuses attention on the end objectives of public policy, on what’s happened or happening outside rather than inside the black box. Well defined objectives and documentation of results facilitate communication with funders, performers, users, and others.

Limitations of Research Performance Measurement Returns/Impacts to research are uncertain, long-term, and circuitous Specious precision in selection of measures Impacts are typically dependent on complementary actions by agents outside of Federal agency control Limited (public) evidence of contributions to improved decision making Benefits from “failure” are underestimated Distortion of Incentives : opportunistic behavior (young researchers to be employed and elder researchers to catch future funds) First comment/issue + Role of creativity/very innovative ideas in science progress (i. e. “scientific revolutions”)

Overview of Evaluation Methodologies METHODBRIEF DESCRIPTIONEXAMPLE OF USE Analytical conceptual modeling of underlying theory Investigating underlying concepts and developing models to advance understanding of some aspect of a program, project or phenomenon. To describe conceptually the paths through which spillover effects may occur. SurveyAsking multiple parties a uniform set of questions about activities, plans, relationships, accomplishments, value, or other topics, which can be statistically analyzed. To find out how many companies have licensed their newly developed technology to others. Case study – descriptive Investigating in-depth a program or project, a technology, or a facility, describing and explaining how and why developments of interest have occurred. To recount how a particular joint venture was formed, how its participants shared research tasks, and why the collaboration was successful or unsuccessful. Case study - economic estimation Adding to a descriptive case study quantification of economic effects, such as through benefit-cost analysis. To estimate whether, and by how much, benefits of a project exceed its costs. Econometric and statistical analysis Using tools of statistics, mathematical economics, and econometrics to analyze functional relationships between economic and social phenomena and to forecast economic effects. To determine how public funding affects private funding of research. Sociometric and social network analysis Identifying and studying the structure of relationships by direct observation, survey, and statistical analysis of secondary databases to increase under-standing of social organizational behavior and related economic outcomes. To learn how projects can be structured to increase the diffusion of resulting knowledge.

Second comment/question Complementarities between methodologies Econometric modeling needs analytical conceptual modeling of underlying theory to be pertinent Econometric analysis also needs to take into account the policy design, context… to be pertinent  Survey, case studies.. No econometric identification of impacts without these components in the evaluation model

Complementarity second example Benefit -Cost Analysis can be made by econometric model Conduct Technical Analysis Identify Next Best Alternative Estimate Program Costs Estimate Economic Benefits Determine Agency Attribution Estimate Benefits of Economic Return RTI 2010

BeforeAfter Treatment Group S Т Comparison/ Control Group S C τ τ + 1 τ Issue: Before/After Design Shows changes “related” to policy intervention, but does not adjust for “intervening” factors. (Threats to internal validity) Reframe Analysis: Did policy “cause” change(s) in treatment group different from those observable in a comparison/control group Microeconometrics of policy evaluation

Third comment/question Econometric enhancements: Non parametric analysis  no a priori constraint on relationship between outcome (whatever the outcome chosen) and R&D spending or funding  No knowledge production function a priori Taking into account the effect of non observable characteristics or time varing characteristics on outcomes  context, context, context

Fourth comment/question “Dominant” U.S. but also European Union Methodology is Expert Panels Problem of network effects Same issue as peer evaluation and bibliometrics  Only issue for « low impacts » (publications..) but not for high impacts???

Is Anyone Listening? My small experience (one evaluation report): no one is listening as a researcher I agree that “Doing good may not make you happy, but doing wrong will certainly make you unhappy” But for a novice at evaluating policy what are arguments not to stop this kind of intellectual exercise? (except publishing or funding researches) What type of advices? For ASIRPA?

Thank you