Cooperative State Research, Education, and Extension Service Quantitative Measures in the Evaluation of Extramural Research.

Slides:



Advertisements
Similar presentations
Vitality Institute Commission Forum Commission Recommendations The Vitality Institute's mission is to advance knowledge about the evolving science and.
Advertisements

Purpose of Instruction
The Need To Improve STEM Learning Successful K-12 STEM is essential for scientific discovery, economic growth and functioning democracy Too.
1 Performance Assessment An NSF Perspective MJ Suiter Budget, Finance and Awards NSF.
Caird E. Rexroad, Jr. Wilbert H. Blackburn.  ARS is a matrix organization comprised of National Program Staff who establish national research program.
Broader Impacts: Meaningful Links between Research and Societal Benefits October 23, 2014 Martin Storksdieck I Center for Research on Lifelong STEM Learning.
Sustainability Planning Pat Simmons Missouri Department of Health and Senior Services.
Measuring for Success NCHER Legislative Conference Sophie Walker September 26, 2013.
The Agricultural Research Service Steven R. Shafer Deputy Administrator Natural Resources and Sustainable Agricultural Systems.
U.S. Science Policy Cheryl L. Eavey, Program Director
Peer Assessment of 5-year Performance ARS National Program 301: Plant, Microbial and Insect Genetic Resources, Genomics and Genetic Improvement Summary.
Occur when a societal condition is improved due to a participant’s action taken in the previous column. For example, specific contributions to: - Increased.
FCS Program Focus Area – Healthy Eating/Active Lifestyles Dr. Virginie Zoumenou UMES/ Maryland Cooperative Extension 11/01/07.
1 Program Performance and Evaluation: Policymaker Expectations 2009 International Education Programs Service Technical Assistance Workshop Eleanor Briscoe.
Leadership and Strategic Planning
HEInnovate A self-assessment tool for higher education institutions (HEIs) wishing to explore their entrepreneurial and innovative potential.
2014 AmeriCorps State and National Symposium How to Develop a Program Logic Model.
Developing Earmark Grant Performance Measures: Grant Proposal Section 3 Deanna Khemani.
Competency Assessment Public Health Professional (2012)-
HEALTHY KIDS LEARN BETTER A Coordinated School Health Approach.
Desired Outcomes / Impacts ActionsKnowledge Occurs when there is a behavior change based upon what participants have learned (medium term): -Adoption of.
IFAS Extension Goal 3, Logic Model and Communications Plan Life Skills Developed in Youth Through Subject Matter Experiences Situation Statement Florida.
The Competitive Grants Environment Presented by: Dr. Deborah Sheely Dr. Mark Poth Competitive Programs Unit.
1. 2 Why is the Core important? To set high expectations –for all students –for educators To attend to the learning needs of students To break through.
US FOREST SERVICE REGIONAL ROUNDTABLE Planning Rule Revision Photographer: Bill Lea.
World Food Prize International Symposium October 12 – 14, 2005 NASULGC Food and Society Initiative Mortimer H. Neufville.
Dr. Anna Palmisano, Deputy Administrator, Competitive Programs The Cooperative State Research, Education and Extension Service Competitive Programs.
Overview of ARS National Programs Steven Kappes Deputy Administrator Animal Production & Protection National Program Staff Agricultural Research Service.
1 Introduction to Evaluating the Minnesota Demonstration Program Paint Product Stewardship Initiative September 19, 2007 Seattle, WA Matt Keene, Evaluation.
Outcome Based Evaluation for Digital Library Projects and Services
ADD Perspectives on Accountability Where are We Now and What does the Future Hold? Jennifer G. Johnson, Ed.D.
National Science Foundation 1 Evaluating the EHR Portfolio Judith A. Ramaley Assistant Director Education and Human Resources.
1. 2 Collaborative Partnerships It’s that evolution thing again! Adult education has been partnering and collaborating for years.
HECSE Quality Indicators for Leadership Preparation.
Purposes of Evaluation Why evaluate? Accountability: Justify trust of stakeholders (funders, parents, citizens) by showing program results (Summative)
Commissioning Self Analysis and Planning Exercise activity sheets.
Larry R. Miller, Acting Associate Administrator; Bart Hewitt, Program Analyst, Planning and Accountability; Greg Crosby, National Program Leader The Cooperative.
Plan of Work and Annual Report of Accomplishments Update Bart Hewitt May 18, 2010.
Graduate studies - Master of Pharmacy (MPharm) 1 st and 2 nd cycle integrated, 5 yrs, 10 semesters, 300 ECTS-credits 1 Integrated master's degrees qualifications.
Mapping New Strategies: National Science Foundation J. HicksNew York Academy of Sciences4 April 2006 Examples from our daily life at NSF Vision Opportunities.
CSREES Reporting Web Conference April 14, questions to User Support (202) or
Desired Outcomes / Impacts ActionsKnowledge Occurs when there is a behavior change based upon what participants have learned (medium term): Development.
The NCATE Journey Kate Steffens St. Cloud State University AACTE/NCATE Orientation - Spring 2008.
Improving and Integrating Evaluation into Program Management Panel Presentation: Henry M. Doan, Ph.D. Suzanne Le Menestrel, Ph.D. CSREES Steve Loring,
SMRB Working Group on Approaches to Assess the Value of Biomedical Research Supported by NIH SMRB Working Group on Approaches to Assess the Value of Biomedical.
Developing a Framework In Support of a Community of Practice in ABI Jason Newberry, Research Director Tanya Darisi, Senior Researcher
TRANSPORTATION RESEARCH BOARD WATER SCIENCE AND TECHNOLOGY BOARD TRANSPORTATION RESEARCH BOARD TRB’s Vision for Transportation Research.
Kathy Corbiere Service Delivery and Performance Commission
1 The project is financed from the European Union funds within the framework of Erasmus+, Key Action 2: Cooperation for innovation and the exchange of.
New Mexico State University Land-Grant System Accountability: Learning from the CSREES Portfolio Review Process Steven Loring Assistant Director Agricultural.
Performance Measurement 101. Performance Measurement Performance Measurement is: –The ongoing monitoring and reporting of program accomplishments and.
Company LOGO. Company LOGO PE, PMP, PgMP, PME, MCT, PRINCE2 Practitioner.
IMPLEMENTING LEAPS IN CENTRAL AND EASTERN EUROPE: TRAINERS’ HANDBOOK Monitoring and Evaluating Results.
Office of Research and Development Photo image area measures 2” H x 6.93” W and can be masked by a collage strip of one, two or three images. The photo.
Giving Them our Best: 4-H Professional Development Logic Model Outcome: 4-H educators reflect quality, distinction and leadership in the field of youth.
Building Consensus on Nationwide Outcomes and Indicators for Extension November 13, 2009.
Authentic service-learning experiences, while almost endlessly diverse, have some common characteristics: Positive, meaningful and real to the participants.
Occur when a societal condition is improved due to a participant’s action taken in the previous column. \ -Increased number and more diverse pool of youth.
Using Analysis and Tools to Inform Adaptation and Resilience Decisions -- the U.S. national experiences Jia Li Climate Change Division U.S. Environmental.
Mgt Project Portfolio Management and the PMO Module 8 - Fundamentals of the Program Management Office Dr. Alan C. Maltz Howe School of Technology.
Department of Political Science & Sociology North South University
Continuous Improvement through Accreditation AdvancED ESA Accreditation MAISA Conference January 27, 2016.
Ross O. Love Oklahoma Cooperative Extension Service
Implementation Guide for Linking Adults to Opportunity
As we reflect on policies and practices for expanding and improving early identification and early intervention for youth, I would like to tie together.
Warning Signs: Effects of Proposed Federal Funding Cuts to Environmental and Climate Research and Development Programs novim.org/budget Kei Koizumi.
Proposal Development Support & Planning
KEY INITIATIVE Financial Data and Analytics
Internal and External Quality Assurance Systems for Cycle 3 (Doctoral) programmes "PROMOTING INTERNATIONALIZATION OF RESEARCH THROUGH ESTABLISHMENT AND.
Framing Grants for policy Research
Presentation transcript:

Cooperative State Research, Education, and Extension Service Quantitative Measures in the Evaluation of Extramural Research Education & Extension AEA 2006 Annual Conference November 2, 2006 Bruce T. McWilliams, Ph.D., MBA

Cooperative State Research, Education, and Extension Service What we do Advance knowledge for agriculture, the environment, human health and well-being, and communities by supporting research, education, and extension programs in the Land-Grant University System and other partner organizations (“The Partnership”).research educationextension CSREES doesn't perform actual research, education, and extension but rather helps fund it at the state and local level.

Cooperative State Research, Education, and Extension Service How we do it Provide program leadership National Program Leaders oversee both pre- and post-grant management of programs Peer-review grants and other types of proposals (Similar to NSF, NIH-OER) Agency distributed $1.2B to REE (FY ‘06) All R&D funded by USDA is ~ $2 B

Cooperative State Research, Education, and Extension Service What the Planning & Accountability Office does Meets Federal Reporting Requirements Executive Legislative Enhances planning & evaluation capacity of Agency Trains others in planning tools & procedures Acts as a Consultant Develops innovative approaches to planning & evaluation

Budget-Performance Cycle Partners’ Plans & Results ProjectsFormula ProposalsPlans of Work Progress Reports Annual Report Portfolio Evaluation Internal Self-Assessment (Annual) Portfolio Review Expert Panel (PREP) (Every 5 Years) OMB Evaluation Program Assessment Rating Tool (PART) (Every 5 Years) CSREES Strategic & Budget Planning Guidance: Portfolio Evaluations Stakeholder Input Administration Congress Performance-Based Budget Request Proposals for Increases Impacts Performance Measures PART results

Cooperative State Research, Education, and Extension Service General Approach to Portfolio Evaluation & PART Use independent outside expert panels to review portfolios Use logic models extensively to organize supporting materials and discussions during panel reviews Convert qualitative observations into quantitative values

Occur when a societal condition is improved due to a participant’s action taken in the previous column. For example, specific contributions to: - Increased market opportunities overseas and greater economic competitiveness - Better and less expensive animal health - Vibrant & competitive agricultural workforce - Higher productivity in food provision - Better quality-of-life for youth & adults in rural communities - Safer food supply - Reduced obesity and improved nutrition & health - Higher water quality and a cleaner environment Generic Logic Model for CSREES Reporting CSREES – Office of Planning & Accountability ( This model is intended to be illustrative guide for reporting on CSREES-funded research, education and extension activities. It is not a comprehensive inventory of our programs.) Outcomes Actions InputsSituationActivities Knowledge What we invest: - Faculty - Staff - Students - Infrastructure - Federal, state and private funds - Time - Knowledge - The collection of stakeholder opinions Occurs when there is a change in knowledge or the participants actually learn: - New fundamental or applied knowledge - Improved skills - How technology is applied - About new plant & animal varieties - Increased knowledge of decision-making, life skills, and positive life choices among youth & adults - Policy knowledge - New improved methods Description of challenge or opportunity - Farmers face increasing challenges from globalization - Opportunity to improve animal health through genetic engineering - Insufficient # of trained & diverse professionals entering agricultural fields - Youth at risk - Invasive species is becoming an increasing problem - Bioterrorism - Obesity crisis - Impaired water quality EXTERNAL FACTORS - A brief discussion of what variables have an effect on the portfolio, program or project, but which cannot be changed by managers of the portfolio, program, or project. For example, a plant breeding program’s success may depend on the variability of the weather...etc. Occur when there is a change in behavior or the participant’s act upon what they’ve learned and: - Apply improved fundamental or applied knowledge - Adopt new improved skills - Directly apply information from publications - Adopt and use new methods or improved technology - Use new plant & animal varieties - Increased skill by youth & adults in making informed life choices - Actively apply practical policy and decision-making knowledge Conditions ASSUMPTIONS - These are the premises based on theory, research, evaluation knowledge etc. that support the relationships of the elements shown above, and upon which the success of the portfolio, program, or project rests. For example, finding animal gene markers for particular diseases will lead to better animal therapies. What we do (Activities): - Design and conduct research - Publish scientific articles - Develop research methods and procedures - Teach students - Conduct non-formal education - Provide counseling - Develop products, curriculum & resources Who we reach (Participation): - Other scientists - Extension Faculty - Teaching Faculty - Students - Federal, state & private funders - Scientific journal, industry & popular magazine editors - Agencies - Policy and decision- makers - Agricultural, environmental, life & human science industries - Public Outputs Version New fundamental or applied knowledge - Scientific publications - Patents - New methods & technology - Plant & animal varieties - Practical knowledge for policy and decision-makers - Information, skills & technology for individuals, communities and programs - Participants reached - Students graduated in agricultural sciences

Cooperative State Research, Education, and Extension Service Situation Struggled with GPRA Structural impediments to an ideal evaluation Attenuated control over programs – University partnership implements programs Projects associated with grants may be short-lived (3 to 5 years) Outputs & outcomes often difficult to measure due to: Nature of effort Research versus Education versus Extension Timing of funding Can be long lags before measurable results (outcomes) appear

Cooperative State Research, Education, and Extension Service Inputs Pres. Mgmt. Agenda (PMA) gave new force and drive to Agency planning activities Planning & Accountability Office created in 2002 Top level Agency commitment Good leadership

Cooperative State Research, Education, and Extension Service Activities Align Department/Agency REE Efforts Strategic Goals Strategic Objectives = Portfolios Knowledge Areas (KAs) Programs/Projects

Cooperative State Research, Education, and Extension Service Activities Invented the Portfolio Review Expert Panel process “The PREP” In the 5 year review: Outside Expert Panelists sift through evidentiary materials and discuss issues regarding portfolio In the final step of review, the panel is asked 14 standardized questions about a portfolio’s Quality, Relevance & Performance These answers are recorded on a score sheet which converts the qualitative assessments into a single quantitative value between 1 & 100. This score becomes the primary performance measure used for the PART This panel process has become a model across the government for the PART process

Cooperative State Research, Education, and Extension Service Activities 5-Year PREP reviews are followed up with annual “Internal Reviews” that also create a performance score for PART Create in-depth Self-Assessment Reports for every portfolio of programs These are compiled and edited by P&A Office, but largely written by NPLs

Cooperative State Research, Education, and Extension Service Activities Additional performance measures were required by PART Developed ~5 for each Strategic Goal These measure progress toward Strategic Goal not Portfolio Objective Logic Models NPLs created logic models at a portfolio level and topic level (KA)

Cooperative State Research, Education, and Extension Service Outputs Organized 14 New Portfolios of Projects Convened 12 Expert Panel Reviews Compiled 12 Self-Assessment Reports Made more than 14 Logic Models for Portfolios & Programs Developed 22 Performance Measures Received 5 PART scores 2 Effectives (only 15% of Federal programs) 3 Moderately Effectives

Cooperative State Research, Education, and Extension Service Outputs Developed Systematic Organizational Structure for Evaluation Strategic Goals (6 USDA / 5 Agency) Portfolios of Projects (17) Knowledge Areas (KAs) (92) Programs (hundreds) Projects (thousands)

Cooperative State Research, Education, and Extension Service Outcomes - Changes in Knowledge Laid down 14 foundations for long-term planning NPLs gained broader understanding of planning process Developed a better idea of status of our portfolios and where improvements could be made Improved the understanding of the types of resources and leadership required

Cooperative State Research, Education, and Extension Service Outcomes - Changes in Behavior Better long-term planning by Agency More long-term planning by NPLs Improved program decision-making & direction Agency & Program leaders made more effective programmatic changes Leaders allocated resources more constructively

Cooperative State Research, Education, and Extension Service Outcomes - Changes in Societal Conditions Programs appear to be more effective and so should ultimately change societal conditions more effectively Really too soon to demonstrate PREP/PART process has had positive impact on Agency performance Esp. through greater partnership involvement and communication through the panels Better long-term planning by Agency Particularly in the allocation of resources

Cooperative State Research, Education, and Extension Service Conclusions Quantitative Measures PREP scores worked well - gave the best assessment of programs for us and for PART All 22 PART performance measures were well-considered Most were good, some were not sustainable Extensive use of logic models was highly effective as organizing device. It led to very useful program descriptions, evaluation indicators & measures

Cooperative State Research, Education, and Extension Service Ongoing & New Challenges Finished complete cycle of portfolio reviews Maintain Momentum Institutionalize Process Make planning & evaluation meaningful Remind stakeholders it takes less effort the 2 nd time Maintain vigilance about improving performance measures

Cooperative State Research, Education, and Extension Service Our Answers Question 1: Should anything be done differently to evaluate: - basic knowledge generation; - the development of applied knowledge; or, - the implementation of research knowledge? Answer: Depends on level of aggregation No – at higher levels. Our LM encompassed all forms of R&D, Education and Extension. Portfolio reviewers could still integrate their observations about programs and convert them into numbers. Yes – at lower levels. At the program level, managers wanted programs treated differently. E.g., basic research could not easily tie their results directly to changes in societal conditions and so were concerned this did not reflect their contributions.

Cooperative State Research, Education, and Extension Service Our Answers Question 2: Does the customer of the R&D outcomes make a difference? Answer: Yes -- OMB (primary customer) wanted comparable #’s, so wanted outcome measures to be as quantitative as possible -- Our internal users (secondary customer) wanted the quantitative outcome metrics to compare well with other agencies, but mostly desired the qualitative recommendations for the mgmt. of their programs.

Cooperative State Research, Education, and Extension Service Our Answers Question 3: Does an organization’s characteristics (size, structure,..) impact a quantitative evaluation? Answer: Yes -- A critical mass of evaluation capabilities must be present in an agency -- This mass should be concentrated in a team of evaluators to be most successful -- The distance from the performance of the R&D we sponsored was a problematic structural feature because limited access to best program knowledge

Cooperative State Research, Education, and Extension Service Our Answers Question 4: Is there or should there be a core evaluation policy that spans all the federal agencies? Answer: Yes The PART process, if considered as a core policy, spurred Agency initiatives to more actively monitor and assess its own programs.

Cooperative State Research, Education, and Extension Service Questions What is the best way to quantitatively evaluate R&D programs? 1. Should anything be done differently to evaluate: - basic knowledge generation; - the development of applied knowledge; or, - the implementation of research knowledge? 2. Does the customer of R&D outcomes make a difference? 3. Does an organization’s characteristics (size, structure,..) impact a quantitative evaluation? 4. Is there or should there be a core evaluation policy that spans all the federal agencies?

Cooperative State Research, Education, and Extension Service Epilogue Ways existing metrics can be used for decision-making Assessing progress-to-plan of projects & portfolios “Portfolio balancing” Project “Stop-start” and “Transition out” criteria Budget Performance Integration Needed improvements Better outcome metrics, pref., more quantitative Better tracking of specific contributions along innovation stream Better “Transition out” criteria