Presentation is loading. Please wait.

Presentation is loading. Please wait.

Cooperative State Research, Education, and Extension Service Quantitative Measures in the Evaluation of Extramural Research.

Similar presentations


Presentation on theme: "Cooperative State Research, Education, and Extension Service Quantitative Measures in the Evaluation of Extramural Research."— Presentation transcript:

1 Cooperative State Research, Education, and Extension Service http://www.csrees.usda.gov Quantitative Measures in the Evaluation of Extramural Research Education & Extension AEA 2006 Annual Conference November 2, 2006 Bruce T. McWilliams, Ph.D., MBA

2 Cooperative State Research, Education, and Extension Service http://www.csrees.usda.gov What we do Advance knowledge for agriculture, the environment, human health and well-being, and communities by supporting research, education, and extension programs in the Land-Grant University System and other partner organizations (“The Partnership”).research educationextension CSREES doesn't perform actual research, education, and extension but rather helps fund it at the state and local level.

3 Cooperative State Research, Education, and Extension Service http://www.csrees.usda.gov How we do it Provide program leadership National Program Leaders oversee both pre- and post-grant management of programs Peer-review grants and other types of proposals (Similar to NSF, NIH-OER) Agency distributed $1.2B to REE (FY ‘06) All R&D funded by USDA is ~ $2 B

4 Cooperative State Research, Education, and Extension Service http://www.csrees.usda.gov What the Planning & Accountability Office does Meets Federal Reporting Requirements Executive Legislative Enhances planning & evaluation capacity of Agency Trains others in planning tools & procedures Acts as a Consultant Develops innovative approaches to planning & evaluation

5 Budget-Performance Cycle Partners’ Plans & Results ProjectsFormula ProposalsPlans of Work Progress Reports Annual Report Portfolio Evaluation Internal Self-Assessment (Annual) Portfolio Review Expert Panel (PREP) (Every 5 Years) OMB Evaluation Program Assessment Rating Tool (PART) (Every 5 Years) CSREES Strategic & Budget Planning Guidance: Portfolio Evaluations Stakeholder Input Administration Congress Performance-Based Budget Request Proposals for Increases Impacts Performance Measures PART results

6 Cooperative State Research, Education, and Extension Service http://www.csrees.usda.gov General Approach to Portfolio Evaluation & PART Use independent outside expert panels to review portfolios Use logic models extensively to organize supporting materials and discussions during panel reviews Convert qualitative observations into quantitative values

7 Occur when a societal condition is improved due to a participant’s action taken in the previous column. For example, specific contributions to: - Increased market opportunities overseas and greater economic competitiveness - Better and less expensive animal health - Vibrant & competitive agricultural workforce - Higher productivity in food provision - Better quality-of-life for youth & adults in rural communities - Safer food supply - Reduced obesity and improved nutrition & health - Higher water quality and a cleaner environment Generic Logic Model for CSREES Reporting CSREES – Office of Planning & Accountability ( This model is intended to be illustrative guide for reporting on CSREES-funded research, education and extension activities. It is not a comprehensive inventory of our programs.) Outcomes Actions InputsSituationActivities Knowledge What we invest: - Faculty - Staff - Students - Infrastructure - Federal, state and private funds - Time - Knowledge - The collection of stakeholder opinions Occurs when there is a change in knowledge or the participants actually learn: - New fundamental or applied knowledge - Improved skills - How technology is applied - About new plant & animal varieties - Increased knowledge of decision-making, life skills, and positive life choices among youth & adults - Policy knowledge - New improved methods Description of challenge or opportunity - Farmers face increasing challenges from globalization - Opportunity to improve animal health through genetic engineering - Insufficient # of trained & diverse professionals entering agricultural fields - Youth at risk - Invasive species is becoming an increasing problem - Bioterrorism - Obesity crisis - Impaired water quality EXTERNAL FACTORS - A brief discussion of what variables have an effect on the portfolio, program or project, but which cannot be changed by managers of the portfolio, program, or project. For example, a plant breeding program’s success may depend on the variability of the weather...etc. Occur when there is a change in behavior or the participant’s act upon what they’ve learned and: - Apply improved fundamental or applied knowledge - Adopt new improved skills - Directly apply information from publications - Adopt and use new methods or improved technology - Use new plant & animal varieties - Increased skill by youth & adults in making informed life choices - Actively apply practical policy and decision-making knowledge Conditions ASSUMPTIONS - These are the premises based on theory, research, evaluation knowledge etc. that support the relationships of the elements shown above, and upon which the success of the portfolio, program, or project rests. For example, finding animal gene markers for particular diseases will lead to better animal therapies. What we do (Activities): - Design and conduct research - Publish scientific articles - Develop research methods and procedures - Teach students - Conduct non-formal education - Provide counseling - Develop products, curriculum & resources Who we reach (Participation): - Other scientists - Extension Faculty - Teaching Faculty - Students - Federal, state & private funders - Scientific journal, industry & popular magazine editors - Agencies - Policy and decision- makers - Agricultural, environmental, life & human science industries - Public Outputs Version 1.2 - New fundamental or applied knowledge - Scientific publications - Patents - New methods & technology - Plant & animal varieties - Practical knowledge for policy and decision-makers - Information, skills & technology for individuals, communities and programs - Participants reached - Students graduated in agricultural sciences

8 Cooperative State Research, Education, and Extension Service http://www.csrees.usda.gov Situation Struggled with GPRA Structural impediments to an ideal evaluation Attenuated control over programs – University partnership implements programs Projects associated with grants may be short-lived (3 to 5 years) Outputs & outcomes often difficult to measure due to: Nature of effort Research versus Education versus Extension Timing of funding Can be long lags before measurable results (outcomes) appear

9 Cooperative State Research, Education, and Extension Service http://www.csrees.usda.gov Inputs Pres. Mgmt. Agenda (PMA) gave new force and drive to Agency planning activities Planning & Accountability Office created in 2002 Top level Agency commitment Good leadership

10 Cooperative State Research, Education, and Extension Service http://www.csrees.usda.gov Activities Align Department/Agency REE Efforts Strategic Goals Strategic Objectives = Portfolios Knowledge Areas (KAs) Programs/Projects

11 Cooperative State Research, Education, and Extension Service http://www.csrees.usda.gov Activities Invented the Portfolio Review Expert Panel process “The PREP” In the 5 year review: Outside Expert Panelists sift through evidentiary materials and discuss issues regarding portfolio In the final step of review, the panel is asked 14 standardized questions about a portfolio’s Quality, Relevance & Performance These answers are recorded on a score sheet which converts the qualitative assessments into a single quantitative value between 1 & 100. This score becomes the primary performance measure used for the PART This panel process has become a model across the government for the PART process

12 Cooperative State Research, Education, and Extension Service http://www.csrees.usda.gov Activities 5-Year PREP reviews are followed up with annual “Internal Reviews” that also create a performance score for PART Create in-depth Self-Assessment Reports for every portfolio of programs These are compiled and edited by P&A Office, but largely written by NPLs

13 Cooperative State Research, Education, and Extension Service http://www.csrees.usda.gov Activities Additional performance measures were required by PART Developed ~5 for each Strategic Goal These measure progress toward Strategic Goal not Portfolio Objective Logic Models NPLs created logic models at a portfolio level and topic level (KA)

14 Cooperative State Research, Education, and Extension Service http://www.csrees.usda.gov Outputs Organized 14 New Portfolios of Projects Convened 12 Expert Panel Reviews Compiled 12 Self-Assessment Reports Made more than 14 Logic Models for Portfolios & Programs Developed 22 Performance Measures Received 5 PART scores 2 Effectives (only 15% of Federal programs) 3 Moderately Effectives

15 Cooperative State Research, Education, and Extension Service http://www.csrees.usda.gov Outputs Developed Systematic Organizational Structure for Evaluation Strategic Goals (6 USDA / 5 Agency) Portfolios of Projects (17) Knowledge Areas (KAs) (92) Programs (hundreds) Projects (thousands)

16 Cooperative State Research, Education, and Extension Service http://www.csrees.usda.gov Outcomes - Changes in Knowledge Laid down 14 foundations for long-term planning NPLs gained broader understanding of planning process Developed a better idea of status of our portfolios and where improvements could be made Improved the understanding of the types of resources and leadership required

17 Cooperative State Research, Education, and Extension Service http://www.csrees.usda.gov Outcomes - Changes in Behavior Better long-term planning by Agency More long-term planning by NPLs Improved program decision-making & direction Agency & Program leaders made more effective programmatic changes Leaders allocated resources more constructively

18 Cooperative State Research, Education, and Extension Service http://www.csrees.usda.gov Outcomes - Changes in Societal Conditions Programs appear to be more effective and so should ultimately change societal conditions more effectively Really too soon to demonstrate PREP/PART process has had positive impact on Agency performance Esp. through greater partnership involvement and communication through the panels Better long-term planning by Agency Particularly in the allocation of resources

19 Cooperative State Research, Education, and Extension Service http://www.csrees.usda.gov Conclusions Quantitative Measures PREP scores worked well - gave the best assessment of programs for us and for PART All 22 PART performance measures were well-considered Most were good, some were not sustainable Extensive use of logic models was highly effective as organizing device. It led to very useful program descriptions, evaluation indicators & measures

20 Cooperative State Research, Education, and Extension Service http://www.csrees.usda.gov Ongoing & New Challenges Finished complete cycle of portfolio reviews Maintain Momentum Institutionalize Process Make planning & evaluation meaningful Remind stakeholders it takes less effort the 2 nd time Maintain vigilance about improving performance measures

21 Cooperative State Research, Education, and Extension Service http://www.csrees.usda.gov Our Answers Question 1: Should anything be done differently to evaluate: - basic knowledge generation; - the development of applied knowledge; or, - the implementation of research knowledge? Answer: Depends on level of aggregation No – at higher levels. Our LM encompassed all forms of R&D, Education and Extension. Portfolio reviewers could still integrate their observations about programs and convert them into numbers. Yes – at lower levels. At the program level, managers wanted programs treated differently. E.g., basic research could not easily tie their results directly to changes in societal conditions and so were concerned this did not reflect their contributions.

22 Cooperative State Research, Education, and Extension Service http://www.csrees.usda.gov Our Answers Question 2: Does the customer of the R&D outcomes make a difference? Answer: Yes -- OMB (primary customer) wanted comparable #’s, so wanted outcome measures to be as quantitative as possible -- Our internal users (secondary customer) wanted the quantitative outcome metrics to compare well with other agencies, but mostly desired the qualitative recommendations for the mgmt. of their programs.

23 Cooperative State Research, Education, and Extension Service http://www.csrees.usda.gov Our Answers Question 3: Does an organization’s characteristics (size, structure,..) impact a quantitative evaluation? Answer: Yes -- A critical mass of evaluation capabilities must be present in an agency -- This mass should be concentrated in a team of evaluators to be most successful -- The distance from the performance of the R&D we sponsored was a problematic structural feature because limited access to best program knowledge

24 Cooperative State Research, Education, and Extension Service http://www.csrees.usda.gov Our Answers Question 4: Is there or should there be a core evaluation policy that spans all the federal agencies? Answer: Yes The PART process, if considered as a core policy, spurred Agency initiatives to more actively monitor and assess its own programs.

25 Cooperative State Research, Education, and Extension Service http://www.csrees.usda.gov Questions What is the best way to quantitatively evaluate R&D programs? 1. Should anything be done differently to evaluate: - basic knowledge generation; - the development of applied knowledge; or, - the implementation of research knowledge? 2. Does the customer of R&D outcomes make a difference? 3. Does an organization’s characteristics (size, structure,..) impact a quantitative evaluation? 4. Is there or should there be a core evaluation policy that spans all the federal agencies?

26 Cooperative State Research, Education, and Extension Service http://www.csrees.usda.gov Epilogue Ways existing metrics can be used for decision-making Assessing progress-to-plan of projects & portfolios “Portfolio balancing” Project “Stop-start” and “Transition out” criteria Budget Performance Integration Needed improvements Better outcome metrics, pref., more quantitative Better tracking of specific contributions along innovation stream Better “Transition out” criteria


Download ppt "Cooperative State Research, Education, and Extension Service Quantitative Measures in the Evaluation of Extramural Research."

Similar presentations


Ads by Google