Presentation is loading. Please wait.

Presentation is loading. Please wait.

Improving and Integrating Evaluation into Program Management Panel Presentation: Henry M. Doan, Ph.D. Suzanne Le Menestrel, Ph.D. CSREES Steve Loring,

Similar presentations


Presentation on theme: "Improving and Integrating Evaluation into Program Management Panel Presentation: Henry M. Doan, Ph.D. Suzanne Le Menestrel, Ph.D. CSREES Steve Loring,"— Presentation transcript:

1 Improving and Integrating Evaluation into Program Management Panel Presentation: Henry M. Doan, Ph.D. Suzanne Le Menestrel, Ph.D. CSREES Steve Loring, Ph.D. New Mexico State University Cheryl J. Oros, Ph.D., Discussant NIH AEA 2006 Portland, Oregon

2 The Portfolio Review Expert Panel (PREP) Process: Planning and Accountability Office, CSREES Use and Perspective Henry M. Doan, Ph.D. Planning and Accountabillity CSREES AEA 2006 Portland, Oregon

3 CSREES Mission and Function Mission To advance knowledge for agriculture, the environment, human health and well-being, and communities Functions Program leadership to identify, develop, and manage programs to sponsor university-based and other institutional research, education, and extension. Fair, effective, and efficient administration of federal assistance in implementing education, research, and extension awards and agreement.

4 Management Cycle Planning Identification of needs/problems, solutions Conceptualization Formulation of evaluation questions & designs Implementation Data collection & analysis Feedback Sharing findings with program managers Refining programs Budget decisions

5 Goal 1: Enhance Economic Opportunities for Agricultural Producers Goal 2: Support Increased Economic Opportunities and Improved Quality of Life in America Goal 3: Enhance Protection and Safety of the Nation’s Agriculture and Food Supply Goal 4: Improve the Nation’s Nutrition and Health Goal 5: Protect and Enhance the Nation’s Natural Resource Base and Environment CSREES STRATEGIC GOALS (2004-2009)

6 Cascading Alignment Strategic Goal Strategic Objective Portfolio Knowledge Area Code Projects Agency Mission ** ** May cross-cut objectives and portfolios.

7 Alignment Example Goal 3: Enhance Protection and Safety of the Nation’s Agriculture and Food Supply Strategic Objective 4.1: Reduce the Incidence of Foodborne Illnesses and Contaminants through Science-Based Knowledge and Education Food Safety Portfolio Knowledge Areas (KA 711) Ensure Food Products Free of Harmful Chemicals, Including Residues from Agricultural and Other Sources (KA 712) Protect Food from Contamination by Pathogenic Microorganisms, Parasites, and Naturally Occurring Toxins

8 OMB Program Assessment Rating Tool (PART) PART: Program Purpose & Design Strategic Planning Program Management Program Results CSREES Portfolio Reviews: 2004: Goal 1 2005: Goals 3 & 5 2006: Goals 2 & 4 This completed the first cycle of 14 portfolio reviews, covering 14 strategic objectives and all five CSREES 2004-2009 strategic goals

9 A Portfolio Approach to Evaluating Research, Education, and Extension Efforts OMB PART/BPI led to development of new portfolio assessment tool and measures Portfolio analysis (meta-analysis) used to assess progress toward goals; guide announcements for grants Uses OMB R&D criteria (relevance, quality, performance)

10 Portfolio Review Expert Panel (PREP) Process Focus on outcomes rather than processes Level of analysis is a portfolio identified via Knowledge Area codes in databases Expert Panels score portfolio progress & provides recommendations for CSREES

11 PORTFOLIO: A New Concept Portfolio as a unit of analysis is a new concept Portfolios are not included in funding lines, programs, and organization of CSREES work units Use of Knowledge Areas codes for all work classification is new Portfolio concept allows complex interrelated programs and funding lines to be described as they address CSREES strategic goals and objectives

12 PREP Unique Features Expert panelists are asked to systematically assess distinct dimensions of the 3 OMB R&D criteria (Relevance, Quality, and Performance) Scoring process is standardized across portfolios, transparent, & scientifically based Therefore, PREP can provide quantitative performance assessment of portfolios of research work

13 PREP Process 1. Identify/Select Expert Panels 2. Develop Self-Study Report 3. Compile Evidentiary Materials 4. Self Score Prior To Panel Meeting 5. Convene Expert Panels

14 1. Identify/Select Expert Panels PREP Process: Selection of high-level panelists with broad experience in topic area after careful reviews for absence of conflict of interest. Panel members included: University Vice-Presidents Deans and Associate Deans Industry Experts (Company Vice Presidents, etc.) Evaluation Experts Experts from other federal agencies

15 PREP Process : 2. Develop Self-Study Reports NPLs develop portfolio self-study reports. The reports include the following: Section I: Agency and PREP Overview Section II: Portfolio Description, including Logic Models and Graphics (e.g. Honey Combs) Section III: Knowledge Area Descriptions Section IV: Discussion of how portfolio meets R&D Criteria and their Dimensions

16 PREP Process: 3. Compile Evidentiary Materials Track papers, citations, patents, products, educational efforts, adoption of products/ practices Identify and present evaluation studies and special analyses conducted in programs covered in portfolios Present budget tables to show portfolio priorities and emphases

17 PREP Process: 4. Self Score Portfolio NPLs score their self-study reports using an instrument developed in-house and based on OMB R&D criteria of relevance, quality, and performance. These self scores will eventually be compared to those assigned to the portfolios by the expert panels.

18 PREP Process: 5. Convene Expert Panels Panelists meet for 2 ½ days in Washington, DC. Day 1: for orientation, short briefings by program managers and NPLs, along with Q&As Day 2: for further review of documentation, discussion, deliberation, and recommendations Day 3: to complete draft report containing score to be submitted for PART & BPI, and recommendations for portfolio improvement; debriefing by panel to CSREES

19 PREP Process: Panel Scoring Sheet Expert Panel scores each dimension of each of three R & D criteria using customized anchors on a 3-point scale: 3= Exceeds expectations 2= Meets expectations 1= Needs improvement

20 Panel Scoring Sheet: OMB R&D Criteria & Dimensions Relevance : 1. Scope 2. Focus on critical needs 3. Identification of emerging issues 4. Integration of CSREES programs 5. Interdisciplinary integration

21 Panel Scoring Sheet: OMB R&D Criteria & Dimensions Quality: 1. Significance of findings & outputs 2. Stakeholder assessment 3. Alignment of portfolio with current science 4. Methodological rigor

22 Panel Scoring Sheet: OMB R&D Criteria & Dimensions Performance : 1. Portfolio productivity 2. Portfolio completeness 3. Portfolio timeliness 4. Agency guidance relevant to portfolio 5. Portfolio accountability

23 Panel Scoring Sheet Example: Relevance Section 1: -Relevance- Dimensions Purpose of Dimension Rating: 3 Rating: 2 Rating: 1 40% of total Total relevance score 1.1 Scope – coverage of the work of the full portfolio Define & summarize needed & existing portfolio topics Fully demonstrates exceptional depth Portfolio coverage is static in depth Portfolio is falling behind 40% 1.2 Portfolio’s ability to remain focused Clarify & examine if portfolio focus on critical needs Fully focused Adequately focused Needs improvement 20% 1.3 Identification of emerging issues Identify important new issues consistent with the portfolio mission Contemporary & emerging issues identified Missing some emerging issues Needs coverage of important issues 20% 1.4 Integration of agency programs for portfolio Demonstrate functional integration REE fully integrated Partially integrated Insufficiently integrated 10% 1.5 Multi- disciplinary balance Demonstrate disciplinary and scientific balance… Extensive balance among relevant disciplines Partial balance Little balance10%

24 Interim Annual Internal Review Update self-review document Consider recommendations from Review Panels and describe Agency and Portfolio responses and, for some portfolios, develop strategic plans Used as interim preparation for next external review at the fifth year

25 P&A Experience Working With Program Managers & NPLs Self-study reports development requires systematic collection and analysis of program data Requires open communications between P&A and program units Requires close collaboration between P&A staff and NPLs All program units need to be encouraged to develop strategic plans based on panel recommendations Given lack of readily available data, the process is extremely demanding

26 For more information, please call Henry M. Doan, Ph.D. 202-401-0791 Or e-mail hdoan@csrees.usda.gov


Download ppt "Improving and Integrating Evaluation into Program Management Panel Presentation: Henry M. Doan, Ph.D. Suzanne Le Menestrel, Ph.D. CSREES Steve Loring,"

Similar presentations


Ads by Google