SMART: Developing Effective Goals and Objectives Presented by Barry Nagle Evaluation and Action Research Associates (EARA) Fairfax, VA March 2009.

Slides:



Advertisements
Similar presentations
March 23, Todays Outcomes By the end of todays workshop, participants will be able to... Revise and/or finalize SAOs within their areas Discover.
Advertisements

Bryan Roach Chairman Crime Stoppers Australia. Strategic Planning The process for defining strategy (direction) and decision making For Crime Stoppers,
Donald T. Simeon Caribbean Health Research Council
Arts in Basic Curriculum 20-Year Anniversary Evaluation the Improve Group.
Aligning Employee Performance with Agency Mission
1 Performance Assessment An NSF Perspective MJ Suiter Budget, Finance and Awards NSF.
Specific outcomes can be compared to resources expended; and successful programs can be highlighted;
Decision Making Tools for Strategic Planning 2014 Nonprofit Capacity Conference Margo Bailey, PhD April 21, 2014 Clarify your strategic plan hierarchy.
A brief overview What is program evaluation? How is an evaluation conducted? When should it be used? When can it be used? Used with Permission of: John.
System Office Performance Management
Performance Appraisal System Update
Nontraditional Employment & Training Program Center for Women in Government & Civil Society University at Albany State University of New York ACTION PLAN.
1 GETTING STARTED WITH ASSESSMENT Barbara Pennipede Associate Director of Assessment Office of Planning, Assessment and Research Office of Planning, Assessment.
DEVELOPING DEPARTMENTAL OUTCOMES ASSESSMENT PLANS Jerry Rackoff Lois Huffines Kathy Martin.
Reporting Results for Pesticide Programs Robin Powell, EM Pyramid Lake Paiute Tribe Environmental Department.
System Office Performance Management
How to Write Goals and Objectives
SMART: Developing Effective Goals and Objectives
Measuring Learning Outcomes Evaluation
Developing Earmark Grant Performance Measures: Grant Proposal Section 3 Deanna Khemani.
How to Develop the Right Research Questions for Program Evaluation
SMART Goal Setting. Introduction Goal Setting Exercise Identify 4-5 Key Goals/Responsibilities for 2012:
PAD190 PRINCIPLES OF PUBLIC ADMINISTRATION
TEMPUS IV- THIRD CALL FOR PROPOSALS Recommendation on how to make a good proposal TEMPUS INFORMATION DAYS Podgorica, MONTENEGRO 18 th December 2009.
Setting SMART Goals.
Jeff Mohlenkamp, CPA Director Department of Administration.
Business and Management Research WELCOME. Lecture 2 Business and management research?
1 Introduction to Evaluating the Minnesota Demonstration Program Paint Product Stewardship Initiative September 19, 2007 Seattle, WA Matt Keene, Evaluation.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Outcome Based Evaluation for Digital Library Projects and Services
Copyright 2010, The World Bank Group. All Rights Reserved. Planning and programming Planning and prioritizing Part 1 Strengthening Statistics Produced.
December 14, 2011/Office of the NIH CIO Operational Analysis – What Does It Mean To The Project Manager? NIH Project Management Community of Excellence.
Academic Assessment Accountability: Are we what we say we are? Program Improvement: How can we be even better? External audiences: SACS.
1 PROJECT EVALUATION IT’S ALL ABOUT STUDENTS. 2 In partnership, we help America’s students stay in school and graduate by: Reducing gaps in college access.
The Value of Data The Vital Importance of Accountability American Institutes for Research February 2005.
General Grant Writing Tips. Research the problem/need and the program before beginning the grant proposal Review research on similar problems/needs and.
Training Resource Manual on Integrated Assessment Session UNEP-UNCTAD CBTF Process of an Integrated Assessment Session 2.
Introduction to Planning
Week 12: Performance Management and Performance Budgeting Discuss Eureka exercise Review mid-term Conceptual Origins of Performance Management Government.
Successfully Conducting Employee Performance Appraisals Wendy L. McCoy Director HR & Benefits Florida Conference of The United Methodist Church.
The Conceptual Framework: What It Is and How It Works Linda Bradley, James Madison University Monica Minor, NCATE April 2008.
Assessment Accountability: Are we doing what we say we are doing? Program Improvement: How can we be even better? External audiences: SACS.
Assessment of Student Learning in General Education AAHE/NCA 2003 Assessment Workshop Omaha, Nebraska ● June 2003.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
SMART GOALS  S = Specific  M = Measurable  A = Attainable  R = Relevant  T = Time Bound.
1 Strategic Plan Review. 2 Process Planning and Evaluation Committee will be discussing 2 directions per meeting. October meeting- Finance and Governance.
From Monitoring Through Evaluation To Impact Assessment The case of NHDRs and vulnerable groups Andrey Ivanov Human Development Adviser, Bratislava RSC.
New Framework for Strategic Goal Assessment NSF Advisory Committee for Business and Operations November 8-9, 2006 Tom Cooley, Director, BFA.
CHAPTER 16 Preparing Effective Proposals. PRELIMINARY CONSIDERATIONS  Conducting a Preliminary Assessment  Prior to Writing the Proposal  How Fundable.
Session 2: Developing a Comprehensive M&E Work Plan.
HLC Criterion Five Primer Thursday, Nov. 5, :40 – 11:40 a.m. Event Center.
RE-AIM Framework. RE-AIM: A Framework for Health Promotion Planning, Implementation and Evaluation Are we reaching the intended audience? Is the program.
Presentation on Outcomes Assessment Presentation on Outcomes Assessment toCCED Mohawk Valley Community College October 11, 2004.
Module 9 Monitoring and Evaluation Tuesday, Oct 14, 2013 Ngo Thi Loan and John Carter.
OVERVIEW OF FEDERAL PERFORMANCE MANAGEMENT Community-Based Child Abuse Prevention (CBCAP) and Promoting Safe and Stable Families (PSSF) Grantees Meeting.
Selection Criteria and Invitational Priorities School Leadership Program U.S. Department of Education 2005.
Developing a Monitoring & Evaluation Plan MEASURE Evaluation.
325K: COMBINED PRIORITY FOR PERSONNEL PREPARATION Webinar on the Annual Performance Report for Continuation Funding Office of Special Education Programs.
Balanced Scorecard Process Writing Outcome Objectives and Measures.
What are Learning Outcomes and how to create good Learning Outcomes
LOGIC MODEL A visual depiction of what a project does and what changes it is expected to bring about. Learn more: Readings, template, examples:
Measuring Project Performance: Tips and Tools to Showcase Your Results
Logic Models and Theory of Change Models: Defining and Telling Apart
Office of Secretary of Defense
2018 OSEP Project Directors’ Conference
Dr. Malcom Phelps Presentation by
Implementation Guide for Linking Adults to Opportunity
Balanced Scorecard Process Writing Outcome Objectives and Measures
Assessment components
Presentation transcript:

SMART: Developing Effective Goals and Objectives Presented by Barry Nagle Evaluation and Action Research Associates (EARA) Fairfax, VA March 2009

2 Agenda Part A – Goal/Objective Definition – How to be SMART Review of the component terms – SMART tool Table to facilitate SMART Objective Development – SMART Benefits/Costs Part B – SMART and the PART How SMART can facilitate NASA’s efforts to effectively respond to OMB’s Performance Assessment Rating Tool

3 Goal/Objective Definition

4 Goals/Objectives The most important element of a successful program is the development of attainable goals and measurable objectives – Guides program planning and design – Communicates to stakeholders – Enables evaluation Success is dependent upon realistic goals

5 Goals: Characteristics Describe the overall purpose of the program Describe broad outcomes and concepts (what we want to accomplish) Expressed in general terms.

6 Research the topic (define needs) Involve stakeholders (gains commitment) Brainstorm goals Select the goals that have priority (decide on what matters) Limit the program to two-five goals (select realistic goals) Goals: Development Steps

7 The program will inspire and motivate students to pursue careers in Science, Technology, Engineering, and Mathematics The program will positively impact the gender diversity of the STEM workforce The program will increase the capacity of minority institutions in STEM research Goals: Samples

8 Objectives Specifically state how the goals will be achieved Are measurable: Define what you want to see Encourage a consistent focus on program functions

9 Objectives Are Not… Tasks Conducting a training session is a task. – Poor objective: We will conduct a training session An effective objective is something the program can fail at. An effective objective defines intent – Better objective: Faculty that attend the training session will be able to identify at least three NASA grant programs that align with their research interests.

10 How to be SMART

11 SMART Objectives Specific: Be precise about what you are going to achieve Measurable: Quantify the objectives Appropriate: Align with the needs of the target audience Realistic: Do you have the resources to make the objective happen? Time-Specific: State when you will achieve the objective

12 S MART : Specific Objectives Specific: Be precise about what you are going to achieve – Specify target – Specify intended outcome – One outcome per objective – Avoid vague verbs (e.g. know, understand) – Make sure the objective is linked to the goal – Sample: By January 2010, at least 3% of the engineering majors at the institution will be female

13 S M ART : Measurable Objectives Measurable: Quantify the objectives – Use measures as indicators of program success – If possible, establish a baseline (e.g. In January 2009, 2% of the engineering majors at the institution were female) – Sample: By January 2010, at least 3% of the engineering majors at the institution will be female

14 SM A RT : Appropriate Objectives Appropriate: Align with the needs of the target audience – Meeting the objective will advance the goal – Identify a specific target audience – Are inclusive of diversity within your group – Sample: By January 2010, at least 3% of the engineering majors at the institution will be female – Note: The “A” is sometimes called “Attainable” or “Achievable” in the literature.

15 SMA R T : Realistic Objectives Realistic : Do you have the resources to make the objective happen? – Are important to stakeholders – Are adequately resourced – Can be achieved – Sample: By January 2010, at least 3% of the engineering majors at the institution will be female Take care on what you say you can do! The January 2009 baseline was 2%. Is a 1% increase in one year realistic?

16 SMAR T: Time-Specific Objectives Time-Specific: State when you will achieve the objective – Provide timeframe indicating when objective will be met – Sample: By January 2010, at least 3% of the engineering majors at the institution will be female

17 Goals and Objectives Goal Objective One Objective Two Objective Three Maintain a clear connection between your goals and objectives. By maintaining this connection, you are articulating your theory of goal attainment.

18 SMART Tool

19 SMART Tool Objective By January 2010, at least 3% of the engineering majors at the institution will be female VerbMetricPopulationObject Baseline Measure Goal Measure Timeframe Breakdown bePercentage Institution Engineering Majors Female s Selecting Engineering Major 2%3% January 2010 Objective On an annual basis, at least 5% of the students that apply to the program will be female VerbMetricPopulationObject Baseline Measure Goal Measure Timeframe Breakdown applyPercentage Institution Engineering Major applicants Female Applicants Selecting Engineering Major --5%Annually Goal: The engineering department will positively impact the gender diversity of the engineering workforce

20 SMART Benefits and Costs

21 Benefits Facilitates communication with program stakeholders Informs on what data should be collected Enables effective program management Enables government funders to better fulfill PART requirements Facilitates the linkage of activities and intended effects/goals Enables a focus on evaluation – Process level (activities) – Output level – Outcome level Facilitates replication

22 Costs and Limitations Impression that creativity is limited Time-consuming GI/GO Encourages too great a focus on discrete measures

23 Comment on Metrics A well-written objective suggests the metric(s) Example: – On an annual basis, at least 5% of the students that apply to the program will be female Metrics: – Total applications to the department – Percentage of applications from females While this may appear obvious, this is an area where programs often fail.

24 SMART and the PART

25 Definition and Information PART: Performance Assessment Rating Tool Operationalizes the Government Performance and Results Act (GPRA) of 1993 – GPRA was passed under the Clinton administration – The PART was developed under the Bush administration

26 Definition and Information Considers four areas – Program purpose and design – Strategic planning – Program management – Program results and accountability Evidence-based assessment that can be accessed by the public Consistent approach for evaluating programs

27 Definition and Information Program Ratings – Effective – Moderately Effective – Adequate – Ineffective – Results Not Demonstrated

28 Effective Program Rating Sample The full reports contain more detail. These are available on the OMB Internet site.

29 Ineffective Program Rating Sample The full reports contain more detail. These are available on the OMB Internet site.

30 Key OMB Terms Outputs: The internal activities of the program (the products and services delivered) – These are the “whats” of the program – These are the SMART objectives Outcomes: The events or conditions external to the program and of direct importance to the public/beneficiary – These are the “so whats” of the program – These are the goals

31 Key OMB Terms Efficiency Measures: Demonstrate the ability of the program to implement activities and achieve results and to make the best use of resources. Usually expressed as a ratio of inputs to outputs/outcomes. – This is an economic concept – Does not mean you are doing something as cheaply as possible; rather, the term indicates an effective use of resources.

32 SMART-PART Association Program purpose and design: Assesses whether the program’s purpose and design are clear and sound – In total, SMART objectives that are aligned with the program goals indicate a clarity of purpose and soundness of design

33 Strategic planning: Assesses whether a program has valid long-term and annual measures and targets – SMART objectives define the metrics that will be collected and the intended timeframes. SMART-PART Association

34 Program management: Assesses program management, including financial management and accountability – A program that maintains alignment between its budget, activities, and objectives can demonstrate its effective program management SMART-PART Association

35 Program results and accountability: Assesses program effectiveness and reported progress on measures – Collecting metrics related to the SMART objectives will enable a program to report on its effectiveness SMART-PART Association

36 The PART is the responsibility of the agency when reporting to the Office of Management and Budget (OMB) Important because the sponsoring federal agency is a critical program stakeholder Responsibility

37 Questions? Contact: Barry Nagle Director Center for Assessment, Planning, and Accountability United Negro College Fund Special Programs Corporation 2750 Prosperity Avenue, Suite 600 Fairfax, VA Barry Nagle President Evaluation and Action Research Associates (EARA) 2813 Lee Oaks Court #204 Falls Church, VA