PATHFINDER Mission: Results for New Zealanders. Agenda for WG4 1.Introduction (Chair) - includes short website update (Greg) 2.WG / WS Process (Chair)

Slides:



Advertisements
Similar presentations
Concept of Managing for Development Results- MfDR T.M.Jayasekera, B Sc Eng., MBA, C Eng, FIE, FIM, FCIWEM, MSLIM, MSLITAD National Consultant-UNDP TA on.
Advertisements

Introduction to Monitoring and Evaluation
Copyright © 1997 CompassPoint Nonprofit Services What Does a Strategic Plan Look Like & How Do You Keep It Relevant? Anushka Fernandopulle CompassPoint.
Developing an Evaluation Strategy – experience in DFID Nick York Director – Country, Corporate and Global Evaluations, World Bank IEG Former Chief Professional.
The concepts/mechanisms/tools for developing a Joint Programme: Critical issues and UNDG Joint Programme Guidance and formats.
Decision Making Tools for Strategic Planning 2014 Nonprofit Capacity Conference Margo Bailey, PhD April 21, 2014 Clarify your strategic plan hierarchy.
Improvement Service / Scottish Centre for Regeneration Project: Embedding an Outcomes Approach in Community Regeneration & Tackling Poverty Effectively.
Action Logic Modelling Logic Models communicate a vision for an intervention as a solution to a public health nutrition (PHN) problem to:  funding agencies,
CISB444 - Strategic Information Systems Planning
Business Excellence within The University of Bolton Strategic Planning Process 17 th October 2006.
Project Cycle Management (PCM)
Program Performance Reporting and Evaluation in Australia Mark Nizette Department of Finance and Administration October 2001.
Chair, Department of Management & Marketing
How to Develop a Project Evaluation Plan Pat Gonzalez Office of Special Education Programs
1 Monitoring and evaluation after 2013 – some first ideas, mainly for ERDF / CF Evaluation network DG REGIO 14 th October 2010.
ISTEP: Technology Field Research in Developing Communities Instructor: M. Bernardine Dias CAs: Sarah Belousov and Ermine Teves Spring 2009.
© The Treasury 1 Better Business Cases “Investing for change” Overview 2012 August 2012.
Striving for Quality Using continuous improvement strategies to increase program quality, implementation fidelity and durability Steve Goodman Director.
Portfolio based assessment - options for the new CGEA.
Evaluation in the GEF and Training Module on Terminal Evaluations
6. Strategic Plan : Implementing GEOSS Validating the way forward: Review of feedback from Plenary.
Quote for today “Sometimes the questions are complicated and the answers are simple” - ?? ????? “Sometimes the questions are complicated and the answers.
1 RBM Background Development aid is often provided on a point to point basis with no consistency with countries priorities. Development efforts are often.
Colombo, Sri Lanka December 2005 NSDS in Practice Presentation by PARIS21.
Investors in People Champions. Interpretation of the Standard Purpose To give a brief overview with the necessary background information on the Investors.
Evaluation of the SEND Pathfinder Programme: Early Findings Graham Thom and Meera Prabhakar May 2012.
Meta-messages All NZ Depts: enhanced outcome focus by 2003 Outcomes: ultimate goal (Holy Grail?) Pathfinder (collective action, individual solutions)
The Employment Strategy Geoff Bascand Department of Labour 11 October 2001.
CAUL Strategic Plan Review 2003 Objectives & Actions.
V ENTURE P LANNING P ARTNERS Strategic Planning  Organizational Development Research  Marketing  Advertising 1 STRATEGIC PLAN STRUCTURE FORMAT & TERMS.
Wgnho Management for Performance Department of Conservation Management for Performance Project.
Green Paper on National Strategic Planning The Presidency November 2009.
INTEGRATED ASSESSMENT AND PLANNING FOR SUSTAINABLE DEVELOPMENT 1 Click to edit Master title style 1 Analysis of the planning process Why and how? Session.
The Essence of ‘Managing for Outcomes’ Planning. The Big Picture Review of the Centre and other reviews have identified Real strengths in New Zealand’s.
Introduction to Kent Nature Partnership. Background to Local Nature Partnerships Introduced by Natural Environment White Paper (2011). Purpose: -Drive.
SESSION 3: FROM SETTING PRIORITIES TO PROGRAMMING FOR RESULTS.
Project Management Learning Program 7-18 May 2012, Mekong Institute, Khon Kaen, Thailand Writing Project Report Multi-Purpose Reporting.
Professional Certificate in Electoral Processes Understanding and Demonstrating Assessment Criteria Facilitator: Tony Cash.
Improving Outcome Measurement Scottish Annual Statistics Users Conference 29/10/09 Dr Mark McAteer Director of Governance & Performance Management.
Quality Assuring Deliverers of Education and Training for the Nuclear Sector Jo Tipa Operations Director National Skills Academy for Nuclear.
We provide web based benchmarking, process diagnostics and operational performance measurement solutions to help public and private sector organisations.
Transforming Patient Experience: The essential guide
Federal Department of the Environment, Transport, Energy and Communications DETEC Federal Office for the Environment FOEN Working sessions: Case example.
Effectively Prioritising Key Outcomes for Tackling Poverty and Community Regeneration Thursday 25th November City Halls, Merchant City, Glasgow.
Logical Framework Slide 1 Mekong Institute & UNESCO Regional Office-Bangkok 23 February – 6 March 2009; Khon Kaen, Thailand Prepared by the Education Policy.
Kathy Corbiere Service Delivery and Performance Commission
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 9, 2012.
Developing a Monitoring and Evaluation Framework: Insight into internal stakeholder learnings Beth Ferguson AES Conference Sydney 2 September 2011.
UNEP EIA Training Resource ManualTopic 14Slide 1 What is SEA? F systematic, transparent process F instrument for decision-making F addresses environmental.
Learning By Doing (Or Looping the Loop, Scooping the Poop & Shooting the Hoop)
Developing a Project Proposal ACTRAV-Turin. Contents Concept of “Logical Framework Approach” SPROUT – model project proposal Individual activity Presentation.
Leading Nottingham Programme update to ACOS 7 September 2010 Angela Probert Director of HR and Organisational Transformation Contributions from Lisa Sharples.
1 Home Care Support Outcome Based Specification Workshop 26 th November 2009.
AGRO PARKS “The Policy Cycle” Alex Page Baku November 2014.
East Midland’s Regional toolkit local community Tuesday 4 th March 2008 Nottinghamshire County Council - our participation in the national customer profiling.
1 Balanced Scorecard Philosophy, Basics, Fundamentals, and Functions.
EIAScreening6(Gajaseni, 2007)1 II. Scoping. EIAScreening6(Gajaseni, 2007)2 Scoping Definition: is a process of interaction between the interested public,
High-level forum on strategic planning for statistics: Bangkok, 6-8 June 2006 The National Strategy for the Development of Statistics (NSDS) approach Presentation.
Introduction and Overview
Maintenance BC - NZTA assessment in TIO
Better Results, Stronger Communities
What is performance management?
Overview – Guide to Developing Safety Improvement Plan
Overview – Guide to Developing Safety Improvement Plan
Draft OECD Best Practices for Performance Budgeting
Introduction and Overview
Defining and measuring outcomes
Portfolio, Programme and Project
Generic Service Delivery Toolkit
LOCALIZING AGENDA 2030 AND THE SDGS: UNDG’S “MAPS” APPROACH IN CAMBODIA.
Presentation transcript:

PATHFINDER Mission: Results for New Zealanders

Agenda for WG4 1.Introduction (Chair) - includes short website update (Greg) 2.WG / WS Process (Chair) - building effective relationships - ideas for improvements (human & output) - feedback on Q&A sessions 3.New WG Products (Roger / Greg) - 10 early step performance cycle (StW 1-3) - Intervention Logic (one-off presentation) - Building Block / Learning Point documents 4.Other matters (Chair)

Focus on Outcomes

Pathfinder ‘Building Blocks’ 1.Define & measure ‘mission critical’ outcomes (using ‘state’ or ‘situation’ outcome indicators) 2.Map causal logic linking outcomes to outputs 3.Assess impact of interventions / targeting 4.Assess cost-effectiveness (vs. outcomes achieved) 5.Management to maximise outcomes (e.g. core/pilots) 6.Benchmarking with outcomes (business units / nations) 7.Focus strategic / annual plans on improving o/c 8.Redesign planning & operations to maximise o/c (incl. feedback & continuous improvement) 9.Improving outcomes across agencies / sectors Steps are not sequential – If feasible, do steps #6-9 early

Observation vs. Action Status quo outcome (client /service groups) Risk, Need & Likely Change (individuals / cases) Improvement in outcome (‘treated’ groups) Situation Report Benchmarking Crude targeting Priority setting Demand analysis Risk manage mnt Continue? Modify? Grow? Kill? MeasureApplicationsOutcome Type

Improving Results – 10 Step Performance Cycle Identify Core Outcomes Define in Measurable Terms Measure as State Indicators Identify Areas for Change in State Define / Refine Intervention Logic Identify Intervention Options Prioritise Intervention Options Design & Deliver Interventions Impact (o/c) Evaluation Framework Cost Effectiveness Analysis

Intervention Logic 1.Aligning activities / outputs to outcomes 2.Defining a robust Intervention Logic (IL) 3.Using IL 'backbones' as a top-down (strategic) tool to scope intervention options 4.Testing intervention logic is robust, and produces best approach (ex ante, ex post) 5.Strengthening intervention logic using evidence-based approaches (R&D, eval) 6.Continuous evolution and testing of intervention logic / frameworks Decision: Sufficient interest to arrange half-day session?

Documenting Learning Points 1.What’s best for NZ as generic guidance 2.Want strong support of documents (-> SG) 3.(Recognise there is no ‘one-size-fits-all’) 4.Seek consensus on any disputed points 5.Initial comments via 6.‘Living document’ for discussion at WG 7.Send to SG (discussion / approval / -> web site) Decision: Approval of principles / generic process; First Priciple … Keep it short(ish), keep it pithy;

1.Executive summary 2.Purpose statement 3.Generic framework (10 step process) 4.Attributes of good outcome frameworks 5.Methods / steps / issues 6.Management and ownership 7.Best practice / good practice (integrated) 8.Worked examples (refer to supplementary docs?) Decision: Approval of format below as start point for writing; History: WG2 agreed focus on StW 1-3; no feedback on format; Proposal: ‘Trial by fire’ of the following format … Documenting Learning (Decision)

Documenting Learning (Contd.) 1.Keep it tight (one or few outcomes, clear priorities for activities) 2.Researched (outcomes derived from institutional knowledge and, where feasible, comparison with ‘similar / like’ agencies) 3.Clear causal links (outcome-output-input links clear; other means of achieving outcome(s) visible; clear fit to your services / clients; covers significant activities of your / relevant agencies) 4.Measures (objective / quantitative as possible; non-measurables identified; built on robust data; confidence intervals known; directly attributable to agency activities; can ‘drill down’ for underlying trends; ‘impact’ and ‘state’ outcomes; decision-making / quality outcomes) 5.Clear fit w. prioritisation / management decisions (e.g. measures inform key mgmt decisions, fit resource information) 6.Clear fit with vision / purpose (e.g. strategy / legislation) 7.Known relation to outcomes of other agencies Attributes of good outcome frameworks, e.g. …

Documenting Learning (Contd.) 1.Identify Core Outcome(s) (e.g. from mission / purpose identify key outcomes; compare to o/c / missions used by similar agencies overseas; ensure good fit w. results you expect from services you deliver) 2.Define in Measurable Terms (translate verbal statement of o/c into precise definition that matches data available at reasonable cost; ID key demographic / service / etc groups for which o/c are required) 3.Measure State / Environment Indicators (document system; gather / process data; measure o/c for all groups w. confidence intervals) 4.Identify Desired Change in State (analyse state indicators for statistically significant ‘highs and lows’; identify improvements sought; identify where results are good / some resources could be shifted) 5.Build Results into Business (e.g. report internally and externally; input into strategic plan, SoI, broad priority setting, KPI, benchmarking, etc) 6.Confirm Fit w. Outcome Management Framework (e.g. how can results be used to benchmark, strategise, prioritise, etc; how do measures have to change or be analysed to perform different functions) Generic processes, e.g. Building Block #1…

1.Executive summary 2.Purpose statement 3.Generic framework (10 step process) 4.Attributes of good outcome frameworks 5.Methods / steps / issues 6.Management and ownership 7.Best practice / good practice (integrated) 8.Worked examples (refer to supplementary docs?) Decision: Approval of format below as start point for writing; History: WG2 agreed focus on StW 1-3; no feedback on format; Proposal: ‘Trial by fire’ of the following format … Documenting Learning (Decision)

Agenda for WG4 1.Introduction (Chair) - includes short website update (Greg) 2.WG / WS Process (Chair) - building effective relationships - ideas for improvements (human & output) - feedback on Q&A sessions 3.New WG Products (Roger / Greg) - 10 early step performance cycle (StW 1-3) - Intervention Logic (one-off presentation) - Building Block / Learning Point documents 4.Other matters (Chair)