1 Brian Pagan, Assistant Secretary Expenditure Management Sector Treasury Board of Canada Secretariat Twentieth Annual PPX Symposium 2016 A RESULTS AND.

Slides:



Advertisements
Similar presentations
Building blocks for adopting Performance Budgeting in Canada Bruce Stacey – Executive Director Results Based Management Treasury Board Secretariat, Canada.
Advertisements

Twelve Cs for Team Building
Overview of Priorities and Activities: Shared Services Canada Presentation to the Information Technology Infrastructure Roundtable June 17, 2013 Liseanne.
Internal Audit : Framework for the Management of Compliance Presentation at FMI meeting Sept
Presentation to the Financial Management Institute May 22, 2014
EEN [Canada] Forum Shelley Borys Director, Evaluation September 30, 2010 Developing Evaluation Capacity.
Moving from money well accounted for to money well spent UK Information Technology Summit May 2005 Helen McDonald A/Chief Information Officer Treasury.
INSTRUCTIONAL LEADERSHIP FOR DIVERSE LEARNERS Susan Brody Hasazi Katharine S. Furney National Institute of Leadership, Disability, and Students Placed.
Resource Allocation in Canada Evaluation, Accountability and Control Brian Pagan Expenditure Operations and Estimates Treasury Board of Canada Secretariat.
Australia’s Experience in Utilising Performance Information in Budget and Management Processes Mathew Fox Assistant Secretary, Budget Coordination Branch.
Session 3 - Plenary on implementing Principle 1 on an Explicit Policy on Regulatory Quality, Principle 3 on Regulatory Oversight, and Principle 6 on Reviewing.
Applying the Federal Cabinet Directive on Streamlining Regulation Regulatory Craft in Nova Scotia Conference 2007 Halifax, Nova Scotia November 20, 2007.
MGT-555 PERFORMANCE AND CAREER MANAGEMENT
Critical Role of ICT in Parliament Fulfill legislative, oversight, and representative responsibilities Achieve the goals of transparency, openness, accessibility,
Summer Non-effective School Districts district staff issue a plethora of uncoordinated and often contradictory directives while presiding over resource.
Expenditure Management Information System GTEC October 2004 emis RDIMS
Commissioning Self Analysis and Planning Exercise activity sheets.
ISO 9001:2008 to ISO 9001:2015 Summary of Changes
HELPING THE NATION SPEND WISELY Evaluating the quality and use of Impact Assessments The role and approach of the NAO.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
1 Strategic Plan Review. 2 Process Planning and Evaluation Committee will be discussing 2 directions per meeting. October meeting- Finance and Governance.
NAVAA1 STRATEGIC PLANNING: “The Big Picture” OR A Linear Look at Strategic Planning.
Kathy Corbiere Service Delivery and Performance Commission
The common structure and ISO 9001:2015 additions
Performance Budgeting in the Government of Canada: Transitioning from Surplus to Deficit Reduction Presented to: The Peterson-Pew Commission's International.
Australian Council for Educational Research School Improvement Christian Schools National Policy Forum Canberra, 26 May 2014.
Shared Services and Third Party Assurance: Panel May 19, 2016.
Memorandum to Cabinet and Treasury Board Submission – Why they are Major Tools for Government June 2016.
UHC 2030 CSO engagement mechanism Bruno Rivalan IHP+ Northern CSO Representative IHP+ Steering committee 21 th June 2016.
Engaging CSOs in UHC 2030 Bruno Rivalan IHP+ Northern CSO Representative IHP+ Steering committee 21 th June 2016.
Introduction to Workforce Planning
JMFIP Financial Management Conference
Principles of Good Governance
Priorities for the Success AT Strategic Action Plan: SUMMARY
CHAPTER 4 THE EVOLVING/ STRATEGIC ROLE OF HUMAN RESOURCE MANAGEMENT
Building evaluation in the Department of Immigration and Citizenship
Asset Management Accountability Framework
Agency Performance: A New Agenda
Wendy Birkinshaw, A/Director, Service Transformation
Parliament and the National Budget Process
The Role of Departments in the Implementation of the Government Agenda Concepts and Realities FMI Professional Development Day - June 7, 2016.
08 March 2016 Briefing to the Portfolio Committee of Tourism on review of the draft APP.
Overview – Guide to Developing Safety Improvement Plan
9/16/2018 The ACT Government’s commitment to Performance and Accountability – the role of Evaluation Presentation to the Canberra Evaluation Forum Thursday,
TSMO Program Plan Development
One ODOT: Positioned for the Future
Research Program Strategic Plan
Overview – Guide to Developing Safety Improvement Plan
Massachusetts Department of Higher Education Boston, Massachusetts
Generic Terms and Conditions: Incentive-based Funding
Our new quality framework and methodology:
Jordan’s Principle Summit Winnipeg, MB September 13, 2018
Statistics Governance and Quality Assurance: the Experience of FAO
Financing Budget oversight: Problems and Solutions.
Engaging Institutional Leadership
Policy on Transfer Payments Renewal
BELFAST HEALTHY CITIES 25th ANNIVERSARY LECTURE SERIES
Implementing Race to the Top
STRATEGIC PLANNING: “The Big Picture”
Assessing Academic Programs at IPFW
State of World’s Cash Report:
Portfolio, Programme and Project
A Focus on Strategic vs. Tactical Action for Boards
Cost Estimating in the Government of Canada Internal Cost Estimating Analysts Association Bill Matthews Comptroller General for Canada.
Maureen McAteer, Scottish Government
Stakeholder Engagement: Webinar Part I: The Regulatory Development Process for the Government of Canada Part II: Making Technical Regulations Under.
The GEF Public Involvement Policy
Expenditure Management
Tracie Wills Senior Commissioning Officer
Portfolio Committee on Communications
Presentation transcript:

1 Brian Pagan, Assistant Secretary Expenditure Management Sector Treasury Board of Canada Secretariat Twentieth Annual PPX Symposium 2016 A RESULTS AND DELIVERY CULTURE

2 The Government is committed to evidence-based decision making The Treasury Board President’s Mandate Letter underscores the importance of government performance, experiments, innovation, and parliamentary reporting The Prime Minister has signalled the importance of achieving results in establishing the Cabinet Committee on Agenda, Results and Communications (ARC) and its supporting Results and Delivery Unit (RDU) at PCO In the PCO Report on Plans and Priorities, the Prime Minister also highlighted the need to develop new, simplified forms of reporting to Parliamentarians and Canadians What are we talking about and why?

Departments … Manage themselves internally Create Program Alignment Architectures (PAA) Report performance to Parliament and Canadians Propose new spending Evaluate their results  Does not align with organizational structure  Programs contribute to a unique strategic outcome  Programs must hierarchically contribute to each other  Departments report performance measurement and financials to TBS  Reporting based on their PAA  Reports made in Departmental Performance Reports, and Reports on Plans and Priorities  To create a new program, departments submit a Memorandum to Cabinet and then a Treasury Board Submission to get policy and spending approval, respectively  Departments evaluate all direct government spending, and make evaluations public  Address a fixed set of five core issues 3 The Results Universe

4 Learning from the past Performance Measurement  PRAS (1996): focused largely on what departments did (i.e. business lines) and was, accordingly, criticized for not focusing on results and establishing some link between what was being spent and what departments were seeking to achieve  MRRS (2005): focused on results but introduced a program alignment architecture insufficiently focused on ensuring high-quality indicators and giving rise to an isolated planning regime geared to satisfying reporting requirements and divorced from the reality of how departments were actually run Evaluation  Coverage: from full (1977), to risk-based (2001), and back to full (2009). Full coverage was criticized for limiting resources that could be assigned to priority areas; risk-based was criticized for missing some programs that should be evaluated

5 We Can Do Better WHERE WE ARE Solutions WHERE WE NEED TO GO We do a lot of reporting that is not widely read Weak performance information and unclear contribution to government priorities Departments organize their programs in ways that don’t always focus on the results that matter to Canadians Evaluations stretched thin to achieve comprehensive coverage every five years, limiting resources to cover riskier programming Telling Canadians a clear and compelling story of the difference departments are making in their lives Working collaboratively to help departments improve their results focus Adopting simpler and more flexible structures focused on more meaningful outcomes and indicators Providing ministers with a clear understanding of what the department is focused on and how it’s doing

6 What role does TBS play? COLLABORATION AND COOPERATION The Treasury Board of Canada Secretariat serves as the government’s management board, expenditure manager, employer, and regulatory oversight body. As such, it has a number of levers that affect how departments track and deliver results, including: o Performance measurement o Evaluation o Treasury Board Submissions o Parliamentary reporting To bring about a change, TBS will need to take a leadership role in collaborating with and supporting departments, serving as a centre of expertise on results and delivery. Support

A RESULTS CULTURE TO PERMEATE THE WHOLE OF GOVERNMENT Agenda, Results and Communications Committee Focused but deep Identify and track top priorities. High-level support and attention to monitor implementation, clear roadblocks and ensure successful delivery. Treasury Board Broad but lighter-touch Authorize new departmental expenditures through TB Submissions. Ensure transparent and clear public reporting and an ethos of delivery across government 7 An Integrated Approach COORDINATED AND COMPATIBLE

MEASURE EVALUATEREPORT Replace departmental PAAs to focus reporting on what matters to Canadians, and provide a clear and simple way of communicating what departments do, what they are trying to achieve, and how they will assess success Implementing clear responsibility and accountability for performance measurement, and encouraging centres of expertise to drive high- quality measurement and indicators Improving evaluations to allow for more flexibility in evaluation planning and to improve the impact of evaluation on learning and delivery and results 8 A Vision TO IMPROVE THE GOVERNMENT’S RESULTS CULTURE

 Departments tell Parliament and Canadians a clear results story about what they are trying to achieve, how they will achieve it, and how they will assess success  Results are achieved across the board  Performance measurement and evaluation are strengthened and provide the evidence needed to support the delivery of results  Performance measurement and evaluation can be used to inform decisions as well as program and policy learning and improvement  Departments are transparent on the difference they are making for Canadians 9 Benefits

Data will need to be carefully managed to allow for trend analysis Changes will need to accommodate the wide diversity of departments in size, focus, and operations Timelines for any changes will need to be long enough to allow departments to manage the change Driving the change will require capacity for performance measurement and evaluation Achieving the benefits will require engagement and close collaboration between TBS, PCO, and departments Considerations 10

Supporting IT system changes in departments and TBS Enabling direct data collection from departments Enhancing InfoBase to allow for transparent and accessible information to Parliament and Canadians 11 Managing the Change Changing cultures requires ongoing involvement over time and a mix of requirements and support that drives the change from all angles. Changing the incentives departments face Modifying the policy and other requirements departments must comply with Creating new delivery and reporting routines Allowing flexibility, while encouraging departments to find their best option Collaboration Providing training both directly and through the Canada School of Public Service to program managers, evaluators, and strategic planners on how to plan for, measure, and deliver results Working with departments in supporting and implementing change Providing ‘hands-on’ expert advice

12 Gaps? Risks? Concerns? QUESTIONS?

Annex 13

14 Structure Department Responsibility Program Department Responsibilities stem from their mandate. Responsibilities should be relatively consistent over time Program Departments have results they are seeking to influence in carrying out each responsibility and indicators to measure the degree to which results are being realized. These reflect departmental mandates and priorities, government priorities, horizontal initiatives and mandate letter commitments. Program Results & Indicators Programs by which the department delivers are the basis for collecting performance information What How Why

15 Vision for performance measurement Track results across government Ensure that at across government, results and performance are being tracked and data collected whenever possible Use the best of what we have, but improve weak spots Encourage departments to continue to use their high-quality performance indicators, but develop new ones to replace those that are weak High-quality indicators that are valid and reliable Develop centers of expertise for performance measurement in departments and TBS to drive high-quality performance measurement Indicators give Canadians a clear idea of the difference departments are making Strengthen high-level indicators so they speak to the issues Canadians care about Increase flexibility for other indicators to best serve management needs Clear accountability for performance measurement Establish clear accountability for performance measurement in departments, centralized in a single point Develop the community of performance measurement experts Consider training and competency requirement opportunities Expertise for performance measurement

16 Vision for evaluation Promote that all programs be evaluated periodically but allow flexibility Risk- and priority-based coverage Introduce exemptions to FAA requirements Fill information gaps with centrally-led evaluations Alternative evaluation reporting Allow for dissemination of evaluation summaries rather than full reports Encourage use of innovative data visualization tools and products Expand evaluation toolbox Encourage use of a wider range of evaluation types (e.g., designed for the program life- cycle; tailored to user needs) Promote more focused and rigorous analysis Flexibility with core issues and timing Core issues no longer mandatory Timing of evaluation based on needs Governance Committee Strengthen and integrated oversight of the quality, utility and use of performance measurement and evaluation information Enhance training curriculum Clear competency requirements Opportunities to earn certification and professional designations Expertise for evaluation

17 Vision for reporting Focus Parliamentary reporting on what matters to Canadians Enhance Canadian understanding of what departments seek to achieve, do achieve, and the resources used to do so More flexibility Provide departments with the flexibility to articulate an architecture that is more reflective of how their programs are managed Ownership of key results at a high level within departments Ministers and deputy heads are involved in the process of developing the results their departments will seek to achieve Departmental results are clearly linked to government priorities What departments do and are trying to achieve is clearly linked to government priorities, mandate letter commitments, and horizontal initiatives Decisions are taken based on results in MCs, TB Subs, and other instruments Increase use of results information in allocating resources Coordinate among central agencies to minimize duplication of requirements and asks Central agency requirements and tools are aligned