Presentation is loading. Please wait.

Presentation is loading. Please wait.

1 Facilitators: Lorna McCue and Pam Kinzie, HC Link Guest Presenter: Andrew Taylor, Taylor Newbury Consulting Results-Based Accountability for community.

Similar presentations


Presentation on theme: "1 Facilitators: Lorna McCue and Pam Kinzie, HC Link Guest Presenter: Andrew Taylor, Taylor Newbury Consulting Results-Based Accountability for community."— Presentation transcript:

1 1 Facilitators: Lorna McCue and Pam Kinzie, HC Link Guest Presenter: Andrew Taylor, Taylor Newbury Consulting Results-Based Accountability for community organizations and networks Session 2: Performance Accountability for Programs, Agencies and Service Systems

2 Web/Teleconference Tips  Synchronize your voice with the webconnection

3 Web/Teleconference Tips  Mute/Un-mute: *1  ON HOLD

4 Web/Teleconference Tips  Private and group chat

5 Web/Teleconference Tips  Size of screen  If the whole page is not displayed an image of the the page will appear for navigation  Disconnected? See instructions  Screen saver or Power saver

6 Contact Us Toll-free: Fax:

7 Introductions Who’s Online? Facilitators: Pam Kinzie, HC Link Lorna McCue, HC Link Andrew Taylor, Taylor Newbury Consulting Please indicate your: Name Organization

8 Session 2 Agenda 1.Welcome and Introductions 2.Learning Objectives & Agenda Review 3.Recap from Session 1 4.Performance Measures – 4 Quadrants Choosing Headline Measures Comparing Performance Turn the Curve Report: Performance Performance accountability questions 5.Performance Measures – Real life Examples 6.“Homework” Assignment 7.Q&A 8.Wrap-Up

9 Learning Objectives After participating in this webinar you will be able to: Define performance accountability; Describe performance measures in each of the 4 quadrants Identify how performance accountability may be useful to your organizations Take the next steps to find out more about RBA

10 Results Accountability is made up of two parts: Performance Accountability about the well-being of CLIENT POPULATIONS For Programs – Agencies – and Service Systems Population Accountability about the well-being of WHOLE POPULATIONS For Communities – Cities – Counties – States - Nations

11 “All performance measures that have ever existed for any program in the history of the universe involve answering two sets of interlocking questions.”

12 How Much did we do? ( # ) How Well did we do it? ( % ) Quantity Quality Performance Measures

13 Effort How hard did we try? Effect Is anyone better off? Performance Measures

14 How Much How Well Performance Measures Effort Effect

15 Quantity How much did we do? Education Quality How well did we do it? Effect Effort Is anyone better off? How much service did we deliver? How well did we deliver it? How much change / effect did we produce? What quality of change / effect did we produce?

16 Quantity How much did we do? Education Quality How well did we do it? Effect Effort Number of students Student- teacher ratio Number of high school graduates Percent of high school graduates Is anyone better off?

17 Quantity How much did we do? Education Quality How well did we do it? Effect Effort Number of students Student- teacher ratio Is anyone better off? Number of 9 th graders who graduate on time and enter college or employment after graduation Percent of 9 th graders who graduate on time and enter college or employment after graduation

18 Lay Definition All Data Have Two Incarnations Technical Definition HS Graduation Rate % enrolled June 1 who graduate June 15 % enrolled Sept 30 who graduate June 15 % enrolled 9 th grade who graduate in 12th grade

19 RBA Categories Account for All Performance Measures (in the history of the universe) Quantity Quality Efficiency, Admin overhead, Unit cost Staffing ratios, Staff turnover Staff morale, Access, Waiting time, Waiting lists, Worker safety Customer Satisfaction (quality service delivery & customer benefit) Cost / Benefit ratio Return on investment Client results or client outcomes Effectiveness Value added Productivity Benefit value Product Output Impact Process Input Effect Effort Cost TQM Effectiveness Efficiency

20 Quantity Quality Efficiency, Admin overhead, Unit cost Staffing ratios, Staff turnover Staff morale, Access, Waiting time, Waiting lists, Worker safety Customer Satisfaction (quality service delivery & customer benefit) Cost / Benefit ratio Return on investment Client results or client outcomes Effectiveness Value added Productivity Benefit value Process Input Effect Effort Cost TQM Product Output Impact RBA Categories Account for All Performance Measures (in the history of the universe)

21 Quantity Quality Efficiency, Admin overhead, Unit cost Staffing ratios, Staff turnover Staff morale, Access, Waiting time, Waiting lists, Worker safety Customer Satisfaction (quality service delivery & customer benefit) Cost / Benefit ratio Return on investment Client results or client outcomes Effectiveness Value added Productivity Benefit value Process Input Effect Effort Cost TQM 1. Did we treat you well? 2. Did we help you with your problems? * Product Output Impact RBA Categories Account for All Performance Measures (in the history of the universe) * World’s simplest complete customer satisfaction survey

22 Not All Performance Measures Are Created Equal Quantity Quality Efficiency, Admin overhead, Unit cost Staffing ratios, Staff turnover Staff morale, Access, Waiting time, Waiting lists, Worker safety Customer Satisfaction (quality service delivery & customer benefit) Cost / Benefit ratio Return on investment Client results or client outcomes Effectiveness Value added Productivity Benefit value Process Input Effect Effort Cost TQM Product Output Impact

23 How much did we do? Not All Performance Measures Are Created Equal How well did we do it? Is anyone better off? Least Important QuantityQuality Effect Effort Most Important Least Most Also Very Important

24 How much did we do? The Matter of Control How well did we do it? Is anyone better off? Quantity Quality Effect Effort Least Control PARTNERSHIPS Most Control

25 The Matter of Use 1.The first purpose of performance measurement is to improve performance. 2. Avoid the performance measurement equals punishment trap. Create a healthy organizational environment. Start small. Build bottom-up and top-down simultaneously.

26 1. To Ourselves Can we do better than our own history? 2. To Others When it is a fair apples/apples comparison. 3. To Standards When we know what good performance is. Comparing Performance

27 2. To Others When it is a fair apples/apples comparison. 3. To Standards When we know what good performance is. 1. To Ourselves First Can we do better than our own history? Using a Baseline CHART ON THE WALL Comparing Performance

28 1. To Ourselves First Can we do better than our own history? 2. To Others When it is a fair apples/apples comparison. Reward?Punish? 3. To Standards When we know what good performance is. Comparing Performance

29 1. To Ourselves First Can we do better than our own history? 2. To Others When it is a fair apples/apples comparison. Comparing Performance 3. To Standards When we know what good performance is.

30 The Matter of Standards Quantity Effect Effort 1. Quality of Effort Standards are sometimes WELL ESTABLISHED Child care staffing ratios Application processing time Handicap accessibility Child abuse response time 2. Quality of Effect Standards are almost always EXPERIMENTAL Hospital recovery rates Employment placement and retention rates Recidivism rates 3. Both require a LEVEL PLAYING FIELD and an ESTABLISHED RECORD of what good performance is. BUT AND

31 Advanced Baseline Display Your Baseline Comparison Baseline Goal (line) Target or Standard Instead: Count anything better than baseline as progress. Avoid publicly declaring targets by year if possible. ● Create targets only when they are: FAIR & USEFUL

32 How much did we do? Separating the Wheat from the Chaff Types of Measures Found in Each Quadrant How well did we do it? Is anyone better off? # Clients/customers served # Activities (by type of activity) % Common measures e.g. client staff ratio, workload ratio, staff turnover rate, staff morale, % staff fully trained, % clients seen in their own language, worker safety, unit cost % Skills / Knowledge (e.g. parenting skills) # % Attitude / Opinion (e.g. toward drugs) # % Behavior (e.g.school attendance) # % Circumstance (e.g. working, in stable housing) # % Activity-specific measures e.g. % timely, % clients completing activity, % correct and complete, % meeting standard Point in Time vs. Point to Point Improvement

33 How much did we do? Choosing Headline Measures and the Data Development Agenda How well did we do it? Is anyone better off? Quantity Quality Effect Effort # Measure # Measure # Measure # Measure # Measure # Measure # Measure #1 Headline #2 Headline #3 Headline #1 DDA #2 DDA #3 DDA % Measure % Measure % Measure % Measure % Measure % Measure % Measure # Measure # Measure # Measure # Measure # Measure # Measure # Measure % Measure % Measure % Measure % Measure % Measure % Measure % Measure

34

35 Program: _______________ Performance Measure Performance Measure Baseline Story behind the baseline Partners Three Best Ideas – What Works No-cost / low-cost One Page Turn the Curve Report: Performance Off the Wall

36 Performance Accountability 1.Who are our customers? 2.How can we measure if we are delivering services well? 3.How can we measure if our customers are better off? 4.How are we doing on the most important of these measures? 5.Who are the partners that have a role to play in doing better? 6.What works, what could work, to do better? 7.What do we propose to do? for programs, agencies and service systems


Download ppt "1 Facilitators: Lorna McCue and Pam Kinzie, HC Link Guest Presenter: Andrew Taylor, Taylor Newbury Consulting Results-Based Accountability for community."

Similar presentations


Ads by Google