Barbara M. Fraumeni Muskie School of Public Service, University of Southern Maine & the National Bureau of Economic Research, USA IARIW, Session 3 Joensuu,

Slides:



Advertisements
Similar presentations
Non-Market Services: What can we measure? Mary OMahony NIESR.
Advertisements

Industry-of-Origin Prices and PPPs:
1 Drafting a Standard n Establish the requirements n Agree the process n Draft the Standard n Test the Standard n Implement the Standard.
Cross-National Survey of School Principal Daniel Pop Education Support Program Open Society Institute.
Developing an Agreement Service Provider’s Perspective.
OECD/INFE High-level Principles for the evaluation of financial education programmes Adele Atkinson, PhD OECD With the support of the Russian/World Bank/OECD.
Role of NSOs in Analysis John Cornish. Analysis underpins effective NSO operations Analysis is broad in extent, and it supports all phases of the production.
AGES 2.0 Research Procedure overview. Overview The number and quality of social relationships has important consequences for individual health and well-being.
Further Education Support Service (FESS) FESS Equality Action Planning Framework: Supporting FETAC Registered Providers in Implementing Quality Assurance.
Authors: J.A. Hausman, M. Kinnucan, and D. McFadden Presented by: Jared Hayden.
Stoke-on-Trent City Council Adult Social Care Transformation programme
Barbara M. Altman Emmanuelle Cambois Jean-Marie Robine Extended Questions Sets: Purpose, Characteristics and Topic Areas Fifth Washington group meeting.
Enhancing Data Quality of Distributive Trade Statistics Workshop for African countries on the Implementation of International Recommendations for Distributive.
A METHODOLOGY FOR MEASURING THE COST- UTILITY OF EARLY CHILDHOOD DEVELOPMENTAL INTERVENTIONS Quality of improved life opportunities (QILO)
Rankings: What do they matter, what do they measure? Anne McFarlane August 18, 2010.
Robert Smith Statistics Canada October 20, 2010 Environmental and Socio- Economic Linkages – Drivers, Impacts, Adaptation and Instruments Discussants comments.
Results Mentor Feedback Description & Procedure Background Results Conclusion The current paper system for nurse mentors to provide feedback to nursing.
Quality of Life ENACTUS TRAINING Measurement Tools Developed by D Caspersz & D Bejr, 2013.
Improving Independence – can homecare re-ablement make a difference in the longer term? Liz Newbronner.
Adapting to Consumer Directed Care funding Developing an approach for Unit Based Costing.
Resource allocation for disability - NDA feasibility study Eithne Fitzgerald Head of Policy and Research National Disability Authority.
PA 574: Health Systems Organization Session 5 – May 1, 2013.
Measuring Output from Primary Medical Care, with Quality Adjustment Workshop on measuring Education and Health Volume Output OECD, Paris 6-7 June 2007.
Measuring Health Systems Performance and NHA: Agenda for Health Services Research and Evaluation Measuring Health Systems Performance and NHA: Agenda for.
6th meeting of Health Accounts Experts and correspondents for health expenditure data 30 September 2004 Atkinson Review of Measurement of Government Output.
Homelessness 2020 The Lift We Need on the Long Road Home? Michelle Burrell Council to Homeless Persons.
Add presentation title to master slide | 1 New inspection methodology from September 2009 NATSPEC conference Lorna Fitzjohn May 2009.
Evaluation Assists with allocating resources what is working how things can work better.
OECD/INFE Tools for evaluating financial education programmes Adele Atkinson, PhD Policy Analyst OECD With the support of the Russian/World Bank/OECD Trust.
Quality issues on the way from survey to administrative data: the case of SBS statistics of microenterprises in Slovakia Andrej Vallo, Andrea Bielakova.
National Institute of Economic and Social Research Metrics, Targets and Performance: Hospital Star Ratings Mary O’Mahony, Philip Stevens and Lucy Stokes.
1 Analysing the contributions of fellowships to industrial development November 2010 Johannes Dobinger, UNIDO Evaluation Group.
Assessing the Capacity of Statistical Systems Development Data Group.
Learning about learning The GEC approach to M&E UKFIET Conference Joseph Holden & Jason Calvert 15 th September 2015 © PEAS.
Barbara M. Fraumeni Muskie School of Public Service, USM, Portland, ME & the National Bureau of Economic Research, USA Conference of European Statisticians,
Barbara M. Fraumeni Muskie School of Public Service, University of Southern Maine & the National Bureau of Economic Research, USA Measurement of Non-market.
Outcomes from service inspection Is the quality of care in Scotland improving? October 2007.
Barbara M. Fraumeni Muskie School of Public Service, University of Southern Maine & the National Bureau of Economic Research, USA IARIW, Session 3 Joensuu,
Evidence and Information for Policy Health as a multi-dimensional construct and cross-population comparability Colin Mathers (WHO) on behalf of Taskforce.
Insights from the Your Better Life Index Romina Boarini OECD Statistics Directorate Exploring and exploiting quality of life complexity (QoLexity): epistemological,
Proposed New Education Measure for Scottish GVA Series Richard Murray.
European Conference on Quality in Official Statistics 8-11 July 2008 Mr. Hing-Wang Fung Census and Statistics Department Hong Kong, China (
Estimating expenditure. Links with CARESIM PSSRU model provides weights to CARESIM –broken down by age/gender/marital status/housing tenure for six different.
Measuring Public Sector Output and Productivity in the UK – Progress since The Atkinson Review UK Centre for the Measurement of Government Activity Mark.
Measuring Non-market Output in the National Accounts by: Robin Lynch Discussant notes Peter van de Ven.
Barbara M. Fraumeni Muskie School of Public Service, USM, Portland, ME & the National Bureau of Economic Research, USA Conference of European Statisticians.
United Nations Oslo City Group on Energy Statistics OG7, Helsinki, Finland October 2012 ESCM Chapter 8: Data Quality and Meta Data 1.
Vicky Copley, PHE Risk Factor Intelligence
StagesOf Assessment Stages Of Assessment. The Stages of Assessment for the Single Assessment Process §Publishing information about services. §Completing.
Quality Performance Measures Presentation Derived from Martin & Kettner’s Measuring the Performance of Human Service Programs, Sage, 1996.
5 April, 2013 Indicator 4 on Transparency Consultation IATI Steering Committee, 3 Dec 2015 Alejandro Guerrero UNDP-OECD Joint Support Team (Global Partnership.
5-May-10 Wolfgang Hauschild, Eurostat – Unit D5: Key indicators for European policies Quality profiles for long-term development indicators European Conference.
IDEV 624 – Monitoring and Evaluation Evaluating Program Outcomes Elke de Buhr, PhD Payson Center for International Development Tulane University.
 Ann Dozier, RN, PhD (PI) › Community and Preventive Medicine; University of Rochester  Cindy R. Howard, MD, MPH › Pediatrics; Rochester General Hospital.
Module 9 Monitoring and Evaluation Tuesday, Oct 14, 2013 Ngo Thi Loan and John Carter.
SOCIAL CARE CURRENT DATA AND GAPS RAPHAEL WITTENBERG PERSONAL SOCIAL SERVICES RESEARCH UNIT ROYAL STATISTICAL SOCIETY CONFERENCE 29 JANUARY 2013.
Non-market output: the view from national accounts David Caplan National Accounts Group, ONS.
ACF Office of Community Services (OCS) Community Services Block Grant (CSBG) Survey of Grantees Satisfaction with OCS Survey of Eligible Entities Satisfaction.
Hildegalda P. Mushi and Dr Sudai, Boniphace Marwa Presented on 5th National Quality Improvement Forum on Health and Social Welfare 28 th August 2015, Hyatt.
How Can Evaluation Efforts Illuminate Systems Change in Courts, Tribes, and States? 19 th Annual National Human Services Training Evaluation Symposium.
Middle Managers Workshop 2: Measuring Progress. An opportunity for middle managers… Two linked workshops exploring what it means to implement the Act.
Home and Community-Based Medicaid Waiver Services Aged and Disabled Medicaid Waiver Update March 2016.
11 Measuring Disclosure Risk and Data Utility for Flexible Table Generators Natalie Shlomo, Laszlo Antal, Mark Elliot University of Manchester
« Ongoing evaluation of the implementation of the National Action Plan ‘Psychargos’
Measurement of non market volume output Clarification item C10
Collecting High-Quality Data
Quarterly GDP at t+30 days for EU and euro area
Effects of Computer Technology Use on Older Adults in Long Term Care
Collecting High-Quality Data!
Measuring outcomes Emma Frew October 2012.
Presentation transcript:

Barbara M. Fraumeni Muskie School of Public Service, University of Southern Maine & the National Bureau of Economic Research, USA IARIW, Session 3 Joensuu, Finland August 22, 2006 A Framework for Quality Adjustment Across UK Public Services By Jim Ebdon and Ogho Okiti, ONS, UK

Muskie School of Public Service Ph.D. Program in Public Policy GOALS  Establish a consistent framework across all UK public services  Differentiate and cover representative service sub-types  Quality adjust via degree of success & contribution to outcomes

Muskie School of Public Service Ph.D. Program in Public Policy Differentiation of Services  As detailed and homogeneous as possible  Avoids confounding of structural and quality changes  Even when costs are predominantly staff costs, assuming cost and quality correspondences may be problematic  Switch from marginal cost to value weights

Muskie School of Public Service Ph.D. Program in Public Policy Degree of Success  Atkinson: ‘a quality improvement is equivalent to getting a larger package.”  Need to measure output quality changes, not just process quality changes  Avoid double-counting  Example of timeliness and accuracy for social security benefits processing

Muskie School of Public Service Ph.D. Program in Public Policy Contribution to Outcome  Eurostat handbook backs use of outcome indicators to adjust for quality  Attribution and time lag: an education example

Muskie School of Public Service Ph.D. Program in Public Policy Case Study: Adult Social Care Differentiation of Services 23 categories of services, where data exists, by  Type of service  Client group  Type of home

Muskie School of Public Service Ph.D. Program in Public Policy Outcome Adjustments  Care weeks, intensity of care, and quality of care  Methodology from the Personal Social Services Research Unit (PSSRU) of the University of Kent  Concentrated on current welfare gains  9 dimensions of outcome, such as personal cleanliness & comfort and control over daily life (see para. 4.6)

Muskie School of Public Service Ph.D. Program in Public Policy Capacity for Benefit (CfB)  Assessments by service clients  Level of benefit if the intervention per week was perfect  Needs were categorized as high, low, or no needs  Could not separate CfB into two types of home care, but could measure CfB by intensity of care  Average CfB per week for home care was about 2 (of a max of 7)  CfB among admissions to care homes by type grew between 13% and 18% from 1995 to 2004

Muskie School of Public Service Ph.D. Program in Public Policy Capacity for Benefit PSSRU Background Paper  Understanding Tables 4.3 and 4.4 (after para. 4.15)  Aggregate CfB’s of subcomponents by  Equal weights for the 9 dimensions of outcomes  Or by weights for the Older Persons Utility Scale (OPUS) which covered 5 dimensions of outcomes; with equal and low weights for two of the remaining domains

Muskie School of Public Service Ph.D. Program in Public Policy Output Index = Capacity for Benefit X Quality Adjustment X Weeks of Help

Muskie School of Public Service Ph.D. Program in Public Policy Adjusting for Degree of Success and Client Experience – Quality  Satisfaction surveys focusing on dimensions such as attitudes of care-workers  Separate data for client experience and elements of care delivered are not available currently  Degree of success and client experience are both on the future work agenda

Muskie School of Public Service Ph.D. Program in Public Policy Comments  Consistent framework, differentiation of services with representation, and quality adjustment goals are commendable  Evaluation by service clients can be tricky  Authors well aware of outcome pitfalls  How about consumer surplus pitfalls?  Use of the word “welfare” makes me nervous  Keep up the good work