LSTA Grants to States Conference Philadelphia, PA March 2012 LSTA Conference Measuring Success and the New SPR.

Slides:



Advertisements
Similar presentations
C4EO Support for Regional Developments Gill Taylor Regional Associate 1.
Advertisements

1 Establishing Performance Indicators in Support of The Illinois Commitment Presented to the Illinois Board of Higher Education December 11, 2001.
Improving Results-Based Management in Grants to States: Transitioning to Virtual Collaboration Training Presentation by Matthew Birnbaum, PhD., IMLS Evaluation.
Evaluation Capacity Building Identifying and Addressing the Fields Needs.
Evaluation Issues in Programs that Promote Self-Determination Anita Yuskauskas, Ph.D.
Monitoring and Evaluation in the CSO Sector in Ghana
The Living Literacy Framework and the E&I Literacy Action Plan Valerie Neaves Alberta Works Programs Alberta Asset Building Collaborative March 17, 2011.
Submission Writing Fundamentals – Part Webinar Series Leonie Bryen.
School College Work Initiative: Phase May 11, 2006.
Chapter 9 Training for Organizations The Training Proposal: Putting Plans Into Writing.
Benchmarking as a management tool for continuous improvement in public services u Presentation to Ministry of Culture of the Russian Federation u Peter.
A Project Sponsored by the McCormick Foundation Lisa Hood, LINC Project Director Center for the Study of Education Policy Illinois State University Presentation.
Destinations What do you aim to achieve through the publication of destination measures? We have made it very clear that we want to put more information.
Challenge Questions How good is our operational management?
Challenge Questions How good is our strategic leadership?
‘Best Practice’ in Police Training? ‘Best Practice’ in Police Training? J. Francis-Smythe, University College Worcester INTRODUCTION.
Certified Business Process Professional (CBPP®)
Evaluating Physical Activity Intervention Programs Thomas Schmid, PhD Physical Activity and Health Branch CDC Atlanta, Georgia, USA.
Bond.org.uk The Bond Effectiveness Programme: developing a sector wide framework for assessing and demonstrating effectiveness July 2011.
Developing Earmark Grant Performance Measures: Grant Proposal Section 3 Deanna Khemani.
Leonardo da Vinci Project BLENDED LEARNING TRANSFER Rationalising, Learning and Transferring the use of technological platforms to enterprise-based learning.
Developing School-Based Systems of Support: Ohio’s Integrated Systems Model Y.S.U. March 30, 2006.
Needs Analysis Session Scottish Community Development Centre November 2007.
Organization Mission Organizations That Use Evaluative Thinking Will Develop mission statements specific enough to provide a basis for goals and.
EL-Civics Application Remember! If in a consortium: –Must complete Budget Detail page for each consortium member EL-Civics Application.
Impact assessment framework
Applying the Principles of Prior Learning Assessment Debra A. Dagavarian Diane Holtzman Dennis Fotia.
Best Practice in HN/degree Articulation Caroline MacDonald PVC Student and Community Engagement.
Evaluation Assists with allocating resources what is working how things can work better.
STRATEGIC DIRECTION UPDATE JANUARY THE VISION AND MISSION THE VISION: ENRICHING LIVES AND CREATING SUCCESSFUL FUTURES. THE MISSION: EDUCATION EXCELLENCE.
Outcome Based Evaluation for Digital Library Projects and Services
Project design & Planning The Logical Framework Approach An Over View Icelandic International Development Agency (ICEIDA) Iceland United Nations University.
OECD/INFE Tools for evaluating financial education programmes Adele Atkinson, PhD Policy Analyst OECD With the support of the Russian/World Bank/OECD Trust.
Welcome! Please join us via teleconference: Phone: Code:
1 Mid-Term Review of the Hyogo Framework for Action Roadmap to Disaster Risk Reduction in the Americas & HFA Mid-Term Review.
2010/12/10 First Skills Council Meeting Brussels, April 14 th 2014 Federico Brugnoli.
Commissioning Self Analysis and Planning Exercise activity sheets.
CHALLENGING BOUNDARIES Rhodia way, The way we do business.
The Research Excellence Framework Expert Advisory Groups round 1 meetings February 2009 Paul Hubbard Head of Research Policy.
Measuring and reporting outcomes for BTOP grants: the UW iSchool approach Samantha Becker Research Project Manager U.S. IMPACT Study 1UW iSchool evaluation.
2008 AHCA/NCAL National Quality Award Program - Step III Overview - Jon Frantsvog Ira Schoenberger Tim Case.
CIRTL Network Data Collection 3/2/2013. Institutional Portrait: Purpose Consistency with the TAR principle Accountability: – Helps us all monitor Network.
Advice on Data Used to Measure Outcomes Friday 20 th March 2009.
Chapter © 2012 Pearson Education, Inc. Publishing as Prentice Hall.
Module II: Developing a Vision and Results Orientation Cheri Hayes Consultant to Nebraska Lifespan Respite Statewide Sustainability Workshop June 23-24,
Tracking national portfolios and assessing results Sub-regional Workshop for GEF Focal Points in West and Central Africa June 2008, Douala, Cameroon.
A project implemented by the HTSPE consortium This project is funded by the European Union SECURITY AND CITIZENSHIP RIGHT AND CITIZENSHIP
LSTA Grant Workshop Jennifer Peacock, Administrative Services Bureau Director David Collins, Grant Programs Director Mississippi Library Commission September.
Early Childhood Transition Part C Indicator C-8 & Part B Indicator B-12 Analysis and Summary Report of All States’ Annual Performance Reports.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
Developing a Framework In Support of a Community of Practice in ABI Jason Newberry, Research Director Tanya Darisi, Senior Researcher
Continuous Improvement. Focus of the Review: Continuous Improvement The unit will engage in continuous improvement between on-site visits. Submit annual.
Sustainability Planning Framework and Process Cheri Hayes Consultant to Nebraska Lifespan Respite Statewide Sustainability Workshop June 23-24, 2015 ©
Improving Results-Based Management in Grants to States: Transitioning to Virtual Collaboration Training Presentation by Matthew Birnbaum, PhD., IMLS Evaluation.
Outcome-based Planning and Evaluation Gloria Latimer, Ed.S, Director of Community Programs Jason Vahling, M.P.H., Community Program Specialist.
Capacity Development Results Framework A strategic and results-oriented approach to learning for capacity development.
Session 2: Developing a Comprehensive M&E Work Plan.
RE-AIM Framework. RE-AIM: A Framework for Health Promotion Planning, Implementation and Evaluation Are we reaching the intended audience? Is the program.
URBACT IMPLEMENTATION NETWORKS. URBACT in a nutshell  European Territorial Cooperation programme (ETC) co- financed by ERDF  All 28 Member States as.
IMPLEMENTING LEAPS IN CENTRAL AND EASTERN EUROPE: TRAINERS’ HANDBOOK Monitoring and Evaluating Results.
[Presentation location] [Presentation date] (Confirm ABT logo) Building Bridges and Bonds (B3): An introduction.
Exploitation means to use and benefit from something. For Erasmus+ this means maximising the potential of the funded activities, so that the results are.
WIS DOT MCLARY MANAGEMENT PERFORMANCE MEASUREMENT.
Account Management Overview
North Carolina Council on Developmental Disabilities
Innovation Ecosystems Fellowship Overview
Investment Logic Mapping – An Evaluative Tool with Zing
LEARNING REPORT 2016 Disasters and Emergencies Preparedness Programme
Figure 1. Basic Logic Model Structure
State of World’s Cash Report:
Presentation transcript:

LSTA Grants to States Conference Philadelphia, PA March 2012 LSTA Conference Measuring Success and the New SPR

Overview of Presentation Achievements to Date Results Chain for Assessment Building a Better Wheel (SPR) Future Evaluations

Developing a New Assessment Framework Assessment questions by your peers are at the basis of the new SPR metrics Simplify the current SPR Review the materials with evaluation methodology experts Review new SPR with COSLA, Measuring Success participants and other potential stakeholders

Operating Principles Increase usefulness and rigor of evaluation tools –Focus on accountability to federal and state policy makers –Collect information to help SLAAs plan and manage programs and services Respect state diversity in crafting services to address local needs Do not stifle innovation Ensure high level of SLAA engagement in the process Streamline current reporting requirement Improve performance measurement practice and reporting

Progress Since Last LSTA Meeting Summer webinars culminated in six broad categories (i.e. “results chains”) that SLAAs use in responding to public needs in their State with LSTA support. –This work is the basis of our framework and will guide performance assessment moving forward. The groups also identified key actors, outcomes, and questions for assessment. –IMLS and SLAA partners focused on what needs to be assessed and what does not. IMLS developing a framework for new SPR based on results chains, assessment questions and required data for program management and accountability.

What Measuring Success gave us… Developed consistent logic models/theories of change that link the national legislation to the actual work of the states are engaged in. Provides a clearer articulation of the mechanisms that result in public benefit. Focusses IMLS’ performance reporting on themes the state partners felt were important/reflect their actual work.

Focal Areas of LSTA Activities Informal Education –Lifelong Learning –Human Services –Employment & Small Business Development Information Access –Digitization –Electronic Databases –Civic Engagement

Results Chains Detail See results chains (11 x 17 sheets).

Underlying Structure of Logic Models SLAA Activity Partner/ Grantee/Point of Service Activity Beneficiary /User

Capturing Action at the SLA Level SLAA Activity State Planning State Level Partnerships Staff Development Information Resources Standards

Action at the Point of Service/Grantee/Partner Level Marketing/ Recruitment Local planning/ partnerships Characterizing different service models. Partner/ Grantee/Point of Service Activity

Action at the Point of Service/Grantee/Partner Level Did they get what they needed? Did it make a difference? Apply new skills? Make more informed decisions? Did they share info? Beneficiary/ User Experience

Model Reporting Hierarchy Focal Area X [SLA Activity] [Project #1 Reporting] [Project #1 User/Beneficiary Info] [Project #2 Reporting] [Project #2 User/Beneficiary Info]

SLA Activity Reporting State level planning? Y/N –How much? State level partnerships? Y/N –Which agencies/NGOs? Staff Development provided for X Focal Area? Y/N –What type? How much? Grant making/Funding? Y/N –Total expended for focal area?

SLA Activity Reporting State level planning? Y/N –How much? State level partnerships? Y/N –Which agencies/NGOs? Staff Development provided for X Focal Area? Y/N –What type? How much? Grant making/Funding? Y/N –Total expended for focal area?

Project Reporting [Focal Area X Project #1 Info] Program modality –How is it delivered? (e.g. on-site, off-site; with/without local partners; ; Dosage –At what intensity? (e.g. contact hours per program) Training provided –Training/capacity building given to local Project dollars –Grant Amount? –Match? Infrastructure/info resources provided –Technology or information resources used? –New resources procured/developed? Total served? [Focal Area X Project #2 Info]….

User Beneficiary Reporting Participation rate/amount Demographics Satisfaction with program Knowledge gain/other added value Applied new skills and/or information Used other local services

Putting It All Together What will we get from all this? –Reporting Framework Based on the way in which you do business –More information about SLAA Level Action –More Systematic Information about Programs and Services Delivered –Platform for in-depth evaluation across states

Long-Term Roll-Out Finalize reporting requirements for new SPR program targeted for Fall 2012 IMLS will assess transition process and report back to COSLA and OMB Systematic evaluation to be conducted after first three-years of new reporting system and at end of five-year planning cycle

General Discussion Questions about proposed content? Continuing engagement with COSLA and SLAA staff.