Molly Chamberlin, Ph.D. Indiana Youth Institute 9-28-2011.

Slides:



Advertisements
Similar presentations
WV High Quality Standards for Schools
Advertisements

Proposal Development Guidelines for Signature Grantee Semi-Finalists The Covenant Foundation.
Arts in Basic Curriculum 20-Year Anniversary Evaluation the Improve Group.
Using Data to Measure and Report Program Impact Anne Basham, MFA MEMconsultants.
C-1, Getting Serious About Sustainability Julie Whitman Vice President-Programs, Indiana Youth Institute November 17, 2014.
Prince George’s County Human Services Coalition Funders Panel Presenter: Renette Oklewicz Director, Foundation Programs January 11, 2012.
Catulpa Community Support Services.  Use of an electronic data entry program to record demographic data and case notes to reflect service delivery 
Driving Questions: Strategies for Uses of ARRA Funds Ongoing Professional Development.
Linking Actions for Unmet Needs in Children’s Health
Healthy Child Development Suggestions for Submitting a Strong Proposal.
Return On Investment Integrated Monitoring and Evaluation Framework.
2014 AmeriCorps State and National Symposium How to Develop a Program Logic Model.
Measuring Success & Impact: Challenges & Opportunities for STEM Early Learning Programs Tiffany R. Lee University of Colorado Boulder University of Washington.
Legal & Administrative Oversight of NGOs Establishing and Monitoring Performance Standards.
Community Planning Training 1-1. Community Plan Implementation Training 1- Community Planning Training 1-3.
TIMELESS LEARNING POLICY & PRACTICE. JD HOYE President National Academy Foundation.
Edward M. Haugh Jr. ESC Consultant. III. Recommendations for Applying Outcomes Planning to ESC  I. Introduction to Outcomes Planning II. A Sample ESC.
Organization Mission Organizations That Use Evaluative Thinking Will Develop mission statements specific enough to provide a basis for goals and.
Investing in Change: Funding Collective Impact
Creating a New Vision for Kentucky’s Youth Kentucky Youth Policy Assessment How can we Improve Services for Kentucky’s Youth? September 2005.
Training of Process Facilitators Training of Process Facilitators.
The Evaluation Plan.
Do it pro bono. Strategic Scorecard Service Grant The Strategy Management Practice is presented by Wells Fargo. The design of the Strategic Scorecard Service.
Planning for Sustainability National Child Traumatic Stress Network All Network Meeting February 6, 2007.
PANAMA-BUENA VISTA UNION SCHOOL DISTRICT
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Working Definition of Program Evaluation
2004 National Oral Health Conference Strategic Planning for Oral Health Programs B.J. Tatro, MSSW, PhD B.J. Tatro Consulting Scottsdale, Arizona.
Military Family Services Program Participant Survey Training Presentation.
Outcome Based Evaluation for Digital Library Projects and Services
EVALUATION RESEARCH. Evaluation Research  How do we begin?  What are the different types of evaluation research?  How do these different types fit.
Introduction to Evaluation January 26, Slide 2 Innovation Network, Inc. Who We Are: Innovation Network National nonprofit organization Committed.
INDIVIDUALIZED FAMILY SERVICE PLAN-IFSP. IFSP The Individualized Family Service Plan (IFSP) is a process of looking at the strengths of the Part C eligible.
Joint Infant and Toddler Steering Committee/Early Learning Regional Coalition Statewide Meeting “Using our Data for Continuous Improvement” Organizational.
TRACKING AND REPORTING PROGRESS AND CONTINUOUS IMPROVEMENT AmeriCorps Program Directors’ Kickoff: 2015 –
Welcome! Please join us via teleconference: Phone: Code:
Logic Models and Theory of Change Models: Defining and Telling Apart
Juggling the Program Management Ball 1. One day thou art an extension educator… The next day thou art a county director or district extension director…how.
 How do we begin?  What are the different types of evaluation research?  How do these different types fit together?  What purpose do they serve?
Director of Evaluation and Accountability Manager, UW’s Grand Rapids, Michigan Robert McKown, CIRS Director of Evaluation and Accountability Sherri.
Julie R. Morales Butler Institute for Families University of Denver.
Managing Organizational Change A Framework to Implement and Sustain Initiatives in a Public Agency Lisa Molinar M.A.
Family Service System Reform Grant Application Training Video FY Donna Bostick-Knox, Pennsylvania Department of Public Welfare, Office of Children.
Program Evaluation for Nonprofit Professionals Unit 1 Part 2: Evaluation and The Logic Model.
“Working to ensure children, birth to 5, are prepared for success in school and life.” Wake County SmartStart Logic Model Training May 2013.
Community Plan Implementation Training 5-1 Community Plan Implementation Training 5-1.
Community Planning Training 5- Community Planning Training 5-1.
Office of Special Education Programs U.S. Department of Education GRANT PERFORMANCE REPORT FOR CONTINUATION FUNDING.
“A Truthful Evaluation Of Yourself Gives Feedback For Growth and Success” Brenda Johnson Padgett Brenda Johnson Padgett.
Mapping the logic behind your programming Primary Prevention Institute
Module V: Writing Your Sustainability Plan Cheri Hayes Consultant to Nebraska Lifespan Respite Statewide Sustainability Workshop June 23-24, 2015 © 2011.
Using Logic Models in Program Planning and Grant Proposals The Covenant Foundation.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
1 A QTS Web Training Writing Consumer Education & Referral Outcomes.
Midwest Child Welfare Implementation Center MCWIC Purpose Our purpose is to facilitate the implementation of systemic change to improve outcomes for children.
1 Strategic Plan Review. 2 Process Planning and Evaluation Committee will be discussing 2 directions per meeting. October meeting- Finance and Governance.
Catholic Charities Performance and Quality Improvement (PQI)
Making it Count! Program Evaluation For Youth-Led Initiatives.
Using Logic Models to Create Effective Programs
Midwest Child Welfare Implementation Center MCWIC Purpose Our purpose is to facilitate the implementation of systemic change to improve outcomes for children.
Evaluating for Impact Learning Circle Project Faculty Presentation Name: Ellen RoweTitle: Community & Leadership Development.
A Framework for Evaluating Coalitions Engaged in Collaboration ADRC National Meeting October 2, 2008 Glenn M. Landers.
1 Outcomes: Outcomes: Libraries Change Lives — Libraries Change Lives — Oh yeah? Prove it. Oh yeah? Prove it. The Institute of Museum and Library Services.
Assessment/Evaluation Make evaluation a central part of planning – not an afterthought 1) Determine Needs 2) Determine Desired Outcomes 3) Determine Activities.
Strategic planning A Tool to Promote Organizational Effectiveness
Logic Models How to Integrate Data Collection into your Everyday Work.
Using Logic Models in Program Planning and Grant Proposals
North Iowa Community Action
February 21-22, 2018.
Assessment of Service Outcomes
Presentation transcript:

Molly Chamberlin, Ph.D. Indiana Youth Institute

 Accountability  Progress and results  More effective/efficient service  Communication opportunities  Focuses staff effort  Engages employees throughout organization  Funding/resources

 Means to see if strategic plan is working  Focus attention on what matters most  Common language within agency  Measurement of outcomes, not just outputs  Systems for verifying results  Support for funding requests

 IMPACT CYCLE  Mission >  Goals >  Inputs >  Activities >  Outputs >  Outcomes

 Evaluation / assessment is NOT a one-time event  Evaluation starts at the beginning of the project (not at the end)  Assignment of responsibility for process, particularly data analysis  Involve all levels of staff within your organization

Indicator  Is there support from organizational leadership?  Are there clearly written goals and objectives?  Are the activities of the program clearly outlined?  Is there a person or a group of people who are responsible for leading evaluation efforts?  Have financial and/or human resources been allocated for evaluation efforts?  Do all staff members agree upon the written goals and objectives for the program?  Do all staff members consider evaluation activities part of their job requirements?  Do other stakeholders (board, service recipients, volunteers) value evaluation?

 Assign responsibility  Create a timetable  Communicate with staff  Use any and all available resources

 Be clear  Be concise  Remember that it all hasn’t happened yet – adjustments to the process can (and should) be made as you learn from results

Resources (Inputs) ActivitiesOutputs Outcomes (immediate, short, long-term)

GOAL: A big picture statement of what the program wants to accomplish Goal: Parents and caregivers possess the knowledge and skills to ensure safe and healthy development for infants and children Inputs Activities Outputs Outcomes Success Indicators

OUTCOMES: What does your program want to CHANGE? Inputs Activities Outputs Outcomes: Expected observable and measurable results; usually a change in behavior, knowledge, attitude, or skill Increased parent/caregiver knowledge in specific areas Increased parent/caregiver practical skills in specific areas Success Indicators

ACTIVITIES: What will the program DO in order to achieve the desired outcomes? Inputs Activities: Actions or strategies employed to achieve desired outcomes (if we do this, then the outcome) Outputs Outcomes: Expected observable and measurable results; usually a change in behavior, knowledge, attitude, or skill Survey parents and local providers to identify specific areas of need/knowledge and skill gaps Provide online and local in-person training for parents and child care providers in specific areas of need Develop and administer grants for provider training (“train the trainer” models) in specific areas of need Increased parent/caregiver knowledge in specific areas Increased parent/caregiver practical skills in specific areas Success Indicators

OUTPUTS: What will the activities produce or create? (What will our clients do as a result of our activities? Inputs Activities: Actions or strategies employed to achieve desired outcomes (if we do this, then the outcome) Outputs: what will clients do as a result of our activities? Outcomes: Expected observable and measurable results; usually a change in behavior, knowledge, attitude, or skill Survey parents and local providers to identify specific areas of need/knowledge and skill gaps Provide online and local in-person training for parents and child care providers in specific areas of need Develop and administer grants for provider training (“train the trainer” models) in specific areas of need Parents/caregivers participate in survey Parents/caregivers participate in online trainings Parents/caregivers participate in on- site trainings Providers apply for training grants Providers receive training grants Parents/providers are satisfied with training opportunities Increased parent/caregiver knowledge in specific areas Increased parent/caregiver practical skills in specific areas Success Indicators

INPUTS: What resources will you use to conduct your activities? Inputs: Resources (usually people/funds) used in program development and implementation Activities: Actions or strategies employed to achieve desired outcomes (if we do this, then the outcome) Outputs: what activities will produce or create (quantity and/or quality of activities) Outcomes: Expected observable and measurable results; usually a change in behavior, knowledge, attitude, or skill 2 FTE Training Staff.5 FTE Training Coordinator 1 FTE Grant Coordinator Funding for trainings Funding for training staff Funding for grants Survey parents and local providers to identify specific areas of need/knowledge and skill gaps Provide online and local in-person training for parents and child care providers in specific areas of need Develop and administer grants for provider training (“train the trainer” models) in specific areas of need Parents/caregivers participate in survey Parents/caregivers participate in online trainings Parents/caregivers participate in on-site trainings Providers apply for training grants Providers receive training grants Parents/providers are satisfied with training opportunities Increased parent/caregiver knowledge in specific areas Increased parent/caregiver practical skills in specific areas Success Indicators

SUCCESS INDICATORS: How do we know if we’re on track to achieve or achieved our outcomes? Inputs: Resources (usually people/funds) used in program development and implementation Activities: Actions or strategies employed to achieve desired outcomes (if we do this, then the outcome) Outputs: what activities will produce or create (quantity and/or quality of activities) Outcomes: Expected observable and measurable results; usually a change in behavior, knowledge, attitude, or skill 2 FTE Training Staff.5 FTE Training Coordinator 1 FTE Grant Coordinator Funding for trainings Funding for training staff Funding for grants Survey parents and local providers to identify specific areas of need/knowledge and skill gaps Provide online and local in-person training for parents and child care providers in specific areas of need Develop and administer grants for provider training (“train the trainer” models) in specific areas of need Parents/caregivers participate in survey Parents/caregivers participate in online trainings Parents/caregivers participate in on- site trainings Providers apply for training grants Providers receive training grants Parents/providers are satisfied with training opportunities Increased parent/caregiver knowledge in specific areas Increased parent/caregiver practical skills in specific areas By 12/14, obtain at least $50,000 for trainings By 12/14, obtain at least $10,000 for grants By 10/14, hire or identify training staff and training coordinator By 10/14, hire or identify grant coordinator n/a By 3/30, 40% of served parents and providers will participate in the survey (survey records) By 5/15, 20 online/5 onsite trainings will be offered (program records) By 5/15, 50% of served parents and local providers will participate in trainings (training logs) By 5/15, 20 applications will be submitted (program records) By 6/15, 17 applications will be approved (program records) By 6/15, 90% of participants will express satisfaction with trainings (paarticipant surveys) By 4/30, 100% of trainings will meet established quality standards (expert content review) By 7/31, 85% of participants will pass a knowledge assessment in specified areas (knowledge assessments) By 7/31, 90% of participants will report increased knowledge and practical skills in specified areas (participant surveys)

 Implement  Repeat  Communicate  Staff  Board  Funders  Volunteers  Community

 Collecting data before you have set goals and identified outcomes  Collecting too much data  Using data collection tools or measurements that don’t match the goals and outcome  Attempting without leadership support  Assigning to staff member(s) regardless of time and skills  Using only goals required by funders  Identifying outcomes that are not feasible or not connected to activities and organizational/program mission  Overcomplicating the evaluation

 On-line Resources  Management Library  Kellogg Foundation  University of Wisconsin Extension  Children Youth and Families Education and Research Network  Innovation Network  Capacity Building Resource Library  IYI Resources  IYI Kids Count Data Report  IYI Virginia Beall Ball Library  IYI Consulting Services and links

Contact: Molly Chamberlin (317)