1 Using a Statewide Evaluation Tool for Child Outcomes & Program Improvement Terry Harrison, Part C Coordinator Susan Evans, Autism Project Specialist.

Slides:



Advertisements
Similar presentations
Preschool Special Education A Review of State Performance Indicators and The Child Outreach Network.
Advertisements

C ontent of the IFSP Produced by NICHCY, In this module, you’ll learn:  Why the IFSP is so important in early intervention  The 8 types of information.
NECTAC Webinar Series on Early Identification and Part C Eligibility Session 2: A Rigorous Definition of Developmental Delay March 10, 2010 Steven Rosenberg,
Building a national system to measure child and family outcomes from early intervention Early Childhood Outcomes Center International Society on Early.
Researchers as Partners with State Part C and Preschool Special Education Agencies in Collecting Data on Child Outcomes Kathy Hebbeler, ECO at SRI International.
Indicator 7 Child Outcomes MAKING SENSE OF THE DATA June
Early Intervention and Child Abuse & Prevention Act (CAPTA) Marina L. Merrill (ODE) Stephanie Stafford (DHS)
Data Analysis for Assuring the Quality of your COSF Data 1.
Approaches for Converting Assessment Data to OSEP Outcome Categories August 28, 2007 Rosanne Griff-Cabelli Delaware Birth to Three Early Intervention System.
Refresher: Background on Federal and State Requirements.
Early Childhood Outcomes Center Orientation for New Outcomes Conference Participants Lynne Kahn Christina Kasprzak Kathy Hebbeler The Early Childhood Outcomes.
Early Childhood Outcomes ECO Institute Kathy Hebbeler, ECO at SRI Robin Rooney ECO at FPG Prepared for the Office of Early Learning and School Readiness.
1 Measuring Child Outcomes: State of the Nation. 2 Learning objective: To gain new information about the national picture regarding measuring child outcomes.
CHILD OUTCOMES BASELINE AND TARGETS FOR INDICATOR 7 ON THE STATE PERFORMANCE PLAN State Advisory Panel for Exceptional Children November 12, 2009 January.
The Results are In! Child Outcomes for OSEP EI and ECSE Programs Donna Spiker Early Childhood Outcomes Center at SRI International October 13, 2011 (CCSSO-SCASS.
The Results are In: Using Early Childhood Outcome Data Kathy Hebbeler Early Childhood Outcomes Center at SRI International August, 2011.
Presented at Division for Early Childhood National Harbor, Maryland November, Child Outcomes: What We Are Learning from National, State, and Local.
Early Childhood Outcomes Center Using the Child Outcomes Summary Form February 2007.
The Current Status of States' Early Childhood Outcome Measurement Systems Kathy Hebbeler, SRI International Lynne Kahn, FPG Child Dev Inst October 17,
Partnering with Local Programs to Interpret and Use Outcomes Data Delaware’s Part B 619 Program September 20, 2011 Verna Thompson & Tony Ruggiero Delaware.
ACCOUNTING FOR PROGRESS…… ONE CHILD AT A TIME
1 Implementation of the New Part C Eligibility Criteria Effective 7/1/2010.
Target Setting For Indicator #7 Child Outcomes WDPI Stakeholder Group December 16, 2009 Ruth Chvojicek Statewide Child Outcomes Coordinator 1 OSEP Child.
OSEP National Early Childhood Conference December 2007.
1 Birth to 3 Child Outcomes Maryland’s Approach to Converting Assessment Data to OSEP Outcome Categories August 28, 2007 Deborah Metzger
Charting the Course- Integrating the IFSP with Early Childhood Outcomes in West Virginia.
Specific Learning Disability: Accurate, Defensible, & Compliant Identification Mississippi Department of Education.
1 Early Childhood and Accountability OSEP’s Project Director’s Meeting August 2006.
SPP Indicators B-7 and B-8: Overview and Results to Date for the Florida Prekindergarten Program for Children with Disabilities PreK Coordinators Meeting.
ND Early Childhood Outcomes Process Nancy Skorheim – ND Department of Public Instruction, Office of Special Education.
OSEP National Early Childhood Conference December 2007.
Infant & Toddler Connection of Virginia Results of FFY 2007 Monitoring Indicators For The Annual Performance Report & State Performance Plan.
A NEW SYSTEM OF SUPPORT FOR INFANTS AND TODDLERS WITH DISABILITIES Recent Changes in the Provision of Early Intervention for Infants and Toddlers with.
1 Accountability Conference Education Service Center, Region 20 September 16, 2009.
Looking for Patterns in Child Outcome Data – Examples from NYS New York State Department of Health Bureau of Early Intervention.
1 Quality Assurance: The COS Ratings and the OSEP Reporting Categories Presented by The Early Childhood Outcomes Center Revised January 2013.
1 Measuring Child Outcomes: State of the Nation. 2 Learning objective: To gain new information about the national picture regarding measuring child outcomes.
An Introduction to the State Performance Plan/Annual Performance Report.
Educable Mental Retardation as a Disability: Accurate, Defensible, & Compliant Identification Mississippi Department of Education.
JANUARY 6, 2014 VERNA THOMPSON Delaware 619 Meeting.
Delaware Child Outcomes Part C and 619 Collaboration Measuring Child and Family Outcomes July 30, 2010 Arlington, Virginia.
Understanding and Using Early Childhood Outcome (ECO) Data for Program Improvement Kansas Division for Early Childhood Annual Conference Feb. 23rd 2012.
Early Childhood Special Education Part B, Section 619 Measurement of Preschool Outcomes-SPP Indicator #7 Training Sessions-2010.
Understanding and Using Early Childhood Outcome (ECO) Data for Program Improvement TASN – KITS Fall 2012 Webinar August 31 st, 2012 Tiffany Smith Phoebe.
1 Indicator 7 Child Outcomes: Changes & Updates June 2011 Indicator 7 Child Outcomes: Changes & Updates June 2011.
Presented at ECEA-SCASS Meeting Savannah, Georgia October, 2010 OSEP Initiatives on Early Childhood Outcomes Kathy Hebbeler Early Childhood Outcomes Center.
Early Childhood Outcomes Center Orientation for New Outcomes Conference Participants Kathy Hebbeler Lynne Kahn The Early Childhood Outcomes (ECO) Center.
Why Collect Outcome Data? Early Childhood Outcomes Center.
Vanessa Winborne Kelly Hurshe Colleen O’Connor EARLY ON ® UPDATES.
State PreK Contacts Meeting February 3, Total matched records after duplicates resolved 11,395 Level 1 Exclusions Datapoint not permissable011,395.
Building State Systems to Produce Quality Data on Child Outcomes Jim J. Lesko Director, Early Development and Learning Resources Delaware Department of.
Special Education Procedures Information from Illinois Rules and Regulations Part 226 Special Education
Making Progress on Measuring Progress Barbara Jackson, NE Beppie Shapiro, HI Lynne Kahn and Kathy Hebbeler, ECO.
From Preschool to Post-School Outcomes Preparing Florida’s Youngest Students to Become College and Career Ready Monica Verra-Tirado, Chief Florida Department.
Specific Learning Disability: Accurate, Defensible, & Compliant Identification Mississippi Department of Education.
Measuring EC Outcomes DEC Conference Presentation 2010 Cornelia Taylor, ECO Christina Kasprzak, ECO/NECTAC Lisa Backer, MN DOE 1.
Child Outcomes Measurement and Data Quality Abby Winer Schachner & Kathleen Hebbeler International Society on Early Intervention Conference Stockholm,
Module 3 Early ACCESS Process Section 3 Evaluation and Assessment Iowa Department of Education.
Approaches for Converting Assessment Data to the OSEP Outcome Categories Approaches for Converting Assessment Data to the OSEP Outcome Categories NECTAC.
Kathy Hebbeler, ECO at SRI Lynne Kahn, NECTAC and ECO at FPG
OSEP Project Directors Meeting
Kathy Hebbeler, ECO at SRI International AUCD Meeting Washington, DC
Integrating Outcomes Learning Community Call February 8, 2012
Building State Systems to Produce Quality Data on Child Outcomes
Early Childhood Transition APR Indicators and National Trends
Building Capacity to Use Child Outcomes Data to Improve Systems and Practices 2018 DEC Conference.
Researchers as Partners with State Part C and Preschool Special Education Agencies in Collecting Data on Child Outcomes Kathy Hebbeler, ECO at SRI International.
Presented by: Marcella Franczkowski, Assistant State Superintendent
Refresher: Background on Federal and State Requirements
Measuring Child and Family Outcomes Conference August 2008
Presentation transcript:

1 Using a Statewide Evaluation Tool for Child Outcomes & Program Improvement Terry Harrison, Part C Coordinator Susan Evans, Autism Project Specialist New Jersey Early Intervention System NJ Department of Health and Senior Services

2 A look at New Jersey Part C  NJ has 21 counties  Each county has at least one dedicated Targeted Evaluation Team (TET). All eligibility evaluations are done by the TETs.  Evaluators administer a standardized tool for all children at entry and a percentage of children at exit to answer OSEP Outcome questions 3.A, 3.B, and 3.C

3 Battelle Developmental Inventory 2 nd edition  Chosen based on following criteria:  Commercially available  Domains answer Child Outcome questions  Reliable and valid  Can be administered by NJEIS evaluators  Norm referenced  Can be used to help determine eligibility  Can be used for Part C and 619

4 Exit Plan  5 -6 counties each year over 4 years conduct exit evaluations when children leave the system.  To be assessed on exit a child has to:  Have an intake BDI-2  Be in the system for at least 6 months  Reside in a county doing exit evaluations  NJ reported exit data in APR 2008 for 63 children

5 OSEP APR Reporting

6 Reporting Decisions For APR indicators 3.B and 3.C NJEIS makes decisions based on two BDI2 domains OSEPBDI-2 Domain A.Positive social-emotional skills (including social relationships) Personal/Social B. Acquisition and use of knowledge and skills (including early language/communication) Communication Cognition C. Use of appropriate behaviors to meet their needs Adaptive Motor

7 Standard Score  NJEIS uses BDI-2 derived Standard Scores by domain for the basis of reporting  The Standard Score represents the child’s development in relation to children in the same age group  Mean = 100, Sd = 15

8 Standard Score  Scores of 90 to 100 are considered as “average”,  Scores between 80 and 89 considered as “low average”.  Scores below 80 indicate “mild to more severe developmental delay”

9 Same age peers  NJEIS considers children as functioning with same age peers when their standard score in each domain is 80 or greater.  Children have to be in the “low average” group or higher.

10 Initial and Exit Scores NJEIS is using four BDI-2 data elements from each domain to “calculate” a cross walk to OSEP a, b, c, d, e  Initial Raw – is the raw score at entry  Initial Standard – is the standard score at entry  Exit Raw – is the raw score at exit  Exit Standard – is the raw score at exit

11 Reporting Categories  Assignment to a, b, c is evaluated independent from d, e  For 3.B & 3.C the assignment to a, b, and c will be based on the maximum little score assigned to a domain in each indicator. (i.e. a is less then b)  In the case of 3.A the score for the one domain will be reported

12 Business Rules a, b, c Report in “c” Percentage of children who improved functioning to a level nearer to same aged peers but did no reach it. Exiting Raw > Initial Raw AND Exiting Standard > Initial Standard

13 Business Rules a, b, c a. Percentage of children who did not improve functioning Exiting Raw =< Initial Raw AND Exiting Standard < 80 b. Percentage of children who improved functioning, but not sufficient to move nearer to functioning comparable to same-aged peers. Exiting Raw > Initial Raw AND Exiting Standard <= Initial Standard AND Exiting Standard < 80

14 Example Outcome 3.B category c  Percentage of children who improved functioning to a level nearer to same aged peers but did no reach it. Cognitive Domain Raw = 49 Standard = 61 Cognitive Domain Raw = 25 Standard = 55 Communication Domain Raw = 57 Standard = 71 Communication Domain Raw = 33 Standard = 64 < < EntryExit Raw and Standard score increase; however exiting standard below 80. Therefore, little c.

15 Business Rules d, e d. Percentage of children who improved functioning to reach a level comparable to same aged peers. Initial Standard = 80 e. Percentage of children who maintained functioning at a level comparable to same- aged peers. Initial Standard >= 80 AND Exiting Standard >= 80

16 Business Rules d, e  Only be assigned to d, or e if both domains indicate that the child is comparable to same aged peer  If only one of two domains is comparable to same aged peer report in c  If one domain is in d and another falls in e then the child will be assigned to d

17 Example Outcome 3.C category d Adaptive Domain Raw = 44 Standard = 87 Adaptive Domain Raw = 33 Standard = 76 Motor Domain Raw = 118 Standard = 102 Motor Domain Raw = 96 Standard = 86  Percentage of children who improved functioning to reach a level comparable to same aged peers. EntryExit < < Initial Standard score below 80. Therefore, little d. Initial Standard score below 80. Therefore, little e. Child is reported in little d because the lower little scores is used.

18 Exit 2008 – Outcome #3.B Knowledge and skills a) Did not make progress 12% b) Improved but not nearer to peers 23% c) Improved nearer to peers 1625% d) Reached peers 1727% e) Maintained functioning with peers 2743% TotalsN= 63100%

19 Exit 2008 – Outcome #3.B Behaviors to meet needs a) Did not make progress 00% b) Improved but not nearer to peers 12% c) Improved nearer to peers 1219% d) Reached peers 1016% e) Maintained functioning with peers 4063% TotalsN= 63100%

20 Exit 2008 – Outcome #3.A Social Skills a) Did not make progress 35% b) Improved but not nearer to peers 46% c) Improved nearer to peers 35% d) Reached peers 23% e) Maintained functioning with peers 5181% TotalsN= %

21 Applying Technology

22 Part C & BDI-2  Each evaluator uses a palm pilot which contains the full BDI-2  Results:  Scoring errors are minimized  Evaluators synch the palm to the web  Agencies have access to reports at local level

23 Web-based Data System  Lead agency has access to individual and agency data via the web-based data system  Lead agency uses the web-based data system to export data for federal reporting  Data is also used by lead agency for:  Procedural Safeguards Contacts  Program compliance with child outcomes project  Quality control of evaluators via desk audits

24 General Supervision

25 Data  NJEIS has started to use BDI-2 data as part of its general supervision and monitoring system  Monitoring:  Appropriateness of IFSP services based on initial evaluation  Eligibility decisions  Evaluator qualifications and quality assurance

26 General Supervision: Appropriate Services  NJEIS charted children whose eligibility evaluation showed more that 1.5 Sd below the mean.  Compared this data to authorized service hours based on IFSPs.  This data raises questions related to appropriate type and intensity of service decisions made by IFSP teams.

27

28 Next Steps Appropriate Services  Compare the areas of need ( by domains & sub-domains identified by the BDI-2 more than 1.5 Sd below) with type, frequency and intensity of services identified on the IFSP  Monitor appropriate justification of IFSP Team service decisions.  Provide Training & Technical Assistance

29 General Supervision: Eligibility Decisions  NJEIS teams use BDI-2 as part of the eligibility decision process  First time state-wide use of same instrument as part of the eligibility process  Other tools are completed as needed

30 Next Steps: Eligibility  Pending Part C final regulations, NJ is considering implementing the screener portion of the BDI-2

31 General Supervision: Evaluators  Use of statewide tool & subsequent training activities identified the need to establish minimum standards for qualified NJEIS evaluators.  The lead agency surveyed TET agencies regarding personnel criteria for their evaluators.

32 Survey Results  16 TET agencies responded  6 agencies had specific “evaluator” job descriptions  The remaining agencies reported having job descriptions related to each discipline that also included evaluation as a job duty

33 Survey Results  Agency Requirement of EI Experience  6 - require 2+ years  4 - require 1 year  1 - requires 400+ hours in EI  1 - required 1 year for a licensed professional and 2+ years for other disciplines  4 - had no requirements

34 Survey Results  Most of TET agencies do not require coursework or training in evaluation.  Mentoring Plan  4 have no mentoring plan  7 have procedures for mentoring or pairing with experienced evaluators  6 did not have any plans specific to being an evaluator

35 Next Steps: Evaluators  Review standard personnel criteria for evaluators established in other states  Develop NJ standards  Challenges:  Quantifying competencies for hiring and monitoring  Recruitment  Should the state consider “grandfathering” of current evaluators?

36 Child Outcome Costs

37 Implementation Costs  DHSS supplied all training and materials to agencies, including technology component. Cost over three years:  First year $107,165  Second year $151, 975  Third year $ 48,210  Totals $ 307,350

38 Training/Evaluations  To date 236 evaluators & program staff have been trained.  Average time of eligibility evaluation has increased by 15 minutes.  Factors for increase include:  Learning curve for new evaluators  Use of technology  Use of additional tools in areas where more information is needed

39 Weighing the costs  Each evaluator one time start-up cost has been approximately $1,300 (materials & training)  Additional evaluation time (15 min * 2 evaluators) cost increase averaged $50.00 per eval.  To implement COSF or a similar procedure the projected cost is:  $100 per staff, per hour, to review & note progress on each form for each child included in Child Outcome Reporting

40 Thank you