Developing a Model of Trainer Evaluation Leslie W. Zeitler, LCSW May 2010: 13 th Annual National Human Services Training Evaluation Symposium “Problem/Brainstorm.

Slides:



Advertisements
Similar presentations
360-degree Look at Me The Leadership Effectiveness Inventory (LEI)
Advertisements

ASSURING THAT TRAINING HAS IMPACT: EVALUATING A LARGE AND COMPLEX TRAINING SYSTEM Child Welfare Evaluation Summit Washington, D.C. | August 30 th, 2011.
Quality Improvement/ Quality Assurance Amelia Broussard, PhD, RN, MPH Christopher Gibbs, JD, MPH.
Performance Appraisal
Common Core 3.0 Update Project Overview and Status Timeline for Completion Evaluation.
Performance Management and Appraisal
Evaluating the Effectiveness of Your Training: Kirkpatrick’s 4 Levels Left Click to move to next slide and begin next audio. PRESS the F5 key to begin.
Providing Leaders with the Missing Link: Making Customer Information that is Linked to the Bottom Line Part of Leaders’ 360-Degree Feedback Jim Miller.
Title I Needs Assessment and Program Evaluation
King Saud University College of nursing Master program.
Implementing the ASCA National Model
Performance Management
QUALITY ASSURANCE PROJECT Improvement Coach The purpose of this session is to introduce participants to the role of the improvement coach and prepare for.
INTRODUCTION Performance management is a relatively new concept to the field of management.
ASSESSMENT& EVALUATION Assessment is an integral part of teaching. Observation is your key assessment tool in the primary and junior grades.
HR Session 5 Performance Management and Appraisal Dr. Debra Munsterman
Agenda Overview of evaluation Timeline Next steps.
ADVANCED LEADERSHIP DEVELOPMENT FOR SUPERVISORS (ALDS) PROGRAM YEAR 1 IMPLEMENTATION MARCH 2010 – FEBRUARY 2011 PILOT PROGRAM.
Training & Development
National Professional Development Center on ASD Lisa Sullivan MIND Institute, UC Davis.
Iowa’s Teacher Quality Program. Intent of the General Assembly To create a student achievement and teacher quality program that acknowledges that outstanding.
Performance Management # 1.1. Employee Performance What is expected of an employee in terms of -Quantity of output -Quality of output -With specification.
EFFECTIVENESS OF TRAINING Group 5. Effectiveness of Training  What is “effectiveness of training”? Effectiveness means producing an intended result.
PLAN AND ORGANISE ASSESSMENT. By the end of this session, you will have an understanding of what is assessment, competency based assessment, assessment.
The Academy of Pacesetting Districts Introducing...
Kindergarten Individual Development Survey (KIDS) District 97 pilot involvement December 11, 2012.
The contents of this presentation were developed under a grant from the US Department of Education, #H323A However, these contents do not necessarily.
Adriana Signorini, CRTE, SATAL Coordinator Greg Dachner, SSHA, SATAL Student Sharai Kirk, SSHA, SATAL Student How do we know our students are learning?
Overview of the Texas Teacher and Principal Evaluation Systems.
Quality Assessment July 31, 2006 Informing Practice.
MISSOURI PERFORMANCE ASSESSMENTS An Overview. Content of the Assessments 2  Pre-Service Teacher Assessments  Entry Level  Exit Level  School Leader.
Strategic Planning for Training Evaluation
1 About Learning & Training. 2 What is Learning? Training?  Learning is change.  Training is how you make the change happen.
Training and Developing a Competitive Workforce 17/04/2013.
CA COUNTY PEER QUALITY CASE REVIEW (Insert Review Week Dates)
TRAINING & DEVELOPMENT Dr. Anil Mehta DEFINITION OF TRAINING “A PLANNED ACTIVITY TO MODIFY ATTITUDE, KNOWLEDGE OR SKILL THROUGH LEARNING EXPERIENCE TO.
Carol M. Trivette, Ph.D. Orelena Hawks Puckett Institute Asheville and Morganton, North Carolina Presentation prepared for Helping Extend Learning and.
“A Truthful Evaluation Of Yourself Gives Feedback For Growth and Success” Brenda Johnson Padgett Brenda Johnson Padgett.
THE NEED TO TRAIN THE TRAINERS. Who University lecturers /academics Teaching assistants (such as research students who do tutoring etc) Senior academics.
Professional Development to Practice The contents of this presentation were developed under a grant from the US Department of Education to the Missouri.
Ohio Department of Education March 2011 Ohio Educator Evaluation Systems.
Session 4 Performance-Based Assessment
Assessment & Program Review President’s Retreat 2008 College of Micronesia - FSM May 13 – 15, 2008 FSM China Friendship Sports Center.
Secondary Analysis of Child Welfare In-Service Training Data Comparing Title IV-E and non-Title IV-E Graduates 1.
The Curriculum Development Process Dr. M
 Training – the process of teaching new employees the basic skills they need to perform their job.  Development – learning that goes beyond today’s.
Florida Charter School Conference Orlando, Florida November, 2009 Clark Dorman Project Leader Florida Statewide Problem-Solving/RtI Project University.
Common Core 3.0 Online Learning Classroom Skill Building Field Activities.
Module 1 Peer Coaching on Paper Peer Coach Training.
COACHING IN CHILD WELFARE MARCH 21, DEFINITION OF COACHING.
1 Collaboration Across Part C and 619 on Child Outcomes Measuring Child and Family Outcomes.
2012 National Human Services Training Evaluation Symposium: 2012 National Human Services Training Evaluation Symposium: An Investigation of Stereotype.
Action research The application of the tools and methods of social science to immediate and practical problems of teaching etc., with the goals of contributing.
Peer Coaching for Effective Professional Learning.
Child Welfare Training Evaluation in California Update on the Strategic Planning Process RTA All Staff SPS | March
 Identification of Child Maltreatment: Public Child Welfare Worker Training Evaluation Outcomes Chris Lee, M.S.W. Maria Hernandez, M.S.W. California Social.
E VALUATION PLAN FOR COURSE DESIGN Embedding Literacy and Numeracy Module.
Implementing the ASCA National Model The Transformed School Counselor Chapter 7 ©2012 Cengage Learning. These materials are designed for classroom use.
CAEP Standard 4 Program Impact Case Study
Professional Development: Evaluation Strategies to Maximize Impact
NEEDS ANALYSIS.
Learning Into Practice Plan
Missouri’s Interagency Statewide Planning Team: Improving Quality of Life for Individuals Across the Lifespan Julia LePage and Terri Rodgers Missouri DDD.
SPIRIT OF HR.in TRAINING EVALUATION.
Assuring the Quality of your COSF Data
[Greetings and Personal Introduction]
ERASMUS+ Teachex: Teaching excellence in Israel
Preparation of Session Plan
TLQAA STANDARDS & TOOLS
Student Learning Outcomes Assessment
Presentation transcript:

Developing a Model of Trainer Evaluation Leslie W. Zeitler, LCSW May 2010: 13 th Annual National Human Services Training Evaluation Symposium “Problem/Brainstorm Session”

2 Problem/Brainstorm Session Not a presentation Not a presentation Is an invitation to contribute to thinking on this topic… Is an invitation to contribute to thinking on this topic…

3 Context Child Welfare Training Evaluation in CA: Child Welfare Training Evaluation in CA: Oversight by Macro Evaluation Team (subcommittee of Statewide Training & Education Committee) Oversight by Macro Evaluation Team (subcommittee of Statewide Training & Education Committee) Began conceptualization in 2002 Began conceptualization in 2002 Began actual evaluations in 2005 Began actual evaluations in st Strategic Plan (through 2009): focused on child welfare worker and supervisor knowledge/skill evaluations. 1 st Strategic Plan (through 2009): focused on child welfare worker and supervisor knowledge/skill evaluations.

4 Data Currently Collected in CA: Under 1 st strategic plan ( ) and continuing through 2 nd strategic plan ( ): Under 1 st strategic plan ( ) and continuing through 2 nd strategic plan ( ): Trainee Demographics Trainee Demographics Trainee Satisfaction (at the regional/county level, but not statewide) Trainee Satisfaction (at the regional/county level, but not statewide) Trainee Knowledge (acquisition of knowledge) Trainee Knowledge (acquisition of knowledge) Trainee Skill (application of knowledge in the classroom via embedded evaluations) Trainee Skill (application of knowledge in the classroom via embedded evaluations)

5 Current (2 nd ) Strategic Plan for CW Training Evaluation in CA: In 2 nd strategic plan, starting to look at additional factors that affect trainees and training: In 2 nd strategic plan, starting to look at additional factors that affect trainees and training: Attitudes, values Attitudes, values Stereotype threat pilot Stereotype threat pilot Item analysis by trainer Item analysis by trainer Development of model of trainer evaluation Development of model of trainer evaluation Etc. Etc.

6 Models of Trainer Evaluation? Do you have a systematic method or model for evaluating trainers? Do you have a systematic method or model for evaluating trainers? If so, is it a model based in theory? If so, is it a model based in theory? If so, which model? If so, which model?

7 Factors for Consideration: Need to identify: Need to identify: Purpose(s) of trainer evaluation Purpose(s) of trainer evaluation Elements/dimensions for evaluation Elements/dimensions for evaluation Methods of evaluation Methods of evaluation Measurement format Measurement format Uses of data Uses of data

8 Purpose Purpose(s) of evaluating trainers Purpose(s) of evaluating trainers Improve quality of training Improve quality of training Standardize training delivery Standardize training delivery Establish basis for trainer fees (On a statewide level) Establish basis for trainer fees (On a statewide level) As part of ongoing professional development for trainers As part of ongoing professional development for trainers For personnel purposes (promotion, probation, firing) For personnel purposes (promotion, probation, firing)

9 I.D. Elements/dimensions for evaluation: Dimensions of trainer performance Dimensions of trainer performance From 2001 APHSA/NSDTA Trainer Competency Model? From 2001 APHSA/NSDTA Trainer Competency Model? From Institute for Human Services competencies? From Institute for Human Services competencies? From other (competency) models? From other (competency) models?

10 Identify Methods of Evaluation: Observation? Observation? By whom? By whom? Peers? Peers? Supervisors? Supervisors? Self-reflection? Self-reflection? Trainee feedback? Trainee feedback? Aggregate data from trainees’ tests? Aggregate data from trainees’ tests? What would be the advantages and disadvantages of each method of evaluation? What would be the advantages and disadvantages of each method of evaluation?

11 Identify Measurement Format: Likert scale? Likert scale? Anchors? (How would we go about anchoring any scale?) Anchors? (How would we go about anchoring any scale?) What tools have other training evaluators used? What tools have other training evaluators used? Sample Tools: Sample Tools: Kentucky (Debbie Dever) Kentucky (Debbie Dever) Indiana (Evoke Communications/Quay Kester) Indiana (Evoke Communications/Quay Kester)

12 Identify Uses of Collected Data: How often are trainer evaluations done? How often are trainer evaluations done? Who gets to see the trainer evaluation data? Who gets to see the trainer evaluation data? What decisions could result from ongoing collection and analysis of trainer evaluation data? What decisions could result from ongoing collection and analysis of trainer evaluation data? High stakes? High stakes? Medium stakes? Medium stakes? Low stakes? Low stakes? What analyses could be done? What analyses could be done? In aggregate, to see improvement over time? In aggregate, to see improvement over time? Etc. Etc.

13 Anything else? What else, if anything, needs to be considered when designing a trainer evaluation model? What else, if anything, needs to be considered when designing a trainer evaluation model? Involvement of trainers in the design process? Involvement of trainers in the design process? Etc. Etc.

14 Additional Thoughts? Contact Leslie W. Zeitler at: Contact Leslie W. Zeitler at: California Social Work Education Center