Presentation is loading. Please wait.

Presentation is loading. Please wait.

Jennifer Coffey, OSEP. Regional Meetings – Evidence Based Professional Development February 3 - Washington, DC - Speaker: Michelle Duda, SISEP February.

Similar presentations


Presentation on theme: "Jennifer Coffey, OSEP. Regional Meetings – Evidence Based Professional Development February 3 - Washington, DC - Speaker: Michelle Duda, SISEP February."— Presentation transcript:

1 Jennifer Coffey, OSEP

2 Regional Meetings – Evidence Based Professional Development February 3 - Washington, DC - Speaker: Michelle Duda, SISEP February 8 - New Orleans, LA - Speaker: Melissa VanDyke, SISEP February 15 - Portland, OR - Speaker: Chris Borgmeier, Oregon PBIS Leadership Network Innovation Fluency Date: March 24, 3:00-4:30pm ET Speaker: Karen Blasé, SISEP Professional Development for Administrators Date: April 19, 3:00-4:30pm ET Speakers:Elaine Mulligan, NIUSI Leadscape & Rich Barbacane, Nat’l Association of Elementary School Principals Steve Goodman, Michigan SPDG Using Technology for Professional Development Date:May 18, 2:00-3:30pm ET Speaker: Chris Dede, Ph.D., Learning Technologies, Harvard 2

3  According to the thesaurus of the Educational Resources Information Center (ERIC) database, professional development refers to "activities to enhance professional career growth." Such activities may include individual development, continuing education, and inservice education, as well as curriculum writing, peer collaboration, study groups, and peer coaching or mentoring.  Fullan (1991) expands the definition to include "the sum total of formal and informal learning experiences throughout one's career from preservice teacher education to retirement" (p. 326). North Central Regional Education Lab (NCREL)

4  "Professional development... goes beyond the term 'training' with its implications of learning skills, and encompasses a definition that includes formal and informal means of helping teachers not only learn new skills but also develop new insights into pedagogy and their own practice, and explore new or advanced understandings of content and resources. [This] definition of professional development includes support for teachers as they encounter the challenges that come with putting into practice their evolving understandings about the use of technology to support inquiry-based learning.... Current technologies offer resources to meet these challenges and provide teachers with a cluster of supports that help them continue to grow in their professional skills, understandings, and interests.“ – Grant (nd)

5  Evidence base/best practice  Models Carol Trivette and Carl Dunst - PALS NIRN – Implementation Drivers Guskey  Preview to the presentation at the Regional Meeting and the PD Series  SPDG/OSEP Program Measures 5

6  Julie Morrison, Ohio SPDG Evaluator Models of Professional Development  Li Walter and Alan Wood, California SPDG Evaluators

7  “No intervention practice, no matter what its evidence base, is likely to be learned and adopted if the methods and strategies used to teach or train students, practitioners, parents, or others are not themselves effective.” "Let's Be Pals: An Evidence-based Approach to Professional Development." Dunst & Trivette, 2009

8 Evidence-Based Intervention Practices  Insert your SPDG initiative here Evidence-Based Implementation Practices  Professional Development  Staff Competence: Selection, Training, Coaching, and Performance Assessment Drivers  Adult learning methods/principles  Evaluation 8 Two Types of Evidence-Based Practices

9  “Adult learning refers to a collection of theories, methods, and approaches for describing the characteristics of and conditions under which the process of learning is optimized.”

10  Research synthesis of 79 studies of accelerated learning, coaching, guided design, and just-in-time-training  58 randomized control design studies and 21 comparison group studies  3,152 experimental group participants and 2,988 control or comparison group participants  Combination of studies in college and noncollege settings  Learner outcomes included learner knowledge, skills, attitudes, and self-efficacy beliefs  Weighted average Cohen’s d effect sizes for the post test differences between the intervention and nonintervention or comparison groups were used for assessing the impact of the adult learning methods. a Trivette, C.M. et al. (2009). Characteristics and consequences of adult learning methods and strategies. Winterberry Research Syntheses, Vol. 2, Number 1. 10

11  To be most effective need to actively involve the learners in judging the consequences of their learning experiences (evaluate, reflection, & mastery) Need learner participation in learning new knowledge or practice Need learner engagement in judging his or her experience in learning and using new material

12 Planning Introduce Engage the learner in a preview of the material, knowledge or practice that is the focus of instruction or training Illustrate Demonstrate or illustrate the use or applicability of the material, knowledge or practice for the learner Application Practice Engage the learner in the use of the material, knowledge or practice Evaluate Engage the learner in a process of evaluating the consequence or outcome of the application of the material, knowledge or practice Deep Understanding Reflection Engage the learner in self-assessment of his or her acquisition of knowledge and skills as a basis for identifying “next steps” in the learning process Mastery Engage the learner in a process of assessing his or her experience in the context of some conceptual or practical model or framework, or some external set of standards or criteria a Donovan, M. et al. (Eds.) (1999). How people learn. Washington, DC: National Academy Press. 12

13 Practices Number Mean Effect Size (d) 95% Confidence Interval Studies Effect Sizes Pre-class exercises991.02.63-1.41 Out of class activities/self-instruction1220.76.44-1.09 Classroom/workshop lectures26108.68.47-.89 Dramatic readings1840.35.13-.57 Imagery718.34.08-.59 Dramatic readings/imagery411.15-.33-.62 Effect Sizes for Introducing Information to Learners 13

14 Practices Number Mean Effect Size (d) 95% Confidence Interval Studies Effect Sizes Using learner input for illustration66.89.28-1.51 Role playing/simulations2064.87.58-1.17 Real life example/real life + role playing 610.67.27-1.07 Instructional video549.33.09-.59 Effect Sizes for Illustrating/Demonstrating Learning Topic 14

15 Characteristics Number Mean Effect Size (d) 95% Confidence Interval StudiesEffect Sizes Real life application + role playing5201.10.48-1.72 Problem solving tasks1629.67.39-.95 Real life application1783.58.35-.81 Learning games/writing exercises911.55.11-.99 Role playing (skits, plays)1135.41.21-.62 Effect Sizes for Learner Application 15

16 Practices Number Mean Effect Size (d) 95% Confidence Interval StudiesEffect Sizes Assess strengths/weaknesses1448.96.67-1.26 Review experience/make changes 1935.60.36-.83 Effect Sizes for Learner Evaluation 16

17 Practices Number Mean Effect Size (d) 95% Confidence Interval StudiesEffect Sizes Performance improvement9341.07.69-1.45 Journaling/behavior suggestion817.75.49-1.00 Group discussion about feedback1629.67.39-.95 Effect Sizes for Learner Reflection 17

18 Practices Number Mean Effect Size (d) 95% Confidence Interval StudiesEffect Sizes Standards-based assessment1344.76.42-1.10 Self-assessment1629.67.39-.95 Effect Sizes for Self-Assessment of Learner Mastery 18

19  Engaging learners in a process of self- assessment of their performance using some type of conceptual or operational framework proved to be a practice that resulted in the largest effect  “Learners are not likely to become experts without instructors engaging them in a process of evaluating their experiences in the context of some framework, model, or operationally defined performance standards or expectations.”

20  “The more opportunities a learner has to acquire and use new knowledge or practice, the more frequently those opportunities occur, and the more the learner is engaged in reflection on those opportunities using some external set of standards, the greater the likelihood of optimal benefits.”

21 21

22  The smaller the number of persons participating in a training (<20), the larger the effect sizes for the study outcomes.  The more hours of training over an extended number of sessions, the better the study outcomes.  The practices are similarly effective when used in different settings with different types of learners. 22

23  Trainers neither direct learning nor encourage only self-directed learning, but rather guide learning based on observations of learners’ experiences and evaluation of the use of the practice and learner self-assessment against standards.

24 PLAN APPLICATIONRECYCLE Active Learner Involvement Reflection and Mastery Practice and Evaluate Introduce and Illustrate Identify Next Steps in the Learning Process INFORMED UNDERSTANDING 24

25  “The use of PALS practices has been found to be associated with improved learner knowledge use, and mastery of different types of intervention practices.”

26 PALS PhasesTrainer RolesTrainee Roles IntroductionPreview learning topicComplete pretraining preview Describe key elementsPre-class/workshop exercises Provide examplesProvide input on the learning topic Include trainee inputIn-class/workshop warm-up exercises Illustrate application Demonstrate application ApplicationFacilitate applicationProvide examples of application Observe trainee applicationTrainee role playing, games, etc. Provide in vivo feedback/guidance Implement/practice use of the subject matter Facilitate learner assessment of options Evaluate use of the knowledge or practice Informed UnderstandingEstablish learning standardsStandards-based evaluation Engage learners in self-assessmentConduct self-assessment Provide guidance to learnersTrainer-guided learner reflection Provide behavioral suggestionsJournaling Group discussions of understanding Repeat Learning ProcessJoint planning Trainer guidanceIdentify needed information/experiences Trainer/trainee mentoring Trainer and Trainee Roles in the Different Phases of PALS 26


Download ppt "Jennifer Coffey, OSEP. Regional Meetings – Evidence Based Professional Development February 3 - Washington, DC - Speaker: Michelle Duda, SISEP February."

Similar presentations


Ads by Google