Presentation is loading. Please wait.

Presentation is loading. Please wait.

Christopher R. Gareis, Ed.D.

Similar presentations


Presentation on theme: "Christopher R. Gareis, Ed.D."— Presentation transcript:

1 Christopher R. Gareis, Ed.D.
SWVA Clinical Faculty Consortium Principles of Program Evaluation A Workshop for the Southwest Virginia Professional Education Consortium Christopher R. Gareis, Ed.D. The College of William & Mary Principles of Program Evaluation

2 Pre-Session Understandings

3 Perspective on Our Time Together Today
The Profession of Education The regular and expert exercise of learned judgment and skills in service of one of an individual’s life needs and one of society’s greater purposes. Zen and the Art of Professional Development The Principle of Collective Wisdom

4 Let’s graph ‘em

5 Shared Understandings
What is a “program”? A planned, multi-faceted series of activities leading to intended outcomes for individuals, groups, and/or a community. What is “evaluation”? The use of valid and reliable information/data to make judgments.

6 Definitions of “Program Evaluation”
Judging the worth of a program. (Scriven, 1967) The process of systematically determining the quality of a school program and how the program can be improved. (Sanders, 2000) The deliberate process of making judgments and decisions about the effectiveness, direction, and value of educational programs in our schools. (me)

7 Christopher R. Gareis, Ed.D.
SWVA Clinical Faculty Consortium DO YOU KNOW OF…? …a program that you thought was effective but was discontinued without explanation? …a program that you believe is likely a waste of time, money, and effort, and yet it continues? We tend not to evaluate programs. If we do evaluate, we tend to do so poorly. If we do evaluate programs effectively, we oftentimes do not act on our inferences. Principles of Program Evaluation

8 Random Acts of Improvement
Our Intentions Our Efforts

9 Aligned Acts of Improvement
Our Intentions Our Efforts

10 Perspective Planning Evaluation

11 The Role of Evaluative Thinking
Planning Redesigning Evaluative Thinking Implementing Evaluation needs to be a part of a project from the beginning, with the most astute including an evaluator in the planning phase. Evaluation is a process that begins at the planning stage Feeds into the project at each stage of implementation Benefits Increases chances you can answer questions you wish to address and appropriate measures can be found or developed so you can develop baseline data Evaluation is not an event but a process Frechtling, 2007

12 > 20 major models of program evaluation
Seminal Model: Daniel Stufflebeam’s CIPP model (1971) Our focus: Creating a Logic Model Focusing the Evaluation Identifying Performance Indicators EVALUATIVE THINKING

13 Our Objectives Today Understand basic concepts and principles of program evaluation at the school level Create a logic model that accurately depicts the essential components of and relationships among input, process, and outcome variables of a school-based educational program Select the appropriate focus for the evaluation of a program Identify appropriate data sources aligned to the intended educational outcomes of a selected school-level program Appreciate the importance of engaging in intentional program evaluation as a teacher leader

14 Talk About Your Program
What is your program? What does your program do? Who participates in your program? Whom does your program serve? How do they benefit? How do you know, or how would you know, that you program is a success?

15 Think Visually Sketch a visual metaphor of your program.

16 Every educational program:
Uses resources or INPUTS, such as… Materials Funds Teachers Students Engages in activities or PROCESSES, such as… Instruction Interventions Intends produce certain results or OUTCOMES, such as… Learning Achievement Engendering of certain attitudes or values Eligibility for a next step in life Bringing about some change

17 Visualizing an Educational Program The Logic Model
Inputs Processes Outcomes

18 What is a logic model? A diagram with text that illustrates and describes the reasoned relationships among program elements and intended outcomes to be attained. A visual theory of change. We use these resources… For these activities… So that these students / teachers … Can produce these results… Leading to these changes for the better… How Why Modified from guest lecture by John A. McLaughlin (November 16, 2009) at The College of William & Mary

19 Intermediate Outcomes
Simple Logic Model Inputs Processes Initial Outcomes Intermediate Outcomes Ultimate Outcomes

20 Take the antihistamine
Everyday Logic Feel better Take the antihistamine Get an antihistamine

21 Try This You’re hungry. Sketch a logic model to address your need.
Oh, but one twist: You have no food in the house. Assumptions

22 Greenlawn Middle School Common Math Planning Initiative
Initial Outcomes Intermediate Outcomes Ultimate Outcomes Processes Inputs Student affect for instruction Core Teachers Math Student achievement Quarterly class grades in math Student achievement 6th grade Math SOL 7th grade Math SOL Common planning by teachers Frequency Enthusiasm Resource Teachers Special Education Gifted Education Student engagement in instruction Logic Model of Project “Teacher Outcomes” = Assessment Literacy Aim is for Impact on student learning If teachers’ knowledge and skills associated with classroom-based assessment practices are improved, then is student learning qualitatively and quantitatively improved? What are the essential knowledge and skills associated with effective classroom-based assessment practices? With what instruments and/or by what means can we empirically draw valid inferences about the nature and degree of student learning? (Clearly, the SOLs alone don’t suffice.) What mode and what “dosage” of professional development is sufficient?

23 Now Try This You want to take a camping trip with your family to a state park. Sketch a logic model that depicts your intentions and actions. Purpose

24 The Reach of Intended Outcomes
Initial Outcomes Intermediate Outcomes Ultimate Time Immediately or very closely following program activity End of a term or year, typically End of a year, several years, or beyond formal K-12 schooling People Targeted individual students and/or aggregate groups May have an organizational or societal impact Indicators Discrete/Finite Measurable and/or observable May be sets of discrete indicators May or may not be readily measured or observed

25 Now Try This The Great Recession has had a broad and, in many places, deep impact on Americans. One “solution” to our economic morass has been a call from policymakers for improved financial literacy of our citizens. K-12 schools are seen as the logical place to teach young citizens financial literacy. Sketch a logic model that depicts this “theory of change” Causality

26 Source: STEM Education Symposium (retrieved 9/16/12).

27

28 A Simple Logic Model: 9th Grade Transition Program
9th Grade Teachers Guidance Counselors Student Mentors New Students Mentor Training Summer Orientation Mentor Activities Milestone Activities Sense of Belonging of 9th Graders Academic Success of 9th Graders Promotion to 10th Grade Reduced Dropout Rates Graduation Rate of 9th Grade Cohorts

29

30 Stephen M. Millett, Susan Tave Zelman, (2005) "Scenario analysis and a logic model of public education in Ohio", Strategy & Leadership, Vol. 33 Iss: 2, pp

31 Source: SUNY NGLC Grant Proposal (retrieved 9/16/12)

32 Christopher R. Gareis, Ed.D.
A Model for Improving Assessment Literacy Christopher R. Gareis, Ed.D. SWVA Clinical Faculty Consortium Context & Inputs Processes of Professional Development Outcomes for Teachers Impact on Student Learning Experience & Expertise of Professional Developers Explore Alignment C=Ia=A Understand role of C=Ia=A Alignment Create assessments that yield valid & reliable data Exposure to the full curriculum (e.g., depth & skill development) Unpack Curriculum Content Cognitive levels Experience & Expertise of Participants - Pre-service teachers - In-service teachers Instructional leaders Administrators Understand relationship between assessment & evaluation Use assessment data to make decisions: - About student learning - For student learning - About assessments - About instruction Create a Table of Specifications or “blueprint” Understand & apply concepts of validity & reliability Increased student achievement Psychometric Principles -Translated into practical terms & skills for teachers Critique an assessment for validity & reliability Use a ToS to: Create an assessment Critique & improve an assessment Create a unit plan assessment - Plan instruction - Analyze assessment results Contribute to the design & development of common assessments Explore uses of ToS Create an assessment Critique & improve an assessment Create a unit plan assessment - Plan instruction - Analyze assessment results State Standardized Assessments - de facto Curriculum? Logic Model of Project “Teacher Outcomes” = Assessment Literacy Aim is for Impact on student learning If teachers’ knowledge and skills associated with classroom-based assessment practices are improved, then is student learning qualitatively and quantitatively improved? What are the essential knowledge and skills associated with effective classroom-based assessment practices? With what instruments and/or by what means can we empirically draw valid inferences about the nature and degree of student learning? (Clearly, the SOLs alone don’t suffice.) What mode and what “dosage” of professional development is sufficient? Deeper, more meaningful learning State &/or School District Curriculum Provide “opportunity to learn” through aligned instruction Apply principles to the spectrum of assessment types & practices Create good assessment items, prompts, assignments, & rubrics District Goals, Initiatives, or Imperatives Principles of Program Evaluation

33 Christopher R. Gareis, Ed.D.
SWVA Clinical Faculty Consortium PROGRAM ACTION- LOGIC MODEL Inputs Processes Outcomes Activities Participation Short-term Intermediate-term Long-term I-P-O Elements are discretely identifiable—steps, resources, events, people, etc. Lines/arrows = hypothesized causal linkages or assumed relationships Principles of Program Evaluation

34 http://www. google. com/imgres. imgurl=http://www. researchutilization

35 Developing a Logic Model
Christopher R. Gareis, Ed.D. SWVA Clinical Faculty Consortium Developing a Logic Model Who: Groups of 3-4 Materials: Chart paper (in “landscape layout”), Post-It notes, & markers Task: Create a basic logic model for the AVID program IIIA06 IIIA07 IIIB06 IIIA17 IID09 IID10 IID11 IIA40 Principles of Program Evaluation

36 Advancement Via Individual Determination
AVID The AVID Student AVID targets students in the academic middle – B, C, and even D students – who have the desire to go to college and the willingness to work hard. These are students who are capable of completing rigorous curriculum but are falling short of their potential. Occasionally, they will be the first in their families to attend college, and many are from low- income or minority families. AVID pulls these students out of their unchallenging courses and puts them on the college track: acceleration instead of remediation. The AVID Elective Not only are students enrolled in their school’s toughest classes, such as honors and Advanced Placement, but also in the AVID elective. For one period a day, they learn organizational and study skills, work on critical thinking and asking probing questions, get academic help from peers and college tutors, and participate in enrichment and motivational activities that make college seem attainable. Their self-images improve, and they become academically successful leaders and role models for other students. Curriculum The AVID curriculum, based on rigorous standards, was developed by middle and senior high school teachers in collaboration with college professors. It is driven by the WIC-R method, which stands for writing, inquiry, collaboration, and reading. AVID curriculum is used in AVID elective classes, in content area classes in AVID schools, and even in schools where the AVID elective is not offered. Results A well-developed AVID program improves school-wide standardized test scores, advanced rigorous course enrollments, and the number of students attending college. Excerpted from

37 Why AVID Works Between the remedial programs for students who lag far behind, and the gifted –and –talented programs for a school’s brightest children, lies the silent majority: average students, who do “okay” in ordinary classes but, because they don’t attract attention to themselves, are left alone. Many of these students hunger for more challenging coursework, but fear failure. Their potential lies dormant, waiting to be recognized, encouraged, and supported. First, AVID identifies these student. The selection criteria include: Ability: Are the students getting Cs and Bs but are capable of more? Desire and Determination: Do they want to attend college? Are they willing to work hard to get there? Membership in an underserved group: Are they in a low income household? Will they be the first in their family to attend college? The AVID program is tailored to the needs of this diverse group of students, and it works for them because it: Accelerates underachieving students into more rigorous courses. Offers the intensive support students need to succeed in rigorous courses. Uses Socratic methods and study groups that specifically target the needs of under-achieving students. Is a school-wide initiative, not a school within a school. Changes the belief system of an entire school by showing that students that are in the middle academically, and low-income and minority students can achieve at the highest levels and attend colleges. Redefines the role of the teacher from lecturer to advocate and guide. The role of counselor changes from gate-keeper to facilitator. Creates site teams of administrators and educators from different content areas, encouraging communication and sharing among teachers, counselors, and principals. Is based on research on tracking – the process by which some children are channeled into challenging courses and others are relegated to remedial ones and peer influences in student achievement.

38 Christopher R. Gareis, Ed.D.
SWVA Clinical Faculty Consortium TRY THIS Task: Create a logic model for your program Materials: Chart paper (in “landscape layout”) Post-It notes Markers Answers to the earlier questions about your program IIIA06 IIIA07 IIIB06 IIIA17 IID09 IID10 IID11 IIA40 Principles of Program Evaluation

39 Gallery Walk (with Docents)
Can you read the logic model (without the docent’s assistance)? Questions that the logic model raises for you? Feature(s) you want to “borrow”? Suggestions to strengthen the logic model?

40 Why bother with logic modeling?
Seems like a lot of work. Logic models are too complex—how could I realistically ever know all the variables at play in a complex program! I don’t “think” this way. Even if we created a logic model, what would we do with it?!

41 Limits of Logic Models Represent intentions not reality
Focus on expected outcomes, which may mean missing out on beneficial unintended outcomes Challenging to know causal relationships Does not address whether what we’re doing is right

42 Benefits of Logic Modeling McLaughlin, J. A. , & Jordan, G. B. (1999)
Benefits of Logic Modeling McLaughlin, J. A., & Jordan, G. B. (1999). Logic models: a tool for telling your program’s performance story. Evaluation and Program Planning 22, Builds a shared understanding of the program and expectations for inputs, processes, and outcomes Helpful for program design (e.g., identifying critical elements for goal attainment and plausible linkages among elements) Points to a balanced set of key performance indicators

43 Let’s Clarify: A Logic Model…
Christopher R. Gareis, Ed.D. SWVA Clinical Faculty Consortium Let’s Clarify: A Logic Model… IS IS NOT Principles of Program Evaluation

44 Are you going to New York or by train?

45 Your question IS your FOCUS
Implementation Fidelity Question: “Are we implementing the program as designed?” Focus: Inputs & Processes Goal Attainment Question: “Are we seeing evidence that we are achieving our intended outcomes?” Focus: Initial, Intermediate, and/or Ultimate Outcomes

46 ESL Dual-endorsement Program
Approved teacher preparation programs Recruitment of candidates Elem/Sec/SPED program Dually-endorsed teachers Satisfaction with preparation Coursework MDLL TESOL/ESL courses Scheduling MDLL courses Field Experiences Advising candidates Arranging field experiences Orientation Transportation Supervision Impact on student learning ESL summer school in LEAs Locating field sites VDOE regulations for ESL prep Program approval

47 Christopher R. Gareis, Ed.D.
SWVA Clinical Faculty Consortium Once you have a logic model and performance indicators, what do you do? Where would IF evaluation occur and where would GA evaluation occur? Determine the intent (and “audience”): Formative evaluation (e.g., implementation fidelity and improvement—an assumption of continuation) Summative evaluation (e.g., to continue or discontinue) Articulate a limited number of relevant evaluation questions Identify who is needed to conduct the evaluation Determine how the evaluation will be conducted (time, money, data collection & analysis, compilation & reporting) Principles of Program Evaluation

48 Christopher R. Gareis, Ed.D.
SWVA Clinical Faculty Consortium i.e., “Implementation Fidelity “or “Goal Attainment” What is the focus of the program evaluation that would answer each of the following questions? Do methods used by teachers at Smith High School correspond to the principles of block scheduling introduced last fall? Did the new reading curriculum result in improved reading abilities among 2nd graders? Increased interest in reading? How consistently are Instructional Teams developing standards-aligned units of instruction for each subject and grade level (IIA01)? Which recruitment strategies (of the three that we used) were most effective in involving fathers in Early Head Start? How effective was the character education program in our middle school? Did our new guidance program help new ESL students transition socially to our school? How many AVID teachers have we trained during the past three years? Adapted from Fitzpatrick, Sanders, & Worthen. (2004). Program evaluation: Alternative approaches and practical guidelines. (3rd ed.) Boston: Pearson (pp ). Principles of Program Evaluation

49 What Information Do You Use to Answer Your Questions?

50 Examples of Assessment Sources in Schools
Student attendance rates Teacher attendance PTA membership Achievement gap among subgroups Inequity of class enrollment (e.g., proportionally fewer minority students in AP classes) Special education referral rates Behavior referrals Reading levels SOL scores PALS scores DIBELS scores Benchmark test scores SOL recovery data SAT scores AP scores Surveys Class grades Grade distributions Algebra I enrollment / completion College acceptances Summer school rates Dropout rates Retention rates Acceleration rates Identification for gifted services Athletic championships Debate team competitions Student demographics Staff demographics (e.g., years of experience, licensure status) Family demographics (e.g., income, educational levels) Financial resources (budget) Per pupil costs

51 What assessment sources could you use to gather relevant, accurate, dependable information/data to answer each question? Do methods used by teachers at Smith High School correspond to the principles of block scheduling introduced last fall? Did the new reading curriculum result in improved reading abilities among 2nd graders? Increased interest in reading? How consistently are Instructional Teams developing standards-aligned units of instruction for each subject and grade level (IIA01)? Which recruitment strategies (of the three that we used) were most effective in involving fathers in Early Head Start? How effective was the character education program in our middle school? Did our new guidance program help new ESL students transition socially to our school? How many AVID teachers have we trained during the past three years? Adapted from Fitzpatrick, Sanders, & Worthen. (2004). Program evaluation: Alternative approaches and practical guidelines. (3rd ed.) Boston: Pearson (pp ).

52 Avoid DRIP: Align Your Data Sources to Your Logic Model
Inputs Data source(s) to assess attainment or completion Processes Initial Outcomes Long-term Outcomes

53 Aligning Focus and Assessment Sources: Process (P) or Outcome (O) Indicator?
Using a classroom observation protocol to determine % of student engagement Advanced pass rate on high school end-of-course SOL test Questionnaire of 4th grade math students to determine degree of “mathphobia” Review of teacher-made lesson plans over the course of a 9-week period to determine % including explicit phonics instruction Graduation rates Number of “leveled” books available in the media center tracked over a 3-year period Survey of students’ attitudes about participating in an after-school tutorial program (e.g., “On a scale of 1-to-5, how much did you enjoy attending the after-school program? How helpful was your after-school tutor?”) 3rd grade SOL pass rates % enrollment of minority students in foreign language courses in the high school Implementation of an advisory period at the middle school level Change from a standard 7-period bell schedule to an alternating-day block schedule Average AP scores Student attendance rates Teacher attendance rates Review of committee meeting agendas to determine % that focus on discussion of achievement data Budget allocation per student Grade Point Averages

54 AVID: Advancement Via Individual Determination

55 Retrieved 9/16/12)

56 TRY THIS Tasks: Imagine undertaking an Goal Attainment evaluation of your program. Articulate at least 2 evaluation questions Identify 1-2 performance indicators (aka, data sources) that could help answer each question If time allows, try doing the same for an Implementation Fidelity evaluation. Materials: Your logic model

57 Know your means from you ends
Sage Advice Remember that not everything that matters can be measured, and not everything that can be measured matters. Know your means from you ends

58 To make “evidence-based decisions," you must have data that are:
Valid Necessary for drawing appropriate inferences Reliable Necessary for avoiding erroneous judgments Triangulated Necessary to confirm data Longitudinal Necessary to determine trends and patterns Disaggregated Necessary to provide "fine-grain" analysis, rather than over-generalizing How do SOL results hold up as a data source?

59 “Without evaluation, change is blind and must be taken on faith”
-- Sanders (2000)

60 Managing Complex Change Adapted from Knoster, T
Managing Complex Change Adapted from Knoster, T. (1991) presentation at TASH Conference, Washington, DC. Vision Skills Incentives Resources Action Plan CHANGE Vision Skills Incentives Resources Action Plan CONFUSION Vision Skills Incentives Resources Action Plan ANXIETY Vision Skills Incentives Resources Action Plan RESISTANCE Vision Skills Incentives Resources Action Plan FRUSTRATION Vision Skills Incentives Resources Action Plan TREADMILL Vision Skills Incentives Resources Action Plan Evaluative Feedback CHANGE for the better

61 Strengthening a Program Evaluation Plan
Jane Jackson is a curriculum coordinator for Greenlawn County Public Schools and has been working closely with the district’s middle school faculty to implement interdisciplinary instruction as a means to improving students’ academic engagement and achievement. Within each grade, a team of teachers cooperates to develop lessons that are interdisciplinary. Individual members of the team have been assigned responsibility for the areas of English, mathematics, science, and social studies. The program has been implemented for two full years. Anecdotally, the 7th grade Language Arts teachers (there are four of them, as well as collaboration teachers for both special and gifted education) appear to be engaging in interdisciplinary planning and instruction most enthusiastically and most regularly. Working with the school’s assistant principal, Lynnell Perkins, and the School Improvement Team chair, Archie Craun, Ms. Jackson has decided to evaluate the 7th grade Language Arts program . Her evaluation tentatively includes the following: The means of the 6th grade Writing and Language Arts SOL tests from last year compared to the means of the 7th grade Writing and Language Arts SOL tests from this year. Monthly interviews of a random sample of 10% of the 7th grade student population to assess students’ reactions to the Language Arts portion of the instructional program. A comparison of the mean, median, mode, and range of quarterly grades in 7th grade Language Arts for this year’s 7th grade students, last year’s 7th grade students (current 8th graders), and the 7th grade students for the two years immediately prior to implementation of the program (current 9th and 10th graders). Observations by an outside observer twice a month, using a scale she has devised to record pupil interaction during class discussions and activities. A weekly tracking of teacher lesson plans for evidence of planned interdisciplinary instruction. Given what you know about program evaluation, what are some of the strengths of Ms. Jackson’s tentative evaluation plan? What would you advise her to change? What would you advise her to add? Adapted from Fitzpatrick, Sanders, & Worthen. (2004). Program evaluation: Alternative approaches and practical guidelines. (3rd ed.) Boston: Pearson (pp ).

62 Greenlawn Middle School Interdisciplinary Instruction Initiative
Initial Outcomes Intermediate Outcomes Ultimate Outcomes Processes Inputs Core Teachers English/LA Math Science History Student affect (interview) Student achievement Quarterly class grades in E/LA (mean, median, mode, & range) Inter-disciplinary planning (lesson plans) Frequency Enthusiasm Student achievement 6th grade E/LA SOL 7th grade E/LA SOL Student engagement (observation protocol) Resource Teachers Special Education Gifted Education Logic Model of Project “Teacher Outcomes” = Assessment Literacy Aim is for Impact on student learning If teachers’ knowledge and skills associated with classroom-based assessment practices are improved, then is student learning qualitatively and quantitatively improved? What are the essential knowledge and skills associated with effective classroom-based assessment practices? With what instruments and/or by what means can we empirically draw valid inferences about the nature and degree of student learning? (Clearly, the SOLs alone don’t suffice.) What mode and what “dosage” of professional development is sufficient?

63 Time to Evaluate Understand basic concepts and principles of program evaluation at the school level Create a logic model that accurately depicts the essential components of and relationships among input, process, and outcome variables of a school-based educational program Select the appropriate focus for the evaluation of a program Identify appropriate data sources aligned to the intended educational outcomes of a selected school-level program Appreciate the importance of engaging in intentional program evaluation as an teacher leader

64 Empirical Research ≠ Program Evaluation
Identifying focus Hypotheses testing Value judgments Replication of results Data collection Control of variables Generalizability of results Focus Organizational need Comparison of outcomes to intended program outcomes Contextually driven Low likelihood Broad and heavily dependent upon feasibility Little to none Dependent variables (i.e., results) more so than on Independent variables (i.e., the cause)

65 Investigating the Efficacy of a Clinical Faculty Program
Christopher R. Gareis, Ed.D. Leslie W. Grant, Ph.D. The College of William & Mary All 50 states and 43 foreign countries 25% are students of color 79 % of freshmen graduated in the top 10% of their class 75 % of students participate in community service projects

66

67 What we have already known

68

69 Indicators of Improved Effectiveness of Recruitment, Selection, & Training
Currently Active

70 Percentage of Placements with CT vs CF

71 Indicators of Perceived Effectiveness Training
Coherence between field and university: I have had student teachers in the past with absolutely no guidance or direction from the college. I’m not sure how effective I was. This class has been so beneficial and I feel I am more confident in mentoring this time! Thank you! Skill development as a mentor: I gained awareness in many of the philosophies behind coaching. As I learned techniques, I grew as a mentor. This program is essential for bringing professionalism to the training of student teachers. Mentoring does not always come naturally and having techniques and strategies to address problems and guide students is invaluable. Outcome orientation: I am prepared to mentor aspiring teachers. I am prepared to help them become independent and effective teachers. Professional growth as a teacher: I always feel very good about what I’m doing when I leave class. I like having the professional growth and being able to take a good look at my teaching. I have really enjoyed it. What a great program. I didn’t realize how much I didn’t know about being a CT.

72 Overall Evaluation of Experiences in the Clinical Faculty Training
W&M Clinical Faculty Program SWVA Clinical Faculty Consortium Overall Evaluation of Experiences in the Clinical Faculty Training Overall evaluation of training 1=Poor, 5=Excellent

73 What We Want to Know: Cooperating Teachers Clinical Faculty

74

75 Research Questions To what degree do CF differ from CTs in their sense of self- efficacy for roles of a cooperating teacher? To what degree do mid-term evaluations of student teachers placed with CF differ from those placed with CTs? To what degree do final evaluations of student teachers placed with CF differ from those placed with CTs? To what degree do new teachers who had been placed with CF differ from those who had been placed with CTs with regard to sense of efficacy for teaching, perceived impact on student learning, and intent to remain in the profession?

76

77 Research Methods – Data Collection
Surveys Clinical Faculty & Cooperating Teachers (R1) 1998 – 2011 n = 101 Response rate: 37.0% Graduates of the School of Education (aka, new teachers) (R4) 2005 – 2010 n = 94 Response rate: % Mid-term & Final Student Teaching Evaluations (R2 & R3) 2008 – 2011 (n = 319) Student Teacher self-evaluations CT/CF evaluations University Supervisor evaluations

78 Research Methods – Data Analysis
Unit of Analysis – Cooperating teacher designation Clinical Faculty (CF) – Trained through the School of Education’s Clinical Faculty Program Cooperating Teacher (CT) – Not trained through the School of Education’s Clinical Faculty Program Statistical Analyses T-tests for significant differences according to CT/CF designation

79 RQ 4

80 Research Questions 2 & 3 To what degree do mid-term evaluations of student teachers placed with CF differ from those placed with CTs? To what degree do final evaluations of student teachers placed with CF differ from those placed with CTs?

81

82 Statistically Significant Differences in Mid-Term Ratings (RQ2)
Competency CF Rating CT Rating 1. Understands subject matter and pedagogy... 2.30 2.44 17. Implements assessments for learning 2.20 2.38 26. Participates in and applies professional development 2.41 2.60 30. Demonstrates potential for teacher leadership 2.34 2.51 4 out of 30 Competencies

83 Differences in Mid-Term Ratings (RQ2)

84 Statistically Significant Differences in Final Ratings (RQ3)
Competency CF Rating CT Rating 16. Creates and selects appropriate assessments for learning 2.36 2.52 1 out of 30 Competencies

85 Differences in Final Ratings (RQ3)

86 “Overall Teaching Effectiveness” Ratings (R2 & R3)
Mid-Term Final CF CT Student Teacher Self-Evaluation Ratings 2.10 2.12 2.37 2.48 Cooperating Teacher Evaluation Ratings 2.29 2.39 2.63 2.69 University Supervisor Evaluation Ratings 2.07 2.64

87 Overall Rating Comparison
CF/CT University Supervisor Student Teacher

88 Research Question 1 (RQ1)
To what degree do CF differ from CTs in their sense of self-efficacy for roles of a cooperating teacher?

89 Sense of Efficacy for the Role of CF/CT (R1)
W&M Clinical Faculty Program Sense of Efficacy for the Role of CF/CT (R1) SWVA Clinical Faculty Consortium Question Clinical Faculty (trained) n=76 Cooperating Teachers (non-trained) n=25 T-Test Equal Variances Assumed T-Test Equal Variances Not Assumed Convey Role as CF/CT* 4.51 4.04 0.007 0.038 Four Fundamental Roles of Mentoring* 4.24 3.65 0.002 0.015 Foster Relationship 4.57 4.62 0.768 0.772 Effectively Observe* 4.59 3.92 0.001 Use a Variety Supervision Strategies** 4.19 0.066 0.111 Effectively Conference 4.47 4.23 0.169 0.198 Summatively Evaluate ST* 4.55 0.006 Impact Professional Abilities of ST 4.49 4.38 0.423 0.447 Provide High Quality Field Exp. 4.54 0.784 Likelihood ST Positively Affects Pupil Learning 4.37 0.941 0.939 Likelihood ST Enters Teaching 4.31 0.663 0.687 Likelihood ST Remains in Teaching 3.99 0.774 0.796 Likelihood ST Positive Impacts Pupil Learning in First Year 4.41 0.454 0.51 Likelihood ST Emerges as a Teacher Leader 4.08 4 0.662 0.686 *significance < .05 **significance < .10

90 To what experiences do CF and CT attribute their mentoring acumen?
Group Experience % Indicating Most Important Experience Cooperating Teachers Experience as a classroom teacher 28.0% Clinical Faculty Clinical Faculty training* 38.7% * 20% of CF selected “experience as a classroom teacher” as Most Important.

91 RQ 1 RQ 2 RQ 3

92 What could explain... ...differences in student teaching outcomes associated with Clinical Faculty? ....the lack of differences between CF and CTs in both intermediate and long-term outcomes?

93

94 Should we. continue,. discontinue,
Should we continue, discontinue, or change the W&M Clinical Faculty Program?

95

96 Program Evaluation in Use Berkeley Middle School SOL Jam & Cram Spring 1999

97 SOL Jam & Cram INPUTS PROCESSES OUTPUTS INITIAL OUTCOMES ULTIMATE
INTERMEDIATE ULTIMATE Analyzed SOL results and class performance Selected students: C or B avg. LPT 7th gr. SOL Invited 175 “in the middle” students Moti-vational “recruit-ment” Analyzed SOL test blueprints Targeted specific SOLs for focused review Review & enrichment activities High-activity, novel instructional strategies 8 sessions (1/2 hour each) Timely (week prior to SOLs) Volunteer teachers (but paid) Collaboratively planned lessons Lunch & door prizes provided Sense of “efficacy” for SOL success (survey of students) SOL assessment results Academically prepared for success in high school Independent thinker Responsible citizen Lifelong learner

98 Why Did Students Attend?
7th 8th 50% 32% It is important to me to do my best in school. 33% 65% I want to avoid having to go to summer school. 10% 3% My parents made me sign up. 7% -- Some of my friends signed up, so I did, too.

99 Why Did Students NOT Attend?
7th 8th 4% -- I do not enjoy learning. 6% I already know that I will have to attend summer school. I don’t’ believe that Jam & Cram would help me on any of the SOL tests. I do not feel that I fit into the group of students who were selected to attend. 52% 56% I have a conflict on the day of Jam & Cram, so I wouldn’t be able to attend. 26% 25% I forgot to return the slip on time 9% I can study fine on my own I didn’t want to go. (“It was really stupid, dumb.”) 3% I don’t like school.

100 Post-Survey Participants
To what degree do you agree or disagree with the following statement: “Jam & Cram helped me to be better prepared for the SOL tests.” [4 pt. scale] 7th Grade: 8th Grade: Non-Participants 7th 8th 35% 22% I wish I had attended 45% 56% I’m glad that I didn’t attend because I think I did well anyway. 21% Other (mostly explanations of why they didn’t attend)

101 Percentage of Students Passing SOL Tests Berkeley Middle School Grade 8

102

103

104 An Analysis of SOL Data: What We Can Learn
We did great last year with gains of percentage points in all areas! Science is our strongest area: highest pass rate, highest advanced pass rate, and relatively low disparity. Writing is a strong area: high pass rate and relatively low disparity…but very few advanced passes. English/Reading is a relatively strong area with a high pass rate and a very high advanced pass rate. Computer Technology is a consistently strong area…but has the second highest disparity group Math 8 is conquerable…but the disparity is wide and our overall pass rate is still in shaky territory History/Social Studies remains our greatest challenge in terms of overall pass rate and disparity by both ethnicity and gender…BUT we had whopping gains last year! Our year 2002 class (last year’s 6th grade) is academically strong. Our 2003 class (this year’s 6th grade) is even stronger! Jam & Cram was a successful strategy by all anecdotal accounts, but was not effective statistically speaking. We may want to repeat the program but with definitive subgroups of students and not in a single cram sessions. SOL Resource class appears to have been a success based on pass rates alone (nearly 100%), although it has not been compared to control data.

105


Download ppt "Christopher R. Gareis, Ed.D."

Similar presentations


Ads by Google