Presentation on theme: "Christopher R. Gareis, Ed.D."— Presentation transcript:
1 Christopher R. Gareis, Ed.D. SWVA Clinical Faculty ConsortiumPrinciples of Program Evaluation A Workshop for the Southwest Virginia Professional Education ConsortiumChristopher R. Gareis, Ed.D.The College of William & MaryPrinciples of Program Evaluation
3 Perspective on Our Time Together Today The Profession of EducationThe regular and expert exercise of learned judgment and skills in service of one of an individual’s life needs and one of society’s greater purposes.Zen and the Art of Professional DevelopmentThe Principle of Collective Wisdom
5 Shared Understandings What is a “program”?A planned, multi-faceted series of activities leading to intended outcomes for individuals, groups, and/or a community.What is “evaluation”?The use of valid and reliable information/data to make judgments.
6 Definitions of “Program Evaluation” Judging the worth of a program.(Scriven, 1967)The process of systematically determining the quality of a school program and how the program can be improved.(Sanders, 2000)The deliberate process of making judgments and decisions about the effectiveness, direction, and value of educational programs in our schools.(me)
7 Christopher R. Gareis, Ed.D. SWVA Clinical Faculty ConsortiumDO YOU KNOW OF…?…a program that you thought was effective but was discontinued without explanation?…a program that you believe is likely a waste of time, money, and effort, and yet it continues?We tend not to evaluate programs.If we do evaluate, we tend to do so poorly.If we do evaluate programs effectively, we oftentimes do not act on our inferences.Principles of Program Evaluation
8 Random Acts of Improvement Our IntentionsOur Efforts
9 Aligned Acts of Improvement Our IntentionsOur Efforts
11 The Role of Evaluative Thinking PlanningRedesigningEvaluative ThinkingImplementingEvaluation needs to be a part of a project from the beginning, with the most astute including an evaluator in the planning phase.Evaluation is a process that begins at the planning stageFeeds into the project at each stage of implementationBenefitsIncreases chances you can answer questions you wish to address and appropriate measures can be found or developed so you can develop baseline dataEvaluation is not an event but a processFrechtling, 2007
12 > 20 major models of program evaluation Seminal Model: Daniel Stufflebeam’s CIPP model (1971)Our focus:Creating a Logic ModelFocusing the EvaluationIdentifying Performance IndicatorsEVALUATIVE THINKING
13 Our Objectives TodayUnderstand basic concepts and principles of program evaluation at the school levelCreate a logic model that accurately depicts the essential components of and relationships among input, process, and outcome variables of a school-based educational programSelect the appropriate focus for the evaluation of a programIdentify appropriate data sources aligned to the intended educational outcomes of a selected school-level programAppreciate the importance of engaging in intentional program evaluation as a teacher leader
14 Talk About Your Program What is your program?What does your program do?Who participates in your program?Whom does your program serve?How do they benefit?How do you know, or how would you know, that you program is a success?
15 Think VisuallySketch a visual metaphor of your program.
16 Every educational program: Uses resources or INPUTS, such as…MaterialsFundsTeachersStudentsEngages in activities or PROCESSES, such as…InstructionInterventionsIntends produce certain results or OUTCOMES, such as…LearningAchievementEngendering of certain attitudes or valuesEligibility for a next step in lifeBringing about some change
17 Visualizing an Educational Program The Logic Model InputsProcessesOutcomes
18 What is a logic model? A diagram with text that illustrates and describes the reasoned relationships among program elements and intended outcomes to be attained. A visual theory of change.We use these resources…For these activities…So that these students / teachers …Can produce these results…Leading to these changes for the better…How WhyModified from guest lecture by John A. McLaughlin (November 16, 2009) at The College of William & Mary
20 Take the antihistamine Everyday LogicFeel betterTake the antihistamineGet an antihistamine
21 Try This You’re hungry. Sketch a logic model to address your need. Oh, but one twist: You have no food in the house.Assumptions
22 Greenlawn Middle School Common Math Planning Initiative InitialOutcomesIntermediateOutcomesUltimateOutcomesProcessesInputsStudent affect for instructionCore TeachersMathStudent achievementQuarterly class grades in mathStudent achievement6th grade Math SOL7th grade Math SOLCommon planning by teachersFrequencyEnthusiasmResource TeachersSpecial EducationGifted EducationStudent engagement in instructionLogic Model of Project“Teacher Outcomes” = Assessment LiteracyAim is for Impact on student learningIf teachers’ knowledge and skills associated with classroom-based assessment practices are improved, then is student learning qualitatively and quantitatively improved?What are the essential knowledge and skills associated with effective classroom-based assessment practices?With what instruments and/or by what means can we empirically draw valid inferences about the nature and degree of student learning? (Clearly, the SOLs alone don’t suffice.)What mode and what “dosage” of professional development is sufficient?
23 Now Try ThisYou want to take a camping trip with your family to a state park.Sketch a logic model that depicts your intentions and actions.Purpose
24 The Reach of Intended Outcomes InitialOutcomesIntermediate OutcomesUltimateTimeImmediately or very closely following program activityEnd of a term or year, typicallyEnd of a year, several years, or beyond formal K-12 schoolingPeopleTargeted individual students and/or aggregate groupsMay have an organizational or societal impactIndicatorsDiscrete/FiniteMeasurable and/or observableMay be sets of discrete indicatorsMay or may not be readily measured or observed
25 Now Try ThisThe Great Recession has had a broad and, in many places, deep impact on Americans.One “solution” to our economic morass has been a call from policymakers for improved financial literacy of our citizens.K-12 schools are seen as the logical place to teach young citizens financial literacy.Sketch a logic model that depicts this “theory of change”Causality
30 Stephen M. Millett, Susan Tave Zelman, (2005) "Scenario analysis and a logic model of public education in Ohio", Strategy & Leadership, Vol. 33 Iss: 2, pp
31 Source: SUNY NGLC Grant Proposal (retrieved 9/16/12)
32 Christopher R. Gareis, Ed.D. A Model for Improving Assessment LiteracyChristopher R. Gareis, Ed.D.SWVA Clinical Faculty ConsortiumContext & InputsProcesses of Professional DevelopmentOutcomes for TeachersImpact on Student LearningExperience & Expertise of Professional DevelopersExplore AlignmentC=Ia=AUnderstand role of C=Ia=AAlignmentCreate assessments that yield valid & reliable dataExposure to the full curriculum (e.g., depth & skill development)Unpack CurriculumContentCognitive levelsExperience & Expertise of Participants- Pre-service teachers- In-service teachersInstructional leadersAdministratorsUnderstand relationship between assessment & evaluationUse assessment data to make decisions:- About student learning- For student learning- About assessments- About instructionCreate a Table of Specifications or “blueprint”Understand & apply concepts of validity & reliabilityIncreased student achievementPsychometric Principles-Translated into practical terms & skills for teachersCritique an assessment for validity & reliabilityUse a ToS to:Create an assessmentCritique & improve an assessmentCreate a unit plan assessment- Plan instruction- Analyze assessment resultsContribute to the design & development of common assessmentsExplore uses of ToSCreate an assessmentCritique & improve an assessmentCreate a unit plan assessment- Plan instruction- Analyze assessment resultsState Standardized Assessments- de facto Curriculum?Logic Model of Project“Teacher Outcomes” = Assessment LiteracyAim is for Impact on student learningIf teachers’ knowledge and skills associated with classroom-based assessment practices are improved, then is student learning qualitatively and quantitatively improved?What are the essential knowledge and skills associated with effective classroom-based assessment practices?With what instruments and/or by what means can we empirically draw valid inferences about the nature and degree of student learning? (Clearly, the SOLs alone don’t suffice.)What mode and what “dosage” of professional development is sufficient?Deeper, more meaningful learningState &/or School District CurriculumProvide “opportunity to learn” throughaligned instructionApply principles to the spectrum of assessment types & practicesCreate good assessment items, prompts, assignments, & rubricsDistrict Goals, Initiatives, or ImperativesPrinciples of Program Evaluation
33 Christopher R. Gareis, Ed.D. SWVA Clinical Faculty ConsortiumPROGRAM ACTION- LOGIC MODELInputsProcessesOutcomesActivitiesParticipationShort-termIntermediate-termLong-term→I-P-OElements are discretely identifiable—steps, resources, events, people, etc.Lines/arrows = hypothesized causal linkages or assumed relationshipsPrinciples of Program Evaluation
35 Developing a Logic Model Christopher R. Gareis, Ed.D.SWVA Clinical Faculty ConsortiumDeveloping a Logic ModelWho: Groups of 3-4 Materials: Chart paper (in “landscape layout”), Post-It notes, & markers Task: Create a basic logic model for the AVID programIIIA06IIIA07IIIB06IIIA17IID09IID10IID11IIA40Principles of Program Evaluation
36 Advancement Via Individual Determination AVIDThe AVID StudentAVID targets students in the academic middle – B, C, and even D students – who have the desire to go to college and the willingness to work hard. These are students who are capable of completing rigorous curriculum but are falling short of their potential. Occasionally, they will be the first in their families to attend college, and many are from low- income or minority families. AVID pulls these students out of their unchallenging courses and puts them on the college track: acceleration instead of remediation.The AVID ElectiveNot only are students enrolled in their school’s toughest classes, such as honors and Advanced Placement, but also in the AVID elective. For one period a day, they learn organizational and study skills, work on critical thinking and asking probing questions, get academic help from peers and college tutors, and participate in enrichment and motivational activities that make college seem attainable. Their self-images improve, andthey become academically successful leaders and role models for other students.CurriculumThe AVID curriculum, based on rigorous standards, was developed by middle and senior high school teachers in collaboration with college professors. It is driven by the WIC-R method, which stands for writing, inquiry, collaboration, and reading. AVID curriculum is used in AVID elective classes, in content area classes in AVID schools, and even in schools where the AVID elective is not offered.ResultsA well-developed AVID program improves school-wide standardized test scores, advanced rigorous course enrollments, and the number of students attending college.Excerpted from
37 Why AVID WorksBetween the remedial programs for students who lag far behind, and the gifted –and –talented programs for a school’s brightest children, lies the silent majority: average students, who do “okay” in ordinary classes but, because they don’t attract attention to themselves, are left alone. Many of these students hunger for more challenging coursework, but fear failure. Their potential lies dormant, waiting to be recognized,encouraged, and supported.First, AVID identifies these student. The selection criteria include:Ability: Are the students getting Cs and Bs but are capable of more?Desire and Determination: Do they want to attend college? Are they willing to work hard to get there?Membership in an underserved group: Are they in a low income household? Will they be the first in their family to attend college?The AVID program is tailored to the needs of this diverse group of students, and it works for them because it:Accelerates underachieving students into more rigorous courses.Offers the intensive support students need to succeed in rigorous courses.Uses Socratic methods and study groups that specifically target the needs of under-achieving students.Is a school-wide initiative, not a school within a school.Changes the belief system of an entire school by showing that students that are in the middle academically, and low-income and minority students can achieve at the highest levels and attend colleges.Redefines the role of the teacher from lecturer to advocate and guide. The role of counselor changes from gate-keeper to facilitator.Creates site teams of administrators and educators from different content areas, encouraging communication and sharing among teachers, counselors, and principals.Is based on research on tracking – the process by which some children are channeled into challenging courses and others are relegated to remedial ones and peer influences in student achievement.
38 Christopher R. Gareis, Ed.D. SWVA Clinical Faculty ConsortiumTRY THISTask: Create a logic model for your programMaterials:Chart paper (in “landscape layout”)Post-It notesMarkersAnswers to the earlier questions about your programIIIA06IIIA07IIIB06IIIA17IID09IID10IID11IIA40Principles of Program Evaluation
39 Gallery Walk (with Docents) Can you read the logic model (without the docent’s assistance)?Questions that the logic model raises for you?Feature(s) you want to “borrow”?Suggestions to strengthen the logic model?
40 Why bother with logic modeling? Seems like a lot of work.Logic models are too complex—how could I realistically ever know all the variables at play in a complex program!I don’t “think” this way.Even if we created a logic model, what would we do with it?!
41 Limits of Logic Models Represent intentions not reality Focus on expected outcomes, which may mean missing out on beneficial unintended outcomesChallenging to know causal relationshipsDoes not address whether what we’re doing is right
42 Benefits of Logic Modeling McLaughlin, J. A. , & Jordan, G. B. (1999) Benefits of Logic Modeling McLaughlin, J. A., & Jordan, G. B. (1999). Logic models: a tool for telling your program’s performance story. Evaluation and Program Planning 22,Builds a shared understanding of the program and expectations for inputs, processes, and outcomesHelpful for program design (e.g., identifying critical elements for goal attainment and plausible linkages among elements)Points to a balanced set of key performance indicators
43 Let’s Clarify: A Logic Model… Christopher R. Gareis, Ed.D.SWVA Clinical Faculty ConsortiumLet’s Clarify: A Logic Model…ISIS NOTPrinciples of Program Evaluation
45 Your question IS your FOCUS Implementation FidelityQuestion: “Are we implementing the program as designed?”Focus: Inputs & ProcessesGoal AttainmentQuestion: “Are we seeing evidence that we are achieving our intended outcomes?”Focus: Initial, Intermediate, and/or Ultimate Outcomes
46 ESL Dual-endorsement Program Approved teacher preparation programsRecruitment of candidatesElem/Sec/SPED programDually-endorsed teachersSatisfaction with preparationCourseworkMDLL TESOL/ESL coursesScheduling MDLL coursesField ExperiencesAdvising candidatesArranging field experiencesOrientationTransportationSupervisionImpact on student learningESL summer school in LEAsLocating field sitesVDOE regulations for ESL prepProgram approval
47 Christopher R. Gareis, Ed.D. SWVA Clinical Faculty ConsortiumOnce you have a logic model and performance indicators, what do you do?Where would IF evaluation occur and where would GA evaluation occur?Determine the intent (and “audience”):Formative evaluation (e.g., implementation fidelity and improvement—an assumption of continuation)Summative evaluation (e.g., to continue or discontinue)Articulate a limited number of relevant evaluation questionsIdentify who is needed to conduct the evaluationDetermine how the evaluation will be conducted (time, money, data collection & analysis, compilation & reporting)…Principles of Program Evaluation
48 Christopher R. Gareis, Ed.D. SWVA Clinical Faculty Consortiumi.e., “Implementation Fidelity “or “Goal Attainment”What is the focus of the program evaluation that would answer each of the following questions?Do methods used by teachers at Smith High School correspond to the principles of block scheduling introduced last fall?Did the new reading curriculum result in improved reading abilities among 2nd graders? Increased interest in reading?How consistently are Instructional Teams developing standards-aligned units of instruction for each subject and grade level (IIA01)?Which recruitment strategies (of the three that we used) were most effective in involving fathers in Early Head Start?How effective was the character education program in our middle school?Did our new guidance program help new ESL students transition socially to our school?How many AVID teachers have we trained during the past three years?Adapted from Fitzpatrick, Sanders, & Worthen. (2004). Program evaluation: Alternative approaches and practical guidelines. (3rd ed.) Boston: Pearson (pp ).Principles of Program Evaluation
49 What Information Do You Use to Answer Your Questions?
50 Examples of Assessment Sources in Schools Student attendance ratesTeacher attendancePTA membershipAchievement gap among subgroupsInequity of class enrollment (e.g., proportionally fewer minority students in AP classes)Special education referral ratesBehavior referralsReading levelsSOL scoresPALS scoresDIBELS scoresBenchmark test scoresSOL recovery dataSAT scoresAP scoresSurveysClass gradesGrade distributionsAlgebra I enrollment / completionCollege acceptancesSummer school ratesDropout ratesRetention ratesAcceleration ratesIdentification for gifted servicesAthletic championshipsDebate team competitionsStudent demographicsStaff demographics (e.g., years of experience, licensure status)Family demographics (e.g., income, educational levels)Financial resources (budget)Per pupil costs
51 What assessment sources could you use to gather relevant, accurate, dependable information/data to answer each question?Do methods used by teachers at Smith High School correspond to the principles of block scheduling introduced last fall?Did the new reading curriculum result in improved reading abilities among 2nd graders? Increased interest in reading?How consistently are Instructional Teams developing standards-aligned units of instruction for each subject and grade level (IIA01)?Which recruitment strategies (of the three that we used) were most effective in involving fathers in Early Head Start?How effective was the character education program in our middle school?Did our new guidance program help new ESL students transition socially to our school?How many AVID teachers have we trained during the past three years?Adapted from Fitzpatrick, Sanders, & Worthen. (2004). Program evaluation: Alternative approaches and practical guidelines. (3rd ed.) Boston: Pearson (pp ).
52 Avoid DRIP: Align Your Data Sources to Your Logic Model InputsData source(s) to assess attainment or completionProcessesInitial OutcomesLong-term Outcomes
53 Aligning Focus and Assessment Sources: Process (P) or Outcome (O) Indicator? Using a classroom observation protocol to determine % of student engagementAdvanced pass rate on high school end-of-course SOL testQuestionnaire of 4th grade math students to determine degree of “mathphobia”Review of teacher-made lesson plans over the course of a 9-week period to determine % including explicit phonics instructionGraduation ratesNumber of “leveled” books available in the media center tracked over a 3-year periodSurvey of students’ attitudes about participating in an after-school tutorial program (e.g., “On a scale of 1-to-5, how much did you enjoy attending the after-school program? How helpful was your after-school tutor?”)3rd grade SOL pass rates% enrollment of minority students in foreign language courses in the high schoolImplementation of an advisory period at the middle school levelChange from a standard 7-period bell schedule to an alternating-day block scheduleAverage AP scoresStudent attendance ratesTeacher attendance ratesReview of committee meeting agendas to determine % that focus on discussion of achievement dataBudget allocation per studentGrade Point Averages
56 TRY THISTasks:Imagine undertaking an Goal Attainment evaluation of your program.Articulate at least 2 evaluation questionsIdentify 1-2 performance indicators (aka, data sources) that could help answer each questionIf time allows, try doing the same for an Implementation Fidelity evaluation.Materials:Your logic model
57 Know your means from you ends Sage AdviceRemember that not everything that matters can be measured, and not everything that can be measured matters.Know your means from you ends
58 To make “evidence-based decisions," you must have data that are: Valid Necessary for drawing appropriate inferencesReliable Necessary for avoiding erroneous judgmentsTriangulated Necessary to confirm dataLongitudinal Necessary to determine trends and patternsDisaggregated Necessary to provide "fine-grain" analysis, rather than over-generalizingHow do SOL results hold up as a data source?
59 “Without evaluation, change is blind and must be taken on faith” -- Sanders (2000)
60 Managing Complex Change Adapted from Knoster, T Managing Complex Change Adapted from Knoster, T. (1991) presentation at TASH Conference, Washington, DC.VisionSkillsIncentivesResourcesAction PlanCHANGEVisionSkillsIncentivesResourcesAction PlanCONFUSIONVisionSkillsIncentivesResourcesAction PlanANXIETYVisionSkillsIncentivesResourcesAction PlanRESISTANCEVisionSkillsIncentivesResourcesAction PlanFRUSTRATIONVisionSkillsIncentivesResourcesAction PlanTREADMILLVisionSkillsIncentivesResourcesAction PlanEvaluative FeedbackCHANGEfor the better
61 Strengthening a Program Evaluation Plan Jane Jackson is a curriculum coordinator for Greenlawn County Public Schools and has been working closely with the district’s middle school faculty to implement interdisciplinary instruction as a means to improving students’ academic engagement and achievement. Within each grade, a team of teachers cooperates to develop lessons that are interdisciplinary. Individual members of the team have been assigned responsibility for the areas of English, mathematics, science, and social studies. The program has been implemented for two full years.Anecdotally, the 7th grade Language Arts teachers (there are four of them, as well as collaboration teachers for both special and gifted education) appear to be engaging in interdisciplinary planning and instruction most enthusiastically and most regularly. Working with the school’s assistant principal, Lynnell Perkins, and the School Improvement Team chair, Archie Craun, Ms. Jackson has decided to evaluate the 7th grade Language Arts program . Her evaluation tentatively includes the following:The means of the 6th grade Writing and Language Arts SOL tests from last year compared to the means of the 7th grade Writing and Language Arts SOL tests from this year.Monthly interviews of a random sample of 10% of the 7th grade student population to assess students’ reactions to the Language Arts portion of the instructional program.A comparison of the mean, median, mode, and range of quarterly grades in 7th grade Language Arts for this year’s 7th grade students, last year’s 7th grade students (current 8th graders), and the 7th grade students for the two years immediately prior to implementation of the program (current 9th and 10th graders).Observations by an outside observer twice a month, using a scale she has devised to record pupil interaction during class discussions and activities.A weekly tracking of teacher lesson plans for evidence of planned interdisciplinary instruction.Given what you know about program evaluation, what are some of the strengths of Ms. Jackson’s tentative evaluation plan? What would you advise her to change? What would you advise her to add?Adapted from Fitzpatrick, Sanders, & Worthen. (2004). Program evaluation: Alternative approaches and practical guidelines. (3rd ed.) Boston: Pearson (pp ).
62 Greenlawn Middle School Interdisciplinary Instruction Initiative InitialOutcomesIntermediateOutcomesUltimateOutcomesProcessesInputsCore TeachersEnglish/LAMathScienceHistoryStudent affect (interview)Student achievementQuarterly class grades in E/LA (mean, median, mode, & range)Inter-disciplinary planning (lesson plans)FrequencyEnthusiasmStudent achievement6th grade E/LA SOL7th grade E/LA SOLStudent engagement (observation protocol)Resource TeachersSpecial EducationGifted EducationLogic Model of Project“Teacher Outcomes” = Assessment LiteracyAim is for Impact on student learningIf teachers’ knowledge and skills associated with classroom-based assessment practices are improved, then is student learning qualitatively and quantitatively improved?What are the essential knowledge and skills associated with effective classroom-based assessment practices?With what instruments and/or by what means can we empirically draw valid inferences about the nature and degree of student learning? (Clearly, the SOLs alone don’t suffice.)What mode and what “dosage” of professional development is sufficient?
63 Time to EvaluateUnderstand basic concepts and principles of program evaluation at the school levelCreate a logic model that accurately depicts the essential components of and relationships among input, process, and outcome variables of a school-based educational programSelect the appropriate focus for the evaluation of a programIdentify appropriate data sources aligned to the intended educational outcomes of a selected school-level programAppreciate the importance of engaging in intentional program evaluation as an teacher leader
64 Empirical Research ≠ Program Evaluation Identifying focusHypotheses testingValue judgmentsReplication of resultsData collectionControl of variablesGeneralizability of resultsFocusOrganizational needComparison of outcomes to intended program outcomesContextually drivenLow likelihoodBroad and heavily dependent upon feasibilityLittle to noneDependent variables (i.e., results) more so than on Independent variables (i.e., the cause)
65 Investigating the Efficacy of a Clinical Faculty Program Christopher R. Gareis, Ed.D.Leslie W. Grant, Ph.D.The College of William & MaryAll 50 states and 43 foreign countries25% are students of color79 % of freshmen graduated in the top 10% of their class75 % of students participate in community service projects
71 Indicators of Perceived Effectiveness Training Coherence between field and university:I have had student teachers in the past with absolutely no guidance or direction from the college. I’m not sure how effective I was. This class has been so beneficial and I feel I am more confident in mentoring this time! Thank you!Skill development as a mentor:I gained awareness in many of the philosophies behind coaching. As I learned techniques, I grew as a mentor.This program is essential for bringing professionalism to the training of student teachers. Mentoring does not always come naturally and having techniques and strategies to address problems and guide students is invaluable.Outcome orientation:I am prepared to mentor aspiring teachers. I am prepared to help them become independent and effective teachers.Professional growth as a teacher:I always feel very good about what I’m doing when I leave class. I like having the professional growth and being able to take a good look at my teaching. I have really enjoyed it. What a great program.I didn’t realize how much I didn’t know about being a CT.
72 Overall Evaluation of Experiences in the Clinical Faculty Training W&M Clinical Faculty ProgramSWVA Clinical Faculty ConsortiumOverall Evaluation of Experiences in the Clinical Faculty TrainingOverall evaluation of training1=Poor, 5=Excellent
73 What We Want to Know:Cooperating TeachersClinical Faculty
75 Research QuestionsTo what degree do CF differ from CTs in their sense of self- efficacy for roles of a cooperating teacher?To what degree do mid-term evaluations of student teachers placed with CF differ from those placed with CTs?To what degree do final evaluations of student teachers placed with CF differ from those placed with CTs?To what degree do new teachers who had been placed with CF differ from those who had been placed with CTs with regard to sense of efficacy for teaching, perceived impact on student learning, and intent to remain in the profession?
77 Research Methods – Data Collection SurveysClinical Faculty & Cooperating Teachers (R1)1998 – 2011n = 101Response rate: 37.0%Graduates of the School of Education (aka, new teachers) (R4)2005 – 2010n = 94Response rate: %Mid-term & Final Student Teaching Evaluations (R2 & R3)2008 – 2011 (n = 319)Student Teacher self-evaluationsCT/CF evaluationsUniversity Supervisor evaluations
78 Research Methods – Data Analysis Unit of Analysis – Cooperating teacher designationClinical Faculty (CF) – Trained through the School of Education’s Clinical Faculty ProgramCooperating Teacher (CT) – Not trained through the School of Education’s Clinical Faculty ProgramStatistical AnalysesT-tests for significant differences according to CT/CF designation
80 Research Questions 2 & 3To what degree do mid-term evaluations of student teachers placed with CF differ from those placed with CTs?To what degree do final evaluations of student teachers placed with CF differ from those placed with CTs?
82 Statistically Significant Differences in Mid-Term Ratings (RQ2) CompetencyCF RatingCT Rating1. Understands subject matter and pedagogy...2.302.4417. Implements assessments for learning2.202.3826. Participates in and applies professional development2.412.6030. Demonstrates potential for teacher leadership2.342.514 out of 30 Competencies
88 Research Question 1 (RQ1) To what degree do CF differ from CTs in their sense of self-efficacy for roles of a cooperating teacher?
89 Sense of Efficacy for the Role of CF/CT (R1) W&M Clinical Faculty ProgramSense of Efficacy for the Role of CF/CT (R1)SWVA Clinical Faculty ConsortiumQuestionClinical Faculty (trained)n=76Cooperating Teachers (non-trained) n=25T-Test Equal Variances AssumedT-Test Equal Variances Not AssumedConvey Role as CF/CT*4.514.040.0070.038Four Fundamental Roles of Mentoring*4.243.650.0020.015Foster Relationship4.574.620.7680.772Effectively Observe*4.593.920.001Use a Variety Supervision Strategies**4.190.0660.111Effectively Conference4.474.230.1690.198Summatively Evaluate ST*4.550.006Impact Professional Abilities of ST4.494.380.4230.447Provide High Quality Field Exp.4.540.784Likelihood ST Positively Affects Pupil Learning4.370.9410.939Likelihood ST Enters Teaching4.310.6630.687Likelihood ST Remains in Teaching3.990.7740.796Likelihood ST Positive Impacts Pupil Learning in First Year4.410.4540.51Likelihood ST Emerges as a Teacher Leader4.0840.6620.686*significance < .05**significance < .10
90 To what experiences do CF and CT attribute their mentoring acumen? GroupExperience% Indicating Most Important ExperienceCooperating TeachersExperience as a classroom teacher28.0%Clinical FacultyClinical Faculty training*38.7%* 20% of CF selected “experience as a classroom teacher” as Most Important.
96 Program Evaluation in Use Berkeley Middle School SOL Jam & Cram Spring 1999
97 SOL Jam & Cram INPUTS PROCESSES OUTPUTS INITIAL OUTCOMES ULTIMATE INTERMEDIATEULTIMATEAnalyzed SOL results and class performanceSelectedstudents:C or B avg.LPT7th gr. SOLInvited 175 “in the middle” studentsMoti-vational “recruit-ment”Analyzed SOL test blueprintsTargeted specific SOLs for focused reviewReview & enrichment activitiesHigh-activity, novel instructional strategies8 sessions (1/2 hour each)Timely (week prior to SOLs)Volunteer teachers (but paid)Collaboratively planned lessonsLunch & door prizes providedSense of “efficacy” for SOL success (survey of students)SOL assessment resultsAcademically prepared for success in high schoolIndependent thinkerResponsible citizenLifelong learner
98 Why Did Students Attend? 7th8th50%32%It is important to me to do my best in school.33%65%I want to avoid having to go to summer school.10%3%My parents made me sign up.7%--Some of my friends signed up, so I did, too.
99 Why Did Students NOT Attend? 7th8th4%--I do not enjoy learning.6%I already know that I will have to attend summer school.I don’t’ believe that Jam & Cram would help me on any of the SOL tests.I do not feel that I fit into the group of students who were selected to attend.52%56%I have a conflict on the day of Jam & Cram, so I wouldn’t be able to attend.26%25%I forgot to return the slip on time9%I can study fine on my ownI didn’t want to go. (“It was really stupid, dumb.”)3%I don’t like school.
100 Post-Survey Participants To what degree do you agree or disagree with the following statement: “Jam & Cram helped me to be better prepared for the SOL tests.” [4 pt. scale]7th Grade:8th Grade:Non-Participants7th8th35%22%I wish I had attended45%56%I’m glad that I didn’t attend because I think I did well anyway.21%Other (mostly explanations of why they didn’t attend)
101 Percentage of Students Passing SOL Tests Berkeley Middle School Grade 8
104 An Analysis of SOL Data: What We Can Learn We did great last year with gains of percentage points in all areas!Science is our strongest area: highest pass rate, highest advanced pass rate, and relatively low disparity.Writing is a strong area: high pass rate and relatively low disparity…but very few advanced passes.English/Reading is a relatively strong area with a high pass rate and a very high advanced pass rate.Computer Technology is a consistently strong area…but has the second highest disparity groupMath 8 is conquerable…but the disparity is wide and our overall pass rate is still in shaky territoryHistory/Social Studies remains our greatest challenge in terms of overall pass rate and disparity by both ethnicity and gender…BUT we had whopping gains last year!Our year 2002 class (last year’s 6th grade) is academically strong. Our 2003 class (this year’s 6th grade) is even stronger!Jam & Cram was a successful strategy by all anecdotal accounts, but was not effective statistically speaking. We may want to repeat the program but with definitive subgroups of students and not in a single cram sessions.SOL Resource class appears to have been a success based on pass rates alone (nearly 100%), although it has not been compared to control data.