Presentation on theme: "How Do We Know It’s Working? Creating Evaluations for Technology Projects and Evaluations (part I)"— Presentation transcript:
How Do We Know It’s Working? Creating Evaluations for Technology Projects and Evaluations (part I)
Contact Information y ext. 204 zwww.edtechevaluation.comwww.edtechevaluation.com yThis presentation will be linked to that site (on the Tools page)
Where Do We Stand? zWho’s working on an actual project? yCurrent? yAnticipated? zYour expectations for today
Workshop Goals zTo review the key elements of effective program evaluation as applied to technology evaluations zTo consider evaluation in the context of your actual projects
Why Evaluate? zTo fulfill program requirements yNCLB and hence Title IID carry evaluation requirements zTo realize your investment in technology yWhat sort of “difference” has all of this technology made?
Basis in NCLB “The application shall include:… A description of the process and accountability measures that the applicant will use to evaluate the extent to which activities funded under this subpart are effective in integrating technology into curricula and instruction, increasing the ability of teachers to teach, and enabling students to meet challenging State academic content and student academic achievement standards.” NCLB Act, Title II, Part D, Section 2414(11)
zOne consistent thread in NCLB is evaluation and assessment yHow can you document that this “intervention” is making a difference? zAll funded work must be based in reflection and data-driven decision-making zNaturally, this translates to local district proposals
A Framework for Review
Evaluation zHelps clarify project goals, processes, products zMust be tied to indicators of success written for your project’s goals zNot a “test” or checklist of completed activities zQualitatively, are you achieving your goals? zWhat adjustments to can be made to your project to realize greater success?
The Basic Process zEvaluation Questions yTied to original project goals zPerformance Rubrics yAllow for authentic, qualitative, and holistic evaluation zData Collection yTied to indicators in the rubrics zScoring and Reporting yRole of this committee (the evaluation committee)
Who Evaluates? zCommittee of stakeholders (pg 12) zOutside facilitator? zData collection specialists? zTask checklist zOther issues: yHonesty yPerspective yTime-intensive
Evaluation Starts with Goals zEvaluation should be rooted in your goals for how you are going to use or integrate that technology yIs more than an infrastructure plan yFocuses on technology’s impact on teachers and students yHas clear goals and objectives for what you want to see happen
Evaluation Logic Map
Your Project? zUsing the Evaluation Logic Map, map your: yProject purpose/vision yGoals yObjectives yActions
Goals Lead to Questions zWhat do you want to see happen? yThese are your goals yRephrase goals into questions zAchieving these goals requires a process that can be measured through a formative evaluation
We Start with Goals… zTo improve student achievement through their participation in authentic and meaningful science learning experiences. zTo provide advanced science and technology learning opportunities to all students regardless of learning styles or abilities. zTo produce high quality science and technology curriculum in which the integration of technology provides “added value” to teaching and learning activities. zTo increase students’ knowledge of the Connecticut River’s history and geology, and to gain and understanding its past, present and possible future environmental issues.
…and move to questions zHas the project developed technology-enhanced science learning experiences that have been instrumental in improving student mastery of the Skills of Inquiry, understanding of the history/geology/ecology of the Connecticut River, and of the 5-8 science curriculum in general? zHas the project offered teacher professional development that has resulted in improved teacher understanding of universal design principles and technology integration strategies?
…And Then to Indicators zWhat is it that you want to measure? yWhether the projects have enhanced learning yThe relationship between the units and xThe selected curriculum xThe process by which they were developed yIncreases in teacher technology skills (in relation to particular standards) yWhether the professional development model met with its design expectations xCollaborative and sustainable xInvolves multiple subjects and administrators
zIndicators should reflect your project’s unique goals and aspirations yRooted in proposed work yIndicators must be indicative of your unique environment...what constitutes success for you might not for someone else yIndicators need to be highly descriptive and can include both qualitative and quantitative measures
Try a Sample Indicator zGoing back to the Logic Map, try to develop a few indicators for your sample project yKeep it simple yQualitative and quantitative see yWill you be able to see the indicator?
To Summarize... zStart with your proposal or technology plan zFrom your goals, develop indicators and a performance rubric
Coming in Part II zData Collection zReporting
How Do We Know It’s Working? Creating Evaluations for Technology Projects and Evaluations (part II)
A Basic Process zEvaluation Questions yMust be tied to original planning goals zPerformance RubricsPerformance Rubrics yAllow for authentic, qualitative, and holistic evaluation zData Collection yTied to indicators in the rubrics zScoring and Reporting
Measures? zClassroom observation, interviews, and work- product review yWhat are teachers doing on a day-to-day basis to address student needs? zFocus groups and surveys yMeasuring teacher satisfaction zTriangulation with data from administrators and staffTriangulation yDo other groups confirm that teachers are being served?
Data Collection zReview Existing Data yCurrent technology plan yCurriculum yDistrict/school improvement plans zwww.sun-associates.com/eval/samplewww.sun-associates.com/eval/sample zCreate a checklist for data collection
Surveys zCreating good surveys ylength ydifferentiation (teachers, staff, parents, community, etc..) yquantitative data yattitudinal data ytiming/response rates (getting returns!) zwww.sun-associates.com/eval/samples/samplesurv.htmlwww.sun-associates.com/eval/samples/samplesurv.html
Classroom Observations zUsing an observation template zUsing outside observers
Other Data Elements? zArtifact analysis yA rubric for analyzing teacher and student work? zSolicitation of teacher/parent/student stories yThis is a way to gather truly qualitative data yWhat does the community say about the use and impact of technology?
Dissemination zCompile the report zDetermine how to share the report ySchool committee presentation yPress releases yCommunity meetings
Conclusion zBuild evaluation into your technology planning effort zRemember, not all evaluation is quantitative zYou cannot evaluate what you are not looking for, so it’s important to — zDevelop expectations of what constitutes good technology integration
More Information y ext. 204 zwww.sun-associates.com/evaluationwww.sun-associates.com/evaluation zwww.edtechevaluation.comwww.edtechevaluation.com yThis presentation is linked to that page