Data Collection Techniques For Technology Evaluation and Planning.

Slides:



Advertisements
Similar presentations
Leading Learning in the Social Sciences MARGARET LEAMY National Coordinator Social Sciences Te Tapuae o Rehua Consortium
Advertisements

Leon County Schools Performance Feedback Process August 2006 For more information
A Vehicle to Promote Student Learning
Year 2 Formative Progress Review
Analyzing Student Work
The Teacher Work Sample
M & E for K to 12 BEP in Schools
Teacher Evaluation Model
How Do We Know It’s Working? Creating Evaluations for Technology Projects and Evaluations (part I)
There is no reason to pay close attention to this unless you are going to conduct a proposal for a needs assessment.
 Reading School Committee January 23,
Domain 1: Planning and Preparation
The Oregon Framework for Teacher and Administrator Evaluation and Support Systems April Regionals Multiple Measures: Gathering Evidence 1.
Supported by the National Science Foundation under Grant No. EHR Taking Stock of Our Reform Efforts Are we making a difference and how do we know?
Assessing Student Learning
What should be the basis of
performance INDICATORs performance APPRAISAL RUBRIC
Science Inquiry Minds-on Hands-on.
Professional Growth= Teacher Growth
How to Develop a Project Evaluation Plan Pat Gonzalez Office of Special Education Programs
How to build effective WORD WALLS and PERFORMANCE TASKS
Evaluation 101 Everything You Need to Know to Get Started Evaluating Informal Science Education Media Everything You Need to Know to Get Started Evaluating.
How to Focus and Measure Outcomes Katherine Webb-Martinez Central Coast and South Region All Staff Conference April 23, 2008.
Jack C Richards Professional Development for Language Teachers: Strategies for Teacher Learning Jack C Richards & Thomas.
Program Evaluation Tools and Strategies for Instructional Technology.
SIOP: Sheltered Instruction Observation Protocol Dr. Kelly Bikle Winter 2007.
ASSESSMENT and EVALUATION FOR IMPROVED STUDENT LEARNING:
Southern Regional Education Board HSTW An Integrated and Embedded Approach to Professional Development and School Improvement Using the Six-Step Process.
Changes in Student and Teacher Attitudes and Behaviors in an Integrated High School Curriculum Presented by: Nicolle Gottfried & Catherine Saldutti.
Washington State Teacher and Principal Evaluation 1.
Creating Successful Proposals For Curriculum and Technology Projects.
LEARNING DIFFERENCES - AGENCY SELF-ASSESSMENT GUIDE Program Year A tool for identifying program improvement and professional development needs.
Slide 1 D2.TCS.CL5.04. Subject Elements This unit comprises five Elements: 1.Define the need for tourism product research 2.Develop the research to be.
ASSESSMENT OF STUDENT LEARNING Manal bait Gharim.
Lessons Learned from District Tech Audits Sun Associates
2 The combination of three concepts constitutes the foundation for results: 1) meaningful teamwork; 2) clear, measurable goals; and 3) regular collection.
MassCUE Evaluators Day 1 – January 11, Greetings! zLogistics and Locations yWiFi SSID and Password yBathrooms, etc. zTeams yWho are you? yTeam members?
Evaluating Your Technology Initiative How do you know it’s working? The MassCUE Evaluators Program Let us know you’re here!
MassCUE Evaluators Data Collection Creating Questions Mapped to Indicators.
Comp 20 - Training & Instructional Design Unit 6 - Assessment This material was developed by Columbia University, funded by the Department of Health and.
Setting purposeful goals Douglas County Schools July 2011.
NECC MP356 Creating Web-Based Online Surveys.
TAH Project Evaluation Data Collection Sun Associates.
Monitoring & Evaluation: The concepts and meaning Day 9 Session 1.
1. Housekeeping Items June 8 th and 9 th put on calendar for 2 nd round of Iowa Core ***Shenandoah participants*** Module 6 training on March 24 th will.
EDU 385 CLASSROOM ASSESSMENT Week 1 Introduction and Syllabus.
Curriculum Focused Visit In Search of Standards. People’s minds are changed by observation and not through argument. (Will Rogers) What gets monitored.
Draft TIP for E-rate. What is E-rate? The E-rate provides discounts to assist schools and libraries in the United States to obtain affordable telecommunications.
Program Evaluation.
STAKEHOLDER MEETING Selecting Interventions to Improve Utilization of the IUD City, Country Date Insert MOH logoInsert Project logoInsert USAID logo (Note:
1. Administrators will gain a deeper understanding of the connection between arts, engagement, student success, and college and career readiness. 2. Administrators.
Project 3 Supporting Technology. Project Proposal.
NCATE STANDARD I STATUS REPORT  Hyacinth E. Findlay  March 1, 2007.
An Introduction to Formative Assessment as a useful support for teaching and learning.
Data Report July Collect and analyze RtI data Determine effectiveness of RtI in South Dakota in Guide.
Evaluating 1:1 Initiatives and Other Technology Initiatives The MassCUE Evaluators Program.
What Are the Characteristics of an Effective Portfolio? By Jay Barrett.
Documenting Completion of your PDP
Identifying Assessments
1 DEMONSTRATION PROJECTS TO ENSURE STUDENTS WITH DISABILITIES RECEIVE A QUALITY HIGHER EDUCATION PROGRAM Performance Measurement, Program and Project Evaluation.
Session 2: Developing a Comprehensive M&E Work Plan.
.  Evaluators are not only faced with methodological challenges but also ethical challenges on a daily basis.
Teaching and Learning Cycle and Differentiated Instruction A Perfect Fit Rigor Relevance Quality Learning Environment Differentiation.
1 Innovative Teaching and Learning (ITL) Research Corinne Singleton SRI International.
Selection Criteria and Invitational Priorities School Leadership Program U.S. Department of Education 2005.
Data Collection Techniques For Technology Evaluation and Planning (TA109)
Action Research for School Leaders by Dr. Paul A. Rodríguez.
Partnership for Practice
CATHCA National Conference 2018
Presentation transcript:

Data Collection Techniques For Technology Evaluation and Planning

Contact Information y ext. 204 zwww.edtechevaluation.comwww.edtechevaluation.com yThis presentation will be linked to that site (on the Tools page)

Where Do We Stand? zWho’s working on an actual project? yCurrent? yAnticipated? zYour expectations for today

Objectives zTo review the key elements of effective program evaluation as applied to technology evaluations zUnderstanding the role of data collection in an overall evaluation process zReviewing various data collection strategies

Why Evaluate? zTo fulfill program requirements yNCLB and hence Title IID carry evaluation requirements yOne of the 7 seven program requirements for NY Title IID Competitive Grants x“Each grantee will be required to develop “process and accountability measures” to evaluate the extent to which activities funded are effective in (1) integrating technology into curricula and instruction; (2) increasing the ability of teachers to teach; and (3) enabling students to meet challenging State standards. Records relating to these “process and accountability measures” are to be made available on request to the NYS Education Department (or its agents).”

yProject evaluation is also required as an overall part of each proposal… x“Describe the plan for evaluating the effectiveness of the competitive grant project. The plan should include clear benchmarks and timelines to monitor progress toward specific objectives and outcome measures to assess impact on student learning and achievement. It must address the extent to which activities funded are effective in (1) integrating technology into curricula and instruction; (2) increasing the ability of teachers to teach; and (3) enabling students to meet challenging State standards.” y10% of the points…10% of the budget?

A Framework for Review

Evaluation zHelps clarify project goals, processes, products zMust be tied to indicators of success written for your project’s goals zNot a “test” or checklist of completed activities zQualitatively, are you achieving your goals? zWhat adjustments to can be made to your project to realize greater success?

The Basic Process zEvaluation Questions yTied to original project goals zPerformance Rubrics yAllow for authentic, qualitative, and holistic evaluation zData Collection yTied to indicators in the rubrics zScoring and Reporting yRole of this committee (the evaluation committee)

Who Evaluates? zCommittee of stakeholders (pg 13) zOutside facilitator? zData collection specialists? zTask checklist (pg 11)

Data Collection vs. Evaluation zEvaluation is more than data collection zEvaluation is about… yCreating questions yCreating indicators yCollecting data yAnalyzing and using data zData collection occurs within the context of a broader evaluation effort

Evaluation Starts with Goals zEvaluation should be rooted in your goals for how you are going to use or integrate technology zA logic map can help highlight the connections between your project’s purpose, goals, and actions yAnd actions form the basis for data collection! ypg 15

Example Project Logic Map

Goals Lead to Questions zWhat do you want to see happen? yThese are your goals yRephrase goals into questions zAchieving these goals requires a process that can be measured through a formative evaluation

We Start with Goals… zTo improve student achievement through their participation in authentic and meaningful science learning experiences. zTo provide advanced science and technology learning opportunities to all students regardless of learning styles or abilities. zTo produce high quality science and technology curriculum in which the integration of technology provides “added value” to teaching and learning activities. zTo increase students’ knowledge of the Connecticut River’s history and geology, and to gain and understanding its past, present and possible future environmental issues.

…and move to questions zHas the project developed technology-enhanced science learning experiences that have been instrumental in improving student mastery of the Skills of Inquiry, understanding of the history/geology/ecology of the Connecticut River, and of the 5-8 science curriculum in general? zHas the project offered teacher professional development that has resulted in improved teacher understanding of universal design principles and technology integration strategies?

…And Then to Indicators zWhat is it that you want to measure? yWhether the projects have enhanced learning yThe relationship between the units and xThe selected curriculum xThe process by which they were developed yIncreases in teacher technology skills (in relation to particular standards) yWhether the professional development model met with its design expectations xCollaborative and sustainable xInvolves multiple subjects and administrators

zIndicators should reflect your project’s unique goals and aspirations yRooted in proposed work yIndicators must be indicative of your unique environment...what constitutes success for you might not for someone else yIndicators need to be highly descriptive and can include both qualitative and quantitative measures zYou collect data on your indicators

Evidence? zClassroom observation, interviews, and work- product review yWhat are teachers doing on a day-to-day basis to address student needs? zFocus groups and surveys yMeasuring teacher satisfaction zTriangulation with data from administrators and staffTriangulation yDo other groups confirm that teachers are being served?

Data Collection zReview Existing Data yCurrent technology plan yCurriculum yDistrict/school improvement plans yOthers?

Tools and Techniques zSurveys zInterviews zObservations zArtifact Analysis

Surveys zOnline vs. Paper yIs there sufficient connectivity? yDoesn’t have to be at the classroom level yOften works best if people complete the instruments all at the same time ySame goes for paper surveys zOnline surveys provide immediate data zSpreadsheets which can be exported to a variety of different programs for analysis

Surveys zOnline yVIVEDVIVED yProfilerProfiler yLoTiLoTi yZoomerangZoomerang ySurveyMonkey.comSurveyMonkey.com

Make Your Own! zwww.sun-associates.com/neccsurv.htmlwww.sun-associates.com/neccsurv.html zBased on a CGI script on your webserver zOutputs to a text file, readable by Excel zWorks with yes/no, choose from a list, and free text input (no branching) zwww.sun-associates.com/surveyws/surveys.htmlwww.sun-associates.com/surveyws/surveys.html

Survey Tips zKeep them short (under 10 minutes) zAvoid huge long checklists zAllow for text comments zSupport anonymity yBut allow for categorical identifications -- school, job function, grade, etc.

zCoordinate and support survey administration yAvoid the “mailbox stuffer” yWork with building leaders yProvide clear deadlines

Three Big Points zSurveys alone mean nothing yTRIANGULATE! z100% response rate is virtually impossible yOn the other hand, nearly 100% is very possible if you follow our tips! zShare the data yNo one wants to fill in forms for no purpose

Interviews zServe to back up and triangulate survey data zLess anonymous than surveys yMixed blessing... zAllows for immediate follow-up of interesting findings

Interviewing Tips zAs homogenous as feasible yBy grade, job function, etc. zBe attentive to power structures yDon’t mix principals with teachers; tech coordinators with teachers; central office staff with principals; etc.

zUse outside interviewers yPeople will explain things to us (because they have to!) yWe avoid the power structure issues yWe’ve done this before zStructure and focus the interviews yUse a well-thought-out and designed protocol yOnly diverge after you’ve covered the basic question

Three Big Points zCreate protocols after you’ve seen survey data zHomogeneity and power zUse outsiders to conduct your interviews

Observations zThe third leg of your data triangle Observations ySurveys - Interviews - Observations zFamiliar yet different yYou’ve done this before...but not quite zProgressively less “objective” than surveys and interviews

Observation Tips zInsure that teachers understand the point and focus of the observations yYou’re evaluating a project, not individuals!! zSample yYou can’t “see” everything ySo think about your sample zYou can learn as much from an empty classroom as an active one yLook at the physical arrangement of the room yStudent materials yHow is this room being used?

zOutside observers are necessary unless you simply want to confirm what you already know zAvoid turning observations into a “technology showcase” yShowcases have their place -- mostly for accumulating and reviewing “artifacts” yBut the point of observations is to take a snapshot of the typical school and teacher

Three Big Points zObserve the place as well as the people zObservations are not intended to record the ideal...rather, the typical zUse outside observers

Artifact Analysis zReviewing “stuff” yLesson plans yTeacher materials yStudent work zCreate an artifact rubric yNot the same as your project evaluation indicator rubric

10 Tips for Data Collection zChallenge your assumptions yBut also don’t waste time by asking the obvious zCast a wide net yIt’s all about stakeholders zDig deep yTry to collect the data that can’t easily be observed or counted

zUse confirming sources yTriangulate! Surveys alone do nothing. zHave multiple writers yStakeholders and different perspectives zThink before you collect yChoose questions carefully and with regard to what you really expect to find

zSet (reasonable) expectations for participation yTime and effort zForget about mailbox surveys yUsually waste more time than their value zReport back yDon’t be a data collection black hole!

zIt’s a process, not an event! yIt does little good to collect data once and then never again yData collection is part of a long-term process of review and reflection yEven if the immediate goal is only to get “numbers” for the state forms

Dissemination zCompile the report zDetermine how to share the report ySchool committee presentation yPress releases yCommunity meetings

Conclusion zBuild evaluation into your technology planning effort zRemember, not all evaluation is quantitative zYou cannot evaluate what you are not looking for, so it’s important to — zDevelop expectations of what constitutes good technology integration

zData collection is not evaluation yRather, it’s an important component zData must be collected and analyzed within the context of a goal-focused project indicators

More Information y ext. 204 zwww.sun-associates.com/evaluationwww.sun-associates.com/evaluation zwww.edtechevaluation.comwww.edtechevaluation.com