Presentation is loading. Please wait.

Presentation is loading. Please wait.

Using survey data for continuous improvement

Similar presentations


Presentation on theme: "Using survey data for continuous improvement"— Presentation transcript:

1 Using survey data for continuous improvement
Phil Chase, MDE/OPPS Sarah-Kate LaVan, MDE/OPPS Preston Hicks, MDE/OPPS 2017 Hope College CAEP Conference 4/6/2017

2 Overview questions How did we develop the current surveys?
Candidate Supervisor (CS) Teacher Candidate (TC) Cooperating Teacher (CT) Year Out (YO) How are we collecting, processing, and disseminating the survey data? How are they aligned to the Interstate Teacher and Assessment Consortium (InTASC) Model Core Teaching Standards? How do we know the surveys are good instruments for data collection? 2017 Hope College CAEP Conference 4/6/2017

3 Preview of discussion questions:
How are you using the survey data now? How can the survey content and alignment be improved? 2017 Hope College CAEP Conference 4/6/2017

4 Background May 2008: Professional Standards for Michigan Teachers (PSMT) adopted 2008 to 2013: First version of teacher candidate and candidate supervisor surveys created by MDE to address Superintendent’s continuous improvement goals at teacher preparation programs, collected on SurveyMonkey April 2013: InTASC Model Core Teaching Standards approved by State Board of Education (SBE) to replace PSMT May-August 2013: MDE and prep programs partner to revise existing survey categories and items to align to InTASC 2017 Hope College CAEP Conference 4/6/2017

5 ANNUAL CYCLES OF SURVEY COLLECTION
2017 Hope College CAEP Conference 4/6/2017

6 Background WITH Timeline
September 2013: First administration of newly-aligned Teacher Candidate and Candidate Supervisor surveys on SurveyMonkey October 2013: Revision of EPI Performance Score along three measurement goals February 2014: Revision of Likert scale for surveys April 2014: First administration of Year Out surveys July 2014: First annual reports from newly-revised EPI Performance Score distributed, which include survey efficacy as component measure 2017 Hope College CAEP Conference 4/6/2017

7 Background WITH Timeline (cont.)
November 2014: MDE enters contract with Qualtrics to improve survey tools, data capture and services April 2015: Second annual EPI Performance Score reports distributed April 2015: First administration of Cooperating Teacher surveys July 2015: First reports of Year Out and Cooperating Teacher surveys distributed November 2015: Qualtrics contract extended to include more survey responses April 2016: Third annual EPI Performance Score reports distributed 2017 Hope College CAEP Conference 4/6/2017

8 Background WITH Timeline (cont.)
June 2016: Qualtrics contract upgraded to unlimited survey responses, unlimited dash-boarding creation on new product called Vocalize April 2017: Fourth annual EPI Performance Score reports distributed, first time on Vocalize May 2017: Multiple-year aggregate reports of Year Out and Cooperating Teacher surveys distributed June-August 2017: Development of real-time survey data dashboards on Vocalize 2017 Hope College CAEP Conference 4/6/2017

9 Current Survey Categories
CS TC CT YO High-Quality Learning Experiences P Critical Thinking Connecting Real-World Problems Using Technology Addressing the Needs of Special Populations Organizing the Learning Environment Effective Use of Assessments and Data Field Experiences and Clinical Practice K-12 School and College/University Partnerships College/University Support for Teacher Candidate Support for Teacher Job Search Job Search Experiences Awareness of Educational Initiatives, Laws, and Policies CURRENT SURVEY SETUP Refer to “desk copy” handouts for current wording of individual items falling under each category Note that not all categories have the same number of items across the four survey populations We will focus our discussion today on the “Upper Seven” categories 2017 Hope College CAEP Conference 4/6/2017

10 Candidate Supervisors
SURVEY RELIABILITY In 2015, the MDE did a short study on the internal consistency of the four surveys using Cronbach’s Alpha Cronbach’s Alpha is a way to measure whether a group of test or survey items “agree” that they are measuring the same thing It was found that for each of the four surveys, their items had a high degree of correlation with each other This we concluded that the surveys reliably measured each unique response group’s perception of the teacher preparation that candidates received Response Group Cronbach’s Alpha Candidate Supervisors .97 Teacher Candidates .96 Cooperating Teachers Year-Out Teachers 2017 Hope College CAEP Conference 4/6/2017

11 Prior alignment work PSMT Standards Subject Matter Knowledge Base in General and Liberal Education Instructional Design and Assessment Curricular and Pedagogical Content Knowledge Aligned with State Resources Effective Learning Environments Responsibilities and Relationships to the School, Classroom, and Student Responsibilities and Relationships to the Greater Community Technology Operations and Concepts InTASC Categories and Standards THE LEARNER AND LEARNING 1: Learner Development 2: Learning Differences 3: Learning Environments CONTENT KNOWLEDGE 4: Content Knowledge 5: Application of Content INSTRUCTIONAL PRACTICE 6: Assessment 7: Planning for Instruction 8: Instructional Strategies PROFESSIONAL RESPONSIBILITY 9: Professional Learning and Ethical Practice 10: Leadership and Collaboration “Upper Seven” Survey Categories High-Quality Learning Experiences Critical Thinking Connecting Real-World Problems Using Technology To Maximize Student Learning Addressing the Needs of Special Populations Organizing the Learning Environment Effective Use of Assessments and Data Refer to “desk copy” handouts for current (draft) InTASC alignment 2017 Hope College CAEP Conference 4/6/2017

12 The future of survey reporting
2017 Hope College CAEP Conference 4/6/2017

13 Questions for discussion
How are you using the survey data now? To whom do you distribute reports? For what purpose(s)? What types of analyses do you create above and beyond what the MDE distributes? How can the Vocalize dashboards play a role in better using or distributing survey data? 2017 Hope College CAEP Conference 4/6/2017

14 Questions for discussion
How can the survey content and alignment be improved? What are we collecting that we don’t need to or can’t use? What are we collecting that we need, but haven’t used to maximum benefit? What are we not collecting that we ought to? Is there “measurement noise” that can be eliminated or reduced? 2017 Hope College CAEP Conference 4/6/2017


Download ppt "Using survey data for continuous improvement"

Similar presentations


Ads by Google