Deconstructing Standard 2c Dr. Mike Mahan Gordon College 1.

Slides:



Advertisements
Similar presentations
Writing an NCATE/IRA Program Report
Advertisements

Leon County Schools Performance Feedback Process August 2006 For more information
A Vehicle to Promote Student Learning
Culminating Academic Review Adams State College Department of Teacher education graduate programs.
Writing Assessment Plans for Secondary Education / Foundations of Educations Department 9 th Annual Assessment Presentation December 3, 2013 Junko Yamamoto.
Deconstructing Standard 2c Angie Gant, Ed.D. Truett-McConnell College 1.
Culminating Academic Review Adams State College Department of Teacher education graduate programs.
ACCREDITATION SITE VISITS.  DIVISION 010 – SITE VISIT PROCESS  DIVISION 017 – UNIT STANDARDS  DIVISION 065 – CONTENT STANDARDS.
Office of Research, Evaluation, and Assessment April 19, 2008.
GOAL SETTING CONFERENCES BRIDGEPORT, CT SEPTEMBER 2-3,
Professional Growth= Teacher Growth
Action Research: For Both Teacher and Student
Accrediting Commission for Community and Junior Colleges of the Western Association of Schools and Colleges.
 Description  The unit has a conceptual framework that defines how our programs prepare candidates to be well-rounded educators. Every course in the.
Emporia State University Phil Bennett (Some Slides by Dr. Larry Lyman) Teacher Work Sample The Teachers College.
The Department of Educational Administration Assessment Report School of Education and Human Services Carol Godsave, Chair, Assessment Coordinator.
BY Karen Liu, Ph. D. Indiana State University August 18,
Commission on Teacher Credentialing Inspire, Educate, and Protect the Students of California Commission on Teacher Credentialing 1 Accreditation Overview.
Engaging the Arts and Sciences at the University of Kentucky Working Together to Prepare Quality Educators.
Streamlined NCATE Visits Donna M. Gollnick Senior Vice President, NCATE 2008 AACTE Annual Meeting.
Purpose of this webinar Orient NH institutions of higher education (IHEs) to the basics of state policy on teacher education program review and approval.
ACCREDITATION SITE VISITS.  DIVISION 010 – SITE VISIT PROCESS  DIVISION 017 – UNIT STANDARDS  DIVISION 065 – CONTENT STANDARDS.
2 The combination of three concepts constitutes the foundation for results: 1) meaningful teamwork; 2) clear, measurable goals; and 3) regular collection.
NCATE STANDARD I REVIEW Hyacinth E. Findlay Carol Dawson Gwendolyn V. King.
 This prepares educators to work in P-12 schools (1)  It provides direction (1)  It is knowledge-based, articulated, shared, coherent, consistent with.
HECSE Quality Indicators for Leadership Preparation.
NCATE Standard 3: Field Experiences & Clinical Practice Monica Y. Minor, NCATE Jeri A. Carroll, BOE Chair Professor, Wichita State University.
Student Learning Objectives: Approval Criteria and Data Tracking September 17, 2013 This presentation contains copyrighted material used under the educational.
Deconstructing Standard 2b Sharon Valente Savannah College of Art and Design May 14, 2012.
PTEU Conceptual Framework Overview. Collaborative Development of Expertise in Teaching, Learning and Leadership Conceptual Framework Theme:
EDU 385 CLASSROOM ASSESSMENT Week 1 Introduction and Syllabus.
Commission on Teacher Credentialing Inspire, Educate, and Protect the Students of California Commission on Teacher Credentialing Accreditation Overview.
Practicing Meaningful Learning Outcomes Assessment at UGA Department of Crop and Soil Sciences August 10, 2015 Dr. Leslie Gordon Associate Director for.
Standard Two: Understanding the Assessment System and its Relationship to the Conceptual Framework and the Other Standards Robert Lawrence, Ph.D., Director.
NCATE for Dummies AKA: Everything You Wanted to Know About NCATE, But Didn’t Want to Ask.
Reviewer Training Welcome & Introductions Co-Chairs.
The Conceptual Framework: What It Is and How It Works Linda Bradley, James Madison University Monica Minor, NCATE April 2008.
Sharon M. Livingston, Ph.D. Assistant Professor and Director of Assessment Department of Education LaGrange College LaGrange, GA GaPSC Regional Assessment.
Assessment System Overview Center for Education Overview for the NCATE BOE Team April 18-22, 2009.
Reviewer Training 5/18/2012. Welcome & Introductions Co-Chairs: NHDOE Representative:Bob McLaughlin.
STANDARD 4 & DIVERSITY in the NCATE Standards Boyce C. Williams, NCATE John M. Johnston, University of Memphis Institutional Orientation, Spring 2008.
Continuous Improvement. Focus of the Review: Continuous Improvement The unit will engage in continuous improvement between on-site visits. Submit annual.
Documenting Completion of your PDP
Commission on Teacher Credentialing Ensuring Educator Excellence 1 Program Assessment Technical Assistance Meetings December 2009.
External Review Team: Roles and Responsibilities A Very Brief Training! conducted by JoLynn Noe Office of Assessment.
Supervisor Training PEER Centers April/May
Stetson University welcomes: NCATE Board of Examiners.
CONNECT WITH CAEP | | Standard 2: Partnership for Practice Stevie Chepko, Sr. VP for Accreditation.
Reading and Literacy M.Ed. Program.  Graduate programs across the university require some sort of exit option that shows that the student has mastered.
CMSP: Finding our Mathematical Roots Lee Ann Pruske Beth Schefelker MTL Meeting October 18, 2011.
Deconstructing Standard 2c Laura Frizzell Coastal Plains RESA 1.
Note: In 2009, this survey replaced the NCA/Baldrige Quality Standards Assessment that was administered from Also, 2010 was the first time.
Making an Excellent School More Excellent: Weston High School’s 21st Century Learning Expectations and Goals
Richard Woods, Georgia’s School Superintendent “Educating Georgia’s Future” gadoe.org Quality Comprehensive Improvement System Key School Performance Standards.
Performance-Based Accreditation
NCATE Unit Standards 1 and 2
Instructional Leadership for a Professional Learning Culture:
Evaluating Real Life Integration and Application of Content Knowledge
Field Experiences and Clinical Practice
NCATE Standard 3: Field Experiences & Clinical Practice
Donna M. Gollnick Senior Vice President, NCATE April 2008
NCATE 2000 Unit Standards Overview.
Helping students know what they know
Standard Four Program Impact
McREL TEACHER EVALUATION SYSTEM
Deconstructing Standard 2a Dr. Julie Reffel Valdosta State University
Deborah Anne Banker Committee Chair
McREL TEACHER EVALUATION SYSTEM
Deconstructing Standard Two, Element 2-b Dr
Presentation transcript:

Deconstructing Standard 2c Dr. Mike Mahan Gordon College 1

2c: Use of Data for Program Improvement, p.2 Key criteria for meeting the expectations of element 2c Unit systematically and regularly uses data to evaluate the efficacy of courses, programs, and clinical experiences Changes in the unit are discussed and made based on systematic use of data Candidate and faculty data are shared with candidates and faculty to encourage reflection and improvement Faculty have access to data or data systems 2

2c: Use of Data for Program Improvement Element 2c focuses on regular and systematic use of data to evaluate and improve the efficacy and effectiveness of programs and unit operations. Performance assessment data are shared and used to support candidate and faculty growth and development. Data-informed changes are initiated to improve the unit and its programs. 3

Sub-elements of Standard 2c (1) The professional education unit regularly and systematically uses data, including candidate and graduate performance information, to evaluate the efficacy of its courses, preparation programs, and clinical experiences. 3 AFI’s cited 4

Sub-elements of Standard 2c (2) The professional education unit analyzes preparation program evaluation and performance assessment data to initiate changes in the program and professional unit operations. 2 AFI’s cited 5

Sub-elements of Standard 2c (3) Faculty have access to candidate assessment data and/or data systems. 0 AFI’s cited 6

Sub-elements of Standard 2c (4) Candidate assessment data are regularly shared with candidates and faculty to help them reflect on and improve their performance and preparation programs. 0 AFI’s cited 7

Scenario One The evidence room contained Program Advisory Committee schedules and minutes spanning three years. Documents outlined specific dates on which the Committee analyzed and summarized Field Experience Observation data, faculty reviews, Principal Surveys, and Candidate GACE Performance data, as well as the recommendations of the Committee. During interviews, Program Advisory Committee members confirmed what was found in the evidence room. Members described specific changes made in assigning field experience supervisors after reviewing and analyzing student complaints, which occurs in May of each year. When members of the Program Advisory Committee were asked to describe how decisions were made regarding changes in the program, members described a recent meeting in which they determined the need for earlier emphasis on classroom management techniques in courses. Members discussed the process of analyzing results from Principal Surveys and Field Experience Observation Instruments, as well as Candidate Action Plans. One member explained that instruction addressing the creation of a classroom management plan now took place in all junior-level courses. 8

List of Evidence (Scenario One) Analysis of Candidate Action Plans Analysis of Field Experience Observation Instruments Analysis of Principal Surveys Department Meeting Minutes (9/5/11) Department Meeting Minutes (10/2/11) Department Meeting Minutes (11/4/11) Department Meeting Minutes (12/1/11) Faculty Assessment Tables Program Advisory Committee Meeting Minutes (5/24/11) Program Advisory Committee Meeting Minutes (11/15/11) Unacceptable, Acceptable, or Target? 9

Scenario Two The evidence room contained descriptions of the roles and responsibilities of unit faculty and P-12 partners in the administration and evaluation of the unit’s assessment system, but interviewees were unable to articulate their roles. The evidence room also housed three years of aggregated data for the Field Experience Observation Instrument. These data indicated that candidates struggled with incorporating technology. When faculty and clinical supervisors were asked about these data during interviews, no one seemed aware of any changes made to courses. When asked in interviews how decisions were made regarding changes in curriculum or assessment, members of the Program Advisory Committee struggled to answer. 10

List of Evidence (Scenario Two) Field Experience Observation Data (3 years) Field Experience Observation Form Program Advisory Committee Minutes (2/25/11) Roles and Responsibilities of Stakeholders Unacceptable, Acceptable, or Target? 11

Scenario Three The Institutional Report described the unit’s assessment system in detail. It included dates for assessment completion and the roles of faculty, candidates, and other stakeholders in reviewing specific assessment data. During interviews, faculty and clinical supervisors articulated the changes made to processes and courses because of data reviews. In interviews, current and past candidates discussed their work on Student Study Teams. They described the process of analyzing results from faculty members’ evaluations of candidate performance for the past three years. These analyses demonstrated an area of weakness in candidates’ use of technology; and candidates had the opportunity to suggest specific ways for making improvements in this area. One of those improvements was to develop a new technology assessment, which is now completed during a pedagogical course (EDUC 3212). In interviews, faculty described the value of the training on the Data Collection System and the Candidate Data Review. They stated that they have a better understanding of how their work fits into the bigger picture of candidate assessment. 12

List of Evidence (Scenario Three) Minutes from Faculty Training: Candidate Data Review (1/25/11) Minutes from Faculty Training: Data Collection System (3/30/11) Student Study Team Groups (List) Student Study Team Minutes (5/18/10) Student Study Team Minutes (5/20/11) Syllabus: EDUC 3212 (Pedagogy Methods) Teacher Education Program Timeline for Assessment ( ) Teacher Education Program Timeline for Assessment ( ) Technology Assessment (completed in EDUC 3212) Unacceptable, Acceptable, or Target? 13

Scenario Four The evidence room contained minutes from the most recent Program Advisory Committee meeting, which revealed that the order of coursework was not highly beneficial to candidates in the field. A different order of classes would be more conducive to the candidates’ field experiences. All stakeholders were made aware of the concerns, and the process for changing the order of coursework had begun. There were also minutes from a department meeting from the previous year that demonstrated that GACE data were examined. ECE candidates were consistently struggling to pass the language arts/social studies portion of the test. An analysis of the sub-element data revealed that social studies was the area of concern. The social studies methods courses were revised to reflect a closer correlation to the GACE frameworks. During interviews, current and previous candidates described the process by which their performance assessments were shared with them in End-of- Semester Conferences. 14

List of Evidence (Scenario Four) Candidate Coursework Order (current) Candidate Coursework Order (proposed) Department Meeting Minutes (8/1/10) End-of-Semester Conference Examples from two candidates End-of-Semester Conference Form Program Advisory Committee Meeting Minutes (10/12/11) Program Advisory Committee Members (including two candidates and two graduates) Syllabus: Social Studies Methods Course (prior to examination of GACE data) Syllabus: Social Studies Methods Course (revised) Unacceptable, Acceptable, or Target? 15

Rubric for Evaluating Evidence, pg. 1 CharacteristicsPerformanceSample Evidence Standard 2-c Sub-Element UnacceptableAcceptableTarget 1 The unit does not regularly and systematically use data to evaluate its courses, its programs, nor its field experiences. The unit regularly and systematically uses data to evaluate the courses, the programs, and the field experiences The unit has an identified routine schedule to use data collected on graduates and candidates for the purpose of evaluating the courses, the programs, and the field experiences. 2The unit does not analyze preparation program and performance assessment data to make changes in program and unit operations. The unit analyzes program preparation and performance assessment data to initiate changes in the program and the unit operations. The unit regularly schedules analysis of program preparation and performance assessment data with the intention of using the data to make changes in the programs, and the unit operations. 16

Rubric for Evaluating Evidence, pg. 2 CharacteristicsPerformanceSample Evidence 3 Faculty do not have access to candidate assessment data nor to data systems. Faculty have access to candidate assessment data and/or to data collection system information Faculty have access to and are trained to understand candidate assessment data and/or to the data collection system. 4Candidate assessment data are not shared with candidates, nor faculty to help them reflect and improve their performance and preparation programs. Candidate assessment data are shared regularly with candidates and faculty to help them reflect on and improve their performance and preparation programs. Candidate assessment data is regularly aggregated and disaggregated by the unit to help faculty and candidates reflect on and improve their performance and preparation programs. 17

Thank You! 18