Deconstructing Standard 2a Dr. Julie Reffel Valdosta State University

Slides:



Advertisements
Similar presentations
Writing an NCATE/IRA Program Report
Advertisements

Deconstructing Standard 2c Angie Gant, Ed.D. Truett-McConnell College 1.
Susan Malone Mercer University.  “The unit has taken effective steps to eliminate bias in assessments and is working to establish the fairness, accuracy,
ASSESSMENT SYSTEMS FOR TSPC ACCREDITATION Assessment and Work Sample Conference January 13, 2012 Hilda Rosselli, Western Oregon University.
ACCREDITATION SITE VISITS.  DIVISION 010 – SITE VISIT PROCESS  DIVISION 017 – UNIT STANDARDS  DIVISION 065 – CONTENT STANDARDS.
Office of Research, Evaluation, and Assessment April 19, 2008.
Weber State University’s Teacher Preparation Program Conceptual Framework.
What should be the basis of
performance INDICATORs performance APPRAISAL RUBRIC
Develop Systematic processes Mission Performance Criteria Feedback for Quality Assurance Assessment: Collection, Analysis of Evidence Evaluation: Interpretation.
Purpose Program The purpose of this presentation is to clarify the process for conducting Student Learning Outcomes Assessment at the Program Level. At.
 Description  The unit has a conceptual framework that defines how our programs prepare candidates to be well-rounded educators. Every course in the.
Commission on Teacher Credentialing Inspire, Educate, and Protect the Students of California Commission on Teacher Credentialing 1 Accreditation Overview.
Measuring Dispositions Dr. Sallie Averitt Miller, Associate Dean Office for Assessment and Accreditation Columbus State University GaPSC Regional Assessment.
Streamlined NCATE Visits Donna M. Gollnick Senior Vice President, NCATE 2008 AACTE Annual Meeting.
ACCREDITATION SITE VISITS.  DIVISION 010 – SITE VISIT PROCESS  DIVISION 017 – UNIT STANDARDS  DIVISION 065 – CONTENT STANDARDS.
Deconstructing Standard 2c Dr. Mike Mahan Gordon College 1.
Alaska Staff Development Network – Follow-Up Webinar Emerging Trends and issues in Teacher Evaluation: Implications for Alaska April 17, :45 – 5:15.
NCATE STANDARD I REVIEW Hyacinth E. Findlay Carol Dawson Gwendolyn V. King.
 This prepares educators to work in P-12 schools (1)  It provides direction (1)  It is knowledge-based, articulated, shared, coherent, consistent with.
NCATE Standard 3: Field Experiences & Clinical Practice Monica Y. Minor, NCATE Jeri A. Carroll, BOE Chair Professor, Wichita State University.
March 26-28, 2013 SINGAPORE CDIO Asian Regional Meeting and Workshop on Engineering Education and Policies for Regional Leaders Programme Evaluation (CDIO.
TWS Aids for Student Teachers & Interns Overview of TWS.
Eloise Forster, Ed.D. Foundation for Educational Administration (FEA)
Standard Two: Understanding the Assessment System and its Relationship to the Conceptual Framework and the Other Standards Robert Lawrence, Ph.D., Director.
The Conceptual Framework: What It Is and How It Works Linda Bradley, James Madison University Monica Minor, NCATE April 2008.
Sharon M. Livingston, Ph.D. Assistant Professor and Director of Assessment Department of Education LaGrange College LaGrange, GA GaPSC Regional Assessment.
Assessment System Overview Center for Education Overview for the NCATE BOE Team April 18-22, 2009.
STANDARD 4 & DIVERSITY in the NCATE Standards Boyce C. Williams, NCATE John M. Johnston, University of Memphis Institutional Orientation, Spring 2008.
Continuous Improvement. Focus of the Review: Continuous Improvement The unit will engage in continuous improvement between on-site visits. Submit annual.
Consistency of Assessment (Validation) Webinar – Part 1 Renae Guthridge WA Training Institute (WATI)
Why So Much Attention on Rubric Quality? CAEP Standard 5, Component 5.2: The provider’s quality assurance system relies on relevant, verifiable, representative,
Stetson University welcomes: NCATE Board of Examiners.
Deconstructing Standard 2c Laura Frizzell Coastal Plains RESA 1.
Wisconsin Personnel Development System Grant Click on the speaker to listen to each slide. You may wish to follow along in your WPDM Guide.
Performance-Based Accreditation
EVALUATING EPP-CREATED ASSESSMENTS
NCATE Unit Standards 1 and 2
Student Learning Outcomes Documentation
Michael Kelly, Ed. D. Virginia Tech
Presented by Deborah Eldridge, CAEP Consultant
Eastern’s Assessment System
Clinical Practice evaluations and Performance Review
Evaluating Real Life Integration and Application of Content Knowledge
Phyllis Lynch, PhD Director, Instruction, Assessment and Curriculum
Michael Kelly, Ed. D. John Gratto, Ed. D. Virginia Tech
Program Quality Assurance Process Validation
Field Experiences and Clinical Practice
NCATE Standard 3: Field Experiences & Clinical Practice
Elayne Colón and Tom Dana
NCATE 2000 Unit Standards Overview.
Continuous Improvement through Accreditation AdvancED ESA Accreditation MAISA Conference January 27, 2016.
Tony Kirchner Developing a Quality Assurance Plan with the Teacher Work Sample as the Linchpin Tony Kirchner
Educator Effectiveness Regional Workshop: Round 2
Standard Four Program Impact
What have we learned, where do we need to go?
Course Evaluation Ad-Hoc Committee Recommendations
Assessing Academic Programs at IPFW
Unpacking Standard 2A: Assessment System Georgia Gwinnett College
Clinical Educator and Teacher Candidate Training Chapter for the Candidate Preservice Assessment for Student Teachers (CPAST) Form Developed by the VARI-EPP*
OTL:NGP:EA:1217.
February 21-22, 2018.
Eloise Forster, Ed.D. Foundation for Educational Administration (FEA)
Deborah Anne Banker Committee Chair
Making Middle Grades Work
Marilyn Eisenwine Committee Chair
Deconstructing Standard 2b
Deconstructing Standard Two, Element 2-b Dr
Training Chapter for the Advanced Field Experience Form (Pre-CPAST
Cooperating Teacher and Student Teacher Training Chapter for the Candidate Preservice Assessment for Student Teachers (CPAST) Form Developed by the VARI-EPP*
Presentation transcript:

Deconstructing Standard 2a Dr. Julie Reffel Valdosta State University

Element 2a. Assessment System Element 2a focuses on the design, development, implementation and evaluation of the unit assessment system. The unit collaborates with members of its professional community in designing and managing an assessment system with multiple and comprehensive measures to help ensure candidate performance, program quality and unit operations. The assessment system should reflect the conceptual framework, professional and state standards. The system is regularly evaluated by the professional community to ensure the fairness, accuracy, consistency, and freedom from bias of its assessment procedures and unit operations.

Key criteria for meeting the expectations of Element 2a The unit's assessment system: Reflects the conceptual framework Identifies comprehensive/integrated measures Monitors candidate performance and unit operations Includes multiple assessments and transition points Includes fair, accurate, consistent assessments with efforts to eliminate bias (including unit operations) Involves the professional education community in the design, development and evaluation of the assessment system

Sub-elements of Standard 2a (1) The professional education unit has an assessment system that reflects the conceptual framework and professional and state standards and is regularly evaluated by its professional community. 8 AFI’s were cited from 2007-2011

Sub-elements of Standard 2a (2) The professional education unit’s system includes a comprehensive and integrated set of assessment and evaluation measures to monitor candidate performance and manage and improve professional education unit’s operations and preparation programs. 4 AFI’s were cited from 2007-2011

Sub-elements of Standard 2a (3) Decisions about candidate performance are based on multiple assessments made at admission into preparation programs, appropriate transition points, and preparation program completion. 0 AFI’s were cited from 2007-2011

Sub-elements of Standard 2a (4) The professional education unit has taken effective steps to eliminate bias in assessments and is working to establish the fairness, accuracy, and consistency of its assessment procedures and professional education unit operations. 5 AFI’s were cited from 2007-2011

Scenario One   Assessments are conducted by various assessors (site supervising teachers and university site supervisors). Full time and part time faculty who have responsibility for completing the assessment instruments meet each semester to discuss the instrument ratings and discuss expectations for instrument use and review rating scales. The supervising teachers are invited to meet each semester with the division chair to discuss instrument use. The unit has not yet adopted a process by which they have determined the accuracy and consistency in use of the instruments they currently use. While there is discussion about what the indicators mean, there have not been any inter-rater studies or activities to confirm consistency in use and accuracy of data. One example of an effort to ensure fairness in the process may be evidenced in the event there is a low rating (3 or below on a 5-point scale) the DSC supervisor may request that another supervisor conduct an independent review to affirm or disaffirm the original ratings.

Findings (Scenario One) Unacceptable, Acceptable, or Target? Acceptable with an Area for Improvement: The unit has not developed a systematic method for assuring that assessments are fair, consistent, and free of bias.   Rationale: Although in some situations varied scorers or a second scorer may be used for specific assessments, there have been limited attempts to ensure reliability across all assessments. Additionally, assessments have not been examined for lack of bias.

Scenario Two   The application of assessments at multiple transitions points provides reliable data about candidates. Data collected regularly through an electronic system for ease in reporting aggregate and individual candidate results. Faculty members engage in discussion of findings from candidate assessments and surveys but the process of using data to inform decisions about candidates and delivery is not systematic or planned.

Findings (Scenario Two) Unacceptable, Acceptable, or Target? Acceptable with an Area for Improvement The program does not have a systematic, planned process for evaluating assessment results, courses, programs and clinical experiences.   Rationale: Faculty report a number of examples where results from candidate assessments have informed their decisions about program matters; however, actual analysis of data to determine implications for programmatic improvement occurs irregularly.

Scenario Three The unit has identified key assessment to evaluate both unit operations and program quality. The unit plans to use the data from assessments identified below to inform program and unit improvement. Candidate performance and progress is the focus at the program level and unit operations and program quality are the focus at the unit level. While the unit has identified the above as key assessments with frequency indicators for collection of data, there is no articulated plan at this time for how these assessments will be collected, analyzed, used and shared in a systematic manner for program quality and unit operations.

Findings (Scenario Three) Unacceptable, Acceptable, or Target? Acceptable with an area for improvement The professional education unit has not fully operationalized the information technologies they have identified to maintain their assessment system. Rationale: Even though the unit has identified multiple information technology systems (BANNER, LiveText, and Access), there is not a well articulated comprehensive, integrative mechanism for utilization of those systems to collect, analyze, and evaluate data. At the time of the on-site visit, the unit provided an updated data collection schedule and timeline that includes assessment data that will be collected within the respective information technology system however it has not been fully integrated or operationalized.

Scenario Four The assessment system was reviewed to ensure the system allowed for effective and efficient student transitions through the program. The evaluation system was adjusted to meet the new timelines for students, but the key transition points remained. Changes were approved by the faculty and taken to the Advisory Council. The unit stated that training is provided to clinical faculty and school personnel in the sue of evaluation tools and rubrics, however there was not consistent evidence in the exhibits or in interviews to validate this statement. While the assessment tools were very consistent in structure around the five domains and the use of a standardized scoring scale; evidence was lacking in the training and using scoring guides to ensure assessment procedures were fair, accurate and consistent.

Findings (Scenario Four) Unacceptable, Acceptable, or Target? Unacceptable at the advanced level

Rubric for Evaluating Evidence, pg. 1 Characteristics Performance Sample Evidence Unacceptable Acceptable Target 1 The unit has not involved its professional community in the development of its assessment system. The unit has an assessment system that reflects the conceptual framework and professional and state standards and is regularly evaluated by its professional community. The unit, with the involvement of its professional community, is regularly evaluating the capacity and effectiveness of its assessment system, which reflects the conceptual framework and incorporates candidate proficiencies outlined in professional and state standards. 2 The unit's assessment system is limited in its capacity to monitor candidate performance, unit operations, and programs. The unit's system includes comprehensive and integrated assessment and evaluation measures to monitor candidate performance and manage and improve the unit's operations and programs. The unit regularly examines the validity and utility of the data produced through assessments and makes modifications to keep abreast of changes in assessment technology and in professional standards.

Rubric for Evaluating Evidence, pg. 2 Characteristics Performance Sample Evidence 3 The assessment system does not reflect professional, state, and institutional standards. Faculty have access to candidate assessment data and/or to data collection system information Faculty have access to and are trained to understand candidate assessment data and/or to the data collection system. 4 Candidate assessment data are not shared with candidates, nor faculty to help them reflect and improve their performance and preparation programs. Candidate assessment data are shared regularly with candidates and faculty to help them reflect on and improve their performance and preparation programs. Candidate assessment data is regularly aggregated and disaggregated by the unit to help faculty and candidates reflect on and improve their performance and preparation programs.

AFIs related to the development of the Assessment System The unit assessment system is not aligned with the unit’s conceptual framework. The assessment system has not been developed in collaboration with the professional community. The unit has not implemented procedures to ensure fairness, accuracy and consistency in the assessment of candidate performance.

AFIs related to the development of the Assessment System The assessment system does not indicate how data will be regularly analyzed to improve candidate performance, program quality and unit operations. The unit assessment system does not clearly address candidate outcomes identified in the conceptual framework. Assessments of candidates knowledge, skills and dispositions in courses are not clearly and consistently linked to candidate performances described in the conceptual framework.

Thank You!