CONNECT WITH CAEP | | Three-Year-Out Review of Assessments (Pending Accreditation Council and CAEP.

Slides:



Advertisements
Similar presentations
CONNECT WITH CAEP | | Teachers Know Their Content And Teach Effectively: CAEP Standard 1 Stevie Chepko,
Advertisements

CONNECT WITH CAEP | WELCOME CAEP Pre-Conference James G. Cibulka, President.
Service Agency Accreditation Recognizing Quality Educational Service Agencies Mike Bugenski
The Program Review Process: NCATE and the State of Indiana Richard Frisbie and T. J. Oakes March 8, 2007 (source:NCATE, February 2007)
ASSESSMENT SYSTEMS FOR TSPC ACCREDITATION Assessment and Work Sample Conference January 13, 2012 Hilda Rosselli, Western Oregon University.
ACCREDITATION SITE VISITS.  DIVISION 010 – SITE VISIT PROCESS  DIVISION 017 – UNIT STANDARDS  DIVISION 065 – CONTENT STANDARDS.
CONNECT WITH CAEP | Timeline for Accreditation Handbook and Early Adopters Stevie Chepko, Sr., VP.
CONNECT WITH CAEP | Building on Strong Foundations: CAEP Standards 2 & 4 OCTEO Spring Conference,
1 NCATE Standards. 2  Candidate Performance  Candidate Knowledge, Skills, & Dispositions  Assessment System and Unit Evaluation  Unit Capacity Field.
Principles of Assessment
 Description  The unit has a conceptual framework that defines how our programs prepare candidates to be well-rounded educators. Every course in the.
Instrumentation.
CONNECT WITH CAEP | | CAEP Standard 3: Candidate quality, recruitment and selectivity Jennifer Carinci,
CONNECT WITH CAEP | Transitioning from NCATE and TEAC to CAEP: How? Patty Garvin, Senior Director,
Classroom Assessments Checklists, Rating Scales, and Rubrics
ACCREDITATION SITE VISITS.  DIVISION 010 – SITE VISIT PROCESS  DIVISION 017 – UNIT STANDARDS  DIVISION 065 – CONTENT STANDARDS.
May 1 1 Teacher Education Accreditation Council One Dupont Circle, Suite 320 Washington DC
CONNECT WITH CAEP | | The Next Horizon Incorporating Student Perception Surveys into the Continuous.
ASSESSMENT OF CORE SKILLS/ GENERAL EDUCATION OUTCOMES Angelina Hill, PhD Associate Director, Office of Academic Assessment.
March 15-16, Inquiry and Evidence An introduction to the TEAC system for accrediting educator preparation programs 3/15/12, 9:00-10:00a.m. CAEP.
The Conceptual Framework: What It Is and How It Works Linda Bradley, James Madison University Monica Minor, NCATE April 2008.
NCATE STANDARD I STATUS REPORT  Hyacinth E. Findlay  March 1, 2007.
CONNECT WITH CAEP | | CAEP Accreditation and STEM Stevie Chepko, Sr. VP for Accreditation
Systems Accreditation Berkeley County School District School Facilitator Training October 7, 2014 Dr. Rodney Thompson Superintendent.
Why So Much Attention on Rubric Quality? CAEP Standard 5, Component 5.2: The provider’s quality assurance system relies on relevant, verifiable, representative,
Stetson University welcomes: NCATE Board of Examiners.
National PE Cycle of Analysis. Fitness Assessment + Gathering Data Why do we need to asses our fitness levels?? * Strengths + Weeknesses -> Develop Performance.
CONNECT WITH CAEP | | Standard 2: Partnership for Practice Stevie Chepko, Sr. VP for Accreditation.
Council for the Accreditationof EducatorPreparation Standard 5: Provider Quality Assurance and Continuous Improvement 2014 CAEP –Conference Nashville,
CONNECT WITH CAEP | | CAEP Update Stevie Chepko, CAEP Sr. VP for Accreditation.
CONNECT WITH CAEP | | Measures of Teacher Impact on P-12 Students Stevie Chepko, Sr. VP for Accreditation.
NOTE: To change the image on this slide, select the picture and delete it. Then click the Pictures icon in the placeholder to insert your own image. COMMON.
NCATE Program Review Process Margaret D. Crutchfield, Ph.D. September 2006
Council for the Accreditationof EducatorPreparation Standard 1: CONTENT AND PEDAGOGICAL KNOWLEDGE 2014 CAEP –Conference Nashville, TN March 26-28, 2014.
Overview of Types of Measures Margaret Kasimatis, PhD VP for Academic Planning & Effectiveness.
Overview of CAEP Guidelines 2014 CAEP –Conference Nashville, TN March 26-28, 2014 Presenters: Mark LaCelle-Peterson, CAEP Hilda R. Tompkins, CAEP, Emerson.
Designing Quality Assessment and Rubrics
CAEP Standard 4 Program Impact Case Study
Data Conventions and Analysis: Focus on the CAEP Self-Study
OCTEO April 1, 2016 Margaret D. Crutchfield, Ph.D.
EVALUATING EPP-CREATED ASSESSMENTS
NCATE Unit Standards 1 and 2
Tenure and Recontracting August 29, 2017
Classroom Assessments Checklists, Rating Scales, and Rubrics
Presented by Deborah Eldridge, CAEP Consultant
NASP Program Review and Approval Eric Robinson, PhD
STANDARD 1 Content and Pedagogical Knowledge
Bob Michael Associate Vice Chancellor, University System of Georgia
Partnership for Practice
GETTING INVOLVED: VOLUNTEER OPPORTUNITIES AT CAEP
Office of Field and Clinical Partnerships and Outreach: Updates
Elayne Colón and Tom Dana
CAEP Orientation: Newcomers
TN: TEACH AACTE Grant TN TEACH: The TN EPP Assistive and Collaborative Help Network.
Tony Kirchner Developing a Quality Assurance Plan with the Teacher Work Sample as the Linchpin Tony Kirchner
Classroom Assessments Checklists, Rating Scales, and Rubrics
PROGRAM REVIEW AS PART OF THE CAEP ACCREDITATION PROCESS
April 17, 2018 Gary Railsback, Vice President What’s new at CAEP.
Tenure and Recontracting February 7, 2018
Tenure and Recontracting February 6, 2018
Option C Reviewer Update
Tenure and Recontracting October 6, 2017
Bob Michael Associate Vice Chancellor, University System of Georgia
Deconstructing Standard 2a Dr. Julie Reffel Valdosta State University
jot down your thoughts re:
Tenure and Recontracting February 26, 2019
jot down your thoughts re:
Prepared for EDTPA CONFERENCE MIDDLE TENNESSEE STATE UNIVERSITY
Tennessee edTPA Conference
Presentation transcript:

CONNECT WITH CAEP | | Three-Year-Out Review of Assessments (Pending Accreditation Council and CAEP Board Approval) Stevie Chepko, Sr., VP for Accreditation Elizabeth Vilky, Sr. Director for Program Review

CONNECT WITH CAEP | | Purpose of Three-Year-Out Review Advancing excellence in educator preparation through evidence-based accreditation and continuous improvement –  Assessment includes all instruments providing evidence for meeting a standard  Mandate to improve the quality of assessments used by EPPs  Part of CAEP’s commitment to capacity building  Designing to provide feedback to EPPs No value or decision is made on assessments Feedback is give to EPPs to facilitate the improvement of EPP assessments used across the EPP (Pending approval of Accreditation Council and CAEP Board)

CONNECT WITH CAEP | | Three-Year-Out Review Review is scheduled three years before the scheduled date of the self-study  Pre-submission allows EPPs to use feedback to improve quality of assessments before the submission of self-study and site visit  Data are not submitted with the assessments  Only assessments(also scoring guide when applicable) are submitted for review  All EPP wide assessments used for all specialty licensure areas (Pending approval of Accreditation Council and CAEP Board)

CONNECT WITH CAEP | | Three-Year-Out Review: Proprietary Assessments Proprietary Assessments –  Assessments that are external to the EPP where property rights are held by another agency State Licensure exams edTPA or PPAT State required assessments (i.e., clinical observation instruments, etc.) Any required state or national level assessment Validity and reliability established by an external source  Proprietary Assessments are exempt from the Three- Year-Out review (Pending approval of Accreditation Council and CAEP Board)

CONNECT WITH CAEP | | Proprietary Assessments (cont.) EPPs will provide context for the use of proprietary assessments  When during the program is assessment used  Identify if the assessment is mandated or elective for the EPP  Identify the alignment of the proprietary assessment with the CAEP standard  If available, provide validity/trustworthiness and reliability/consistency data for the instrument (Pending approval of Accreditation Council and CAEP Board)

CONNECT WITH CAEP | | EPP Created Assessments Often include, but not limited to -  Clinical observation instruments  Work sample instruments  Lesson and/or unit plan instruments  Dispositional instruments  Reflection instruments  Surveys Candidate exit surveys Employer surveys Student surveys Alumni surveys  (Pending approval of Accreditation Council and CAEP Board)

CONNECT WITH CAEP | | Criteria for Evaluating Assessments with Scoring guides Assessments align with CAEP Standards and provide evidence for meeting the standards –  Same or consistent categories of content appear in the assessment item that are found in the standards  Assessments are congruent with the complexity, cognitive demands, and skill requirements described in the standard  Level of respondent effort required, or the difficulty or degree of challenge is consistent with standards  (Pending approval of Accreditation Council and CAEP Board)

CONNECT WITH CAEP | | Criteria for Evaluating Assessments with Scoring Guides (cont.) Level of respondent effort required is reasonable for candidates who are ready to teach Assessment item(s) address the range of knowledge, skills, and dispositions delineated in standards Assessments are free of bias – Avoid bias in language – Avoid bias in testing situations  (Pending approval of Accreditation Council and CAEP Board)

CONNECT WITH CAEP | | Criteria for Evaluating Assessments with Scoring Guides (cont.) Questions to be answered –  Is there a clear basis for judging the adequacy of candidate work? A rubric or scoring guide is used Evidence that the assessment measures what it is purports to measure (validity) Results are consistent across raters and over time (reliability) Criteria in rubric or scoring guide are related to CAEP standards (Pending approval of Accreditation Council and CAEP Board)

CONNECT WITH CAEP | | Criteria for Scoring Guides or Rubrics Distinct levels of candidate performance must be defined – Descriptions of each level describe attributes related to actual performance – Levels represent a developmental sequence in which each successive level is qualitatively different from prior level – It is clear which level represents exit proficiency (ready to practice) – Levels are clearly distinguishable from one another – Levels are constructed in parallel with one another in terms of attributes and descriptors – Scoring guides provide specific feedback to candidates – (Pending approval of Accreditation Council and CAEP Board)

CONNECT WITH CAEP | | Quality Surveys  Surveys allow EPPs to – Gather information for program improvement Access a broad spectrum of individuals – Candidate satisfaction – Graduate satisfaction – Employer satisfaction – Clinical faculty perceptions of candidates’ preparedness for teaching  Characteristics of Quality Survey Carefully designed Allow for systematic collection of data Measures the property it claims to measure  (Pending approval of Accreditation Council and CAEP Board)

CONNECT WITH CAEP | | Criteria for Evaluating Surveys  Surveys include preambles that explains explain what the respondent is being asked to do  Define any concepts or terms that the respondent needs to understand to complete the survey  Questions should be sorted by themes or categories  Questions should be simple and direct  Vague words or terms should be avoided  Leading questions should be avoided  A cover letter is included  Confidentiality of respondent is assured (Pending approval of Accreditation Council and CAEP Board)

CONNECT WITH CAEP | | Criteria for Evaluating Surveys (cont.)  Questions should have a single subject and not combine two or more attributes  Questions should be stated in the positive  Questions should have a parallel structure  Response choices should be mutually exclusive and exhaustive  If frequency questions (e.g. “occasionally”) are included, they should be defined in terms of an actual frequency (e.g. “3-5” times) (Pending approval of Accreditation Council and CAEP Board)

CONNECT WITH CAEP | | Additional Criteria for Three-Year-Out Review Description of how the assessment was developed Description of how validity was established for the assessment  Expert validation of the items on the assessment (convergent validity)  A measure’s ability to predict performance on another measure (predictive validity)  Extent to which the evaluation item measures what it claims to measure (construct validity) (Pending Accreditation Council and CAEP Board Approval)

CONNECT WITH CAEP | | Additional Criteria for Three-Year-Out Review (cont.) Right attributes are being measured in the right balance (content validity) Measure subjectively viewed as being important and relevant (face validity)  Description of how the reliability of assessments were established or will be established (plan is acceptable)  Reliability in its various forms is supported through evidence of – Agreement among multiple raters of the same event Stability or consistency of ratings over time Evidence of internal consistency of measures Pending approval of Accreditation Council and CAEP Board

CONNECT WITH CAEP | | Review Process for Three-Year Out  CAEP assigns a lead reviewer and two additional reviewers Reviewers are specifically trained on criteria for quality assessments Reviewers provide feedback on - – Quality of scoring guides or rubrics based on the criteria – Quality of surveys based on the criteria – Alignment of assessments to CAEP Standards – Quality of the evidence for CAEP Standards – Quality of the answers to validity and reliability answers Pending approval of Accreditation Council and CAEP Board

CONNECT WITH CAEP | | Review Process for Three-Year Out (cont.) Steps in review process  Each of the three reviewers complete an independent review through AIMS  After all reports are submitted, lead reviewer host a conference call with team  Conference call generates a final team report submitted through AIMs  CAEP staff completes a tech edit of final report  EPP receives feedback on all submitted assessments by March 1 for fall cycle and September 1 for spring cycle Pending approval of Accreditation Council and CAEP Board

CONNECT WITH CAEP | | Steps for Submission of Three-Year-Out Review Step 1: Three years before the due date of self-study, the EPP request a shell for submission of assessments Step 2: EPPs identify on a chart the proprietary assessments to be submitted as evidence  Complete a checklist of which proprietary assessments provided evidence for which CAEP Standards Step 3: EPPs attach/upload assessments (identified by number) to shell  Assessments are uploaded as the assessment is used by the EPP Pending approval of Accreditation Council and CAEP Board

CONNECT WITH CAEP | | Steps for Submission of Three-Year-Out Review (cont.) Step 4: In the space provided in the shell, EPPs answer questions on the development of the assessment, describe the establishment of validity for each assessment, and describe the process in which reliability was established or a plan for establishing reliability. Submissions are due by October 1 for fall cycle and April 1 for the spring cycle Pending approval of Accreditation Council and CAEP Board

CONNECT WITH CAEP | | Evidence for Standards Most of candidate data from assessments will be submitted as evidence for Standard 1 Validity and reliability evidence will be submitted as evidence for Standard 5 (Quality Assurance) Feedback will be used by EPPs to improve or modify assessments Feedback will be used by EPPs to improve or modify validity and reliability processes Member of the Three-Year-Out review team will serve on the site visit team Pending approval of Accreditation Council and CAEP Board