D EVELOPING A Q UALITY A SSURANCE S YSTEM, B UILDING A V ALIDITY A RGUMENT FOR L OCALLY D EVELOPED P ERFORMANCE A SSESSMENTS, AND S TRATEGIES FOR C ALIBRATING.

Slides:



Advertisements
Similar presentations
Title I Schoolwide Providing the Tools for Change Presented by Education Service Center Region XI February 2008.
Advertisements

Training Module for Cooperating Teachers and Supervising Faculty
An Assessment Primer Fall 2007 Click here to begin.
Educational Outcomes: The Role of Competencies and The Importance of Assessment.
Dallas Baptist University College of Education Graduate Programs
Grade 12 Subject Specific Ministry Training Sessions
Standards and Guidelines for Quality Assurance in the European
REGIONAL PEER REVIEW PANELS (PRP) August Peer Review Panel: Background  As a requirement of the ESEA waiver, ODE must establish a process to ensure.
 Description  The unit has a conceptual framework that defines how our programs prepare candidates to be well-rounded educators. Every course in the.
Emporia State University Phil Bennett (Some Slides by Dr. Larry Lyman) Teacher Work Sample The Teachers College.
The Department of Educational Administration Assessment Report School of Education and Human Services Carol Godsave, Chair, Assessment Coordinator.
BY Karen Liu, Ph. D. Indiana State University August 18,
Engaging the Arts and Sciences at the University of Kentucky Working Together to Prepare Quality Educators.
Writing Your Program’s SPA Report(s) Cynthia Conn, Ph.D., Associate Director, Office of Academic Assessment Chris Geanious, Project Director, College of.
Assistant Principal Meeting August 28, :00am to 12:00pm.
March 26-28, 2013 SINGAPORE CDIO Asian Regional Meeting and Workshop on Engineering Education and Policies for Regional Leaders Programme Evaluation (CDIO.
Expeditionary Learning Queens Middle School Meeting May 29,2013 Presenters: Maryanne Campagna & Antoinette DiPietro 1.
Building and Recognizing Quality School Systems DISTRICT ACCREDITATION © 2010 AdvancED.
SACS-CASI Southern Association of Colleges and Schools Council on Accreditation and School Improvement FAMU DRS – QAR Quality Assurance Review April 27-28,
NCATE for Dummies AKA: Everything You Wanted to Know About NCATE, But Didn’t Want to Ask.
March 15-16, Inquiry and Evidence An introduction to the TEAC system for accrediting educator preparation programs 3/15/12, 9:00-10:00a.m. CAEP.
The Conceptual Framework: What It Is and How It Works Linda Bradley, James Madison University Monica Minor, NCATE April 2008.
Assessment Information from multiple sources that describes a student’s level of achievement Used to make educational decisions about students Gives feedback.
Assessment System Overview Center for Education Overview for the NCATE BOE Team April 18-22, 2009.
Systems Accreditation Berkeley County School District School Facilitator Training October 7, 2014 Dr. Rodney Thompson Superintendent.
Using edTPA Data for Program Design and Curriculum Mapping Mary Ariail, Georgia State University Kristy Brown, Shorter University Judith Emerson, Georgia.
Continuous Improvement. Focus of the Review: Continuous Improvement The unit will engage in continuous improvement between on-site visits. Submit annual.
Building Rubrics that Align with Standards & Documenting Candidates’ Effects on Student Learning Cynthia Conn, Ph.D., Associate Director, Office of Academic.
1 Scoring Provincial Large-Scale Assessments María Elena Oliveri, University of British Columbia Britta Gundersen-Bryden, British Columbia Ministry of.
Accreditation Update and Institutional Student Learning Outcomes Deborah Moeckel, SUNY Assistant Provost SCoA Drive in Workshops Fall 2015
February, MansourahProf. Nadia Badrawi Implementation of National Academic Reference Standards Prof. Nadia Badrawi Senior Member and former chairperson.
Nevada Department of Education Office of Educational Opportunity Nevada Comprehensive Curriculum Audit Tool for Schools NCCAT-S August
Implementing the Professional Growth Process Session 3 Observing Teaching and Professional Conversations American International School-Riyadh Saturday,
Instructional Leadership and Application of the Standards Aligned System Act 45 Program Requirements and ITQ Content Review October 14, 2010.
Designing Quality Assessment and Rubrics
Outcomes By the end of our sessions, participants will have…  an understanding of how VAL-ED is used as a data point in developing professional development.
CAEP Standard 4 Program Impact Case Study
PILOT SCHOOL PRINCIPAL EVALUATION
A Reflective Practice Model for Examining the Validity of Performance assessments Cynthia Conn, PhD Assistant Vice Provost, Professional Education Programs.
Developing a Quality Assurance System & Examining the Validity and Reliability of Performance Assessments (Part 1 of 3) Cynthia Conn, PhD Assistant Vice.
EVALUATING EPP-CREATED ASSESSMENTS
NCATE Unit Standards 1 and 2
Michael Kelly, Ed. D. Virginia Tech
School – Based Assessment – Framework
Eastern’s Assessment System
Clinical Practice evaluations and Performance Review
Consider Your Audience
Phyllis Lynch, PhD Director, Instruction, Assessment and Curriculum
Michael Kelly, Ed. D. John Gratto, Ed. D. Virginia Tech
Office of Field and Clinical Partnerships and Outreach: Updates
Elayne Colón and Tom Dana
Padlet Question Board Why did you select this workshop and what are you hoping to learn from it?
Continuous Improvement through Accreditation AdvancED ESA Accreditation MAISA Conference January 27, 2016.
Rubrics in Student Affairs
April 17, 2018 Gary Railsback, Vice President What’s new at CAEP.
Measuring Project Performance: Tips and Tools to Showcase Your Results
Ensuring Meaningful Performance Assessment Results: A Reflective Practice Model for Examining Validity and Reliability Cynthia Conn, PhD Assistant Vice.
Educator Effectiveness Regional Workshop: Round 2
Diocese of Baton Rouge Catholic Schools
Student Learning Outcomes Assessment
Gary Carlin, CFN 603 September, 2012
DISTRICT ACCREDITATION QUALITY ASSURANCE REVIEW
Standard Four Program Impact
Presented by: Skyline College SLOAC Committee Fall 2007
Clinical Educator and Teacher Candidate Training Chapter for the Candidate Preservice Assessment for Student Teachers (CPAST) Form Developed by the VARI-EPP*
Deconstructing Standard 2a Dr. Julie Reffel Valdosta State University
Dr. Phyllis Underwood REL Southeast
Aligning Academic Review and Performance Evaluation (AARPE)
Training Chapter for the Advanced Field Experience Form (Pre-CPAST
Cooperating Teacher and Student Teacher Training Chapter for the Candidate Preservice Assessment for Student Teachers (CPAST) Form Developed by the VARI-EPP*
Presentation transcript:

D EVELOPING A Q UALITY A SSURANCE S YSTEM, B UILDING A V ALIDITY A RGUMENT FOR L OCALLY D EVELOPED P ERFORMANCE A SSESSMENTS, AND S TRATEGIES FOR C ALIBRATING I NSTRUMENTS Cynthia Conn, PhD Assistant Vice Provost, Professional Education Programs Kathy Bohan, EdD Associate Dean, College of Education Sue Pieper, PhD Assessment Coordinator, Office of Curriculum, Learning Design, & Academic Assessment Matteo Musumeci, MA-TESL Instructional Specialist, Professional Education Programs

CLICK TO EDIT MASTER TITLE STYLE NAU P ROFESSIONAL E DUCATION P ROGRAMS

CLICK TO EDIT MASTER TITLE STYLE P ROGRAMS & E NROLLMENT FOR Y OUR EPP Please respond to the following questions by raising your hand: – Do you consider your institution to be large in terms of initial and/or advanced programs (25+) and enrollment of programs (2,000+)? – Do you consider your institution to be medium sized in terms of initial and/or advanced programs (10-25) and enrollment of programs (1,000 to 2,000)? –Do you consider your institution to be small in terms of number of initial and advanced programs (<10) and enrollment of programs (<1,000)?

CLICK TO EDIT MASTER TITLE STYLE Y OUR EPP’ S T IMELINE FOR A CCREDITATION Please respond to the following questions by raising your hand: – Is your institution exploring CAEP accreditation or in the candidacy process? – Has your institution recently been re- accredited through NCATE or TEAC and now transitioning to CAEP? – Is your institution in the process of writing your CAEP Self-Study in the next 1 to 3 years? – Has your institution submitted the CAEP Self-Study report and will be having a Site Visit this year?

CLICK TO EDIT MASTER TITLE STYLE W ORKSHOP O BJECTIVES Objectives – Discuss strategies for developing a comprehensive Quality Assurance System (CAEP Standard 5.1) – Discuss framework and strategies for examining validity and reliability of the use and interpretation of locally developed performance assessments Validity Inquiry Process Model Strategies for calibrating performance assessments

D EVELOPING A Q UALITY A SSURANCE S YSTEM

CLICK TO EDIT MASTER TITLE STYLE W HAT IS A Q UALITY A SSURANCE S YSTEM ? Ideas:

CLICK TO EDIT MASTER TITLE STYLE Q UALITY A SSURANCE S YSTEM : C OMPONENTS /S TRUCTURE QAS Resources:

CLICK TO EDIT MASTER TITLE STYLE Q UALITY A SSURANCE S YSTEM : D EFINITIONS What is a Quality Assurance System? CAEP STANDARD 5.1 The provider’s quality assurance system is comprised of multiple measures that can monitor candidate progress, completer achievements, and provider operational effectiveness. Evidence demonstrates that the provider satisfies all CAEP standards. CAEP Standard 5.3 REQUIRED COMPONENT The provider regularly and systematically assesses performance against its goals and relevant standards, tracks results over time, tests innovations and the effects of selection criteria on subsequent progress and completion, and uses results to improve program elements and processes.

CLICK TO EDIT MASTER TITLE STYLE Q UALITY A SSURANCE S YSTEM : S TRATEGIES

CLICK TO EDIT MASTER TITLE STYLE S TRATEGY : H IGH L EVEL N EEDS A NALYSIS Purpose: Document strengths and issues related to your current quality assurance system that will assist with prioritizing work Examples: –At NAU, this first strategy was conducted by Assistant Vice Provost of NAU Professional Education Programs. Work provided necessary information for prioritizing work and developing a vision for the Quality Assurance System. –NAU was collecting data well but needed to improve the systematic reporting and access to data –We also recognized we needed to improve the quality of assessment instruments

CLICK TO EDIT MASTER TITLE STYLE S TRATEGY : H IGH L EVEL N EEDS A NALYSIS Activity (partner-share/large-group share): –How have you or could you gather this high level needs analysis data on your campus? Who did/could you talk to on your campus? What documentation did/could you review? –Are there other initial approaches your campus took to develop a quality assurance system? –Who could implement this strategy on your campus?

CLICK TO EDIT MASTER TITLE STYLE S TRATEGY : H IGH L EVEL N EEDS A NALYSIS Ideas: How have you or could you gather this high level needs analysis data on your campus? Are there other initial approaches your campus took to develop a quality assurance system? Who could implement this strategy on your campus?

CLICK TO EDIT MASTER TITLE STYLE S TRATEGY : H IGH L EVEL N EEDS A NALYSIS Ideas:

CLICK TO EDIT MASTER TITLE STYLE Q UALITY A SSURANCE S YSTEM : S TRATEGIES

CLICK TO EDIT MASTER TITLE STYLE Purpose: 1) Develop a detailed listing of current assessment instruments; 2) Document alignment to CAEP Standards, quality of the instruments, and implementation schedule Examples: –Two of NAU’s EPP leaders conducted the initial assessment audit and discussed strengths and gaps with Coordinating Council members The Student Teaching Evaluation and Candidate Work Sample needed to be improved in terms of validity and reliability NAU’s EPP identified gaps with collecting data regarding graduates –Assessment Audit Template S TRATEGY : A SSESSMENT A UDIT

CLICK TO EDIT MASTER TITLE STYLE S TRATEGY : A SSESSMENT A UDIT Assessment Audit Template CAEP Standard #1: Candidate Knowledge, Skills, and Professional Dispositions Standard Component Evidence/ Assess- ment Instrument Schedule: -Implementation -Reporting - Review -Administrations Use of Data Validity/ Reliability CAEP Assessment Review Criteria QAS Resources:

CLICK TO EDIT MASTER TITLE STYLE Questions? Suggestions? If we aren’t able to get to your question, please post it using the following URL. Check back after the presentation for a response. S TRATEGY : A SSESSMENT A UDIT

CLICK TO EDIT MASTER TITLE STYLE S TRATEGY : A SSESSMENT A UDIT Ideas:

CLICK TO EDIT MASTER TITLE STYLE Q UALITY A SSURANCE S YSTEM : S TRATEGIES

CLICK TO EDIT MASTER TITLE STYLE S TRATEGY : I DENTIFY & I MPLEMENT D ATA T OOLS Purpose: To identify the data tool functions that need to be present to have systematic collection, reporting, and use of data

CLICK TO EDIT MASTER TITLE STYLE S TRATEGY : I DENTIFY & I MPLEMENT D ATA T OOLS Data ToolData Tool Function AudienceSustainable and efficient? Name of toolWhich function is aligned with the data tool? Who will view and use the data? Yes/No? Data Tools and Functions Self-Assessment Identify and address current areas of strength and any potential gaps related to data tools on your campus. Types of Data Collection (e.g., rubric or survey tools) Reporting tools Sustainable, efficient Audiences Identify any gaps in relation to data functions in your quality assurance system.

CLICK TO EDIT MASTER TITLE STYLE W HAT D ATA T OOL F UNCTIONS ARE N EEDED ? Ideas:

CLICK TO EDIT MASTER TITLE STYLE S TRATEGY : I DENTIFY & I MPLEMENT D ATA T OOLS Ideas:

CLICK TO EDIT MASTER TITLE STYLE D EVELOPING A Q UALITY A SSURANCE S YSTEM : S TRATEGIES

CLICK TO EDIT MASTER TITLE STYLE S TRATEGY : A SSESSMENT P OLICIES & P ROCEDURES Purpose: To develop an useful, efficient and sustainable Quality Assurance System Examples: –Aligning systematic reporting with University and State reporting requirements –Self-Study and SPA report files are maintained and updated to eliminate duplicate work –Develop a Master Assessment Plan and Calendar, Policies & Procedures for Program Level Assessment Reporting (developed in collaboration with NAU’s assessment office), Biennial Reporting Chart with expectations

CLICK TO EDIT MASTER TITLE STYLE Questions? Suggestions? If we aren’t able to get to your question, please post it using the following url. Check back after the presentation for a response. S TRATEGY : A SSESSMENT P OLICIES & P ROCEDURES

CLICK TO EDIT MASTER TITLE STYLE S TRATEGY : A SSESSMENT P OLICIES & P ROCEDURES Ideas:

CLICK TO EDIT MASTER TITLE STYLE CAEP S ELF -S TUDY R EPORT Iterative process... − Evidence file templates available on QAS Resource website − Formed CAEP Self-Study Writing Committees − EPP level faculty meeting held on biennial basis to formally review data; utilize “speed sharing” technique − Re-reviewing assessment instruments (high level needs analysis strategy) to consider options for streamlining data collection − Conducting assessment audit with advanced programs to identify existing data and gaps related to standards approved in June 2016

CLICK TO EDIT MASTER TITLE STYLE D EVELOPING A Q UALITY A SSURANCE S YSTEM The wonders of inflight construction… [Credit: Eugene Kim & Brian Narelle,

B UILDING A V ALIDITY A RGUMENT FOR L OCALLY D EVELOPED P ERFORMANCE A SSESSMENTS

CLICK TO EDIT MASTER TITLE STYLE T HE P URPOSE OF THE V ALIDITY I NQUIRY P ROCESS (VIP) M ODEL Validity Inquiry Process is a component of the Quality Assurance System Purpose The purpose of the Validity Inquiry Process (VIP) Model instruments is to assist in examining and gathering evidence to build a validity argument for the interpretation and use of data from locally or faculty developed performance assessment instruments. Leads to Making the Validity Argument Theory to practice approach Qualitative and reflective Efficient

CLICK TO EDIT MASTER TITLE STYLE V ALIDITY I NQUIRY P ROCESS (VIP) M ODEL C RITERIA 1.Domain coverage 2.Content quality 3.Cognitive complexity 4.Meaningfulness 5.Generalizability 6.Consequences 7.Fairness 8.Cost and Efficiency (Linn, Baker, & Dunbar, 1991; Messick, 1994) Purpose

CLICK TO EDIT MASTER TITLE STYLE V ALIDITY I NQUIRY F ORMS Validity Inquiry Form Metarubric Form Student Survey

CLICK TO EDIT MASTER TITLE STYLE U SING THE V ALIDITY I NQUIRY F ORM Cognitive Complexity Daggett, W.R. (2014). Rigor/relevance framework®: A guide to focusing resources to increase student performance. International Center for Leadership in Education. Retrieved from

CLICK TO EDIT MASTER TITLE STYLE T HE V ALIDITY I NQUIRY P ROCESS : E XAMPLE Student Teaching Capstone Assignment: Candidate Work Sample (CWS) Background –Spring 2014: revisions from 7-row CWS made/used –August 2014: 19-row rubric based on Faculty & University Supervisors feedback –July 2015: Inter-rater reliability session –September 2015: Validity Inquiry Process meeting, further revisions; Next Steps Summary –December 2015: change to CWS Evaluators –February 2016: Implementation, Committee met to write the Validity Argument –April 2016: CWS Evaluator (with student teachers) De- brief, additional Next Steps; Summer revisions –August 2016: CWS Evaluator Calibration session (reviewed the revised CWS, inter-rater agreement training) Instrument development is continuous.

CLICK TO EDIT MASTER TITLE STYLE A CTIVITY : U SING THE V ALIDITY I NQUIRY F ORM Discuss in pairs or small groups: What is the stated purpose of this performance assessment and is it an effective purpose statement? Q3: Using the Rigor/Relevance Framework framework.php framework.php Identify the quadrant that the assessment falls into and provide a justification for this determination. What are the results of your small group?

CLICK TO EDIT MASTER TITLE STYLE U SING THE M ETARUBRIC Read the example assignment instructions and the Metarubric question. – Criteria: Q2: Does the rubric criterion align directly with the assignment instructions? What are the results of your small group?

CLICK TO EDIT MASTER TITLE STYLE F ACULTY F EEDBACK R EGARDING P ROCESS “I wanted to thank you all for a providing a really productive venue to discuss the progress and continuing issues with our assessment work. I left the meeting feeling very optimistic about where we have come and where we are going. Thank you.” – Associate Professor, Elementary Education “Thanks for your facilitation and leadership in this process. It is so valuable from many different perspectives, especially related to continuous improvement! Thanks for giving us permission to use the validity tools as we continue to discuss our courses with our peers. I continue to learn and grow...” – Assistant Clinical Professor, Special Education

S TRATEGIES FOR C ALIBRATING I NSTRUMENTS

CLICK TO EDIT MASTER TITLE STYLE P URPOSE OF C ALIBRATING I NSTRUMENTS Purpose of calibrating instruments Strategies for calibrating instruments (Frame-of- Reference Training)

CLICK TO EDIT MASTER TITLE STYLE I NTER - RATER A GREEMENT AND R ELIABILITY Agreement: Measures the consistency/differences between absolute value of evaluators’ scores Reliability: Measures the variability of scores; relative ranking/ordering of evaluators’ scores Low Agreement, High Reliability High Agreement, High Reliability Evaluator 1Evaluator 2Evaluator 3Evaluator 4 Student Student Student Agreement Reliability1.0 Adapted from Graham, Milanowski, & Miller (2012)

CLICK TO EDIT MASTER TITLE STYLE C ALIBRATION T RAINING : F RAME - OF - R EFERENCE T RAINING Elements adapted from Frame-of-Reference Training: –Explanation of rating system to evaluators –Discussion of common evaluator errors and strategies for avoiding them –Advice for making evaluations –Practice calibrating a sample paper Considerations for selecting evaluators and expert panelists –Common expectations for the implementation of the assessment Ongoing monitoring of evaluators’ ratings during semester for scoring consistency Redesign of instrument based on data, ad hoc focus groups, and evaluator feedback

CLICK TO EDIT MASTER TITLE STYLE I NTER - RATER A GREEMENT : C ALIBRATION S TRATEGIES Calibration strategies: ●Select anchor papers previously scored for the expert panel ●Select expert panel members to score the anchor papers ●Examine data from anchor papers to determine the strongest paper for calibration exercise ●Train group of evaluators

CLICK TO EDIT MASTER TITLE STYLE I NTER - RATER A GREEMENT : C ALIBRATION S TRATEGIES Questions for small group discussion given to evaluators participating at calibration session (after individual scoring): 1.What evidence (i.e., quantity and quality) can you connect to each indicator of the rubric? 2.What challenges to developing consensus did your group encounter? 3.What qualitative feedback would you provide to help the candidate advance from one performance level to the next higher level?

CLICK TO EDIT MASTER TITLE STYLE I NTER - RATER A GREEMENT : C ALIBRATION S TRATEGIES Whole group discussion Reporting inter- rater agreement data Following up with programs

CLICK TO EDIT MASTER TITLE STYLE I NTER - RATER A GREEMENT : S UMMARY OF A GREEMENT D ATA Summary of inter-rater agreement data (Summer 2016 CWS Evaluator Training & Calibration Session) Summary of CWS Evaluators’ Percentages of Agreement with Expert Panel on Calibration Exercise Paper Average % Absolute Agreement 43.33% Number of evaluators 15 Average % Adjacent Agreement 50.91% Overall Average Agreement (Adjacent + Absolute) 94.24% Cronbach’s alpha (internal consistency reliability of scale).897

CLICK TO EDIT MASTER TITLE STYLE I NTER - RATER A GREEMENT : A CTIVITY Group discussion: Complete the worksheet provided by filling in the details for your EPP –Choose one unit-level (EPP-wide) assessment in which you could apply these strategies and discuss the questions on the worksheet. Spreadsheet template available on QAS Resources website:

CLICK TO EDIT MASTER TITLE STYLE S TRATEGIES FOR C ALIBRATING I NSTRUMENTS Questions? Suggestions? If we aren’t able to get to your question, please post it using the following url. Check back after the presentation for a response.

CLICK TO EDIT MASTER TITLE STYLE R ESOURCES & C ONTACT I NFORMATION Quality Assurance System Resources website: Contact Information: Cynthia Conn, PhD Assistant Vice Provost, Professional Education Programs Kathy Bohan, EdD Associate Dean, College of Education Sue Pieper, PhD Assessment Coordinator, Office of Curriculum, Learning Design, & Academic Assessment Matteo Musumeci, MA-TESL Instructional Specialist, Professional Education Programs

CLICK TO EDIT MASTER TITLE STYLE D EFINITIONS Performance Assessment –An assessment tool that requires test takers to perform—develop a product or demonstrate a process—so that the observer can assign a score or value to that performance. A science project, an essay, a persuasive speech, a mathematics problem solution, and a woodworking project are examples. (See also authentic assessment.) Validity –The degree to which the evidence obtained through validation supports the score interpretations and uses to be made of the scores from a certain test administered to a certain person or group on a specific occasion. Sometimes the evidence shows why competing interpretations or uses are inappropriate, or less appropriate, than the proposed ones. Reliability –Scores that are highly reliable are accurate, reproducible, and consistent from one testing [e.g., rating] occasion to another. That is, if the testing [e.g., rating] process were repeated with a group of test takers [e.g., raters], essentially the same results would be obtained. (National Council on Measurement in Education. (2014). Glossary of important assessment and measurement terms. Retrieved from:

CLICK TO EDIT MASTER TITLE STYLE R EFERENCES Daggett, W.R. (2014). Rigor/relevance framework®: A guide to focusing resources to increase student performance. International Center for Leadership in Education. Retrieved from framework.php Gall, M. D., Borg, W. R., & Gall, J. P. (1996). Educational research: An introduction (6th Edition). White Plains, NY: Longman Publishers. Graham, M., Milanowksi, A., & Miller, J. (2012). Measuring and promoting inter-rater agreement of teacher and principal performance ratings. Center for Educator Compensation Reform. Retrieved from Kane, M. (2013). The argument-based approach to validation. School Psychology Review, 42(4), Linn, R. L., Baker, E. L., & Dunbar, S. B. (1991). Complex, performance-based assessment: Expectations and validation criteria. Educational Researcher, 20(8), Messick, S. (1994). The interplay of evidence and consequences in the validation of performance assessments. Educational Researcher, 23(2), Pieper, S. (2012, May 21). Evaluating descriptive rubrics checklist. Retrieved from Stevens, D. D., & Levi, A. J. (2005). Introduction to rubrics: An assessment tool to save grading time, convey effective feedback and promote student learning. Sterling, VA: Stylus Publishing, LLC.