Presentation is loading. Please wait.

Presentation is loading. Please wait.

D EVELOPING A Q UALITY A SSURANCE S YSTEM, B UILDING A V ALIDITY A RGUMENT FOR L OCALLY D EVELOPED P ERFORMANCE A SSESSMENTS, AND S TRATEGIES FOR C ALIBRATING.

Similar presentations


Presentation on theme: "D EVELOPING A Q UALITY A SSURANCE S YSTEM, B UILDING A V ALIDITY A RGUMENT FOR L OCALLY D EVELOPED P ERFORMANCE A SSESSMENTS, AND S TRATEGIES FOR C ALIBRATING."— Presentation transcript:

1 D EVELOPING A Q UALITY A SSURANCE S YSTEM, B UILDING A V ALIDITY A RGUMENT FOR L OCALLY D EVELOPED P ERFORMANCE A SSESSMENTS, AND S TRATEGIES FOR C ALIBRATING I NSTRUMENTS Cynthia Conn, PhD Assistant Vice Provost, Professional Education Programs Kathy Bohan, EdD Associate Dean, College of Education Sue Pieper, PhD Assessment Coordinator, Office of Curriculum, Learning Design, & Academic Assessment Matteo Musumeci, MA-TESL Instructional Specialist, Professional Education Programs

2 CLICK TO EDIT MASTER TITLE STYLE NAU P ROFESSIONAL E DUCATION P ROGRAMS

3 CLICK TO EDIT MASTER TITLE STYLE P ROGRAMS & E NROLLMENT FOR Y OUR EPP Please respond to the following questions by raising your hand: – Do you consider your institution to be large in terms of initial and/or advanced programs (25+) and enrollment of programs (2,000+)? – Do you consider your institution to be medium sized in terms of initial and/or advanced programs (10-25) and enrollment of programs (1,000 to 2,000)? –Do you consider your institution to be small in terms of number of initial and advanced programs (<10) and enrollment of programs (<1,000)?

4 CLICK TO EDIT MASTER TITLE STYLE Y OUR EPP’ S T IMELINE FOR A CCREDITATION Please respond to the following questions by raising your hand: – Is your institution exploring CAEP accreditation or in the candidacy process? – Has your institution recently been re- accredited through NCATE or TEAC and now transitioning to CAEP? – Is your institution in the process of writing your CAEP Self-Study in the next 1 to 3 years? – Has your institution submitted the CAEP Self-Study report and will be having a Site Visit this year?

5 CLICK TO EDIT MASTER TITLE STYLE W ORKSHOP O BJECTIVES Objectives – Discuss strategies for developing a comprehensive Quality Assurance System (CAEP Standard 5.1) – Discuss framework and strategies for examining validity and reliability of the use and interpretation of locally developed performance assessments Validity Inquiry Process Model Strategies for calibrating performance assessments

6 D EVELOPING A Q UALITY A SSURANCE S YSTEM

7 CLICK TO EDIT MASTER TITLE STYLE W HAT IS A Q UALITY A SSURANCE S YSTEM ? Ideas:

8 CLICK TO EDIT MASTER TITLE STYLE Q UALITY A SSURANCE S YSTEM : C OMPONENTS /S TRUCTURE QAS Resources: https://nau.edu/Provost/PEP/Quality-Assurance-System/ https://nau.edu/Provost/PEP/Quality-Assurance-System/

9 CLICK TO EDIT MASTER TITLE STYLE Q UALITY A SSURANCE S YSTEM : D EFINITIONS What is a Quality Assurance System? CAEP STANDARD 5.1 The provider’s quality assurance system is comprised of multiple measures that can monitor candidate progress, completer achievements, and provider operational effectiveness. Evidence demonstrates that the provider satisfies all CAEP standards. CAEP Standard 5.3 REQUIRED COMPONENT The provider regularly and systematically assesses performance against its goals and relevant standards, tracks results over time, tests innovations and the effects of selection criteria on subsequent progress and completion, and uses results to improve program elements and processes.

10 CLICK TO EDIT MASTER TITLE STYLE Q UALITY A SSURANCE S YSTEM : S TRATEGIES

11 CLICK TO EDIT MASTER TITLE STYLE S TRATEGY : H IGH L EVEL N EEDS A NALYSIS Purpose: Document strengths and issues related to your current quality assurance system that will assist with prioritizing work Examples: –At NAU, this first strategy was conducted by Assistant Vice Provost of NAU Professional Education Programs. Work provided necessary information for prioritizing work and developing a vision for the Quality Assurance System. –NAU was collecting data well but needed to improve the systematic reporting and access to data –We also recognized we needed to improve the quality of assessment instruments

12 CLICK TO EDIT MASTER TITLE STYLE S TRATEGY : H IGH L EVEL N EEDS A NALYSIS Activity (partner-share/large-group share): –How have you or could you gather this high level needs analysis data on your campus? Who did/could you talk to on your campus? What documentation did/could you review? –Are there other initial approaches your campus took to develop a quality assurance system? –Who could implement this strategy on your campus?

13 CLICK TO EDIT MASTER TITLE STYLE S TRATEGY : H IGH L EVEL N EEDS A NALYSIS Ideas: How have you or could you gather this high level needs analysis data on your campus? Are there other initial approaches your campus took to develop a quality assurance system? Who could implement this strategy on your campus?

14 CLICK TO EDIT MASTER TITLE STYLE S TRATEGY : H IGH L EVEL N EEDS A NALYSIS Ideas:

15 CLICK TO EDIT MASTER TITLE STYLE Q UALITY A SSURANCE S YSTEM : S TRATEGIES

16 CLICK TO EDIT MASTER TITLE STYLE Purpose: 1) Develop a detailed listing of current assessment instruments; 2) Document alignment to CAEP Standards, quality of the instruments, and implementation schedule Examples: –Two of NAU’s EPP leaders conducted the initial assessment audit and discussed strengths and gaps with Coordinating Council members The Student Teaching Evaluation and Candidate Work Sample needed to be improved in terms of validity and reliability NAU’s EPP identified gaps with collecting data regarding graduates –Assessment Audit Template S TRATEGY : A SSESSMENT A UDIT

17 CLICK TO EDIT MASTER TITLE STYLE S TRATEGY : A SSESSMENT A UDIT Assessment Audit Template CAEP Standard #1: Candidate Knowledge, Skills, and Professional Dispositions Standard Component Evidence/ Assess- ment Instrument Schedule: -Implementation -Reporting - Review -Administrations Use of Data Validity/ Reliability CAEP Assessment Review Criteria QAS Resources: https://nau.edu/Provost/PEP/Quality-Assurance-System/

18 CLICK TO EDIT MASTER TITLE STYLE Questions? Suggestions? If we aren’t able to get to your question, please post it using the following URL. Check back after the presentation for a response. http://tinyurl.com/QASquestions S TRATEGY : A SSESSMENT A UDIT

19 CLICK TO EDIT MASTER TITLE STYLE S TRATEGY : A SSESSMENT A UDIT Ideas:

20 CLICK TO EDIT MASTER TITLE STYLE Q UALITY A SSURANCE S YSTEM : S TRATEGIES

21 CLICK TO EDIT MASTER TITLE STYLE S TRATEGY : I DENTIFY & I MPLEMENT D ATA T OOLS Purpose: To identify the data tool functions that need to be present to have systematic collection, reporting, and use of data

22 CLICK TO EDIT MASTER TITLE STYLE S TRATEGY : I DENTIFY & I MPLEMENT D ATA T OOLS Data ToolData Tool Function AudienceSustainable and efficient? Name of toolWhich function is aligned with the data tool? Who will view and use the data? Yes/No? Data Tools and Functions Self-Assessment Identify and address current areas of strength and any potential gaps related to data tools on your campus. Types of Data Collection (e.g., rubric or survey tools) Reporting tools Sustainable, efficient Audiences Identify any gaps in relation to data functions in your quality assurance system.

23 CLICK TO EDIT MASTER TITLE STYLE W HAT D ATA T OOL F UNCTIONS ARE N EEDED ? Ideas:

24 CLICK TO EDIT MASTER TITLE STYLE S TRATEGY : I DENTIFY & I MPLEMENT D ATA T OOLS Ideas:

25 CLICK TO EDIT MASTER TITLE STYLE D EVELOPING A Q UALITY A SSURANCE S YSTEM : S TRATEGIES

26 CLICK TO EDIT MASTER TITLE STYLE S TRATEGY : A SSESSMENT P OLICIES & P ROCEDURES Purpose: To develop an useful, efficient and sustainable Quality Assurance System Examples: –Aligning systematic reporting with University and State reporting requirements –Self-Study and SPA report files are maintained and updated to eliminate duplicate work –Develop a Master Assessment Plan and Calendar, Policies & Procedures for Program Level Assessment Reporting (developed in collaboration with NAU’s assessment office), Biennial Reporting Chart with expectations

27 CLICK TO EDIT MASTER TITLE STYLE Questions? Suggestions? If we aren’t able to get to your question, please post it using the following url. Check back after the presentation for a response. http://tinyurl.com/QASquestions S TRATEGY : A SSESSMENT P OLICIES & P ROCEDURES

28 CLICK TO EDIT MASTER TITLE STYLE S TRATEGY : A SSESSMENT P OLICIES & P ROCEDURES Ideas:

29 CLICK TO EDIT MASTER TITLE STYLE CAEP S ELF -S TUDY R EPORT Iterative process... − Evidence file templates available on QAS Resource website − Formed CAEP Self-Study Writing Committees − EPP level faculty meeting held on biennial basis to formally review data; utilize “speed sharing” technique − Re-reviewing assessment instruments (high level needs analysis strategy) to consider options for streamlining data collection − Conducting assessment audit with advanced programs to identify existing data and gaps related to standards approved in June 2016

30 CLICK TO EDIT MASTER TITLE STYLE D EVELOPING A Q UALITY A SSURANCE S YSTEM The wonders of inflight construction… [Credit: Eugene Kim & Brian Narelle, http://eekim.com/eekim/]

31 B UILDING A V ALIDITY A RGUMENT FOR L OCALLY D EVELOPED P ERFORMANCE A SSESSMENTS

32 CLICK TO EDIT MASTER TITLE STYLE T HE P URPOSE OF THE V ALIDITY I NQUIRY P ROCESS (VIP) M ODEL Validity Inquiry Process is a component of the Quality Assurance System Purpose The purpose of the Validity Inquiry Process (VIP) Model instruments is to assist in examining and gathering evidence to build a validity argument for the interpretation and use of data from locally or faculty developed performance assessment instruments. Leads to Making the Validity Argument Theory to practice approach Qualitative and reflective Efficient

33

34 CLICK TO EDIT MASTER TITLE STYLE V ALIDITY I NQUIRY P ROCESS (VIP) M ODEL C RITERIA 1.Domain coverage 2.Content quality 3.Cognitive complexity 4.Meaningfulness 5.Generalizability 6.Consequences 7.Fairness 8.Cost and Efficiency (Linn, Baker, & Dunbar, 1991; Messick, 1994) Purpose

35 CLICK TO EDIT MASTER TITLE STYLE V ALIDITY I NQUIRY F ORMS Validity Inquiry Form Metarubric Form Student Survey

36 CLICK TO EDIT MASTER TITLE STYLE U SING THE V ALIDITY I NQUIRY F ORM Cognitive Complexity Daggett, W.R. (2014). Rigor/relevance framework®: A guide to focusing resources to increase student performance. International Center for Leadership in Education. Retrieved from http://www.leadered.com/our-philosophy/rigor-relevance-framework.php

37 CLICK TO EDIT MASTER TITLE STYLE T HE V ALIDITY I NQUIRY P ROCESS : E XAMPLE Student Teaching Capstone Assignment: Candidate Work Sample (CWS) Background –Spring 2014: revisions from 7-row CWS made/used –August 2014: 19-row rubric based on Faculty & University Supervisors feedback –July 2015: Inter-rater reliability session –September 2015: Validity Inquiry Process meeting, further revisions; Next Steps Summary –December 2015: change to CWS Evaluators –February 2016: Implementation, Committee met to write the Validity Argument –April 2016: CWS Evaluator (with student teachers) De- brief, additional Next Steps; Summer revisions –August 2016: CWS Evaluator Calibration session (reviewed the revised CWS, inter-rater agreement training) Instrument development is continuous.

38 CLICK TO EDIT MASTER TITLE STYLE A CTIVITY : U SING THE V ALIDITY I NQUIRY F ORM Discuss in pairs or small groups: What is the stated purpose of this performance assessment and is it an effective purpose statement? Q3: Using the Rigor/Relevance Framework http://www.leadered.com/our-philosophy/rigor-relevance- framework.php http://www.leadered.com/our-philosophy/rigor-relevance- framework.php Identify the quadrant that the assessment falls into and provide a justification for this determination. What are the results of your small group?

39 CLICK TO EDIT MASTER TITLE STYLE U SING THE M ETARUBRIC Read the example assignment instructions and the Metarubric question. – Criteria: Q2: Does the rubric criterion align directly with the assignment instructions? What are the results of your small group?

40 CLICK TO EDIT MASTER TITLE STYLE F ACULTY F EEDBACK R EGARDING P ROCESS “I wanted to thank you all for a providing a really productive venue to discuss the progress and continuing issues with our assessment work. I left the meeting feeling very optimistic about where we have come and where we are going. Thank you.” – Associate Professor, Elementary Education “Thanks for your facilitation and leadership in this process. It is so valuable from many different perspectives, especially related to continuous improvement! Thanks for giving us permission to use the validity tools as we continue to discuss our courses with our peers. I continue to learn and grow...” – Assistant Clinical Professor, Special Education

41 S TRATEGIES FOR C ALIBRATING I NSTRUMENTS

42 CLICK TO EDIT MASTER TITLE STYLE P URPOSE OF C ALIBRATING I NSTRUMENTS Purpose of calibrating instruments Strategies for calibrating instruments (Frame-of- Reference Training)

43 CLICK TO EDIT MASTER TITLE STYLE I NTER - RATER A GREEMENT AND R ELIABILITY Agreement: Measures the consistency/differences between absolute value of evaluators’ scores Reliability: Measures the variability of scores; relative ranking/ordering of evaluators’ scores Low Agreement, High Reliability High Agreement, High Reliability Evaluator 1Evaluator 2Evaluator 3Evaluator 4 Student 11211 Student 22322 Student 33433 Agreement0.01.0 Reliability1.0 Adapted from Graham, Milanowski, & Miller (2012)

44 CLICK TO EDIT MASTER TITLE STYLE C ALIBRATION T RAINING : F RAME - OF - R EFERENCE T RAINING Elements adapted from Frame-of-Reference Training: –Explanation of rating system to evaluators –Discussion of common evaluator errors and strategies for avoiding them –Advice for making evaluations –Practice calibrating a sample paper Considerations for selecting evaluators and expert panelists –Common expectations for the implementation of the assessment Ongoing monitoring of evaluators’ ratings during semester for scoring consistency Redesign of instrument based on data, ad hoc focus groups, and evaluator feedback

45 CLICK TO EDIT MASTER TITLE STYLE I NTER - RATER A GREEMENT : C ALIBRATION S TRATEGIES Calibration strategies: ●Select anchor papers previously scored for the expert panel ●Select expert panel members to score the anchor papers ●Examine data from anchor papers to determine the strongest paper for calibration exercise ●Train group of evaluators

46 CLICK TO EDIT MASTER TITLE STYLE I NTER - RATER A GREEMENT : C ALIBRATION S TRATEGIES Questions for small group discussion given to evaluators participating at calibration session (after individual scoring): 1.What evidence (i.e., quantity and quality) can you connect to each indicator of the rubric? 2.What challenges to developing consensus did your group encounter? 3.What qualitative feedback would you provide to help the candidate advance from one performance level to the next higher level?

47 CLICK TO EDIT MASTER TITLE STYLE I NTER - RATER A GREEMENT : C ALIBRATION S TRATEGIES Whole group discussion Reporting inter- rater agreement data Following up with programs

48 CLICK TO EDIT MASTER TITLE STYLE I NTER - RATER A GREEMENT : S UMMARY OF A GREEMENT D ATA Summary of inter-rater agreement data (Summer 2016 CWS Evaluator Training & Calibration Session) Summary of CWS Evaluators’ Percentages of Agreement with Expert Panel on Calibration Exercise Paper Average % Absolute Agreement 43.33% Number of evaluators 15 Average % Adjacent Agreement 50.91% Overall Average Agreement (Adjacent + Absolute) 94.24% Cronbach’s alpha (internal consistency reliability of scale).897

49 CLICK TO EDIT MASTER TITLE STYLE I NTER - RATER A GREEMENT : A CTIVITY Group discussion: Complete the worksheet provided by filling in the details for your EPP –Choose one unit-level (EPP-wide) assessment in which you could apply these strategies and discuss the questions on the worksheet. Spreadsheet template available on QAS Resources website: https://nau.edu/Provost/PEP/Quality-Assurance-System/ https://nau.edu/Provost/PEP/Quality-Assurance-System/

50 CLICK TO EDIT MASTER TITLE STYLE S TRATEGIES FOR C ALIBRATING I NSTRUMENTS Questions? Suggestions? If we aren’t able to get to your question, please post it using the following url. Check back after the presentation for a response. http://tinyurl.com/QASquestions

51 CLICK TO EDIT MASTER TITLE STYLE R ESOURCES & C ONTACT I NFORMATION Quality Assurance System Resources website: https://nau.edu/Provost/PEP/Quality-Assurance-System/ https://nau.edu/Provost/PEP/Quality-Assurance-System/ Contact Information: Cynthia Conn, PhD Assistant Vice Provost, Professional Education Programs Cynthia.Conn@nau.edu Kathy Bohan, EdD Associate Dean, College of Education Kathy.Bohan@nau.edu Sue Pieper, PhD Assessment Coordinator, Office of Curriculum, Learning Design, & Academic Assessment Sue.Pieper@nau.edu Matteo Musumeci, MA-TESL Instructional Specialist, Professional Education Programs Matteo.Musumeci@nau.edu

52 CLICK TO EDIT MASTER TITLE STYLE D EFINITIONS Performance Assessment –An assessment tool that requires test takers to perform—develop a product or demonstrate a process—so that the observer can assign a score or value to that performance. A science project, an essay, a persuasive speech, a mathematics problem solution, and a woodworking project are examples. (See also authentic assessment.) Validity –The degree to which the evidence obtained through validation supports the score interpretations and uses to be made of the scores from a certain test administered to a certain person or group on a specific occasion. Sometimes the evidence shows why competing interpretations or uses are inappropriate, or less appropriate, than the proposed ones. Reliability –Scores that are highly reliable are accurate, reproducible, and consistent from one testing [e.g., rating] occasion to another. That is, if the testing [e.g., rating] process were repeated with a group of test takers [e.g., raters], essentially the same results would be obtained. (National Council on Measurement in Education. (2014). Glossary of important assessment and measurement terms. Retrieved from: http://ncme.org/resource-center/glossary/)

53 CLICK TO EDIT MASTER TITLE STYLE R EFERENCES Daggett, W.R. (2014). Rigor/relevance framework®: A guide to focusing resources to increase student performance. International Center for Leadership in Education. Retrieved from http://www.leadered.com/our-philosophy/rigor-relevance- framework.php Gall, M. D., Borg, W. R., & Gall, J. P. (1996). Educational research: An introduction (6th Edition). White Plains, NY: Longman Publishers. Graham, M., Milanowksi, A., & Miller, J. (2012). Measuring and promoting inter-rater agreement of teacher and principal performance ratings. Center for Educator Compensation Reform. Retrieved from http://files.eric.ed.gov/fulltext/ED532068.pdf Kane, M. (2013). The argument-based approach to validation. School Psychology Review, 42(4), 448-457. Linn, R. L., Baker, E. L., & Dunbar, S. B. (1991). Complex, performance-based assessment: Expectations and validation criteria. Educational Researcher, 20(8), 15- 21. Messick, S. (1994). The interplay of evidence and consequences in the validation of performance assessments. Educational Researcher, 23(2), 13-23. Pieper, S. (2012, May 21). Evaluating descriptive rubrics checklist. Retrieved from http://www2.nau.edu/~d-elearn/events/tracks.php?EVENT_ID=165 Stevens, D. D., & Levi, A. J. (2005). Introduction to rubrics: An assessment tool to save grading time, convey effective feedback and promote student learning. Sterling, VA: Stylus Publishing, LLC.


Download ppt "D EVELOPING A Q UALITY A SSURANCE S YSTEM, B UILDING A V ALIDITY A RGUMENT FOR L OCALLY D EVELOPED P ERFORMANCE A SSESSMENTS, AND S TRATEGIES FOR C ALIBRATING."

Similar presentations


Ads by Google