Presentation is loading. Please wait.

Presentation is loading. Please wait.

Assessment Plan Tune-up

Similar presentations


Presentation on theme: "Assessment Plan Tune-up"— Presentation transcript:

1 Assessment Plan Tune-up
Fall 2016 Assessment Academy Workshop 1 D. Kent Johnson, PhD Director of Assessment

2 Todays Objectives Review outline for the Assessment Report (Part 3 of the Annual Assessment Report) Discuss how to ensure the assessment plan measures the progression of student learning relative to programmatic SLO’s as they matriculate to graduation Examine how present assessment plan leads to actionable assessment findings that can be used to improve student success through programmatic interventions and innovations Plan an assessment schedule to cycle through assessing, intervening, and reassessing student achievement of all SLO’s in a 3 to 4 year cycle Briefly discuss how cumulative reporting contributes to the completion of the Academic Program Review (if we have time)

3 HLC Assessment Expectation Statements
For student learning, a commitment to assessment would mean assessment at the program level that proceeds from clear goals, involves faculty at all points in the process, and analyzes the assessment results; it would also mean that the institution improves its programs or ancillary services or other operations on the basis of those analyses.  Higher Learning Commission. “The Criteria for Accreditation: Guiding Values” Qualified faculty should also be aware of whether and how much students learn through the ongoing collection and analysis of appropriate data, because an institution should be able to demonstrate its commitment to educational achievement and improvement through ongoing assessment of student learning. Higher Learning Commission. “Determining Qualified Faculty through HLC’s Criteria for Accreditation and Assumed Practices: Guidelines for Institutions and Peer Reviewers”. October, 2015.

4 A Review of Department/Program Level Assessment Activities and Results
Assessing, Making Interventions, and Reassessing to Improve Programmatic Capacity for Improving Student Learning

5 IPFW Undergraduate Program Assessment Model

6 Annual Academic Program Assessment Report Outline (SD 15-6)
Overview of Programmatic Student Learning Outcomes (Appendix D, Section I) Curricular Maps (Institutional to Baccalaureate Framework and Programmatic to Core Courses or Specific Curricular Mileposts (Appendix D, Section II and III) Assessment Plan (Appendix D, Section IV) Assessment Results (Appendix D, Section V) Conclusions, Next Steps, and Communication (Appendix D, Section VI and VII)

7 Assessment Reporting Sections 1,2, and 3 should be fairly static (require minimal changes) over time Sections 3 and 4 reporting are informed by the PLAIR Model emphasizing a process of “assess-intervene-reassess” (Fulcher, Good, Coleman, and Smith, 2015). Sections 5 and 6 draw conclusions to demonstrate the improving quality of the program, suggest how the faculty plan on continuing to improve program support of student learning, and communicate to internal and external constituents what students are learning

8 Assessment Plan Outline*
Description of Departments Assessment Model describing the department’s plan for assessing student learning (how, when, and where) relative to Programmatic SLO’s (assess), using assessment results to improve student learning in the program (intervene), and evaluate how changes impacted student learning (reassess). (Fulcher, et.al., 2015). Identification and description of the assessment measures a department plans to use (e.g. – direct assessment of student learning embedded in common course assignments, direct assessment of student portfolios, externally validated professional exams, indirect assessments of student perceptions of learning gains) – need to have two with at least one being a direct measure Description of how assessment products will be evaluated (e.g. rubrics, primary trait analysis, metrics, etc.). Description of how assessment findings will be used to improve student learning * For support see Assessment Handbook and Workbook (ipfw.edu/assessment)

9 Developing the Assessment Plan:
Designing the Assessment Plan – Part 1 Review the curriculum map for your program, does it identify levels of learning expected of students as they progress through the curriculum. If not, ask – are your programmatic SLO’s written on the basis of expectations at graduation? Let’s choose one outcome. One way to think about assessing achievement at programmatic level is to assess somewhere early (100 or 200 level course, in the middle (a 300 level course) and toward the end (perhaps a capstone course or experience) This type of strategy provides the type of evidence you need to identify where learning might be improved For the most reliable assessment, think in terms of what types of products students should produce as evidence of learning – this will begin to help determine the measures that you will use.

10 Developing the Assessment Plan:
Designing the Assessment Plan – Part 2 Direct measures assess student achievement relative to the outcome – based on student “demonstrating” achievement Indirect measures assess a student or faculty member’s perception of achievement relative to the outcome Determining the type of measure forms the foundation for selecting or designing the assessment instrument – or the type of product students will produce to measure The expected level of learning might be expressed as a rubric, matrix, objective statement, etc. – this will largely depend on conventions of your particular discipline, The Plan should define how this will be measured How and when will you measure student learning and a description of how you plan to use findings to guide programmatic change – (e.g. discussions in department meeting leading to curricular changes) – planning interventions and innovations Finally, a description of how will you re-assess the interventions or innovations made

11 Developing the Assessment Plan:
Planning the Assessment Cycle Programmatic SLO Year Assessed Intervention Planned Y/N Year Reassessed

12 Part 2: Reviewing your plan
Begin with the Assessment Plan Evaluation Rubric from Appendix D Work back from it to the SLO’s Work forward to your assessment reporting

13 Appendix D-IV: Assessment Plan
Exemplary 3 Acceptable 2 Developing 1 Score Relationship between assessments and SLOs Detail is provided regarding SLO- to-measure match. Specific items included on the assessment are linked to SLOs. The match is affirmed by faculty subject experts. Description of how SLOs relate to assessment is general but sufficient to show alignment. Description of how SLOs relate to assessment is incomplete or too general to provide sufficient information for use in determining progress toward SLO. Types of Measures All SLOs are assessed using at least two measures including at least one direct measure. Most SLOs are assessed using at least one direct measure. Most SLOs are either assessed using only indirect measures or are not assessed. Established Results Statements of desired results (data targets) provide useful comparisons and detailed timelines for completion. Statements of desired results provide a basic data target and a general timeline for completion. Statements of desired results are missing or unrealistic for completion. Data Collection and Design Integrity The data collection process is sound, clearly explained, and appropriately specific to be actionable. Enough information is provided to understand the data collection process with limited methodological concerns. Limited information is provided about the data collection process or includes sufficient flaws to nullify any conclusions drawn from the data. Evidence of Reliability of Measures Methods used to ensure reliability of findings are clearly explained and consistently support drawing meaningful conclusions. Methods used to ensure reliability of findings are stated and generally support drawing meaningful conclusions. Methods to ensure reliability of findings are insufficient for drawing meaningful conclusions.

14 Appendix D-III: Programmatic Curriculum Map
Exemplary 3 Acceptable 2 Developing 1 Score Content Alignment All SLOs are mapped to common classes or learning activities expected of all students completing the program. Most SLOs are mapped to common classes or learning activities expected of all students completing the program. Common classes or learning activities are identified for all students completing the program but most SLOs are not clearly mapped to classes or activities. Student Learning Development of SLOs (Learning Benchmarks) Curricular Map clearly identifies the progression of student learning relative to all SLOs at specific points in the curriculum. Curricular Map identifies levels of expected learning relative to most SLOs at specific points in the curriculum. Curricular Map identifies expected levels of learning for some SLOs at specific points in the curriculum. Student Engagement Classes and/or activities engage students in the work outlined in the SLOs. Classes and/or activities engage students in the work outlined by most of the SLOs. Classes and/or activities do not consistently engage students in the work outlined by most of the SLOs.

15 Appendix D-II: Alignment with Baccalaureate Framework
Exemplary 3 Acceptable 2 Developing 1 Score IPFW Baccalaureate Framework Alignment Specific, clearly defined, student- centered Program-Level SLOs are aligned to all foundation areas of the IPFW Baccalaureate Framework. Generally defined student- centered Program-Level SLOs are aligned to all foundation areas of the IPFW Baccalaureate Framework. Program-Level SLOs are aligned to some foundation areas of the IPFW Baccalaureate Framework.

16 Appendix D-I: Clearly Stated Student Learning Outcomes
Exemplary 3 Acceptable 2 Developing 1 Score Clarity and specificity All SLOs are stated with clarity and specificity including precise verbs and rich descriptions of the knowledge, skills and value domains expected of students upon completing the program SLOs generally contain precise verbs, rich description of the knowledge, skills and value domains expected of students. SLOs are inconsistently defined for the program, descriptions of the knowledge, skill and value domains are present but lack consistent precision. Student-Centered All SLOs are stated in student-centered terms (i.e. what a student should know, think, or do). Most SLOs are stated in student-centered terms. Some SLOs are stated in student-centered terms. Expectation Level SLOs exceed basic expectations established by the University and other necessary approving organizations required of the submitting unit. SLOs meet the basic expectations established by the University and other necessary approving organizations required of the submitting unit. SLOs meet only a portion of the expectations established by the University or other necessary approving organizations required of the submitting unit.

17 Appendix D-V: Reporting Results
Exemplary 3 Acceptable 2 Developing 1 Score Presentation of Results Results are clearly present and directly related to SLOs. Results consistently demonstrate student achievement relative to stated SLOs. Results are derived from generally accepted practices for student learning outcomes assessment. Results are present and related to SLOs. Results generally demonstrate student achievement relative to stated SLOs. Results are derived from generally accepted practices for student learning outcomes assessment. Results are provided but do not clearly relate to SLOs. Results inconsistently demonstrate student achievement relative to stated SLOs. Use of generally accepted practices for student learning outcomes assessment is unclear. Historical Results Past iterations of results are provided for most assessments to provide context for current results. Past iterations of results are provided for the majority of assessments to provide context for current results. Limited or no iterations of prior results are provided. Interpretation of Results Interpretations of results are reasonable given the SLOs, desired levels of student learning and methodology employed. Multiple faculty interpreted the results including an interpretation of how classes/activities might have affected the results. Interpretations of results are reasonable given the SLOs, desired levels of student learning and methodology employed. Multiple faculty interpreted the results. Interpretation of results does not adequately refer to stated SLOs or identify expectations for student learning relative to SLOs. The interpretation does not include multiple faculty.

18 Appendix D-VI: Report Dissemination and Collaboration
Exemplary 3 Acceptable 2 Developing 1 Score Documents and results are shared with faculty Information is routinely provided to all faculty with multiple opportunities for collaboration to build meaningful future plans. Information is provided to all faculty through an effective mode and with sufficient detail to be meaningful. Information is not distributed to all faculty or provides insufficient detail to be meaningful. Documents and results are shared with other stakeholders Information is routinely provided to stakeholders (beyond faculty) with multiple opportunities for collaboration to build meaningful future plans. Information is shared with stakeholders (beyond faculty) through an effective mode and with sufficient detail to be meaningful. Information is not distributed to stakeholders (beyond faculty) or provides insufficient detail to be meaningful.

19 College Level Report Template Assessment Report
Section 1: Summary of findings detailing scores of all academic departments/programs of the college. Section 2: Summary of recommendations made to academic departments/programs based on their assessment findings. Section 3: Summary of results of changes made or actions taken as a result of prior year findings including results of student learning and a summary of impact (positive or negative). Section 4: Conclusions providing an overall evaluation of assessment in the College and a description of changes in process planned to improve the quality of student learning.

20 Part 3: Supporting Programmatic Assessment
The Office of Assessment, IPFW Assessment Academy, and Assessment Council as the Departmental Support Team

21 Overview of Institutional Assessment Team
Office of Assessment – Coordinates and provides support for assessment activities The Assessment Council – Shared governance focused group of faculty charged with recommending policy and providing oversight of assessment process. The IPFW Assessment Academy Leadership Team – Faculty group charged with creating and supporting learning cohorts, programs, workshops, and other activities focused on developing and improving assessment at IPFW.

22 Support Available to Academic Departments
IPFW Assessment Academy – Cohort based community to support departments through a full cycle of assessment Workshop Series – Delivered in general sessions and available as Department/Academic Program Focused Series “on-demand” and “just-in-time” based on department/program needs Blackboard Course and Assessment Website – Supports the Workshop Series and can be “stand-alone” Assessment Director – Provides ongoing and “on-demand” support, resources, and leadership for programmatic and course level assessment


Download ppt "Assessment Plan Tune-up"

Similar presentations


Ads by Google