Presentation is loading. Please wait.

Presentation is loading. Please wait.

Compiling and Reporting Your Assessment Plan Data Office of Institutional Assessment & Effectiveness SUNY Oneonta Spring 2012.

Similar presentations


Presentation on theme: "Compiling and Reporting Your Assessment Plan Data Office of Institutional Assessment & Effectiveness SUNY Oneonta Spring 2012."— Presentation transcript:

1 Compiling and Reporting Your Assessment Plan Data Office of Institutional Assessment & Effectiveness SUNY Oneonta Spring 2012

2 Providing a Context: Oneontas Process for Assessing Student Learning Fall 2009: Approval by Presidents Cabinet of APAC Guidelines for Academic Program Assessment Spring 2009: Submission by programs of Step 1 (Establishing Objectives) of guidelines December 2010: Submission by programs of Step 2 (Activities & Strategies) of guidelines June 1, 2011: Submission of Steps 3 (Assessment) and 4 (Closing the Loop) [plans only] All plans approved by academic deans by the end of Fall 2011 2011-12 academic year: First round of data collection, with reports due June 1, 2012 as part of annual reports

3 Emphases Throughout the Process Establishing congruence among institutional goals, programmatic and course objectives, learning opportunities, and assessments Linkages to disciplinary (and, as appropriate, accreditation/certification) standards Using a variety of assessment measures, both quantitative and qualitative, in search of convergence Value of course-embedded assessment Course- vs. program-level assessment

4 Course- Vs. Program-Level Assessment Focus of SUNY Oneonta assessment planning is programmatic student learning objectives, not individual students or faculty For the most part, data collection described in approved assessment plans will take place in the context of the classroom (i.e., course-embedded assessment) Focus at this point, however, is how to compile and aggregate data across courses and course sections, as appropriate, and to report these results in a meaningful way

5 What Youve Done So Far 1. Development of programmatic student learning objectives 2. Curriculum mapping 3. Development of plan for collecting data across a three-year period APAC guidelines recommend that programs assess 1/3 of SLOs every year 4. Development of plan for reviewing data and using that information to revise courses and programs, as appropriate (i.e., closing the loop)

6 Follow Your Curriculum Map – Its Your Assessment Database! Designed correctly, your curriculum map tells you what to assess, when to assess, and how to assess it!

7 Collecting Data: Important Points from Spring 2011 Workshop Collaborative work among faculty to assure : SLOs are being assessed using direct measures Measures are of sufficient quality, either by review process or through the use of a checklist that includes good practice criteria A priori success indicators are in place for each measure as well as common categories for results (e.g., exceeding, meeting, approaching, not meeting standards) Decisions on how issue of multiple course sections will be addressed Process in place for assuring: Reliable scoring of qualitative measures Courses (and students) to be included in assessment are representative

8 Value of Assessment Data Will Be Maximized IF: A variety of assessment measures is used Quantitative and qualitative Course-embedded and stand alone measures Benchmarking as available For each SLO, data are collected from multiple courses/course sections Criteria for scoring and determining what meets standards are consistent across courses Do measures have to be the SAME? No, but…….

9 And Perhaps Most Important Theres no point in collecting useless data!

10 Compiling and Reporting Assessment Data Across Courses Aggregating and Documenting Your Assessment Results

11 Approaching the Task Systematically 1. Start with your curriculum map, which depicts: The courses in which youre covering – and assessing – SLOs And, depending on the detail of your map, the levels at which SLOs are being addressed and the measures being used to assess them 2. Select courses – and faculty – to be included in the assessment and communicate that expectation clearly and establish a time frame for administering assessment and submitting results

12 Approaching the Task Systematically (cont.) 3. Collect and compile data (categorized as agreed upon by department) on spreadsheet for review 4. Review results collectively, reach conclusions regarding strengths and weaknesses, and make recommendations for change as appropriate Be sure to document all this (i.e., write it down)

13 A Step-by-Step Example

14 1.Following Your Curriculum Map The Simplest Case: Which Courses Cover Which SLOs? To What Extent Do Courses Cover SLOs?

15 1.Following Your Curriculum Map At What Level Do Courses Cover SLOs? How Do Courses Assess SLOs?

16 2.Selecting Courses to be Included in Assessment Do all courses covering a particular SLO need to be included? Ideally, yes – remember, each SLO is only assessed every 3 years Especially important if SLOs are being assessed at different levels (in terms of proficiency and/or course levels) Do all course sections covering a particular SLO need to be included? No, but why not? If not, important that: Adequate number of students are included in the assessment Sections are selected randomly

17 2.Selecting Courses to be Included in Assessment (cont.) Once courses are selected, communicate this fact quickly and clearly to faculty, being sure to emphasize: SLOs to be assessed and categories to be used in reporting results Mapping of assignments – and eventually assessment results – to specific SLOs and the need for a priori criteria in categorizing student performance Time line for submitting results (late enough in semester for students to demonstrate competence but early enough for data to be compiled) Must assignments and a priori criteria be the same? No, but the more comparable, the more meaningful the results Ultimately, its a programmatic decision

18 A Few More Hints About Assignments Final grades are NEVER an adequate measure for a specific SLO Overall assignment grades are rarely – if ever – an adequate measure for a specific SLO Often only minor adjustments are needed in existing assignments to gather information on specific SLOs Limit assignments to one SLO Break assignments into subparts, each of which addresses a specific SLO Identify test questions that map to specific SLOs

19 3.Compiling Data for Review and Discussion Important for all results be submitted in agreed- upon format, organized, and maintained in a single place (an Excel spreadsheet is perfect) Different ways to organize results for review and deliberation Best to start with all data for each SLO, showing each course and its overall results for student performance organized by performance categories The more you can aggregate, the better (i.e., by course, course level, level of proficiency), being attentive to variations within aggregating categories Importance of removing identifiers

20 Sample Spreadsheet (for Internal Review by Program Faculty) Course ## StudentsExceedingMeetingApproachingNot Meeting GER 100252456128 GER 1002295050 Sub-Total - GER 1004759.5288.54 GER 150247614100 GER 175122442277 Sub-Total 100-level8354.752813.53.75 GER 2254753.229.812.84.3 GER 2751915.773.75.3 GER 280236122134 Sub-Total 200-level8943.341.8310.374.53 GER 330282150254 GER 335283929275 GER 350172446290 Sub-Total 300-level732841.67273 GER 4819336600 GER 4815010000 GER 481212443330 Sub-Total 400-level351969.67110 Total28037.6843.9615.322.89

21 4.Reviewing and Discussing Results and Closing the Loop Important points of discussion How are data consistent? Do students at different course levels perform similarly? Eventually, it will be possible to look at this issue over time How are they distinctive? Do students perform better on some objectives than others? Are assessment results acceptable to programs faculty? What strengths (and weaknesses) are revealed and what explains them? What should (and will) the program do to improve areas of weakness?

22 Incorporating Assessment Data into Annual Report At present, directions for Annual Report include the following: Descriptions of student learning outcome assessments conducted for your students during the year. These descriptions should include a summary of the assessment methods that were used and the results, including benchmarking as appropriate. Also, whenever possible, detail course-specific student learning outcomes and their relationship(s) to departmental or programmatic expectations.

23 Incorporating Assessment Data into Annual Report (cont.) APACs recommendation to the Provost and Deans is that Annual Reports include two distinct components with respect to the assessment of student learning, effective June 2012 Chart that summarizes the assessments that were done during the academic year and results Narrative that summarizes the programs discussion of the assessment results, overall conclusions, and planned changes based on the assessment data Programs may also report other relevant data (e.g., results from senior/alumni surveys)

24 Sample Outcomes Assessment Chart – Aggregate Data Simplest Case (and All Thats Required) Courses Assessment Measure(s) Performance Criteria # of Students % Exceeding Standards % Meeting Standards % Approaching Standards % Not Meeting Standards All Portfolio4-5=Exceeding 3=Meeting 2=Approaching 1=Not Meeting 33818%59%13%10%

25 Sample Outcomes Assessment Charts – More Detailed By Course Level Courses Assessment Measure(s) Performance Criteria # of Students % Exceeding Standards % Meeting Standards % Approaching Standards % Not Meeting Standards 100-Level Tests 90%+=Exceeding 80-89%=Meeting 70-79%=Approaching 69% and below=Not Meeting 20318%59%13%10% 200-Level Papers 4=Exceeding 3=Meeting 2=Approaching 1=Not Meeting 7818%59%13%10% Tests 90%+=Exceeding 80-89%=Meeting 70-79%=Approaching 69% and below=Not Meeting 11218%59%13%10% 300-Level Portfolios4-5=Exceeding 3=Meeting 2=Approaching 1=Not Meeting 9618%59%13%10%

26 Sample Outcomes Assessment Charts – More Detailed By Competency Level Courses Assessment Measure(s) Performance Criteria # of Students % Exceeding Standards % Meeting Standards % Approaching Standards % Not Meeting Standards Beginning Tests 90%+=Exceeding 80-89%=Meeting 70-79%=Approaching 69% and below=Not Meeting 20318%59%13%10% Inter- mediate Papers 4=Exceeding 3=Meeting 2=Approaching 1=Not Meeting 789%61%20%10% Tests 90%+=Exceeding 80-89%=Meeting 70-79%=Approaching 69% and below=Not Meeting 11211%54%25%10% Advanced Portfolios4-5=Exceeding 3=Meeting 2=Approaching 1=Not Meeting 9615%54%18%13%

27 Issues to Be Addressed in Narrative Overall conclusions of student performance on SLO as revealed by assessment Description of strengths and weaknesses revealed by data Planned revisions for improvements as appropriate Planned revisions to the assessment process itself as appropriate

28 APAC Members Paul French Josh Hammonds Michael Koch Julie Licata Richard Lee Patrice Macaluso Anuradhaa Shastri Bill Wilkerson (Chair) Patty Francis (ex officio)


Download ppt "Compiling and Reporting Your Assessment Plan Data Office of Institutional Assessment & Effectiveness SUNY Oneonta Spring 2012."

Similar presentations


Ads by Google