Presentation is loading. Please wait.

Presentation is loading. Please wait.

YSU OFFICE OF ASSESSMENT OCTOBER 9 & 10, 2012 Assessment Reports 1.

Similar presentations


Presentation on theme: "YSU OFFICE OF ASSESSMENT OCTOBER 9 & 10, 2012 Assessment Reports 1."— Presentation transcript:

1 YSU OFFICE OF ASSESSMENT OCTOBER 9 & 10, 2012 Assessment Reports 1

2 Goals of Workshop Current assessment context Assessment reportevaluation process Overview of new Higher Learning Commission Criteria Review key items on the assessment report template Walk through new online reporting form 2

3 Assessment Context 3

4 Accreditation Context  Last year participating in the Higher Learning Commission’s (HLC) Academy for the Assessment of Student Learning  New HLC Criteria effective January 1, 2013 Quality of Assessment Processes  Strengths  Excellent participation LO review (100%), curriculum maps (95%)  Quality/participation in reporting process improving over time – but 100% needed  Challenges  Continuous collection of data (no years off!)  Stepping back for the big picture—what is the impact on learning? 4

5 Academic Plan and Report Quality 5

6 Assessment Process Review Process  Focused rubric – emphasis on assessment priorities  Team of 2 reviewers:  Assessment Council Member  Assessment volunteer – good service opportunity  Final review by Director  Feedback via and/or meeting  Strengths of plan or report  Suggestions for next year  Revisions, if needed  Quality levels  Exemplary, proficient – high quality  Progressing – developing expertise  Inadequate – request revision 6

7 Assessment Process, cont. Assessment Reporting Priorities  Focus on use of data  Reflect on changes and impact on learning  Continuous data collection (i.e. every year!)  Streamline reporting – focus on process vitality, not form What’s New This Year  Online reporting  Fewer questions  Focused rubrics on priority areas Future Goals  Longer reporting cycle  Possible integration with program review 7

8 Keeping a Student Learning Archive Accreditation Archives  Departments need to keep a student learning archive for 10 years  Plan and report submissions kept in OOA for 10 years 8 Archive Examples Summaries of data on student learning Representative student work examples at different performance levels Student work evaluation criteria, e.g., rubrics Assessment plans and reports Newsletters Website screenshots Meeting minutes on assessment

9 HLC New Criteria for Accreditation New Criteria at: Guiding Values, includes: 1. Focus on student learning 4. Culture of continuous improvement 5. Evidence-based institutional learning and self-presentation 9. Mission-centered evaluation The Five Criteria 1. Mission 2. Integrity: Ethical and Responsible Conduct 3. Teaching and Learning: Quality, Resources, and Support 4. Teaching and Learning: Evaluation and Improvement 5. Resources, Planning, and Institutional Effectiveness 9

10 New HLC Criteria Relevant to Practice 4.B. The institution demonstrates a commitment to educational achievement and improvement through ongoing assessment of student learning. 1. The institution has clearly stated goals for student learning and effective processes for assessment of student learning and achievement of learning goals. 2. The institution assesses achievement of the learning outcomes that it claims for its curricular and co-curricular programs. 3. The institution uses the information gained from assessment to improve student learning. 4. The institution’s processes and methodologies to assess student learning reflect good practice, including the substantial participation of faculty and other instructional staff members. 10

11 New HLC Criteria Relevant to Practice, cont. 3A2: The institution articulates and differentiates learning goals for its undergraduate, graduate, post- baccalaureate, post-graduate, and certificate programs. 3A3: The institution’s program quality and learning goals are consistent across all modes of delivery and all locations (on the main campus, at additional locations, by distance delivery, as dual credit, through contractual or consortial arrangements, or any other modality). 5C2: The institution links its processes for assessment of student learning, evaluation of operations, planning, and budgeting 11

12 Upcoming Assessment Workshops Developing an Assessment Plan Wednesday, October 10 th, 1-2 pm Completing the Assessment Report Wednesday, October 10 th, am Note: workshops/forms overlap 12

13 Assessment Plans Assessment Reports Plans and methods to cover all SLOs in 3-4 year cycle Criteria for at least the 1 st year Plans for sharing results with major stakeholders Data from previous year Two methods and data summary for two SLOs Analysis of student learning for strengths and challenges Action steps based on data Sharing of data and results Assessment Plans vs. Reports 13 Both Plans and Reports: Engagement of faculty Impact on learning from previous action steps

14 EVERYTHING TO KNOW BUT DIDN’T WANT TO ASK Completing the Assessment Report Form 14

15 Assessment Templates Section 1: Identifying and Contact Information Section 2: Outside Accreditation Section 3: Assessment and Evaluation of Student Learning Outcomes Section 4: Use of Data 15

16 Sections 1 &2: Identifying/Accreditation Information

17 Degree ProgramDegree LevelProgram Tracks Political ScienceBAGeneral Foreign Affairs Public Management Teacher EducationBSEdEarly Childhood Middle Childhood Secondary Career/Technical Multi-age BiologyBAn/a BiologyBSn/a Medical CodingCertificaten/a 17 Save time and fill out only online! Note the difference between degree, program, and track:

18 Section 3: Assessment and Evaluation of Student Learning Outcomes 18

19 Section 3: When SLOs Overlap Year of Cycle Report Options B.A in PhysicsB.S. in Physics Year One SLOs are the same for both programs this year Can turn in ONE SECTION THREE table this year that covers both the B.A. and B.S. 1.Students will learn to model physical systems and interpret experimental and theoretical results. 2.Students will learn how to measure the physical properties of systems using a variety of test equipment and defend the results of their measurements using the associated accuracy and precision of these measurements. 1.Students will learn to model physical systems and interpret experimental and theoretical results. 2.Students will learn how to measure the physical properties of systems using a variety of test equipment and defend the results of their measurements using the associated accuracy and precision of these measurements. Year Two SLOs are NOT the same for both programs this year Must turn in TWO SECTION THREEs this year to cover both the B.A. and B.S. 1.Students will learn to apply the concepts of Classical Physics, Modern Physics, Thermodynamics, and Electrostatics to solve problems and predict numerical results. 2.In addition to the learning outcomes for the B.A. Program in Physics, students of the B.S. program in Physics will further learn to apply the concepts of Electrodynamics and Quantum Mechanics to solve problems and predict numerical results. 19

20 Section 3: Learning Outcomes 1. State the student learning outcome assessed: One per column for a total of two OOA considers the quality of the SLO, but recognizes limitations such as accreditation restrictions For more on revising learning outcomes, see the OOA website for the assessment plan workshop 20

21 Section 3: Methods 2. What methods/measures did you use to assess student learning? Provide two methods for each SLO One must be a direct method Methods should include/attach:  where it was administered (e.g., capstone)  Performance criteria (e.g., rubrics) Same method can span multiple SLOs (e.g., you could use the same method for both SLOs) 21

22 Assessment Method Definitions 22 TermDefinitionExamples Direct Measure Direct measures provide for the direct examination or observation of [staff, faculty, student] knowledge or skills against measurable learning outcomes In-class/embedded assignments, oral presentations, performance appraisals, internship supervisors’ evaluations, behavioral observations, etc. Indirect Measure Indirect measures of learning that ascertain the opinion or self ‐ report of the extent or value of learning experiences Written surveys, exit and other interviews, archival records, focus groups, etc. Performance Criteria Specific, measurable statements identifying the performance(s) required to meet the outcome; confirmable through evidence. Standards, rubrics*, specifications, outcomes, metrics, objectives, *For an example of rubrics, see the AAC&U’s VALUE Rubric Project:

23 Section 3: Data Summary 3. What were the data resulting from these methods? What were your results? Include:  Number of students  Aggregate/group data patterns Note: data patterns may or may not point to conclusions 23

24 Section 3: Data Summary Good ExampleWeak Example Of 24 students who were evaluated by site supervisors while enrolled in Internship in , the mean evaluation score for Critical Thinker (Knowledge about cultural differences and diversity) was 3.71/4.00; the mean evaluation score for Organize Outreach to low-income families was 3.67/4.00. Open ended comments by 10 site supervisors revealed 2 themes: students demonstrate professionalism in their work and students seek more hands-on experience in working with inner city youth. Students demonstrated that their knowledge of the subject X had increased considerably and they were able to develop a greater appreciation for different populations (continued)

25 Section 3: Successes and Challenges 4. What successes and challenges do you see in students’ learning as a result of these assessments? At least one strength and one challenge Not about pedagogy or curriculum Looks at data patterns (with contextual knowledge) to reach conclusions (about students’ learning) 25

26 Section 3: Successes and Challenges Good ExampleWeak Example Successes: There is consistent evidence that students are mastering requisite knowledge and skills and professionalism to work effectively with diverse populations. The challenge is that a small number of students and graduates indicate that faculty may need to help all students acquire a deeper understanding and hands-on experience in working with diverse populations such as inner city youth and they seek practical knowledge for promoting social justice. Students prove in their writing via embedded essay questions that they have learned more since they enrolled in the course (continued)

27 Section 3: Action Steps 5. How did you use the data: e.g., what recommendations and action steps to the program have resulted from reviewing the data and where is the department in this process? Any decisions should be based on data The data may not indicate the need for change; that is fine, just explain your conclusion Could include pedagogy or curriculum impact or issues here 27

28 Section 3: Action Steps Good Example: Action StepsWeak Example: Action Steps Develop a new course that integrates human development processes, poverty, ethics, and family issues to promote a deeper understanding of social justice and diversity issues. Bridges Out of Poverty curricular materials will be introduced to students in Course X Increase computers, lab space, and faculty. We will look at the data over time since this data set involved a small number of majors so it is too early to make changes (continued)

29 Section 4- Use of Data 29

30 Section 4: Sharing Results 6. How are you sharing the results of the data discussed in section three with your students, your college, and other stakeholders? Include both internal and external stakeholders Examples:  Students’ review of aggregate data  College wide assessment committees  Discuss in advisory group meeting  Share with foundational subject departments (e.g., Engineering Dept. shares findings with Mathematics Dept.) 30

31 Section 4: SLOs and Curricular Maps 7. How did the assessment activities in (i.e., reviewing learning outcomes and completing curriculum maps) impact your program? No correct answer; just experience of departments Questions to consider:  Did you streamline learning outcomes?  Did they foster faculty discussion?  Were gaps in learning or assessment practices uncovered?  Did you find more efficient ways to collect data? 31

32 Section 4: Impact on Learning 8. In the past several years (e.g., ), you have analyzed data and identified action steps for learning outcomes. Considering action steps from previous years, what has been an impact on student learning as a result of (one of) those action steps? Refer to past assessment reports ( ) Focus on how action step impacted student learning Do not need specific supporting data, just professional judgment at this stage 32

33 Section 4: Engaging Faculty 9. How is your department working to engage all faculty in the assessment process? All department faculty should be meeting at least once per year to discuss assessment results and decide on action steps Collective responsibility Not just one person’s job 33

34 Section 4: Additional Information 10. Optional: Is there anything else you would like to share regarding your assessment report and/or is there any particular area on which you would like assistance or feedback? Is there more to “the story” than reflected in the report? Something the Office of Assessment or Assessment Council can assist you with? Examples:  Involving students in review of data  Increasing faculty participation 34

35 web.ysu.edu/assessment/templates 35 Template Submission link

36 New Online Reporting Form Online Assessment Report Submission Form: Note: if you have new or revised undergraduate learning outcomes, they should also be sent to Jean Engle at 36

37 TO VIEW ASSESSMENT PLAN OR REPORT FORMS AND SCORING RUBRICS, AS WELL AS THIS PRESENTATION, VISIT: CONTACT INFO: HILLARY FUHRMAN, X2453 OFFICE OF ASSESSMENT, X2014 Thank you for your participation! 37


Download ppt "YSU OFFICE OF ASSESSMENT OCTOBER 9 & 10, 2012 Assessment Reports 1."

Similar presentations


Ads by Google