Presentation is loading. Please wait.

Presentation is loading. Please wait.

Developing Academic Program Assessment Plans UAA Faculty Senate Academic Assessment Committee 1.

Similar presentations


Presentation on theme: "Developing Academic Program Assessment Plans UAA Faculty Senate Academic Assessment Committee 1."— Presentation transcript:

1 Developing Academic Program Assessment Plans UAA Faculty Senate Academic Assessment Committee 1

2 What is an assessment plan? Start with your program student learning outcomes Decide what evaluation tools you want to use for which outcome and how often you’ll collect the data Collect the data, then get together and figure out what the results mean and what you want to do about it 2 This is what needs to be assessed This is your assessment plan This your assessment

3 Good assessment plan characteristics Is your process systematic (as opposed to ad hoc)? Is it sustainable? Can it continue to run if the person in charge leaves? Is it robust, or is it met with faculty apathy? 3

4 Where to start Program goals/mission statement Student learning outcomes answer the question “what should students be able to do upon completion of your program?” – SLOs relate to the knowledge and skills that students acquire as they progress through your program – If you are externally accredited, these have probably been given to you 4

5 Ways of gathering assessment data 5 Formative vs. Summative Formative – undertaken while student learning is taking place; the purpose of which is to improve teaching and learning; designed to capture students’ progress Summative – obtained at the end of a course or program; the purpose of which is to document student learning; designed to capture students’ achievement at the end of their program of study Direct vs. Indirect Direct – evidence of student learning which is tangible, visible, self-explanatory Example: performances, creations, results of research, responses to questions or prompts Indirect – evidence that provides signs that students are properly learning, but the evidence of exactly what they are learning is less clear and convincing Example: student satisfaction surveys, alumni surveys Source:

6 Direct and indirect assessment Direct assessment methods Published/standardized tests Locally-developed tests Embedded assignments and course activities Competence interviews/practica Portfolios Indirect assessment methods Surveys Interviews Focus groups Reflective essays 6 Source:

7 Commonly used tools at UAA Embedded course-level assessment Standardized tests (if your discipline has one) Alumni/employer surveys Professional portfolios/e- portfolios Field instructor assessments/practical examinations 7 Source:

8 When forming your plan Capitalize on what you are already doing More data are not necessarily better – Do not try to assess every course every semester – It is generally not considered good practice to try to assess every outcome every year Don’t wait for perfection – it generally take 2- 3 full assessment cycles to get your process nailed down 8

9 When and how to collect data Student learning is cumulative over time – What students learn in one course, they use, practice and develop in other courses We are not interested in assessing individual students, faculty or courses, we are evaluating programs Focus of data collection in program assessment should be on cumulative effect of student learning With this in mind, you can determine – When to collect data, and how often – From whom to collect data – How to interpret results 9 Source: ABET Advanced Program Assessment Workshop

10 Course-level assessment Course-embedded assessments that look at actual work produced by students in our courses May be separate from graded work in course (but often are not) Purpose of this assessment is to assess the particular learning outcome, not the grade of the student (although this work can contribute to the grade of the student) May evaluate the student by assigning a grade, but each student can be additionally evaluated for the purpose of assessing the outcome 10 Source:

11 11

12 Performance Indicators Not required, but considered a best practice PIs are specific, measureable statements identifying student performance(s) required to meet the SLO, confirmable through evidence Three characteristics of good PIs – Subject content that is the focus of instruction – One action verb (indicates level, e.g. Bloom’s taxonomy) – Value free (don’t use descriptors like “few” or “many”) – we can add value by creating rubrics Rule of thumb: each SLO should have no fewer than 2 PIs and no more than 4 12

13 Example: writing PIs SLO: an ability to communicate effectively – Communicates information in a logical, well- organized manner – Uses graphics effectively to illustrate concepts – Presents material that is factually correct, supported with evidence, explained in sufficient detail and properly documented – Listens and responds appropriately to questions (for oral communication) 13 Source: UAA ME Department

14 Creating rubrics for PIs 14 Outcome g: an ability to communicate effectively Performance IndicatorPoorDevelopingSatisfactoryExcellent 1.Communicates information in a logical, well- organized manner Communication is particularly poorly organized, or grammar and usage is particularly poor Organization of communication is limited Communicates information in a way that is satisfactorily well- organized Communicates information in an exceptionally well- organized manner 1.Uses graphics effectively to illustrate concepts Does not attempt to clarify ideas with graphics, or graphics are inappropriate to the idea being expressed Limited attempts to clarify ideas with graphics, or graphics are of limited effectiveness Makes satisfactory use of graphics to illustrate concepts Makes exceptional use of graphics to illustrate concepts 1.Presents material that is factually correct, supported with evidence, explained in sufficient detail and properly documented Much of the material presented is factually incorrect, poorly supported and/or documented incorrectly Some of the material presented is factually incorrect, poorly supported and/or documented incorrectly Factually correct material is satisfactorily supported with evidence, explained in sufficient detail and properly documented Factually correct material is supported with an exceptional amount of evidence or explained particularly well 1.Listens and responds appropriately to questions (for oral communication) Does not respond to questions appropriately or does not listen to questions Makes limited attempts to respond to questions Provides satisfactory response to questions Provides exceptional response to questions Source: UAA ME Department

15 Streamlining the process 15 Course Semester, Year Course Outcome Criterion Evidence - not too much - substantive - rubric-based - counts (not means) Source: James Allert, Department of Computer Science University of Minnesota Duluth

16 16 Course title:ME 313Instructor:Brock Number of students:24Semester:Spring 2012 Outcome e: an ability to identify, formulate, and solve engineering problems Performance IndicatorPoorDevelopingSatisfactoryExcellent 1.Identifies relevant known and unknown factors Does not demonstrate understanding of known and unknown factors Demonstrates limited understanding of known and unknown factors Identifies expected known and unknown factors Demonstrates exceptional insight in identifying known and unknown factors 2.Provides appropriate analysis of elements of the solution In unable to provide analysis of the problem Provides limited analysis of the problem Provides satisfactory analysis of the problem Provides analysis of the problem which exceeds expectations 3.Assesses the validity of the solution based on mathematical or engineering insight Makes no attempt to validate the solution, or validation method is completely incorrect Makes limited attempts to validate the solution Assesses the validity of the solution using an appropriate technique Uses multiple techniques to assess validity of solution Number of Students Achieving this Level PI Assessment method Poor (1)Developing (2)Satisfactory (3)Excellent (4) % Students scoring 3 or 4 1Project124575% 2Project033675% 3Project056158% Direct Assessment Action: Students were assigned one of three design problems where they were asked to optimize a thermodynamic cycle for either refrigeration or power generation. They worked in groups of two. Their project reports were assessed. Comments and Proposed Improvement: Source: UAA ME Department

17 Example results 17 CourseMeasurePI AssessedAttainment Level ME A414Project report1100% Project presentation288% Project report3100% Project presentation488% ME A441Lab report154% Lab report2100% Lab report338% ES A341LLab reports1100% Lab reports2100% Lab reports3100% ME A438Final presentation1-494% Final report1100% Final report282% Final report3100% MeasureAttainment Level Senior exit survey100% Overall attainment level (80/20 weight factor for direct vs. indirect measures): 91% Direct Assessment Measures Indirect Assessment Measures Source: UAA ME Department

18 Example results 18 Source: UAA ME Department

19 Example results 19 Outcomes CLA, direct CLA, indirect ME A438 Capstone Design Senior Exit Survey FE Exam Overall Attainment (a)an ability to apply knowledge of mathematics, science and engineering 66%100% 77% (b)an ability to design and conduct experiments, as well as analyze and interpret data 76%91%100%82% (b)an ability to design a system, component, or process to meet desired needs within realistic constraints such as economic, environmental, social, political, ethical, health, and safety, manufacturability, and sustainability 75%88%78%82% (b)an ability to function on multi-disciplinary teams90%100%92% (b)an ability to identify, formulate, and solve engineering problems 67%89%100%81% (b)an understanding of professional and ethical responsibilities 72%80%100% 79% (b)an ability to communicate effectively87%94%100%91% (b)the broad education necessary to understand the impact of engineering solutions in a global, economic, environmental and societal context 85%100%88% (b)a recognition of the need for, and the ability to engage in, life-long learning 59%100%67% (b)a knowledge of contemporary issues75%100%80% (b)an ability to use the techniques, skills, and modern engineering tools necessary for engineering practice 87%93%89% Source: UAA ME Department

20 Available resources Upcoming workshops in the assessment series – Norming Your Academic Assessment Rubrics, Friday, March 20, 10:30 – 11:30am RH 303 – ePortfolios and Academic Assessment, Friday, April 3, 10:30 – 11:30am RH 303 UAA Academic Assessment Committee webpage: c_assessment_committee/index.cfm c_assessment_committee/index.cfm UConn Assessment Primer, available at 20


Download ppt "Developing Academic Program Assessment Plans UAA Faculty Senate Academic Assessment Committee 1."

Similar presentations


Ads by Google