Developing Academic Program Assessment Plans UAA Faculty Senate Academic Assessment Committee 1.

Slides:



Advertisements
Similar presentations
National Academic Reference Standards
Advertisements

For AS 229 (Environmental Technology). 1. A competent environmental technologist with strong understanding of fundamental scientific and technological.
Curriculum Maps Margaret Kasimatis, PhD VP for Academic Planning & Effectiveness.
Assessment of Undergraduate Programs Neeraj Mittal Department of Computer Science The University of Texas at Dallas.
ABET-ASAC Accreditation Workshop ABET Criteria and Outcomes Assessment
Using RUBRICS to Assess Program Learning Outcomes By Dr. Ibrahim Al-Jabri Director, Program Assessment Center April 16, 2007.
Good Afternoon Program Assessment Committee (PAC) Data Assessment Records and Analysis For the first three year Cycle Semesters Dr. Hussain Al-Zaher.
Measuring Student Learning March 10, 2015 Cathy Sanders Director of Assessment.
1 A pupil from whom nothing is ever demanded which he cannot do, never does all he can. John Stuart Mill.
1 UCSC Computer Engineering Objectives, Outcomes, & Feedback Tracy Larrabee Joel Ferguson Richard Hughey.
Accreditation Strategy for the BYU CE En Dept. Presentation to External Review Board October 20, 2000.
Computer Science Department Program Improvement Plan December 3, 2004.
Outcomes-Based Accreditation: An Agent for Change and Quality Improvement in Higher Education Programs A. Erbil PAYZIN Founding Member and Past Chairman.
DIPOL Quality Practice in Training at İstanbul Technical University Maritime Faculty Dr.Banu Tansel.
ABET Accreditation Board for Engineering and Technology
Capstone Design Project (CDP) Civil Engineering Department First Semester 1431/1432 H 10/14/20091 King Saud University, Civil Engineering Department.
Assessment College of Engineering A Key for Accreditation February 11, 2009.
Accreditation Board for Engineering and Technology - is a non governmental organization that accredits post secondary educational organizations in : 1)
Program Assessment Workshop Kathleen Harring. What is Assessment? Assessment is the systematic gathering and analysis of information to inform and improve.
ABET Accreditation Status CISE IAB MeeertingJanuary 24, CEN program fully ABET-accredited (in 2006) until 2012: no concerns, no weaknesses, no deficiencies.
Welcome… The attendee will understand assessment basics with a focus on creating learning activities and identifying assessment expectations. Apply the.
CHEN Program Assessment Advisory Board Meeting June 3 rd, 2012.
King Fahd University of Petroleum and Minerals
JIC ABET WORKSHOP No.4 Guidelines on: II Faculty Survey Questionnaire.
A Decade of Experience On Outcome Based Accreditation: Still a Long Way To Go A. Erbil PAYZIN Bülent E. PLATIN Chair, MÜDEK Executive Board Member, MÜDEK.
Timothy S. Brophy, Ph.D., Director of Institutional Assessment University of Florida Office of the Provost.
OUTCOME BASED LEARNING- CONTINUES IMPROVEMENT. Motivation  PEC??  Continues Improvement.
ABET’s coming to Rose! Your involvement Monday, Nov 5, 2012.
CHEMICAL ENGINEERING PROGRAM CHEN Program Assessment Advisory Board Meeting May 21, 2013.
Overview of the Department’s ABET Criterion 3 Assessment Process.
Eportfolio: Tool for Student Career Development and Institutional Assessment Sally L. Fortenberry, Ph.D., and Karol Blaylock, Ph.D. Eportfolio: Tool for.
ASSESSMENT SYED A RIZVI INTERIM ASSOCIATE PROVOST FOR INSTITUTIONAL EFFECTIVENESS.
Note: Because of slide animation, this ppt is intended to be viewed as a slide show.  While viewing the ppt, it may be helpful to obtain a sample Core.
Unit 1 – Preparation for Assessment LO 1.1&1.2&1.3.
1 Department of Electrical and Computer Engineering MDR (18 th -27 th November 2013) -MDR Deliverables clearly defined? -Individual team member MDR deliverables.
 Introduction Introduction  Contents of the report Contents of the report  Assessment : Objectives OutcomesObjectivesOutcomes  The data :
Assessing Your Assessments: The Authentic Assessment Challenge Dr. Diane King Director, Curriculum Development School of Computer & Engineering Technologies.
ABET 2000 Preparation: the Final Stretch Carnegie Institute of Technology Department Heads Retreat July 29, 1999.
EENG 4910/4990 Engineering Design Murali Varanasi September 02, 2009.
1 A pupil from whom nothing is ever demanded which he cannot do, never does all he can. John Stuart Mill.
Supporting ABET Assessment and Continuous Improvement for Engineering Programs William E. Kelly Professor of Civil Engineering The Catholic University.
CEN ABET Mini- Retreat March 4, CEN ABET Mini-Retreat Agenda: –State of the Assessments –Discussion on loop closings. –CSE Program Objectives/Outcomes.
Department of Electrical and Computer Engineering MDR Report.
Copyright © 2011 by ABET, Inc. and TMS 1 December 2, 2008 ABET Update UMC Meeting April 6, 2015 San Francisco, CA Chester J. Van Tyne
Copyright © 2014 by ABET Proposed Revisions to Criteria 3 and 5 Charles Hickman Managing Director, Society, Volunteer and Industry Relations AIAA Conference.
ABET Accreditation Status CISE IAB MeeertingOctober 6, CEN program fully ABET-accredited (in 2006) until 2012: no concerns, no weaknesses, no deficiencies.
CEN Faculty MeetingMarch 31, ABET Accreditation Brief history. –1980’s: faculty qualifications sufficed. –1990s: quality of courses, materials, and.
Gateway Engineering Education Coalition Background on ABET Overview of ABET EC 2000 Structure Engineering Accreditation and ABET EC2000 – Part I.
CISE IAB MeetingOctober 15, ABET Accreditation Brief history. –1980’s: faculty qualifications sufficed. –1990s: quality of courses, materials, and.
Preparing for ABET visit Prof. Dr. Lerzan Özkale Management Engineering Head of Department November 2010.
ABET Accreditation Criterion 4: Continuous Improvement Direct Assessment of Learning Outcomes Dr. Abdel-Rahman Al-Qawasmi Associate Professor EE Department.
ENGINEERING ANALYSIS. WHAT IS ENGINEERING ANALYSIS? ABET Required Program Outcomes: (a) an ability to apply knowledge of mathematics, science, and engineering.
1 Assessment of Undergraduate Programs Neeraj Mittal Department of Computer Science The University of Texas at Dallas (UTD) January 22, 2016.
University of Utah Program Goals and Objectives Program Goals and Objectives Constituents U of U, COE, ASCE, IAB Constituents U of U, COE, ASCE, IAB Strategic.
Assessment of Industrial Internships Karyn Biasca.
Engineering programs must demonstrate that their graduates have the following: Accreditation Board for Engineering and Technology (ABET) ETP 2005.
ABET ACREDITATION By: Elizabeth Rivera Oficina de Acreditación.
Funded by a grant from the National Science Foundation. Any opinions, findings, conclusions or recommendations expressed are those of the authors and do.
Learning Goals Development & Assessment The Basics of Goals-Based Course Design.
 (noun) a specification of an object, manifested by an agent, intended to accomplish goals, in a particular environment, using a set of primitive components,
ABET Accreditation College of IT and Computer Engineering
Continuous Program Improvement
OUTCOME BASED EDUCATION
Accreditation Board for Engineering and Technology
Proposed Revisions to Criteria 3 and 5
Neeraj Mittal September 29, 2017
Department of Computer Science The University of Texas at Dallas
Development of ABET Syllabus
Assessment and Accreditation
CE 220 Professionalism A pupil from whom nothing is ever demanded which he cannot do, never does all he can. John Stuart Mill.
Presentation transcript:

Developing Academic Program Assessment Plans UAA Faculty Senate Academic Assessment Committee 1

What is an assessment plan? Start with your program student learning outcomes Decide what evaluation tools you want to use for which outcome and how often you’ll collect the data Collect the data, then get together and figure out what the results mean and what you want to do about it 2 This is what needs to be assessed This is your assessment plan This your assessment

Good assessment plan characteristics Is your process systematic (as opposed to ad hoc)? Is it sustainable? Can it continue to run if the person in charge leaves? Is it robust, or is it met with faculty apathy? 3

Where to start Program goals/mission statement Student learning outcomes answer the question “what should students be able to do upon completion of your program?” – SLOs relate to the knowledge and skills that students acquire as they progress through your program – If you are externally accredited, these have probably been given to you 4

Ways of gathering assessment data 5 Formative vs. Summative Formative – undertaken while student learning is taking place; the purpose of which is to improve teaching and learning; designed to capture students’ progress Summative – obtained at the end of a course or program; the purpose of which is to document student learning; designed to capture students’ achievement at the end of their program of study Direct vs. Indirect Direct – evidence of student learning which is tangible, visible, self-explanatory Example: performances, creations, results of research, responses to questions or prompts Indirect – evidence that provides signs that students are properly learning, but the evidence of exactly what they are learning is less clear and convincing Example: student satisfaction surveys, alumni surveys Source:

Direct and indirect assessment Direct assessment methods Published/standardized tests Locally-developed tests Embedded assignments and course activities Competence interviews/practica Portfolios Indirect assessment methods Surveys Interviews Focus groups Reflective essays 6 Source:

Commonly used tools at UAA Embedded course-level assessment Standardized tests (if your discipline has one) Alumni/employer surveys Professional portfolios/e- portfolios Field instructor assessments/practical examinations 7 Source:

When forming your plan Capitalize on what you are already doing More data are not necessarily better – Do not try to assess every course every semester – It is generally not considered good practice to try to assess every outcome every year Don’t wait for perfection – it generally take 2- 3 full assessment cycles to get your process nailed down 8

When and how to collect data Student learning is cumulative over time – What students learn in one course, they use, practice and develop in other courses We are not interested in assessing individual students, faculty or courses, we are evaluating programs Focus of data collection in program assessment should be on cumulative effect of student learning With this in mind, you can determine – When to collect data, and how often – From whom to collect data – How to interpret results 9 Source: ABET Advanced Program Assessment Workshop

Course-level assessment Course-embedded assessments that look at actual work produced by students in our courses May be separate from graded work in course (but often are not) Purpose of this assessment is to assess the particular learning outcome, not the grade of the student (although this work can contribute to the grade of the student) May evaluate the student by assigning a grade, but each student can be additionally evaluated for the purpose of assessing the outcome 10 Source:

11

Performance Indicators Not required, but considered a best practice PIs are specific, measureable statements identifying student performance(s) required to meet the SLO, confirmable through evidence Three characteristics of good PIs – Subject content that is the focus of instruction – One action verb (indicates level, e.g. Bloom’s taxonomy) – Value free (don’t use descriptors like “few” or “many”) – we can add value by creating rubrics Rule of thumb: each SLO should have no fewer than 2 PIs and no more than 4 12

Example: writing PIs SLO: an ability to communicate effectively – Communicates information in a logical, well- organized manner – Uses graphics effectively to illustrate concepts – Presents material that is factually correct, supported with evidence, explained in sufficient detail and properly documented – Listens and responds appropriately to questions (for oral communication) 13 Source: UAA ME Department

Creating rubrics for PIs 14 Outcome g: an ability to communicate effectively Performance IndicatorPoorDevelopingSatisfactoryExcellent 1.Communicates information in a logical, well- organized manner Communication is particularly poorly organized, or grammar and usage is particularly poor Organization of communication is limited Communicates information in a way that is satisfactorily well- organized Communicates information in an exceptionally well- organized manner 1.Uses graphics effectively to illustrate concepts Does not attempt to clarify ideas with graphics, or graphics are inappropriate to the idea being expressed Limited attempts to clarify ideas with graphics, or graphics are of limited effectiveness Makes satisfactory use of graphics to illustrate concepts Makes exceptional use of graphics to illustrate concepts 1.Presents material that is factually correct, supported with evidence, explained in sufficient detail and properly documented Much of the material presented is factually incorrect, poorly supported and/or documented incorrectly Some of the material presented is factually incorrect, poorly supported and/or documented incorrectly Factually correct material is satisfactorily supported with evidence, explained in sufficient detail and properly documented Factually correct material is supported with an exceptional amount of evidence or explained particularly well 1.Listens and responds appropriately to questions (for oral communication) Does not respond to questions appropriately or does not listen to questions Makes limited attempts to respond to questions Provides satisfactory response to questions Provides exceptional response to questions Source: UAA ME Department

Streamlining the process 15 Course Semester, Year Course Outcome Criterion Evidence - not too much - substantive - rubric-based - counts (not means) Source: James Allert, Department of Computer Science University of Minnesota Duluth

16 Course title:ME 313Instructor:Brock Number of students:24Semester:Spring 2012 Outcome e: an ability to identify, formulate, and solve engineering problems Performance IndicatorPoorDevelopingSatisfactoryExcellent 1.Identifies relevant known and unknown factors Does not demonstrate understanding of known and unknown factors Demonstrates limited understanding of known and unknown factors Identifies expected known and unknown factors Demonstrates exceptional insight in identifying known and unknown factors 2.Provides appropriate analysis of elements of the solution In unable to provide analysis of the problem Provides limited analysis of the problem Provides satisfactory analysis of the problem Provides analysis of the problem which exceeds expectations 3.Assesses the validity of the solution based on mathematical or engineering insight Makes no attempt to validate the solution, or validation method is completely incorrect Makes limited attempts to validate the solution Assesses the validity of the solution using an appropriate technique Uses multiple techniques to assess validity of solution Number of Students Achieving this Level PI Assessment method Poor (1)Developing (2)Satisfactory (3)Excellent (4) % Students scoring 3 or 4 1Project124575% 2Project033675% 3Project056158% Direct Assessment Action: Students were assigned one of three design problems where they were asked to optimize a thermodynamic cycle for either refrigeration or power generation. They worked in groups of two. Their project reports were assessed. Comments and Proposed Improvement: Source: UAA ME Department

Example results 17 CourseMeasurePI AssessedAttainment Level ME A414Project report1100% Project presentation288% Project report3100% Project presentation488% ME A441Lab report154% Lab report2100% Lab report338% ES A341LLab reports1100% Lab reports2100% Lab reports3100% ME A438Final presentation1-494% Final report1100% Final report282% Final report3100% MeasureAttainment Level Senior exit survey100% Overall attainment level (80/20 weight factor for direct vs. indirect measures): 91% Direct Assessment Measures Indirect Assessment Measures Source: UAA ME Department

Example results 18 Source: UAA ME Department

Example results 19 Outcomes CLA, direct CLA, indirect ME A438 Capstone Design Senior Exit Survey FE Exam Overall Attainment (a)an ability to apply knowledge of mathematics, science and engineering 66%100% 77% (b)an ability to design and conduct experiments, as well as analyze and interpret data 76%91%100%82% (b)an ability to design a system, component, or process to meet desired needs within realistic constraints such as economic, environmental, social, political, ethical, health, and safety, manufacturability, and sustainability 75%88%78%82% (b)an ability to function on multi-disciplinary teams90%100%92% (b)an ability to identify, formulate, and solve engineering problems 67%89%100%81% (b)an understanding of professional and ethical responsibilities 72%80%100% 79% (b)an ability to communicate effectively87%94%100%91% (b)the broad education necessary to understand the impact of engineering solutions in a global, economic, environmental and societal context 85%100%88% (b)a recognition of the need for, and the ability to engage in, life-long learning 59%100%67% (b)a knowledge of contemporary issues75%100%80% (b)an ability to use the techniques, skills, and modern engineering tools necessary for engineering practice 87%93%89% Source: UAA ME Department

Available resources Upcoming workshops in the assessment series – Norming Your Academic Assessment Rubrics, Friday, March 20, 10:30 – 11:30am RH 303 – ePortfolios and Academic Assessment, Friday, April 3, 10:30 – 11:30am RH 303 UAA Academic Assessment Committee webpage: c_assessment_committee/index.cfm c_assessment_committee/index.cfm UConn Assessment Primer, available at