ABET Assessment One Department’s Experience Paul H. Schimpf penguin.ewu.edu/~pschimpf Eastern Washington University Assessment Colloquium April 26, 2013.

Slides:



Advertisements
Similar presentations
Sheltered Instruction Observation Protocol
Advertisements

Progress Monitoring Short Response. Rubric for a score of 2 Indicates a thorough understanding of the scientific concept Completed the task correctly.
Confirming Student Learning: Using Assessment Techniques in the Classroom Douglas R. Davenport Dean, College of Arts and Sciences Truman State University.
A Vehicle to Promote Student Learning
The only thing all about you is the grade:-]. Critical Analysis Addressing two fundamental questions: What it is I claim to know? How valid are the methods.
Developing an Outcomes Assessment Plan. Part One: Asking a Meaningful Question OA is not hard science as we are doing it. Data that you collect is only.
To Challenge all Learners
Daniel Peck January 28, SLOs versus Course Objectives Student Learning Outcomes for the classroom describe the knowledge, skills, abilities.
S3 Useful Expressions.
Assisting Peers to Provide W orthwhile Feedback UC Merced SATAL Program.
Learning Objectives, Performance Tasks and Rubrics: Demonstrating Understanding and Defining What Good Is Brenda Lyseng Minnesota State Colleges.
ASSESSING ORAL CLASSROOM PRESENTATIONS DAVID W. KALE, PH.D. PROFESSOR OF COMMUNICATION, MVNU.
Providing Constructive Feedback
Graduate Program Assessment Report. University of Central Florida Mission Communication M.A. Program is dedicated to serving its students, faculty, the.
How to manage your advisor (and one day, how to manage your student.
ICE Evaluations Some Suggestions for Improvement.
Computer Science Department Program Improvement Plan December 3, 2004.
1 Student Perceptions of Assessment Placement: Results and Implications Gregory Anderson ESL Dept (faculty) De Anza College 14 April 2011.
Understanding Validity for Teachers
Enhancing Student Learning Through Error Analysis
How the Social Studies Interns are Viewed by their Mentors Going Public Presentation Mike Broda, Mark Helmsing, Chris Kaiser, and Claire Yates.
OCTOBER ED DIRECTOR PROFESSIONAL DEVELOPMENT 10/1/14 POWERFUL & PURPOSEFUL FEEDBACK.
How to Use Data to Improve Student Learning Training Conducted at Campus-Based SLO Summit Spring 2014 How to Use Data to Improve Student Learning.
Bettina Matysiak PIDP 3104November Introduction  What are “application cards”?  an informal assessment technique which allows instructors to.
Strategies for Interpreting a Prompt and Succeeding at the In-Class Timed Writing Essay.
Paper #2 (due 2/6/13) After reading Chapter 7 in the textbook ("Arguing a Position"), read David Crystal's article, "2b or Not 2b?" (pp in your.
© New Zealand Ministry of Education copying restricted to use by New Zealand education sector. Page 1 Consider the Evidence Evidence-driven.
Please check, just in case…. Announcements 1.Plagiarism certificate due next week, without fail. Make sure you keep the original. 2.The classroom or caseload.
The SET/SEM Project. The SET/SEM project Wireless collection and data-basing Utilises student SMART Phones in the classroom Processes data instantly.
Persuasive Writing Rebekah Lowery. What is Persuasive Writing? Writing that has as its purpose convincing others to accept the writer’s position as valid,
Prepared by M.A. Sana Yousif Ahmed College of Languages English Department Evening Classes.
EDU 385 CLASSROOM ASSESSMENT Week 1 Introduction and Syllabus.
My Course Was Selected For LAE Assessment Reporting. What Do I Do Now??? LAE Assessment.
Colleen Taylor, Ph. D. Tongwen Wang, Ph. D. Department of Chemistry Virginia State University.
1 Chapter 10: Introduction to Inference. 2 Inference Inference is the statistical process by which we use information collected from a sample to infer.
Mauro Garcia July 26, 2015 EDTC 3332 Instructional Technology Practicum Training Module: Creating a Gradebook Using Microsoft Excel.
Fair and Appropriate Grading
BY LINDA CASTILLO If I have a pencil sharpening procedure will the classroom have fewer distractions?
S519: Evaluation of Information Systems Analyzing data: Merit Ch8.
Beautiful Question Project PHL 103 Erica Michalak.
Ian F. C. Smith Preparing and presenting a poster.
2015 SAA Board Survey. Raw Board Survey ResultsStrongly Agree AgreeDisagreeStrongly Disagree Don't Know Total Points Responses minus DKs Average Score.
Explain Roland G. Fryer's plan for improving the performance of students in New York schools.
CLUE Project Sarah Stover Literature and Society Dr. Sherry 10/03/11.
National Board Study Group Meeting Dan Barber 5 th Grade Teacher, Irwin Academic Center
1 Evaluating the User Experience in CAA Environments: What affects User Satisfaction? Gavin Sim Janet C Read Phil Holifield.
Today Discussion Follow-Up Interview Techniques Next time Interview Techniques: Examples Work Modeling CD Ch.s 5, 6, & 7 CS 321 Human-Computer Interaction.
CMSC 304 Giving Effective Presentations Professor Marie desJardins April 16, /16/13 1 CMSC Presentations.
Idiom of the Day IN THE LOOP To keep someone informed and up-to-date about what’s happening – usually in the workplace.
Strategies for Essay Tests. Preparing for the test Know what is expected of you. What content will be covered? How many questions will be on the test?
Assessing Student Learning Outcomes Andrew Swan What are Student Learning Outcomes?  Education reform in the 1990s pushed for more sophisticated goals.
Learning Objectives for Senior School Students. Failing to plan is planning to fail. / Psychology of Achievement /
Writing FRQ’s AP US Govt. and Politics. AP Exam Format There are two parts. A multiple choice section and a FRQ section. MC section: 60 questions in 45.
 You want to choose a professor who can write a letter that includes specifics about your personal characteristics or accomplishments rather than "glittering.
Using Online Quizzing to Improve Student Preparedness and Learning Outcomes John Broida Psychology Department University of Southern Maine
Developing effective faculty processes for quality assessment
Unit 5: Hypothesis Testing
Experience the Photo by Pep Bonet
Intermediate Small Business Programs, Part B SBP 202 Lesson 1: Introduction February 2017 Lesson 1: Introduction.
In-Service Teacher Training
Writing Tasks and Prompts
Course name: Weekly Planning
Classroom Assessment Validity And Bias in Assessment.
Optimizing L&D Contribution to Business Outcomes
Session 8 Exam techniques
AP Research The second course in College Board’s Capstone Program
CS305, HW1, Spring 2008 Evaluation Assignment
AP Research The second course in College Board’s Capstone Program
SLOs, Curriculum, and Other Things that Shape Your Classroom
Presentation transcript:

ABET Assessment One Department’s Experience Paul H. Schimpf penguin.ewu.edu/~pschimpf Eastern Washington University Assessment Colloquium April 26, 2013

Caveats  This represents just one person’s experiences and opinions –I am not an assessment specialist, nor do I wish to be –feel free to disagree –debate assessment strategy and methods in your units  We are all assessment experts –most of us have probably spent more time thinking formally about student assessment, as opposed to course or program assessment –but most of us DO perform course and program assessment informally all the time  There are many ways to successfully go about assessment

Some Guiding Principles  Make it sensical and meaningful –I personally would rather see a refusal than a fake  Make it flexible –a valid outcome of assessment is making changes to assessment (it’s a meta process and can be applied to itself)  Minimize the workload –try to use data that you are already collecting –data analysis will add some work, so keep it flexible –don’t forget that we have that little item called “Service”  Empower the faculty –they are likely assessing their coursework already –don’t force them to change what they do, just ask them to document it with as little impact on them as possible

What is Assessment?  Tests and Grades constitute assessment, right? –yes, but they focus on assessment of students –do not provide the data that we are looking for in this context  Surely all faculty members, while grading an exam, have encountered this situation: –“Hmmm, very few have answered this particular question correctly. Even many that I think are my better students got it wrong, but this is an important concept.” –“Perhaps I didn’t cover this clearly? Maybe I should try covering this again, with a quick review of a pre-req, or perhaps from a different angle, or with different examples that start with something they may be more familiar with.”  Or maybe this situation: –“Everyone got this correct. Maybe that means I can trim out a little time spent on it in order to cover this other concept I’ve been wanting to add to this class.”

What is Assessment?  Whenever a faculty member is refining the content or delivery of a course, assessment is happening –It is about whether the conduct of the course could or should change. This creates one possible avenue for continual improvement. –It is about specific objective(s) of the course. Grade statistics for the entire course do not provide this information, nor do grade statistics for an entire exam (unless perhaps the exam covers only a single objective of the course). –It is about whether students are adequately mastering or understanding some objective. An average numerical score across students may obscure that a few did very well, and a whole lot more performed inadequately. –It is more powerful to know what percentage of students scored “adequately” than to know the average score.

What do you expect ME to do?  “Most of my students failed this. What am I supposed to do about it? I just got a bunch of crappy students.” –That is one possibility, but is it the only possibility? –Is this a one-time occurrence or a recurring theme? –Is this with regards to something extremely fundamental to the course, or something esoteric? –Is this a basic concept, or an advanced concept? –Are you assuming any pre-requisite knowledge that isn’t explicit in the course pre-reqs? –Is it possible that these students came out of a pre-req without the knowledge you assumed? –Have you tried taking another approach?

Waste of Time?  “I don’t believe anyone reads this and I don’t get any feedback” –“I thus conclude that it is a waste of time”  First and foremost, the reports DO need to be read by someone in your unit –if the department is not interested in continual improvement against its own objectives, then it is indeed meaningless  Don’t expect detailed feedback from an outside assessment enforcer –unless you aren’t doing any at all –or are not demonstrating that you are using assessment as part of a process for continual improvement –or are making claims in your assessment report that look suspicious

Who Closes the Loop?  Be careful what you wish for!  Where do the domain experts reside?  Who are the most important evaluators of assessment reports?  Who should recommend changes to programs, course structure, or assessment practices, based on assessment data?  Do you want to put that into the hands of a non- expert in your domain?  Close the loop yourselves, and document it so others can see that you are doing so.

How to Get Started  I describe a “top down” approach –but a bottom up analysis of program outcomes that existing course objectives lead to might also be useful  Start by defining some meaningful outcomes for your program –things that you want your students to understand, or be able to do, or be prepared for, by the time they complete –you will need to think about measurability, but at the program level you need not worry too much about that  Exam the various events that students go through to identify possible points of assessment –many assessment opportunities will naturally fall into the classroom, but those are not your only opportunities

Some Assessment Opportunities  Professional or Other Terminal Exams  Advancement Exams  Senior Capstone Presentations –you can have observers rate various aspects of performance using a rubric  Internship Reports  … ?  and yes, coursework  Your objective is to assess each program outcome in at least one place –two places would be better …

A Suggestion for Courses  With the help of the course instructor(s) –create some specific learning objectives that are appropriate for the course, and … –that support (are demonstrations of) some program outcome, –and are stated as something measurable  IMPORTANT –make sure it is understood that these are not necessarily the only objectives of the course –an instructor can add other objectives, and can choose to assess them for personal use or not  Example learning objectives –students will know how to titrate an appropriate amount of ethanol, … –students will know how to program responses to various GUI events, such as button presses, menu events, …

Empower the Instructor  Create a simple AND flexible report form  State the agreed-upon learning objective AND the program outcome that it supports  Allow the instructor to explain how they assess that and what the results are in whatever form they wish  Allow the instructor to explain any conclusions they drew from their assessments and any actions they took or suggest to the department

CS Course Assessment Form …

IMPORTANT  It is NOT a bad thing to report…  That an objective was not met –hopefully some thoughts and suggestions on the matter are offered  That an objective was not measured because the instructor feels it is an inappropriate objective for the course –hopefully this will lead to a discussion amongst the course instructors as to appropriate objectives  That all objectives were met –this is the end goal after all

Other Suggestions  Create an Assessment Plan, and Follow it –It should list your program outcomes –It should explain where your assessment points are, which outcomes they assess, and what manner of assessment is used –It should explain the process you follow, including: how often you perform each assessment how you process the data what documents are produced and how they are processed how long it takes to cycle through assessment of all of your program outcomes –and perhaps more the CS plan also covers “Educational Goals” and how they tie to “Program Outcomes” the CS plan also explains how our Program Outcomes relate to ABET “attributes” of program outcomes

CS Assessment Process