Where We Are in the Cycle Types of Assessment Measures And Associated Assessment Instruments.

Slides:



Advertisements
Similar presentations
Performance Assessment
Advertisements

General Education Assessment AAC&U GE and Assessment Conference March 1, 2007.
SLO Assessment Departmental Tools and Examples from the Field Sacramento City College Fall Flex Workshop 8/21/08 8/21/08Presenters: Alan Keys Faculty Research.
What “Counts” as Evidence of Student Learning in Program Assessment?
LEARNER CENTERED LEARNER DESIGNED Learning & Preparation Objectives Learning Resources and Strategies Evidence of Accomplishment of Objectives Criteria.
Teacher Evaluation Model
Best Practices in Assessment, Workshop 2 December 1, 2011.
Tips and guidelines.  Too much time  First, do no harm: many things do appear to be going well. Let’s not screw them up.  Why help others evaluate.
Assessing student learning. Diagnostic - The gathering of information at the outset of a course or program of study to provide information for both the.
OUTCOMES ASSESSMENT Developing and Implementing an Effective Plan.
An Assessment Primer Fall 2007 Click here to begin.
CHAPTER 3 ~~~~~ INFORMAL ASSESSMENT: SELECTING, SCORING, REPORTING.
The Academic Assessment Process
Program Assessment Workshop Kathleen Harring. What is Assessment? Assessment is the systematic gathering and analysis of information to inform and improve.
Purpose Program The purpose of this presentation is to clarify the process for conducting Student Learning Outcomes Assessment at the Program Level. At.
AET/515 Spanish 101 Instructional Plan SofiaDiaz
Welcome… The attendee will understand assessment basics with a focus on creating learning activities and identifying assessment expectations. Apply the.
Guide to Evidence for WASC Accreditation Dr. Robert Gabriner City College of San Francisco Student Learning Outcomes Workshop Strand 3.
Maps, Rubrics and Templates A Primer on Their Uses for Assessment in Student Affairs.
Assessment Tools. Contents Overview Objectives What makes for good assessment? Assessment methods/Tools Conclusions.
Presented by Jennifer Fager For University of Wisconsin-Superior Enhancement Day 1/19/2011.
1 DEVELOPING ASSESSMENT TOOLS FOR ESL Liz Davidson & Nadia Casarotto CMM General Studies and Further Education.
Student Learning Objectives The SLO Process Student Learning Objectives Training Series Module 3 of 3.
Formative Assessment.
Authentic Assessment Principles & Methods
California State University East Bay
Assessment for Optimal Learning Tace Crouse Faculty Center for Teaching and Learning University of Central Florida.
Susan Agre-Kippenhan, Portland State University Professor, Art Department Evaluating the Effectiveness of Service Learning.
JACKSON STATE UNIVERSITY ACADEMIC ASSESSMENT COMMITTEE WORKSHOP
Preceptor Orientation
NAVIGATING THE PROCESS OF STUDENT LEARNING OUTCOMES: DEVELOPMENT, EVALUATION, AND IMPROVEMENT Shannon M. Sexton, Julia M. Williams, & Timothy Chow Rose-Hulman.
Using Electronic Portfolios to Assess Learning at IUPUI. Trudy Banta, et. al. Indiana University-Purdue University Indianapolis 2007.
Unit 1 – Preparation for Assessment LO 1.1&1.2&1.3.
Comp 20 - Training & Instructional Design Unit 6 - Assessment This material was developed by Columbia University, funded by the Department of Health and.
Understanding Meaning and Importance of Competency Based Assessment
Designing for Learning Tools to Help Faculty Design More Inclusive Courses Beth Harrison, PhD University of Dayton.
ASSESSMENT OF STUDENT SUPPORT SERVICES Kimberly Gargiulo, Coordinator of Assessment Office of Institutional Research and Assessment.
ASSESSMENT OF CORE SKILLS/ GENERAL EDUCATION OUTCOMES Angelina Hill, PhD Associate Director, Office of Academic Assessment.
Don Dodson, Senior Vice Provost Diane Jonte-Pace, Vice Provost for Undergraduate Studies Carol Ann Gittens, Director, Office of Assessment Learning Assessment.
North Carolina Network for Excellence in Teaching  The Assessment Toolkit.
Lecture 7. The Questions: What is the role of alternative assessment in language learning? What are the Reasons.
ScWk 242 Course Overview and Review of ScWk 240 Concepts ScWk 242 Session 1 Slides.
Instructional Plan | Slide 1 AET/515 Instructional Plan December 17, 2012 Kevin Houser.
 Read through problems  Identify problems you think your team has the capacity and interest to solve  Prioritize the problems and indicate the.
EDU 385 CLASSROOM ASSESSMENT Week 1 Introduction and Syllabus.
Fourth session of the NEPBE II in cycle Dirección de Educación Secundaria February 25th, 2013 Assessment Instruments.
Where are you joining us from
Assessment Tools.
LSAC Academic Assistance Training Workshop June 13 – 16, 2012 OUTCOMES ASSESSMENT – THE BASICS Janet W. Fisher Suffolk University Law School.
Assessing Your Assessments: The Authentic Assessment Challenge Dr. Diane King Director, Curriculum Development School of Computer & Engineering Technologies.
Academic Practicum Winter Academic Practicum Seminar2 Agenda 4 Welcome 4 Burning ??’s 4 Routines & Organizational Systems 4 Overview of Academic.
VT University Libraries: Identifying, Teaching, and Assessing What Matters Most Office of Academic Assessment Ray Van Dyke,
ABET is Coming! What I need to know about ABET, but was afraid to ask.
PBL for the 21 st century. Getting Started Planning & Preparing Managing Reflect Perfect & Planning and Preparing Entry event Culminating products/rubrics.
What Are the Characteristics of an Effective Portfolio? By Jay Barrett.
Identifying Assessments
Research Methods Observations Interviews Case Studies Surveys Quasi Experiments.
Assessment of Student Learning: Phase III OSU-Okmulgee’s Evidence of Student Learning.
Formative Assessment. Fink’s Integrated Course Design.
& YOUR PROGRAM How to Get Started Thinking About it All…
AAC&U Members on Trends in Learning Outcomes Assessment Key findings from a survey among 325 chief academic officers or designated representatives at AAC&U.
Assessment Instruments and Rubrics Workshop Series Part 1: What is a rubric? What are the required elements? What are VALUE rubrics? February 24, 2016.
+ Montgomery College Program Assessment Orientation Spring 2013.
Teaching and Learning Cycle and Differentiated Instruction A Perfect Fit Rigor Relevance Quality Learning Environment Differentiation.
Tasks & Grades for MET2.
Tasks & Grades for MET3.
Tasks & Grades for MET5.
Tasks & Grades for MET4.
Curriculum Coordinator: D. Para Date of Presentation: Jan. 20, 2017
Presentation transcript:

Where We Are in the Cycle

Types of Assessment Measures And Associated Assessment Instruments

Summative & Formative

Qualitative vs. Quantitative

Direct & Indirect Note: Direct assessment methods are preferred over indirect.

Direct & Indirect

Applied Experiences Assessment Learning environments that provide in-the-field, hands-on learning and training experiences can provide valuable assessment information. Applied experiences integrate what the student has learned in their academic studies with “real life” situations and further develop the student as a professions, citizen and life-long learner Examples of Applied Experiences include: Practicum Service Learning Internship Experiential Learning Field Experience Student-teaching Co-op Applied Experiences Assessment Instruments: Journal Videotape of student skills Progress Reports Portfolio Midterm Evaluation Cumulative Report Reflective Papers Performance evaluation by mentor Student evaluation of internship experience Final project/creative project/presentation

Newer Wave of Assessment Instruments  Rubrics  Learning contracts  Observations with documentation  Reflective journals  Reflective conversations/writings  Case studies  Student interviews  Videotaping

Rubrics Several sample rubrics have been put in to the Sample Rubric folder on the SharePoint site. These are examples only, and they don’t match the format being used by the Core Curriculum rubric being used by our academic counterparts. The CAS Domains and Dimensions rubric has been put in to your Assessment Tools folder. This rubric was modified to match the format of the Core Curriculum and has been used by other institutions as a way to map learning outcomes. Additionally, the CampusLabs presentation “Rubrics 101: A Tool to Assess Learning” has been put in to your Assessment Tools folder. This presentation provides valuable information on the use of rubrics as a tool to measure learning directly and objectively.

Rubrics The Preferred Format

Guidelines for Selecting an Assessment Method and Instrument Select a method that is appropriate for your goals and objectives. (The method that will provide the most useful and relevant information.) Not all methods work for all areas or are appropriate to all outcomes. Use the information you already have available. Choose an assessment method that allows you to assess the strengths and weaknesses of the program. Effective methods of assessment provide both positive and negative feedback. Finding out what is working well is only one goal of your assessment work. Remember, the data you collect must have meaning and value to those who will be asked to make changes based on the findings.

Guidelines for Selecting an Assessment Method and Instrument Use multiple methods to assess each learning outcome. Many outcomes will be difficult to assess using only one measure. The advantages to selecting more than one method include: 1.Multiple measures can assess different components of a complex task. 2.There is no need to try to design a complicated all-purpose method. 3.Greater accuracy and authority achieved when several methods of assessment produce similar findings. 4.Provides opportunity to pursue further inquiry when methods contradict each other.

 Alumni surveys  Culminating assignments  Content analysis  Course-embedded assessment  Curriculum analysis  Delphi technique  ePortfolio  Employer surveys  Focus groups  Institutional data  Matrices  Observation  Performance assessment  Portfolio evaluations  Pre/post evaluation  Quasi-experiments  Reflective essays  Rubrics  Standardized test instrument  Student self efficacy  Surveys  Syllabus analysis  Transcript analysis

 You give course grades  The grades measure the specific outcome  The courses are consistent over time suggested for ROTCs

 Your observers are experts  The setting is controlled  The outcome is Ethical Reasoning suggested for CAPS & ODoS Counseling

 Diverse people will do the assessments  You want to enforce consistency  You need to document the criteria

 You want to measure cumulative effects  Your students will write the essays  The outcome can be expressed in an individual’s own words  The outcome is Written Communication suggested for OSRR, DRC & HORIZONS

 The outcome is best measured in the eyes of others  The outcome is Leadership and Teamwork  You have opportunity to gather these evaluations ◦ (ideally at the close of a seminar or event) suggested for leadership training programs

 You conduct group sessions  The outcome can be measured as an answer to a question  Some students won’t know the answer at the start  You can use Clickers suggested for CCO workshops & intramural sports training

 The location is unusual  The learning is best measured in the moment  You don’t care about getting a valid sample  These surveys can be facilitated: ◦ by Clickers ◦ Campus Labs Baseline w/ iPhone or iPod

 If you are fishing for details  The outcome is Creative Thinking  The outcome is Oral Communication

 You have lots of time  You are determined to get lots of detail  The outcome is Integrative Learning  You trust method more than people ◦ Experiments may create ethical concerns

 the portfolios exist  the portfolios have value  Criteria exist for assessing the portfolios  Staff capacity exists to review students’ efforts Suggested for future assessments

Instrument/Tool Mapping Process

Now, review your original Student Learning Outcome mappings. Although you may have indicated several outcomes students take away after participating in your program, you now want to decide which of them you can assess for 2012 (this academic year). Your original program worksheet will be where you retain all of the learning outcomes for your program and participants. You will want to map only those learning outcomes you can assess in Over time, the worksheet will document which year you document each of the outcomes you have mapped. For each program you originally mapped a Student Learning Outcome for, your SharePoint folder will contain a Assessment Tool spreadsheet that will provide you with a way to indicate which tool you want to use to assess each of the outcomes. Columns A, B, C, and D will be pre-populated with information on each of the Student Learning Outcomes you mapped back to the CAS dimensions. (Refer to the mapping spreadsheet handout #1)

Instrument/Tool Mapping Process If after mapping your SLOs you determine you can’t assess one of the outcomes, or if you aren’t going to be able to assess the outcome in 2012, remove it from your mapping tool spreadsheet. This mapping tool spreadsheet will become your authoritative source for information on the mapping of your outcomes back to the CAS dimensions, the Purdue Core Competencies and the Strategic Plan.

Timeline Review your current student learning outcome statements…the ones you created in phase 2. Make sure you have indicated in the Student Learning Outcomes section, only those outcomes you will assess in Your review will need to be complete by May 25. At that point, and for a period of one week, all information in your folders will be frozen in order for the work to begin on moving your information in to the Tool Mapping worksheet. You will have the information moved over for you to the Tool Mapping worksheet by June 4, and you will have the entire month of June to begin completing the information requested in the remaining columns of the Tool Mapping worksheet. All mappings will need to be completed by June 29. At ANY point along the way, if you have any questions or need assistance, please call or either Dan or Andy Dan: Andy:

Closing the Loop