Making Your Assessments More Meaningful Flex Day 2015.

Slides:



Advertisements
Similar presentations
Writing an NCATE/IRA Program Report
Advertisements

The Commissions Expectations for the Assessment of Student Learning and Institutional Effectiveness Beth Paul Interim Provost and Vice President for Academic.
Using Embedded Assessment to Collect Evidence of Student Learning Using Embedded Assessment to Collect Evidence of Student Learning.
Outcomes Assessment- Full Implementation Meeting Fall 2009.
Who Put “Instructional Monitoring” On My To Do List? Suggestions for Principals M. Ann Levett, Ed.D.
SLO Course Assessment in 5 Easy Steps Vivian Mun, Ed.D.
Using Embedded Assessment to Collect Evidence of Student Learning Using Embedded Assessment to Collect Evidence of Student Learning.
Daniel Peck January 28, SLOs versus Course Objectives Student Learning Outcomes for the classroom describe the knowledge, skills, abilities.
MARGARET (MIDGE) COZZENS TUSKEGEE UNIVERSITY FEBRUARY 21, 2013 Assessment 1.
The Role of Faculty During the Self-Study Process Ensuring Success at Cedar Crest LaMont Rouse Executive Director of Assessment, Accreditation & Compliance.
Using Rubrics for Evaluating Student Learning. Purpose To review the development of rubrics for the purpose of assessment To share an example of how a.
SLO Process A process to document a measure of educator effectiveness based on student achievement of content standards.
Apples to Oranges to Elephants: Comparing the Incomparable.
Writing Across the Curriculum Pilot Project – DABCC Susan Wood.
Using Rubrics for Evaluating Student Learning Office of Assessment and Accreditation Indiana State University.
Formative and Summative Evaluations
Coordinator of Assessment Coordinate assessment efforts on campus Maintain the NCCC General Education Assessment Plan Collect assessment results from course.
Grade 12 Subject Specific Ministry Training Sessions
Standards and Guidelines for Quality Assurance in the European
Experiments and Observational Studies.  A study at a high school in California compared academic performance of music students with that of non-music.
Assessment Assessment Planning Assessment Improvements Assessment Data Dialogue & Reflection.
TIMELESS LEARNING POLICY & PRACTICE. JD HOYE President National Academy Foundation.
Student Growth Goals: How Principals can Support Teachers in the Process Jenny Ray PGES Consultant KDE/NKCES.
SLO A SSESSMENT S TUDY W ORKSHOP D EVON A TCHISON, SLO C OORDINATOR August 20, :30-10:30 a.m. Room 523.
 Description  The unit has a conceptual framework that defines how our programs prepare candidates to be well-rounded educators. Every course in the.
Effective Results and Actions for Course Assessment Plans Kim Anderson ASLO Subcommittee Chair Eva Bagg Associate Dean, Institutional Effectiveness John.
Assessment Leader Training General Education Student Learning Assessment GEO Training Series: 2 of 5 Spring 2012 February 13, 2012.
BY Karen Liu, Ph. D. Indiana State University August 18,
GE SLO Assessment Guide. Best Practices In the best practices for assessing GE SLOs, there are two dominant modes being used: – Common assessments – Common.
October 31, Dialog about SLOs, assessment, and existing practices at TC Identify course level SLO to assess this semester Align SLO with TC’s institutional.
August 3,  Review “Guiding Principles for SLO Assessment” (ASCCC, 2010)  Review Assessment Pulse Roundtable results  Discuss and formulate our.
Mapping Student Learning Outcomes
Winter  Provide information and examples of the SLO process to help faculty use assessment to improve teaching and learning  Provide tools and.
Student Learning Outcomes at CSUDH. Outcomes assessment can tell us if our students are really learning what we think they should be able to do.
Inter-Rater Reliability Respiratory Ivy Tech Community College-Indianapolis.
Pilot Training for Volunteers General Education Assessment Committee.
Everything You Need to Know for Spring 2010 S. Kashima.
 Read through problems  Identify problems you think your team has the capacity and interest to solve  Prioritize the problems and indicate the.
EDU 385 CLASSROOM ASSESSMENT Week 1 Introduction and Syllabus.
Periodic Program Review Guiding Programs in Today’s Assessment Climate LaMont Rouse Executive Director of Assessment, Accreditation & Compliance.
Richard Beinecke, Professor and Chair Suffolk University Institute for Public Service.
Fall 2015 Professional Development Days C. Cruz-Johnson & R. Gamez August 28, Walking with Integrity Towards Student Success.
Evaluating Impacts of MSP Grants Hilary Rhodes, PhD Ellen Bobronnikov February 22, 2010 Common Issues and Recommendations.
NSE Assessment Overview: Your Role in Outcomes-Based Assessment What Will We Learn About Our Students?
Connecting Course Goals, Assignments, and Assessment Faculty Development for Student Success at Prince George’s Community College William Peirce
VALUE/Multi-State Collaborative (MSC) to Advance Learning Outcomes Assessment Pilot Year Study Findings and Summary These slides summarize results from.
THE SLO PROCESS #1 create/update SLO’s, rubrics, & assessment methods #2 assess all students in all sections of all courses #3 maintain SLO assessment.
March 23, NYSCSS Annual Conference Crossroads of Change: The Common Core in Social Studies.
Evaluating Impacts of MSP Grants Ellen Bobronnikov Hilary Rhodes January 11, 2010 Common Issues and Recommendations.
Life Science online SLO Assessment. Click on the course that you want to assess, e.g. Anthro 101.
1 Student Learning Outcomes and Assessment Norma Ambriz * February 14, 2008 Student Learning Outcomes and Assessment Norma Ambriz * February 14, 2008 A.
Module 3: Programmatic Assessment Strategies
Onsite Quarterly Meeting SIPP PIPs June 13, 2012 Presenter: Christy Hormann, LMSW, CPHQ Project Leader-PIP Team.
UK Office of Assessment. The LEARNING Initiative Dual Track Implementation Strategy Completion Dates Not actively engaged in program level assessment.
General Education Assessment Report Assessment Cycle.
Overview of Types of Measures Margaret Kasimatis, PhD VP for Academic Planning & Effectiveness.
Student Learning Outcomes Assessment Montgomery College Fall 2011 Orientation.
+ Montgomery College Program Assessment Orientation Spring 2013.
College of Business Administration Assessing Inter- Cultural Learning: Direct Measures and Beyond Kathleen A. Krentler San Diego State University.
V: Maryland’s High School Assessments (HSAs) & the Bridge Plan for Academic Validation Overview.
CRITICAL CORE: Straight Talk.
Consider Your Audience
Institutional Effectiveness USF System Office of Decision Support
Collecting Data to Assess Capstone and Culminating Courses
Presented by: Skyline College SLOAC Committee Fall 2007
Unit 7: Instructional Communication and Technology
General Education Assessment Revision Plan Proposal
What to do with your data?
Physical Therapist Assistant Program School of Science, Health, and Criminal Justice Fall 2016 Assessment Report Curriculum Coordinator: Deborah Molnar.
Student Learning Outcomes Assessment
Presentation transcript:

Making Your Assessments More Meaningful Flex Day 2015

Why do we assess?  To study the learning process in the discipline and in the larger learning arenas and discover new ways and methods of teaching.  To determine the extent to which the curriculum is working (design and implementation of the curriculum).  To inform decisions as to where time, energy, and/or money should be repurposed for continuous improvement to learning.  To help demonstrate our quality assurance to the community we serve.

Today we’ll talk about…  Creating and/or revising your means of assessment to ensure that your results are more meaningful.  New ways to analyze your data.

Making your Means of Assessment more Meaningful Ensure Consistency Across Course Sections  When conducting assessment, the same assessment instrument should be used across course sections to assess an SLO.  If you use exam questions to assess: The same SLO questions should be used on the exams across all sections of the course.  If you use a rubric to assess: Everyone needs to use the same rubric. The instructions and task for the essay or skill performance used for the assessment need to be the same across sections.  Why?  You can’t account for as much variance if you don’t standardize the assessment. In other words, it becomes more difficult to tell why students scores differed on the assessment.

Making your Means of Assessment more Meaningful Ensure a Consistent Time Frame for Conducting the Assessments  Timing is important!  Plan for all instructions teaching sections of a course to give the assessment on approximately the same date.  For pre-test/post-tests: The pre-test should be administered in all sections on the first day of class. The post-test should be administered in all sections on the final day of class.  For end-of-semester exams: The exam should be given on the last day of class.  For essays or skill performances: The assignment should be administered during the same week.

Making your Means of Assessment more Meaningful Sampling Techniques: Sample Size  What is sampling?  The process of selecting a sample of individuals (students) from a population (course population or program population). Through sampling, you attempt to estimate what is likely to be the situation for the total population.  NOTE: There will always be a difference between the scores of your sample of students and the scores of the population of students for a course or program.

Making your Means of Assessment more Meaningful Sampling Techniques: Sample Size What if I asked you to flip a quarter three times?

Making your Means of Assessment more Meaningful Sampling Techniques: Sample Size  Your sample needs to be large enough so that you can be confident that the sample best represents your population.  When is doubt, the bigger the sample, the better.

Making your Means of Assessment more Meaningful Sampling Techniques: Sample Design  Your sample of students will be biased if the sampling is done in a non-random method.  Selection can be consciously or unconsciously influenced by human choice.  Selection can be biased if it is not equally likely for each student to get selected into the sample.  What can you do?  Choose a random sample of all course sections and all students enrolled in the course.  How can you do this?  Use Excel’s random number generator

Making your Means of Assessment more Meaningful Revisit your Exam Questions  SLOs have changed over the years and sometimes the questions used to test those SLOs should be revisited.  When you meet with the instructors who teach your course (or all the faculty in your program), ask the group…  Are we really testing students on all aspects of this SLO?  Have we included enough questions in the assessment to adequately measure the SLO?

Making your Means of Assessment more Meaningful Revisit your Rubric  A well-designed rubric should allow evaluators to efficiently and effectively assess their students’ learning.  When choosing a rubric for assessment make sure that  The rubric’s categories are well-defined and that each scoring level is consistent in it’s increments.  For assessment, the rubric needs to be extremely clear and leave little room for interpretation.

Making your Means of Assessment more Meaningful Revisit your Rubric

Analyzing your Data Ensuring Inter-Rater Reliability with Rubric Scoring  When using an assessment tool with a rubric it is important to achieve inter- rater reliability when scoring the assignment.  Inter-rater reliability: The degree of agreement in scoring among the faculty raters of the assignment.  How can you achieve this?  Submit a random sample of the student assignment used to assess the SLO.  Make multiple copies of a selected number of student submissions.  Pass out the same student’s assignment to all the raters and have them score it based on the rubric.  Discuss how everyone rated that student’s work and come to agreement on how it should have been scored. Repeat until everyone is on the same page. Then pass out the rest to be scored individually by each rater.

Analyzing your Data Digging Deeper into the Data  An overall percentage of how many students successfully met the level of achievement for an SLO will not tell you much.  Consider looking at:  Differences in results between course section offerings at different times of the day (morning, afternoon, night).  Differences in results between full and part-time faculty.  Differences in student populations (by overall success rates at the college, overall GPA, socio-economic status)  Indirect assessments combined with your direct assessments of learning.

How Often Should you Assess?  I recommend collecting data for as many SLOs, for as many courses, and for as many programs as often as possible.  The ASLO subcommittee recommends that you close the loop for each course SLO at least twice during the course review cycle. The same goes for program SLOs.  By the time you go through course review, all your course SLOs should have been assessed. Same for program review. You want the results to inform your reviews!