How Institutions Use Evidence of Assessment: Does It Really Improve Student Learning? Natasha Jankowski, NILOA Higher Education Collaborative October 22,

Slides:



Advertisements
Similar presentations
WV High Quality Standards for Schools
Advertisements

Session Learning Target You will gain a better understanding of identifying quality evidence to justify a performance rating for each standard and each.
Continuous Improvement in the Classroom -Professional Learning Communities.
Linking assignments with course and program goals Lehman College Assessment Council and Lehman Teaching and Learning Commons November 28, :30 -
April 6, 2011 DRAFT Educator Evaluation Project. Teacher Education and Licensure DRAFT The ultimate goal of all educator evaluation should be… TO IMPROVE.
Teaching, Learning, and Assessing Peggy Maki Senior Scholar, Assessing for Learning AAHE
Conceptual Framework What It Is and How It Works Kathe Rasch, Maryville University Donna M. Gollnick, NCATE October 2005.
Measuring Student Learning March 10, 2015 Cathy Sanders Director of Assessment.
An Outcomes-based Assessment Model for General Education Amy Driscoll WASC EDUCATIONAL SEMINAR February 1, 2008.
Effective Grading and Assessment:. Strategies to Enhance Student Learning.
Consistent Teacher judgement
Measuring Learning Outcomes Evaluation
FLCC knows a lot about assessment – J will send examples
Purpose Program The purpose of this presentation is to clarify the process for conducting Student Learning Outcomes Assessment at the Program Level. At.
1 Classroom-Based Research: How to Be a Researcher in Your Classroom Basic Skills Initiative Teaching and Learning Workshop October 2009 Darla M. Cooper.
Welcome… The attendee will understand assessment basics with a focus on creating learning activities and identifying assessment expectations. Apply the.
 Description  The unit has a conceptual framework that defines how our programs prepare candidates to be well-rounded educators. Every course in the.
Using a logic model to help you use the ePortfolio Implementation Framework Katherine Lithgow.
Overall Teacher Judgements
Helping Your Department Advance and Implement Effective Assessment Plans Presented by: Karen Froslid Jones Director, Institutional Research.
Everything you wanted to know about Assessment… Dr. Joanne Coté-Bonanno Barbara Ritola September 2009 but were afraid to ask!
 “…the systematic collection, review, and use of information about educational programs undertaken for the purpose of improving student learning and.
Let’s Get S.T.A.R.T.ed Standards Transformation and Realignment in Thompson.
CriteriaExemplary (4 - 5) Good (2 – 3) Needs Improvement (0 – 1) Identifying Problem and Main Objective Initial QuestionsQuestions are probing and help.
Year Seven Self-Evaluation Workshop OR Getting from Here to There Northwest Commission on Colleges and Universities.
Note: Because of slide animation, this ppt is intended to be viewed as a slide show.  While viewing the ppt, it may be helpful to obtain a sample Core.
Using Electronic Portfolios to Assess Learning at IUPUI. Trudy Banta, et. al. Indiana University-Purdue University Indianapolis 2007.
Assessing Student Learning Lynn Merklin Assistant Provost Office of Institutional Effectiveness August, 2014.
March 26-28, 2013 SINGAPORE CDIO Asian Regional Meeting and Workshop on Engineering Education and Policies for Regional Leaders Programme Evaluation (CDIO.
Evidence of Student Learning Fall Faculty Seminar Office of Institutional Research and Assessment August 15, 2012.
On SLOA (SLO Assessment). (after Munch) Will Lecture die? No…
 Integrate the Bacc Core category learning outcomes into the course.  Clarify for students how they will achieve and demonstrate the learning outcomes.
By Elisa S. Baccay. The teacher understands and uses a variety of instructional strategies to encourage students’ development of critical thinking, problem.
EDU 385 CLASSROOM ASSESSMENT Week 1 Introduction and Syllabus.
Developing Learning Assessment Plans and Outcomes Assessments: Challenges and Opportunities in Medical Education Research Robert A. DiTomasso, Ph.D., ABPP.
Practicing Meaningful Learning Outcomes Assessment at UGA Department of Crop and Soil Sciences August 10, 2015 Dr. Leslie Gordon Associate Director for.
Standard Two: Understanding the Assessment System and its Relationship to the Conceptual Framework and the Other Standards Robert Lawrence, Ph.D., Director.
1 Orientation Session at 2003 Assessment Conference A Richer and More Coherent Set of Assessment Practices Peggy L. Maki Senior Scholar Assessing for Learning.
Programming the New Syllabuses (incorporating the Australian Curriculum)
THE SLO PROCESS #1 create/update SLO’s, rubrics, & assessment methods #2 assess all students in all sections of all courses #3 maintain SLO assessment.
The Conceptual Framework: What It Is and How It Works Linda Bradley, James Madison University Monica Minor, NCATE April 2008.
Student Learning Objectives (SLO) Resources for Science 1.
1 Student Learning Outcomes and Assessment Norma Ambriz * February 14, 2008 Student Learning Outcomes and Assessment Norma Ambriz * February 14, 2008 A.
Candidate Assessment of Performance CAP The Evidence Binder.
Greenbush. An informed citizen possesses the knowledge needed to understand contemporary political, economic, and social issues. A thoughtful citizen.
Title Slide In the Murky Middle “In the Murky Middle: Assessing Critical Thinking and Integrative Learning Across the General Education/Major Divide” Dr.
Assessment 101: What do you (really) need to know? Melinda Jackson, SJSU Assessment Director October
PROGRAM ASSESSMENT BASICS Alan Kalish, PhD Teresa Johnson, PhD Director Assistant Director University Center for the Advancement of Teaching.
MUS Outcomes Assessment Workshop University-wide Program-level Writing Assessment at The University of Montana Beverly Ann Chin Chair, Writing Committee.
Associate Professor Cathy Gunn The University of Auckland, NZ Learning analytics down under.
Deepening Student Impact Via Instructional Practice Data Joe Schroeder, PhD Associate Executive Director, AWSA.
EVIDENCE-BASED STORYTELLING: SHARING OUR NARRATIVES Natasha Jankowski, Ph.D., Associate Director National Institute for Learning Outcomes Assessment (NILOA)
ASSESSING STUDENT LEARNING AND DEVELOPMENT By Dr. Clarice Ford.
Learning Goals, Objectives, & Curriculum Mapping Linking curriculum & pedagogy to demonstrations of knowledge, skills, and abilities Dr. Marjorie Dorimé-Williams.
Assessment Design Natasha Jankowski Associate Director, NILOA
Student Learning Outcomes Assessment: Past, Present, and Future
M-LANG project  Ref. n NO01-KA Interactive Exchange Workshop on how to use response systems and ICT tools for creating interactive learning.
Consider Your Audience
Closing the Assessment Loop
Effective Outcomes Assessment
Learning Is Our Goal: Helping Faculty Embed Evidence-Based Practices into Their Classes ……………… Melinda Messineo, Associate Professor Department of Sociology,
Telling our Stories: Narratives of Student Learning
Programme Review Dhaya Naidoo Director: Quality Promotion
Preparing Future Faculty/FaCET Assessment Session
Consistency of Teacher Judgement
Logic Models and Theory of Change Models: Defining and Telling Apart
Associate Provost for Graduate and Professional Studies
Curriculum Coordinator: D. Miller Date of Presentation: 1/18/2017
February 21-22, 2018.
Assessment at Northeastern University
Presentation transcript:

How Institutions Use Evidence of Assessment: Does It Really Improve Student Learning? Natasha Jankowski, NILOA Higher Education Collaborative October 22, 2013

 Institutions of higher education are increasingly asked to show the value of attending, i.e. impact  Public and policy makers want assurance of the quality of higher education  Regional accreditors are asking institutions to show evidence of student learning and instances of use Value

 Assessment is the systematic collection, review, and use of information about educational programs undertaken for the purpose of improving student learning and development. (Palomba and Banta 1999) Assessment

 Institutions have the greatest difficulty in the assessment cycle of “closing the loop” (Banta, 2009) Closing the Loop SLOs Select Measures Gather data Analyze the data Use to improve

 With the majority of institutions in the US engaged in assessing student learning, why is there still a large disparity of institutions that are using assessment results ? Why the lack of use?

 Most assessment literature states that the reason or purpose of engaging with assessment in the first place is to USE the results to IMPROVE student learning (Banta, 2007; Ewell, 2010; Suskie, 2009; Walvoord, 2004)  But what does that really mean? Using Results

 The ability to make causal claims about our impact on students and their learning  Institutional structures and support + student = enhanced learning Causal Statements

 Mobility of students  Untracked changes  Changes in courses add up to program level change  Levels at which use occurs  Longer than a year cycle  Loosely coupled relationships  Life Difficulty of Causal Statements

 Why do we think the changes we make will lead to better outcomes?  What is assumed in the changes we select as it relates to how students understand and navigate higher education? Theories of Change

 A process of outlining causal pathways by specifying what is needed for goals to be achieved (Carman, 2012)  It requires articulating underlying assumptions which can be tested and measured (Jones, 2010)  A Theory of Change provides a roadmap outlining how one thinks or conceptualizes getting from point A to point B over time (Ployhart & Vandenburg, 2010) Theory of Change (TOC)

 Coverage and content  Opportunities and support  Intentional, coherent, aligned pathways  Within each of these is the belief about a root cause – why students were not learning or not meeting the outcome and the mechanism by which the institution can help them succeed For instance…

 A process of exploring not only that something happened, but why it happened the way that it did (Rooney & Heuvel, 2004)  Moves beyond surface-level problem identification and examines assumptions in order to prevent reoccurrences (Taitz, Genn, Brooks, Ross, Ryan, Shumack, Burrell, & Kennedy, 2010) Root Cause

Toulmin (2003) Evidence Claim Warrant  Warrants  Arguments But…

 Case Studies and Cross-case report What does it look like in practice?

Evidence of student learning is used in support of claims or arguments about improvement and accountability told through stories to persuade a specific audience (Jankowski, 2012) Evidence-based Storytelling

 Making sense of results – Meaning Making  Multiple individuals across the institution critically engaging with assessment data  Make sense of data to determine what, if anything, to do  70%  Examine multiple data points  Group data by theme not method Discussion and Reflection

Tell the Institutional Story

A faculty chair in business examined the results of program outcomes for learners who completed the program capstone course and found that on one of the outcomes, learners were performing below what he regarded as the minimum threshold. Through the curriculum maps and alignments linking learning activities in individual courses to program outcomes in the capstone, he was able to identify across the entire program which courses had the strongest alignment to the outcome in question. From there, he was able to delve deeper into individual learning activities, to combine that information with additional data including course evaluations, and from the combined data to make detailed changes in specific courses and specific learning activities or assignments within courses. By the time participants in the revised courses and learning activities completed the capstone course, there was a measurable improvement in the particular outcome in question. The faculty chair involved in the process stated, “The concept of having an outcomes-based approach and having a strong theory of alignment all the way down to individual learning activities helps facilitate the use of assessment data.” The Brian Barton Story

Veterinary technology students did not score as well as needed in quantitative reasoning, for example, so veterinary technology faculty redesigned several key assignments to build and document that competency in students. Whereas previously students only read an article to learn about monitoring glucose levels in felines, the new assignment asked them to read the article, to take a reading of a cat’s glucose level, and then to use both sources to write an analytical report. This curriculum redesign created a more robust and discipline-specific quantitative reasoning experience for students and a richer set of documents to be collected and examined through ePortfolio. Addressing general education requirements throughout the program, according to the veterinary technology program director, means that “programs need to decide where they are addressing general education within the curriculum,” and using student artifacts collected through the ePortfolio “brings assessment to the forefront of the classroom.” Veterinary Technology

The religion department wanted to know if their students were writing at a desired level, and so the faculty developed a writing rubric, gathered a random collection of student essays, and had a faculty panel rate them. A report was generated from the rating that outlined where students demonstrated or fell short on the outcomes in question. Areas where students fell short were used to refocus teaching and also to rethink the sequence of courses and assignments within courses so as to better reinforce the desired outcomes and help students improve. A faculty member involved in this effort remarked, “It seems so modest to state it now – we identified an intended learning outcome, made rubrics, looked at essays, and altered teaching – but that fairly modest process generated a holistic view of what students were doing well and what they were not doing so well, which allowed for minor adjustments. In a year or two these adjustments showed that students are doing better on a given outcome.” Writing Across the Curriculum

  NILOA: Learningoutcomesassessment.org Questions?