ASSESSMENT CERTIFICATE CULMINATING PROJECT: ASSESSING STUDENT LEARNING OUTCOMES Presented by: Shujaat Ahmed and Kaitlin Fitzsimons.

Slides:



Advertisements
Similar presentations
Personal Development Plans (PDPs) Subject-specific PDPs in Economics.
Advertisements

The Teacher Work Sample
Program Goals Just Arent Enough: Strategies for Putting Learning Outcomes into Words Dr. Jill L. Lane Research Associate/Program Manager Schreyer Institute.
Assessing Student Learning Outcomes In the Context of SACS Re-accreditation Standards Presentation to the Dean’s Council September 2, 2004.
What Behaviors Indicate a Student is Meeting Course Goals and Objectives? “Indicators”
Best Practices in Assessment, Workshop 2 December 1, 2011.
The Network of Dynamic Learning Communities C 107 F N Increasing Rigor February 5, 2011.
PEER REVIEW OF TEACHING WORKSHOP SUSAN S. WILLIAMS VICE DEAN ALAN KALISH DIRECTOR, UNIVERSITY CENTER FOR ADVANCEMENT OF TEACHING ASC CHAIRS — JAN. 30,
Measuring Student Learning March 10, 2015 Cathy Sanders Director of Assessment.
Culminating Academic Review Adams State College Department of Teacher education graduate programs.
An Outcomes-based Assessment Model for General Education Amy Driscoll WASC EDUCATIONAL SEMINAR February 1, 2008.
DEVELOPING DEPARTMENTAL OUTCOMES ASSESSMENT PLANS Jerry Rackoff Lois Huffines Kathy Martin.
Coordinator of Assessment Coordinate assessment efforts on campus Maintain the NCCC General Education Assessment Plan Collect assessment results from course.
GOAL SETTING CONFERENCES BRIDGEPORT, CT SEPTEMBER 2-3,
QEP Update CFCC Planning Retreat June 19, QEP Update Mid-Term Report Global Outcomes 1.Measurable improvement of students’ critical thinking skills.
City University of New York (CUNY) School of Law Natalie Gomez-Velez, Presenter.
Assessing Students Ability to Communicate Effectively— Findings from the College of Technology & Computer Science College of Technology and Computer Science.
Terms, practices, and outcomes. What connections do we make? What about those connections is meaningful?  across courses  through reflection  linked.
The purpose of this workshop is to introduce faculty members to some of the foundation issues associated with designing assignments for writing intensive.
FLCC knows a lot about assessment – J will send examples
Purpose Program The purpose of this presentation is to clarify the process for conducting Student Learning Outcomes Assessment at the Program Level. At.
Welcome… The attendee will understand assessment basics with a focus on creating learning activities and identifying assessment expectations. Apply the.
TIMELESS LEARNING POLICY & PRACTICE. JD HOYE President National Academy Foundation.
Graduate Program Review Where We Are, Where We Are Headed and Why Duane K. Larick, Associate Graduate Dean Presentation to Directors of Graduate Programs.
Meeting SB 290 District Evaluation Requirements
Writing Across the Curriculum (WAC) at Sojourner Douglass College Faculty and Staff Session One Saturday, November 9, 2013.
Curriculum Mapping: Assessment’s Second Step Office of Institutional Assessment & Effectiveness SUNY Oneonta Fall 2010.
ASSESSMENT IN EDUCATION ASSESSMENT IN EDUCATION. Copyright Keith Morrison, 2004 PERFORMANCE ASSESSMENT... Concerns direct reality rather than disconnected.
COURSE ADDITION CATALOG DESCRIPTION To include credit hours, type of course, term(s) offered, prerequisites and/or restrictions. (75 words maximum.) 4/1/091Course.
Eportfolio: Tool for Student Career Development and Institutional Assessment Sally L. Fortenberry, Ph.D., and Karol Blaylock, Ph.D. Eportfolio: Tool for.
RESEARCH IN MATH EDUCATION-3
Assessing General Education Workshop for College of the Redwoods Fred Trapp August 18, 2008.
Assessing Student Learning Lynn Merklin Assistant Provost Office of Institutional Effectiveness August, 2014.
Assessment of Student Presentations Jones College of Business MTSU July 8, 2015.
Threshold Concepts & Assessment Ahmed Alwan, American University of Sharjah Threshold Concepts For Information Literacy: The Good, the Bad and the Ugly.
Standard 9 - Assessment of Candidate Competence Candidates preparing to serve as professional school personnel know and demonstrate the professional knowledge.
Approaches to Assessment Workshop for College of the Redwoods Fred Trapp August 18, 2008.
March 26-28, 2013 SINGAPORE CDIO Asian Regional Meeting and Workshop on Engineering Education and Policies for Regional Leaders Programme Evaluation (CDIO.
 Integrate the Bacc Core category learning outcomes into the course.  Clarify for students how they will achieve and demonstrate the learning outcomes.
Developing a Teaching Portfolio for the Job Search Graduate Student Center University of Pennsylvania April 19, 2007 Kathryn K. McMahon Department of Romance.
1 This CCFSSE Drop-In Overview Presentation Template can be customized using your college’s CCFSSE/CCSSE results. Please review the “Notes” section accompanying.
Expeditionary Learning Queens Middle School Meeting May 29,2013 Presenters: Maryanne Campagna & Antoinette DiPietro 1.
What could we learn from learning outcomes assessment programs in the U.S public research universities? Samuel S. Peng Center for Educational Research.
Practicing Meaningful Learning Outcomes Assessment at UGA Department of Crop and Soil Sciences August 10, 2015 Dr. Leslie Gordon Associate Director for.
Dick Clark, Rich DiNinni and Gary Rauchfuss November , 2006
The Gold Standard… Faculty are Key.  Annual Assessment based on  Address each SLO  Be specific, measurable, student- focused  Align the new.
Identifying Assessments
1 Learning Outcomes Assessment: An Overview of the Process at Texas State Beth Wuest Director, Academic Development and Assessment Lisa Garza Director,
QCC General Education Assessment Task Force March 21 and 22, 2016 Faculty Forum on General Education Outcomes.
CONNECT WITH CAEP | | Measures of Teacher Impact on P-12 Students Stevie Chepko, Sr. VP for Accreditation.
Incorporating Program Assessment into Your Annual Program Review June 29, 2006.
Instructional Leadership and Application of the Standards Aligned System Act 45 Program Requirements and ITQ Content Review October 14, 2010.
COUNSELOR EDUCATION PEDAGOGY TRAINING Session One: Significant Learning and Counselor Education.
Learning Objectives for Senior School Students. Failing to plan is planning to fail. / Psychology of Achievement /
Making an Excellent School More Excellent: Weston High School’s 21st Century Learning Expectations and Goals
DCB Annual Review of Teaching Performance Proposal for New Metrics for Review of Teaching Performance.
Criteria Rollout Meeting October 30, 2016
A community of learners improving our world
CRITICAL CORE: Straight Talk.
Closing the Assessment Loop
Creating Analytic Rubrics April 27, 2017
Instructional Personnel Performance Appraisal System
Derek Herrmann & Ryan Smith University Assessment Services
Leanne Havis, Ph.D., Neumann University
Assessing Academic Programs at IPFW
DCB Annual Review of Teaching Performance
February 21-22, 2018.
Developing a Rubric for Assessment
Instructional Personnel Performance Appraisal System
Instructional Personnel Performance Appraisal System
Presentation transcript:

ASSESSMENT CERTIFICATE CULMINATING PROJECT: ASSESSING STUDENT LEARNING OUTCOMES Presented by: Shujaat Ahmed and Kaitlin Fitzsimons

Objective 1: Assess student learning outcomes for required L1 competence

L1 Competence statement: Can use independent learning skills and strategies to organize, initiate, and document prior, current, and future college-level learning. Competence Criteria – Describe strategies for independent and experiential learning. – Use strategies to surface prior experiential learning in personal, professional, and academic settings and integrate these experiences with new learning. – Demonstrate skills in planning, organizing, assessing, and documenting competence based learning. OBJECTIVE 1 L1 COMPETENCE AND CRITERIA

Objective 2: Compare three course designs (1.0, 1.5, 2.0) of the Independent Learning Seminar

Independent Learning Seminar was a new course created to help students structure their independent learning experiences (V. 1.0) After the initial offering of the course, one of the instructors decided to adjust the original course design (V. 1.5) This Fall, a third course design was developed, replacing all of the 1.0 version offerings (V. 2.0) Our study is focused on determining whether the three course designs are achieving the L1 competence criteria, and if one design is better at meeting certain criteria than the others OBJECTIVE 2

1.Developed rubric to assess ILS projects 2.Identified 33 randomly selected assignments from 1.0, 1.5, and 2.0 (N=11 per section) for assessing L1. 3.Eight SNL TLA members rated selected assignments 4.Rubric data was analyzed by ILS section STEPS TAKEN TO ASSESS L1 COMPETENCE (OBJ 1)

ILP RUBRIC (OBJ 1) L1 Competence Criteria Below Average (1)Satisfactory (2)Above Average (3)Excellent (4) Describes strategies for independent and experiential learning. Does not or minimally identify(ies) strategies for independent and experiential learning. Superficially describes strategies for independent and experiential learning. Clearly describes strategies for independent and experiential learning. Clearly describes in detail an understanding of strategies of independent and experiential learning across multiple contexts. Uses strategies to surface prior experiential learning in personal, professional, or academic settings Does not apply strategies to surface prior learning in personal, professional, or academic settings Minimally applies strategies to surface prior learning in personal, professional, or academic settings Clearly applies strategies to surface prior learning in personal, professional, or academic settings Reflects upon the value of strategies to surface prior learning in personal, professional, or academic settings. Integrates learning experiences with new learning. Does not connect or integrate learning experiences Minimally connects learning experiences with new learning. Clearly connects learning experiences with new learning. Draws multiple connections between learning experiences across contexts and time. Demonstrates skills in documenting learning. Documentation of learning is unclear or incomplete Documents learning but does not emphasize learning strategies or future learning Documents learning with an emphasis on learning strategies OR an emphasis on future learning Documents learning effectively with an emphasis on learning strategies AND their application to future learning

Means and Standard Deviations for Version 1, 1.5, and 2.0 Note. Bold values in red represent highest means across versions; an asterisk (*) is indicated near each mean that is the highest per version ANALYSIS OF RUBRIC DATA (OBJ 1) Version 1Version 1.5Version 2 L1 CriteriaMean (N=11)SDMean (N=11)SDMean (N=11)SD Learning Strategies Surfacing Prior Learning * *0.60 Applying Learning to Future Learning2.55* Documenting Learning

A 3 x 4 MANOVA was conducted to determine whether the mean differences between the three sections were statistically significant. Results revealed a significant multivariate main effect for section, Wilks’ λ =.540, F (8, 54) = 2.435, p <. 10, η2=.27, with 27% of the multivariate variance of the L1 criteria being associated with the grouping factor, section. In examining the univariate between-subject effects, only surfacing prior learning was significantly different, F (2, 30) =5.064, p=.013. Post hoc tests using Bonferroni procedure indicated that only when version 1.0 (M=2.18) was compared to version 1.5 (M=3.09), or vice versa for surfacing prior learning, the mean difference was statistically significant (p<.025). ANALYSIS OF RUBRIC DATA (OBJ 1)

Applying Assessment Concepts Assessment promotes continuous improvement – Goal of the project is to determine if the different course designs effectively assess the L1 competence, and provide information for future decisions on selection or improvement of course designs. Measuring student learning requires direct assessment – We chose a random sample of student work and assessed each using a rubric A mixed methods approach to assessment provides a full picture, bringing together the “What” and the “Why” – We used quantitative methods to assess the rubrics. We chose to run a MANOVA in order to see if there were significant differences between the sections – We used qualitative methods to analyze the open-ended student comments from the student online teaching evaluations for each section of Independent Learning Seminar.

What We Learned In the development of the L1 rubric, we used Bloom's Extended List of Cognitive Action Verbs to fill in the cells that describe below average, satisfactory, above average, and excellent demonstrations of the L1 criteria.Bloom's Extended List of Cognitive Action Verbs – As L1 competence criteria were sometimes "double-barreled,” we separated some of the criteria such that they were measuring a single outcome. [Writing and Revising Learning Outcomes Workshop, Direct v. Indirect Assessment Workshop] At this stage of the project, analysis of the L1 rubric Ratings was exclusively quantitative. We used SPSS to compare the different ILS sections by determining means, standard deviations, and running a MANOVA. [Quantitative Workshop]

What We Learned In analyzing survey data from instructors who taught each version of the Independent Learning Seminar, we wanted to gauge faculty perception on teaching experiences, as well as student learning. We realize that faculty perceptions constitute indirect evidence, so the survey was only one piece of our mixed methods approach, including direct evidence from the rubric data analysis. [Direct vs. Indirect Assessment Workshop] In analyzing the student online teaching evaluations for each section of Independent Learning Seminar we looked at open-ended student comments. We identified several themes based upon the frequency of student comments. Not surprisingly, students in the 1.0 version reported the course focusing on the development of an ILP, while students in versions 1.5 and 2.0 emphasized the class structure [Qualitative Workshop]

RELEVANCE TO WORK The Assessment Center provides analysis and expertise so that SNL can make data-driven decisions For last year’s Annual Assessment Project, SNL completed a similar study of four different versions of the Advanced Project course The findings from that report were presented to College leadership and were relied upon to make decisions about curricular offerings SNL is increasing its reliance on rubrics as a direct assessment tool to objectively measure student outcomes

REFLECTION AND MOVING FORWARD Working on the Annual Assessment project has given us the opportunity to incorporate what we've learned through the ACP workshops to improve SNL curricular offerings. We will continue to incorporate the assessment best practices regarding data collection, data analysis, and reporting findings in our positions at SNL. Employing methods of formal inquiry as a means of problem solving extends outside the area of assessment and can be used in various practice settings.

APPENDIX