Evidence of Student Learning: What it is, Where to Find it, and How to Use It Dr. Renay M. Scott Interim Executive Vice President & Provost Owens Community.

Slides:



Advertisements
Similar presentations
Meeting MSCHE Assessment Expectations
Advertisements

Graduation and Employment: Program Evaluation Using Dr. Michele F. Ernst Chief Academic Officer Globe Education Network.
GENERAL EDUCATION ASSESSMENT Nathan Lindsay January 22-23,
What “Counts” as Evidence of Student Learning in Program Assessment?
ACADEMIC DEGREE ASSESSMENT & GENERAL EDUCATION ASSESSMENT Nathan Lindsay Arts & Sciences Faculty Meeting March 12,
Assessing Student Learning Outcomes In the Context of SACS Re-accreditation Standards Presentation to the Dean’s Council September 2, 2004.
Session Two Review SEF analysis –Benefits –Issues.
Alternative Strategies for Evaluating Teaching How many have used end-of-semester student evaluations? How many have used an alternative approach? My comments.
Source Code: Assessing Cited References to Measure Student Information Literacy Skills Dale Vidmar Information Literacy and Instruction Librarian Southern.
Best Practices in Assessment, Workshop 2 December 1, 2011.
An Assessment Primer Fall 2007 Click here to begin.
Department Plans for Assessment of Student Learning Teachers Talking February 16, 2011.
Assessment of Student Affairs Initiatives for First-Year Students National Conference on First-Year Assessment October 12-14, 2008 San Antonio, Texas Jennifer.
Presented by: Dr. Sue Courtney Janice Stoudemire, CPA, ATA, ABA Associate Degree Board of Commissioners Copyright Protected: Material can not be use or.
The SACS Re-accreditation Process: Opportunities to Enhance Quality at Carolina Presentation to the Faculty Council September 3, 2004.
Effective Grading and Assessment:. Strategies to Enhance Student Learning.
Winning Strategies for Assessing Student Learning Institutional Planning, Assessment, Research, and Testing (IPART) January 23, 2008.
FLCC knows a lot about assessment – J will send examples
Program Assessment Workshop Kathleen Harring. What is Assessment? Assessment is the systematic gathering and analysis of information to inform and improve.
Reaffirmation of Accreditation: Institutional Effectiveness Southern Association of Colleges and Schools February 2008 Stephen F. Austin State University.
Welcome… The attendee will understand assessment basics with a focus on creating learning activities and identifying assessment expectations. Apply the.
Learning Outcomes at the University of North Alabama Dr. Andrew L. Luna Institutional Research, Planning, and Assessment.
Accrediting Commission for Community and Junior Colleges of the Western Association of Schools and Colleges.
Presented by Jennifer Fager Xavier University for University of Wisconsin-Superior Enhancement Day 1/19/2011.
Outcomes Assessment 101 Assessment Showcase 2009: Best Practices in the Assessment of Learning and Teaching February 11, 2009 Gwendolyn Johnson, Ph.D.
Preparing for ABET Accreditation: Meeting Criteria 2, 3 & 4 October 12, 2006 Melissa Canady Wargo, Director Office of Assessment.
Creating a Learning Community Vision
 “…the systematic collection, review, and use of information about educational programs undertaken for the purpose of improving student learning and.
Year Seven Self-Evaluation Workshop OR Getting from Here to There Northwest Commission on Colleges and Universities.
Student Learning Outcome Assessment: A Program’s Perspective Ling Hwey Jeng, Director School of Library and Information Studies June 27,
Using Electronic Portfolios to Assess Learning at IUPUI. Trudy Banta, et. al. Indiana University-Purdue University Indianapolis 2007.
The Basics of February 25, 2012 EDTN.  The ACCJC requires it for accreditation  To make course, degree, certificate, and GE outcomes more relevant 
 This prepares educators to work in P-12 schools (1)  It provides direction (1)  It is knowledge-based, articulated, shared, coherent, consistent with.
Learning Outcomes Made Easy Using the Best Tools Jeffrey D. Keith, Ph.D. J. Kelly Flanagan, Ph.D. Russell T. Osguthorpe, Ph.D. Danny R. Olsen, Ph.D. Tom.
Evidence of Student Learning Fall Faculty Seminar Office of Institutional Research and Assessment August 15, 2012.
UCF University-wide System for Assessing Student Learning Outcomes Dr. Julia Pet-Armacost Assistant VP, Information, Planning, and Assessment University.
ASSESSMENT OF CORE SKILLS/ GENERAL EDUCATION OUTCOMES Angelina Hill, PhD Associate Director, Office of Academic Assessment.
Student Learning Outcomes at CSUDH. Outcomes assessment can tell us if our students are really learning what we think they should be able to do.
Full-Time Faculty In-Service: Program and Student Learning Outcomes Fall 2005.
Dr. Marsha Watson Director of Assessment Dr. Kenny Royal Assistant Director of Measurement & Analysis Dr. Julie F. Johnson Assessment Specialist.
Student Services Learning Outcomes and Assessment Jim Haynes, De Anza College Sept. 17, 2010.
Assessing Your Assessments: The Authentic Assessment Challenge Dr. Diane King Director, Curriculum Development School of Computer & Engineering Technologies.
Model for Sustaining Departmental Student Outcomes Assessment Russ E. Mullen, Mary H. Wiedenhoeft, Thomas A. Polito, Sherry L. Pogranichniy, and Michelle.
A Basic Guide to Academic Assessment Presented by Darby Kaikkonen Director of Institutional Research.
VT University Libraries: Identifying, Teaching, and Assessing What Matters Most Office of Academic Assessment Ray Van Dyke,
NCATE STANDARD I STATUS REPORT  Hyacinth E. Findlay  March 1, 2007.
Learning Goals at St. John Fisher College Peter J. Gray, Ph.D. Director of Academic Assessment United States Naval Academy May 2004.
Introduction to Academic Assessment John Duffield Office of Academic Assessment Georgia State University September 2013.
What Your Program Needs to Know about Learning Outcomes Assessment at UGA.
Assessing Student Learning Workshop for Department Chairs & Program Directors Workshop for Department Chairs & Program Directors January 9, 2007.
National Board Study Group Meeting Dan Barber 5 th Grade Teacher, Irwin Academic Center
UK Office of Assessment. The LEARNING Initiative Dual Track Implementation Strategy Completion Dates Not actively engaged in program level assessment.
Developing Program Learning Outcomes To help in the quality of services.
UK Office of Assessment. The LEARNING Initiative Dual Track Implementation Strategy Completion Dates Not actively engaged in program level assessment.
DEFINING AND REFINING LEARNING OUTCOMES UK Office of Assessment.
Incorporating Program Assessment into Your Annual Program Review June 29, 2006.
Student Learning Outcomes (SLOs) Module #2: Writing SLOs Office of Academic Planning & Accountability Institutional Effectiveness Moderator: Dr. Cathy.
What are students able to demonstrate upon completion of a course? (Not about what instructors provide) Use specific and simple action verbs. Example:
Writing and Revising SLOs with Best Practices in Mind
Student Learning Outcomes (SLOs) Module #4: SLO Annual Report
Assessment Basics PNAIRP Conference Thursday October 6, 2011
Assessment Planning and Learning Outcome Design Dr
CRITICAL CORE: Straight Talk.
Consider Your Audience
SLOs: What Are They? Information in this presentation comes from the Fundamentals of Assessment conference led by Dr. Amy Driscoll and sponsored by.
Why Consider Becoming a Teacher?
Student Learning Outcomes Assessment
Teaching and Learning Commons West Virginia University
Program Assessment Processes for Developing and Strengthening
Student Learning Outcomes at CSUDH
Presentation transcript:

Evidence of Student Learning: What it is, Where to Find it, and How to Use It Dr. Renay M. Scott Interim Executive Vice President & Provost Owens Community College

Two Fundamental Questions What evidence do you have that students achieve your stated learning outcomes? In what ways do you analyze and use evidence of student learning?

Defining Evidence Information that tells you something directly or indirectly about the topic of interest Evidence is neutral -- neither “good” nor “bad” ▫Requires context to be meaningful Two types of assessment evidence ▫Direct and Indirect

Direct Evidence Students show achievement of learning goals through performance of knowledge, skills (demonstrate their learning): ▫Scores and pass rates of licensure/certificate exams ▫Capstone experiences  Individual research projects, presentations, performances  Collaborative (group) projects/papers which tackle complex problems ▫Score gains between entry and exit ▫Ratings of skills provided by internship/clinical supervisors ▫Substantial course assignments that require performance of learning ▫Portfolios

Indirect Evidence Students’ share their perception of their learning (talk about learning): Questionnaires Focus Groups Interviews Alumni Surveys Employer surveys Graduation Rates Job Placement Length of Time to Degree

Finding Evidence: An Evidence Inventory Lets you discover the evidence you already have, such as: ▫Institutional Research data ▫Student Life data ▫Exit Surveys (seniors) ▫Alumni Surveys Start with the obvious … but don’t stop there

Finding Evidence: Perils and Pitfalls Institutional history ▫“We’ve already done that, and it didn’t tell us anything!” Territory; Politics ▫Fighting for scant resources Institutional policy/culture about sharing information ▫“I don’t want somebody ‘policing’ my classrooms!”

Finding Evidence: Appropriateness Does the evidence address student learning issues appropriate to the institution? Does the evidence tell you something about how well the institution is accomplishing its mission and goals? ▫The questions you have about student learning should guide your choice of appropriate existing evidence and identify gaps where a new type of evidence might be needed

What About Grades? Grades are not generally acceptable evidence of student learning at the program level. Grades measure narrowly defined outcome(s). Course grades are often arrived at differently by course section and therefore do not reflect the same evidence or same evaluation of that evidence. Therefore making generalizations about student learning is not possible.

Assisting Academic Departments Faculty are intensely interested in what students are learning Assessment occurs in classrooms and academic departments every day Evidence of student learning already exists in academic departments The challenge is not to convince academic departments to gather evidence, but rather to help them recognize and use evidence they already have

Assisting Academic Departments: Addressing Common Barriers “This is a lot of work!” ▫Use some sort of evidence inventory to help faculty understand how existing academic practices yield evidence ▫Keep expectations reasonable, given limited time and resources Remember: it is not necessary to gather all the evidence all of the time

Assessment Inventory: One Example Inventory of Written Statements and Plans 1.Do you have a written mission statement or statement of purpose?   yes  no If yes, please attach a copy or reference where this can be found: ________________________________________________________ 2.Do you have a written statement of intended educational outcomes describing what a student should know or be able to do when they have completed this program?  yes  no 3.Do you have a written method of assessment for measuring student outcomes?  yes  no 4.Does your program have a separate accreditation process?  yes  no

Assessment Inventory: One Example Direct Methods of Assessment 1. ________ Comprehensive Examinations 2. ________ Writing proficiency Examinations 3. ________ National Examinations assessing subject matter knowledge 4. ________ Graduate Record Exam General Test 5. ________ Graduate Record Exam Subject Test 6. ________ Certification Examinations 7. ________ Licensure Examinations 8. ________ Locally developed pre-test or post-test for subject matter knowledge 9. ________ Major paper/project 10. ________ Program/course portfolios 11. ________ Capstone coursework 12. ________ Audio/video tape of presentations/performances

Assisting Academic Departments: Addressing Common Barriers “How do I know you won’t use this against me?” ▫Be consistent and firm in the message that assessment is not faculty evaluation, that results will only be reported in the aggregate ▫Partner with faculty willing to engage in the process and make her/his evidence public ▫Link assessment results to allocation of resources, ideally through a strategic planning process

Assisting Academic Departments: Addressing Common Barriers “My students pass the tests. Why isn’t that good enough?” ▫Tests often measure only content knowledge ▫Learning = what student know (content knowledge) + what they can do with what they know (performance) ▫Grades are generally not linked to specific learning outcomes Modify course tests to measure learning outcomes by adding performance assessments

Modifying Tests to Gather Direct Evidence of Learning Identify questions on the test that provide evidence of a learning outcome: ▫Five questions that require the use of deductive reasoning to arrive at the right answer ▫Open-ended questions that require students to solve a unique problem given knowledge/skills learned Isolate those questions and look for patterns of performance: ▫the average grade in the class was a “B” but 85% of the students missed four of the questions requiring deductive reasoning ▫70% of students were able to use a particular theory/approach to resolve the problem

Meaningful Evidence Situated within the institutional mission and context Addresses relevant questions Analyzed and interpreted in relation to other evidence

Meaning Evidence: Facts + Context Fact: ▫National survey data indicates seniors do not feel a sense of engagement and belonging on our campus.

Meaningful Evidence: Facts + Context Fact: ▫Seniors feel disengaged from our campus (national survey data) Fact: ▫Seniors would recommend this institution to other people (senior exit surveys)

Meaningful Evidence: Facts + Context Fact: ▫Seniors feel disengaged from our campus (national survey data) Fact: ▫Seniors would recommend this institution to other people (senior exit surveys) Context: ▫Over the past five years, an average of 82% of first- year alums donated to the institution

Recognizing Meaningful Evidence How compelling is your evidence? Does it make you want to do something? Will it make others want to do something? How relevant is your evidence? To what is it linked: departmental mission, institutional initiatives? How trustworthy is your evidence? How was it gathered? Who does it represent? Is it one piece? Several pieces?

Recognizing Meaningful Evidence CompellingRelevantTrustworthy High Medium Low

1.Read the Brinymead case study from the Accounting Department. 2.Complete the first grid for the capstone project. 3.Complete the second grid using the information from the CPA exam.

Meaningful Evidence: Example Senior exit surveys: ▫Indicate a dissatisfaction with the amount of time spent on clinical skills Departmental assessment of skill ability and development finds that, of the critical skills required: ▫students are outstanding on three of them, satisfactory on two, and not acceptable on two ▫Internship evaluations from supervisors consistently cite lack of ability in clinical skills

Meaningful Evidence: Qualitative Data Appropriate uses: ▫Exploring an issue in more depth ▫Answering specific questions about individual experience:  Ex: How are you different now than you were before?  Ex: how did living with a host family inform your understanding of the culture? ▫Including student voices

Qualitative Data Analysis: Open-Ended Questions Read the data Strip and code the data, while looking for themes and patterns Present the data thematically---it will “lead” you somewhere ▫Academic Advising ▫General Education ▫Student perceptions of particular courses

Qualitative Data Example “220 was a last semester course but I felt like a freshman! There was no way I knew all of that stuff.” “I thought I was going to fail 220 and I’m a good student.” “I didn’t know how to do anything in 220 and the instructor didn’t care. We kept saying we didn’t know but he just kept going. It was ridiculous.”

Qualitative Data Example Drill down into the data by asking pertinent questions: ▫What are the learning goals of 220? ▫How did students perform in 220? ▫What are the assumptions about students entering 220?  Skill level?  Knowledge base? Analyze the program curriculum map ▫Where do students learn prerequisite skills and/or knowledge? ▫How and where are program and course learning outcomes (expectations) assessed? Are they assessed?

Using Assessment Results Inform policy decisions Strategic allocation/reallocation of resources ▫Make changes in curriculum ▫Support new initiatives Accountability ▫Inform stakeholders about expectations and results ▫Improve teaching and learning on campus

Presenting Assessment Results Consider audience ▫Who are they? What’s important to them? How will they use assessment information in their lives? Appropriate presentation ▫Present data thematically ▫Link data and interpretations to institutional initiatives or departmental strategic planning (provide a context)

Assessing and Improving Assessment Reflect on your assessment Were the assessments reasonable and manageable? Did they answer your questions? Did they tell you something about student learning? Were you able to use the evidence you gathered?

Helpful Sources Diamond, Robert M. “Designing and Assessing Courses & Curricula” (1998) Huba, Mary E. and Jann E. Freed. (2000). “Learner- Centered Assessment on College Campuses” Maki, Peggy L. (2004) Assessing for Learning: Building a Sustainable Commitment Across the Institution. Suskie, Linda. “Assessing Student Learning: A Common Sense Guide” (2004) Walvoord, Barbara E. “Assessment Clear and Simple” (2004)