ASSESSMENT OF CORE SKILLS/ GENERAL EDUCATION OUTCOMES Angelina Hill, PhD Associate Director, Office of Academic Assessment.

Slides:



Advertisements
Similar presentations
Writing an NCATE/IRA Program Report
Advertisements

Graduation and Employment: Program Evaluation Using Dr. Michele F. Ernst Chief Academic Officer Globe Education Network.
Capstone Assessment An Introduction Office of Assessment and Accreditation Indiana State University.
SLO Assessment Departmental Tools and Examples from the Field Sacramento City College Fall Flex Workshop 8/21/08 8/21/08Presenters: Alan Keys Faculty Research.
ASSESSMENT 101 Preparing Future Faculty (PFF) Workshop Spring 2011 Facilitators: Dr. Kara Penfield, Director of Academic Assessment; Isis Artze-Vega, PFF.
What “Counts” as Evidence of Student Learning in Program Assessment?
Assessing Student Learning Outcomes In the Context of SACS Re-accreditation Standards Presentation to the Dean’s Council September 2, 2004.
Evaluation for 1st Year Grantees Shelly Potts, Ph.D. Arizona State University
Academic Program and Unit Review at UIS Office of the Provost Fall 2014.
Best Practices in Assessment, Workshop 2 December 1, 2011.
Assessment Plans Discussion Career Services Julie Guevara, Accreditation & Assessment Officer February 6, 2006.
OUTCOMES ASSESSMENT Developing and Implementing an Effective Plan.
The Oregon Framework for Teacher and Administrator Evaluation and Support Systems April Regionals Multiple Measures: Gathering Evidence 1.
NCCEA Annual Conference Waynesville, NC Assessment Basics: Implementing and Sustaining a Comprehensive, Outcomes-Based Assessment Plan October 19, 2006.
Responding to the Assessment Challenges of Large Classes.
Dallas Baptist University College of Education Graduate Programs
Presented by: Dr. Sue Courtney Janice Stoudemire, CPA, ATA, ABA Associate Degree Board of Commissioners Copyright Protected: Material can not be use or.
NCATE Institutional Orientation Session on PROGRAM REVIEW Moving Away from Input- based Programs Toward Performance-based Programs Emerson J. Elliott,
2005 CELT Conference1 Strategies from the Assessment Trenches: A Self-Help Clinic presented by AURA.
ASSESSMENT SYSTEMS FOR TSPC ACCREDITATION Assessment and Work Sample Conference January 13, 2012 Hilda Rosselli, Western Oregon University.
Effective Grading and Assessment:. Strategies to Enhance Student Learning.
Catherine Wehlburg, Ph.D. Office for Assessment & Quality Enhancement.
Program Assessment Workshop Kathleen Harring. What is Assessment? Assessment is the systematic gathering and analysis of information to inform and improve.
Welcome… The attendee will understand assessment basics with a focus on creating learning activities and identifying assessment expectations. Apply the.
Guide to Evidence for WASC Accreditation Dr. Robert Gabriner City College of San Francisco Student Learning Outcomes Workshop Strand 3.
Principles of Assessment
Accrediting Commission for Community and Junior Colleges of the Western Association of Schools and Colleges.
SLO A SSESSMENT S TUDY W ORKSHOP D EVON A TCHISON, SLO C OORDINATOR August 20, :30-10:30 a.m. Room 523.
Direct vs Indirect Assessment of Student Learning: An Introduction Dr. Sheila Handy, Chair Business Management and Co-Chair University Assessment Committee.
Adolescent Sexual Health Work Group (ASHWG)
Preparing for ABET Accreditation: Meeting Criteria 2, 3 & 4 October 12, 2006 Melissa Canady Wargo, Director Office of Assessment.
Tk20Tk20 CAMPUS TOOLS FOR HIGHER EDUCATION. WHAT IS IT? Tk20 is an electronic program that offers one, central, easy location to manage all courses. Instructors.
Mapping Student Learning Outcomes
The Basics of February 25, 2012 EDTN.  The ACCJC requires it for accreditation  To make course, degree, certificate, and GE outcomes more relevant 
Adriana Signorini, CRTE, SATAL Coordinator Greg Dachner, SSHA, SATAL Student Sharai Kirk, SSHA, SATAL Student How do we know our students are learning?
CRICOS Provider Code: 02042G Curtin International College A Member of Navitas CRICOS Provider Code: 02042G Curtin International College Diagnostic Exercise.
Educational Effectiveness Fall Faculty Retreat 2006 Leanne Neilson Halyna Kornuta.
UCF University-wide System for Assessing Student Learning Outcomes Dr. Julia Pet-Armacost Assistant VP, Information, Planning, and Assessment University.
Student Learning Outcomes at CSUDH. Outcomes assessment can tell us if our students are really learning what we think they should be able to do.
AQIP Action Projects  Action Project Directory created in 2002  An overt commitment to continuous improvement  At least three active action projects.
Full-Time Faculty In-Service: Program and Student Learning Outcomes Fall 2005.
EDU 385 CLASSROOM ASSESSMENT Week 1 Introduction and Syllabus.
Dr. Marsha Watson Director of Assessment Dr. Kenny Royal Assistant Director of Measurement & Analysis Dr. Julie F. Johnson Assessment Specialist.
Practicing Meaningful Learning Outcomes Assessment at UGA Department of Crop and Soil Sciences August 10, 2015 Dr. Leslie Gordon Associate Director for.
Assessing Your Assessments: The Authentic Assessment Challenge Dr. Diane King Director, Curriculum Development School of Computer & Engineering Technologies.
NCATE for Dummies AKA: Everything You Wanted to Know About NCATE, But Didn’t Want to Ask.
VT University Libraries: Identifying, Teaching, and Assessing What Matters Most Office of Academic Assessment Ray Van Dyke,
Texas Tech University PROGRAM-LEVEL ASSESSMENT WORKSHOP WRAP-UP SESSION FRIDAY, NOVEMBER 11,
Guide to Evidence for WASC Accreditation Dr. Robert Gabriner City College of San Francisco Student Learning Outcomes.
What Your Program Needs to Know about Learning Outcomes Assessment at UGA.
Developing Rubrics within the Context of Assessment Methods Peggy Maki Senior Scholar Assessing for Learning AAHE
Stetson University welcomes: NCATE Board of Examiners.
PROGRAM ASSESSMENT BASICS Alan Kalish, PhD Teresa Johnson, PhD Director Assistant Director University Center for the Advancement of Teaching.
JACKSON STATE UNIVERSITY ACADEMIC ASSESSMENT COMMITTEE WORKSHOP By: Dr. Shemeka McClung Director Ms. Arnitra Hunter Research Associate Institutional Research.
QCC General Education Assessment Task Force March 21 and 22, 2016 Faculty Forum on General Education Outcomes.
Incorporating Program Assessment into Your Annual Program Review June 29, 2006.
Instructional Plan | Slide 1 AET/515 Instructional Plan For Associate’s Degree in Library Skills (Donna Roy)
Overview of Types of Measures Margaret Kasimatis, PhD VP for Academic Planning & Effectiveness.
Educational Effectiveness Assessing WIC Student Learning Outcomes October 2006 Halyna Kornuta.
2009 TACUSPA Fall Conference October 5, 2009 El Paso, Texas.
Road To Success Kathleen Bolland, PhD Javonda Williams, PhD, LCSW SUCCESS Next Exit Aligning curricular planning, teaching, and program evaluation to facilitate.
Assessment and Evaluation of CAREER Educational Components Center for Teaching Advancement and Assessment Research.
What are students able to demonstrate upon completion of a course? (Not about what instructors provide) Use specific and simple action verbs. Example:
Indirect and Direct Evidence
Institutional Learning Outcomes Assessment
Student Learning Outcomes Assessment
Gary Carlin, CFN 603 September, 2012
The Heart of Student Success
Student Learning Outcomes at CSUDH
Presentation transcript:

ASSESSMENT OF CORE SKILLS/ GENERAL EDUCATION OUTCOMES Angelina Hill, PhD Associate Director, Office of Academic Assessment

Levels of Assessment University-level Outcome Statements Program level Outcome Statements Course/Educational Experience Outcome Statements

Activity  Identify the level of assessment that applies to your project  Roughly state the core skills that you aim to assess  Be as specific as you think is necessary

Activity  Program assessment:  What courses address these outcomes, and at what level (introduce, reinforce, enhance/master)?  Course assessment:  Where in the course is the outcome addressed (e.g., activities, assignments, tests), and at what level?

Assessment Instruments  Design assessment instruments around the type of learning emphasized in the course.  What do you want your students to learn?

Direct Methods of Assessment:  Course-embedded assessment (e.g., homework assignment; essays, locally developed tests)  Grading with criteria or rubrics  Comprehensive exams  Senior thesis or major project  Portfolio evaluation  Pre and posttests  Reflective journals  Capstone projects  Internal/external juried review of performances and exhibitions  Internship and clinical evaluation  National Major Field Achievement Tests  GRE subject exams  Certification exams, licensure exams

Indirect Methods of Assessment:  Survey of perceived outcome attainment by course (survey students or faculty)  Departmental survey  Exit interviews  Alumni survey  Employer survey  Focus groups  Job placement statistics  Graduation and retention rates  Percentage of students who study abroad

Assessment Instruments  Some combination of indirect and direct measures is best (in an ideal world)  Allows for converging evidence  Example: students have low confidence in their ability to formulate a hypothesis students perform below expectations on an embedded assignment asking them to formulate a hypothesis

Assessment Instruments  What measures do you use already?  What additional measure(s) do you think would be useful?

Assessing Large Classes  Challenges include:  Giving rich individual feedback  Managing the quantity of grading (yourself, and coordinating assistants)  Avoiding testing that fosters shallow learning  Assessing a diverse mix of students  Avoiding plagiarism

Assessing Large Classes  Some strategies:  Scoring rubrics  Complex products or behaviors can be examined efficiently Identify characteristics of what you are assessing Describe the best work you could expect based on these (top category), and the worst (lowest category) Develop descriptions of intermediate-level products that are meaningful to you 1 to 3 (novice, competent, exemplary), or 1 to 5 (unacceptable, marginal, competent, very competent, outstanding)

Assessing Large Classes  Some strategies:  Using samples of student work Random or performance based  Technology Clickers, web-campus quizzes & surveys, on-line discussion boards  Automating the analysis process Setting up excel templates Structuring exams to easily capture info

Assessing Large Classes  Group projects  Peer & self evaluations  Evaluations by grad assistants, faculty committees, etc. Ensure that grading material are understood by all staff Run training sessions where they evaluate various levels of student artifacts

Assessing Large Classes  Assess background knowledge early in semester  Use cumulative tasks with more formative feedback that guides efforts on next task  Ask students to consider how topics relate to their discipline area

Assessing Multiple Sections  Direct Measures  Imbed common questions in exam for all sections  Create common writing assignment for all sections  Create a multiple choice test to give to all students at end of semester

Assessing Multiple Sections  Direct Measures cont.  Create a pre and post test to give to all students  Compare different modes of delivery!  Indirect Measures  Survey of student’s perceived learning (SALG)  Survey faculty

Assessing Multiple Sections  How to get agreement in your department??

Assessing On-line Courses  Formative assessment in on-line classes is key  Assessment processes in on-line classes should:  Enable students to self-monitor their progress Stop lecture after certain time, ask students to reflect, write down insights, submit feedback as short notes

Assessing On-line Courses  Gather regular feedback from students Pose a question via about teaching and invite students to respond Students can respond with personal  Give regular feedback to students One sentence summaries, minute papers Paper or project prospectus (brief structured first draft plan)

Developing a Timeline  How frequently can you feasibly measure each outcome?  Or, how frequently can you feasibly use a specific instrument to measure an outcome?

Establishing Criteria  Criteria help you make sense of your results.  May not be “right” the first time  Expectations that are very low or very high yield less meaningful results

Closing the Loop  Using results to make improvements is the ultimate goal!  Can you do anything about met or unmet expectations? If not, why see if their met?