AASCU Academic Affairs Summer Meeting Portland Oregon ▪ July 28-31, 2011 Christine M Keller VSA Executive Director.

Slides:



Advertisements
Similar presentations
SLO Assessment Departmental Tools and Examples from the Field Sacramento City College Fall Flex Workshop 8/21/08 8/21/08Presenters: Alan Keys Faculty Research.
Advertisements

Academic Program and Unit Review at UIS Office of the Provost Fall 2014.
Division of Student Affairs and Enrollment Management Supporting Student Success and Retention.
Higher Learning Commission Annual Conference Chicago, IL ▪ April 7, 2013 Christine Keller VSA Executive Director Teri Lyn Hinds VSA Associate Director.
Assessment of the Impact of Ubiquitous Computing on Learning Ross A. Griffith Wake Forest University Ubiquitous Computing Conference Seton Hall University.
Oklahoma Association for Institutional Research & Planning Spring Conference April 1, 2005 National Assessment of College-Level Learning Debra L. Stuart.
Pilot Study Introduction and Overview 1.  Julie Carnahan Senior Associate - SHEEO MSC Project Director  Terrel Rhodes Vice President.
 Reading School Committee January 23,
Dr. Geri Cochran Director, Institutional Research, Assessment and Planning.
Crowd-Sourcing Innovative Practices: Assessing Integrative Learning at Large Research Institutions.
Gateway Engineering Education Coalition Engineering Accreditation and ABET EC2000 Part II OSU Outcomes Assessment for ABET EC200.
Educational Outcomes: The Role of Competencies and The Importance of Assessment.
Update from the UNC General Education Council [presented to the UNC Board of Governors’ Educational Planning, Programs, and Policies Committee on February.
Apples to Oranges to Elephants: Comparing the Incomparable.
IT Strategic Planning Project – Hamilton Campus FY2005.
What’s in the works for General Education Assessment? Dan McCollum, Ph.D. Associate Director of Academic Assessment.
1 Reading First Internal Evaluation Leadership Tuesday 2/3/03 Scott K. Baker Barbara Gunn Pacific Institutes for Research University of Oregon Portland,
The Center for IDEA Early Childhood Data Systems 2014 Improving Data, Improving Outcomes Conference September 9, 2014 State Panel on the Benefits of CEDS:
PHAB's Approach to Internal and External Evaluation Jessica Kronstadt | Director of Research and Evaluation | November 18, 2014 APHA 2014 Annual Meeting.
GOAL SETTING CONFERENCES BRIDGEPORT, CT SEPTEMBER 2-3,
Student Assessment Inventory for School Districts Inventory Planning Training.
The Personal Development Plan (PDP)
Blackboard 201 Communication Workshop Barbara Cooper. OCC Faculty Online Coordinator.
Graduate Program Review Where We Are, Where We Are Headed and Why Duane K. Larick, Associate Graduate Dean Presentation to Directors of Graduate Programs.
University Of North Alabama General Education Assessment Paradigm Shift: A plan for Revising General Education Assessment at UNA.
MNSAA Accreditation January 2014 New School Training The Whole Learning School Sarah W. Mueller Executive Director.
AAC&U/ Minnesota Collaborative Pilot (MCP) Project: Artifacts and Assessment Aug. 19, 2014 Professional Development Day 1.
1 National Survey of Student Engagement (NSSE) 2013 Tiffany Franks Assistant Director, Office of Planning & Analysis October 25, 2013.
SLOAC Committee Flex Day Working Session Date: Time: 12:45 – 2:15.
Life After VSA: An Update Oklahoma Association of Institutional Research and Planning Fall 2009 Conference October 23, 2009.
Open Forum Educational Master Plan (EMP) Toyon Room June 4, 2015 E. Kuo FH IR&P.
The Voluntary System of Accountability (VSA SM ).
Committee of Practitioners ESEA Flexibility Waiver Review June 25, 2014.
Data Development and Pilots Academic Leadership Retreat August 2015.
Streamlined NCATE Visits Donna M. Gollnick Senior Vice President, NCATE 2008 AACTE Annual Meeting.
MONASH UNIVERSITY LIBRARY’S QUALITY SELF REVIEW: INVOLVING ALL STAFF M. Pernat Monash University Library, Monash University, Victoria, 3800 QUALITY AT.
A forum for coordinating state, federal, and tribal aquatic monitoring programs in the Pacific Northwest Pacific Northwest Aquatic Monitoring Partnership.
Accreditation follow-up report. The team recommends that the college further refine its program review, planning, and resource allocation processes so.
AQIP Action Projects  Action Project Directory created in 2002  An overt commitment to continuous improvement  At least three active action projects.
NSSE 2013 How to Use Results (or “Why you should care about NSSE”) 8/26/
Higher Learning Commission Annual Conference Chicago, IL ▪ April 8, 2013 Christine Keller VSA Executive Director Teri Lyn Hinds VSA Associate Director.
Executive Summary Agenda Item: College Portrait Purpose: provide an overview College Portrait web site and MUS involvement.
AdvancED District Accreditation Process © 2010 AdvancED.
BUILDING A PRIOR LEARNING ASSESSMENT PROGRAM Office for Prior Learning Assessment Joyce Lapping, Director and Panel Presenter at NEASC 126 th Annual Meeting.
Barbara F. Schloman, Ph.D., TRAILS Project Director Julie A. Gedeon, Ph.D., TRAILS Assessment Coordinator Kent State University Libraries and Media Services.
Responding to Calls for Greater Accountability SHEEO-NCES Network Conference April 17, 2008 Christine M. Keller NASULGC The VSA Project.
2009 OSEP Project Directors Meeting Martha Diefendorf, Kristin Reedy & Pat Mueller.
North East Association for Institutional Research Bethesda, Maryland ▪ November 3-6, 2012 Christine Keller VSA Executive Director Teri Lyn Hinds VSA Associate.
1 VSA UPDATE The Next Generation of the College Portrait Presentation by William E. Kirwan USM Chancellor Association of Governing Boards Sunday, April.
Welcome to Del Mar College. How have you changed the lives of your student learners?
VSA & Assessment of Learning Outcomes Sally Frazee, Temple University Criss Gilbert, University of Minnesota-Twin Cities Susan Morgan – Appalachian State.
Program Review 2.0 Pilot 2 October Self Evaluation HAPS is the result of a process that began in 2012, the last Accreditation self- evaluation.
Program Review 2.0 Pilot 2 October Self Evaluation HAPS is the result of a process that began in 2012, the last Accreditation self- evaluation.
Multi-State Collaborative. What is it? Oregon University System (OUS) partnership with 9 states, Association of American Colleges and Universities (AAC&U),
Information about Foundations of Excellence ® Illustration of Academic & Student Affairs Partnerships.
VOLUNTARY SYSTEM OF ACCOUNTABILITY AND LEARNING OUTCOMES: AN UPDATE Teri Hinds Voluntary System of Accountability Natasha Jankowski National Institute.
Voluntary System of Accountability UNF sign-up March 1, 2008 Slides from the NASULGC Opening General Session on the VSASlides from the NASULGC Opening.
Report to the Board of Governors Strategic Planning Committee Arthur Winston Region 1 Strategic Planning Committee Chair 2 August 2008.
Assessing General Education: It’s Easy to Get in on the Action! Dr. Mardell Wilson Director University Assessment Office.
QCC General Education Assessment Task Force March 21 and 22, 2016 Faculty Forum on General Education Outcomes.
Updates on AASCU & APLU Data Projects Christine M. Keller, PhD Executive Director, Voluntary System of Accountability APLU Associate Vice President, Academic.
Improving Data, Improving Outcomes Conference Washington, DC Sept , 2013 Planful Changes: Using Self-Assessments to Improve Child and Family Outcome.
C OLLEGIATE L EARNING A SSESSMENT Dr. Pedro Reyes, Executive Vice Chancellor for Academic Affairs January 2014.
CIAS Program Level Assessment Office of Educational Effectiveness Assessment September 6, 2016.
CBU CALIFORNIA BAPTIST UNIVERSITY Assessment, Accreditation, and Curriculum Office CBU - OIRPA.
Director of Policy Analysis and Research
Continuing to Advance the Culture of Assessment in Massachusetts
Academic Assessment: Data Day May 7, 2015.
General Education Redesign Task Force
Curriculum Coordinator: D. Para Date of Presentation: Jan. 20, 2017
Presentation transcript:

AASCU Academic Affairs Summer Meeting Portland Oregon ▪ July 28-31, 2011 Christine M Keller VSA Executive Director

on the undergraduate student experience through a common web report – the College Portrait. Initiative by public 4-year universities to supply comparable, transparent information AASCU & APLU g

Provide a streamlined college search tool for prospective students, families, and high school counselors Provide a mechanism for public institutions to demonstrate transparency and accountability Support institutions in the measurement and reporting of student learning outcomes through original research and by providing a forum for collaboration and exchange

: Directly measure, publicly report student learning gains at institution level using a common method – Skills: critical thinking, analytic reasoning, problem solving, written communication – Tests: CAAP, CLA, ETS Proficiency Profile – Reported: At, Above, Below Expected 2012: 4-year trial period ends : Evaluate what works, what doesn’t work, next steps

SLO measurement and reporting will continue to be essential element of VSA How can the section be improved to better meet the needs of participating institutions and external stakeholders within the current environment?

Interviews with key internal and external stakeholders – National Institute for Learning Outcomes Assessment (NILOA) Focus Groups, Conference Sessions Surveys Advisory panels, working groups

CLA N=80 ETS PP N=18 CAAP N=11 34% or 109 College Portraits have published SLO results as of July 2011

Short survey to gauge progress, gather feedback from VSA contacts ~200 responses in June/July – 61% institutional research – 16% academic affairs – 13% assessment – 10% other

83% administered at least one test – 23% CAAP – 37% ETS PP – 51% CLA 60% posted results on College Portrait 92% plan to continue using one of tests

Helpful as overall indicator of student progress … powerful stimulus to other more local efforts Benchmarks to compare skill levels of our students with other campuses Good faith effort to be transparent Selected Comments VSA SLO Survey 7/5/2011 Dovetails with institutional priorities

Lack of faculty buy in … small sample sizes provide limited value at operational level Major limitation of VSA, please change this section … add AAC&U rubrics Selected Comments VSA SLO Survey 7/5/2011 Testing protocol is negative because of cost (cash/staff time) with no useful results Unfortunately I don’t think many people look at this section

UsedEffective AAC&U VALUE Rubrics 39% (52 of 135) 53% (43 of 80) Electronic Portfolios 68% (105 of 155) 58% (71 of 122) ETS Major Field Test 61% (90 of 147) 64% (68 of 102) Professional Licensure/Certification 93% (161 of 173) 87% (135 of 155)

Gather feedback on efficacy of the SLO pilot study (Summer/Fall 2011) Assemble brainstorming group to consider learning outcomes in current context (Fall 2011) Convene technical workgroup to evaluate options (early Spring 2012) Work with senior advisory panel and VSA Board to recommend next steps and future directions (late Spring 2012)

Questions about evaluation plan or current status of VSA/SLO Your input on strengths, weaknesses, recommendations for future for student learning measurement and reporting within the VSA

What are the benefits of the current VSA approach to measuring/reporting SLO? – Has it been effective with external stakeholder groups? What are some of the challenges or barriers of the current approach? How can the current approach be expanded or modified to better address current needs, challenges, external demands, etc?

Websites Christine Keller, VSA Executive Director Wendell Hall, VSA Assistant Director Elspeth Payne, VSA Project Coordinator