RACHEL BAKER & ELIZABETH DAYTON GRADUATE SCHOOL OF EDUCATION, STANFORD UNIVERSITY ANDREW LAMANQUE FOOTHILL-DE ANZA CCD KATHERINE MCLAIN COSUMNES RIVER.

Slides:



Advertisements
Similar presentations
St. Louis Public Schools Human Resources Support for District Improvement Initiatives (Note: The bullets beneath each initiative indicate actions taken.
Advertisements

UCSC History. UCSC: A brief history 60s University Placement Committee A lot of field trips/interaction with employers.
Outcomes Assessment. Many academic institutions measure the success of service- learning based on participation, number of hours, or estimated monies.
Developing Our Leaders – Creating a Foundation for Success
Bridging Research, Information and Culture An Initiative of the Research and Planning Group for California Community Colleges Your Name Your Institution.
April 6, 2011 DRAFT Educator Evaluation Project. Teacher Education and Licensure DRAFT The ultimate goal of all educator evaluation should be… TO IMPROVE.
The Marzano School Leadership Evaluation Model Webinar for Washington State Teacher/Principal Evaluation Project.
A Commitment to Excellence: SUNY Cortland Update on Strategic Planning.
Strategic Planning and the NCA Special Emphasis A Focus on Community Engagement and Experiential Learning.
An Assessment Primer Fall 2007 Click here to begin.
SEM Planning Model.
1 GETTING STARTED WITH ASSESSMENT Barbara Pennipede Associate Director of Assessment Office of Planning, Assessment and Research Office of Planning, Assessment.
CRICOS Provider No 00025B Strategies for enhancing teaching and learning: Reflections from Australia Merrilyn Goos Director Teaching and Educational Development.
Staff Compensation Program Update
Pace University Assessment Plan. Outline I. What is assessment? II. How does it apply to Pace? III. Who’s involved? IV. How will assessment be implemented.
The Academic Assessment Process
Title I Needs Assessment and Program Evaluation
INSTRUCTIONAL LEADERSHIP FOR DIVERSE LEARNERS Susan Brody Hasazi Katharine S. Furney National Institute of Leadership, Disability, and Students Placed.
Cosumnes River College Refreshing our Vision, Mission & Values Students, Faculty, Staff & Administration March 28, 2008.
Reaffirmation of Accreditation: Institutional Effectiveness Southern Association of Colleges and Schools February 2008 Stephen F. Austin State University.
Senior Team Briefing Implementing 360 Degree Feedback.
Creating a Culture of Collaboration: Collecting Community Engagement Data Susan Connery, Director of the Feinstein Community Service Center Christopher.
An initiative of the Research & Planning Group for California Community Colleges Assessing Institutional Effectiveness An RP Group initiative funded by.
Accrediting Commission for Community and Junior Colleges of the Western Association of Schools and Colleges.
Strategic Planning Summit GAP/Committee Chairs/IE December 5,
Mia Alexander-Snow, PhD Director, Office for Planning and Institutional Effectiveness Program Review Orientation 1.
Helping Your Department Advance and Implement Effective Assessment Plans Presented by: Karen Froslid Jones Director, Institutional Research.
Hanmer School – Margaret Zacchei Highcrest School – Maresa Harvey Webb School – Michael Verderame Emerson-Williams School – Neela Thakur Charles Wright.
1. Continue to distinguish and clarify between Student Learning Outcomes (SLOs) and Service Area Outcomes (SAOs) 2. Develop broad SLOs/SAOs in order to.
Stages of Commitment to Change: Leading Institutional Engagement Lorilee R. Sandmann, University of Georgia Jeri Childers, Virginia Tech National Outreach.
Data-Driven Change: Tools to Improve Campus-wide Retention Derek Price, DVP-PRAXIS LTD Vincent Tinto, Syracuse University/Pell Institute for the Study.
Focus on Learning: Student Outcomes Assessment and the Learning College.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Student Services Assessment Workshop College of the Redwoods Angelina Hill & Cheryl Tucker Nov 28 th & 30 th, 2011.
Using Electronic Portfolios to Assess Learning at IUPUI. Trudy Banta, et. al. Indiana University-Purdue University Indianapolis 2007.
Assessing Program-Level SLOs November 2010 Mary Pape Antonio Ramirez 1.
College Board EXCELerator Schools Site Visit Preparation.
1 Promoting Evidence-Informed Practice: The BASSC Perspective Michael J. Austin, PhD, MSW, MSPH BASSC Staff Director Mack Professor of Nonprofit Management.
Assessing Student Learning Lynn Merklin Assistant Provost Office of Institutional Effectiveness August, 2014.
1 PROJECT EVALUATION IT’S ALL ABOUT STUDENTS. 2 In partnership, we help America’s students stay in school and graduate by: Reducing gaps in college access.
The Student Services Assessment Institute (SSAI): Creating a Culture of Assessment through Professional Development Kim Black, Ph.D. Stephanie Torrez,
March 26-28, 2013 SINGAPORE CDIO Asian Regional Meeting and Workshop on Engineering Education and Policies for Regional Leaders Programme Evaluation (CDIO.
Don Dodson, Senior Vice Provost Diane Jonte-Pace, Vice Provost for Undergraduate Studies Carol Ann Gittens, Director, Office of Assessment Learning Assessment.
Integrating Knowledge Translation and Exchange into a grant Maureen Dobbins, RN, PhD SON, January 14, 2013.
Mia Alexander-Snow, PhD Director, Office for Planning and Institutional Effectiveness Program Review Orientation 1.
Campus Quality Survey 1998, 1999, & 2001 Comparison Office of Institutional Research & Planning July 5, 2001.
ELearning Committee Strategic Plan, A Brief History of the ELC Committee Developed and Charged (2004) CMS Evaluation and RFP Process (2004)
Information Literacy: Assessing Your Instruction Texas Library Association District 10 Fall Workshop October 15, 2005 Michelle Millet Information Literacy.
Quality Assurance Review Team Oral Exit Report District Accreditation Rapides Parish School District February 2, 2011.
Assessment & Program Review President’s Retreat 2008 College of Micronesia - FSM May 13 – 15, 2008 FSM China Friendship Sports Center.
Assessment of Student Learning in General Education AAHE/NCA 2003 Assessment Workshop Omaha, Nebraska ● June 2003.
Kimberly B. Lis, M.Ed. University of St. Thomas Administrative Internship II Dr. Virginia Leiker.
Why Community-University Partnerships? Partnerships Enhance quality of life in the region Increase relevance of academic programs Add public purposes to.
Assessing Information Literacy with SAILS Juliet Rumble Reference & Instruction Librarian Auburn University.
Evaluating Our Assessment Program Spring 2004 What is assessment? Assessment is the ongoing process of understanding and improving student learning.
Basic Skills, Outcomes, and Fostering a Culture of Evidence and Inquiry | 2009 | RP Group.
[Presentation location] [Presentation date] (Confirm ABT logo) Building Bridges and Bonds (B3): An introduction.
Conversation with the SLOA&C March 20, 2015 Professional Development Day SJCC Presenters: C. Cruz-Johnson, S. Datta, and R. Gamez Paving the Way for Student.
Demonstrating Institutional Effectiveness Documenting Using SPOL.
From Nuts and Bolts to Drywall and Paint: Building an Assessment Culture Office of Outcomes Assessment.
Note: In 2009, this survey replaced the NCA/Baldrige Quality Standards Assessment that was administered from Also, 2010 was the first time.
Assessment Basics PNAIRP Conference Thursday October 6, 2011
Effective Outcomes Assessment
Senior Team Briefing Implementing 360 Degree Feedback.
Governance and leadership roles for equality and diversity in Colleges
Internship Bill of Rights
Engaging Institutional Leadership
Program Review Presentation March 17th, 2016
Pati Kravetz Associate Director for Experiential Learning and Student Employment Main title: 40 pt. Arial Presenter Name: 16 pt. Arial Presenters Title:
Pati Kravetz Associate Director for Experiential Learning and Student Employment Main title: 40 pt. Arial Presenter Name: 16 pt. Arial Presenters Title:
Presentation transcript:

RACHEL BAKER & ELIZABETH DAYTON GRADUATE SCHOOL OF EDUCATION, STANFORD UNIVERSITY ANDREW LAMANQUE FOOTHILL-DE ANZA CCD KATHERINE MCLAIN COSUMNES RIVER COLLEGE Leading by Example: Assessing Institutional Research Office Outcomes

Background Institutional Research is becoming increasingly important for helping to measure institutional quality as witnessed by new reporting requirements from federal and state governments and accreditation. The senior institutions under WASC have a standard requiring an evaluation of the IR Office (Periodic Reviews of the IR Office and WASC Criterion for Review 4.2. Heather Brown, Sutee Sujitparapitaya, Rebecca Sorell, Michael Wrona. CAIR, 2013). While ACCJC does not have such an explicit requirement the new standards assume a highly functional IR Office. 2

Satisfaction, Outputs, and Outcomes Satisfaction How do people feel about our work? Outputs How many projects did we complete? Outcomes Campus leaders know how to interpret and apply research Improvements based on results Development of a Culture of Inquiry 3

Purposes of Assessing IR Outcomes Promotes Improvement Provides Confidence when Advising Others Assures Quality to Campus Community 4

FHDA IR AUOs 2012 Articulate 3 questions important to ask when starting a research project at FHDA. Interpret and draw correct conclusions from a cross tabulation of descriptive statistics such as course success rates by ethnicity. Using data provided by the IR office, identify and describe 3 key attributes of students served by the employees assigned work unit. Utilize data provided by the IR office to enhance programs and services. 5

IR AUO Assessment Cycle Developed AUOs Surveyed Clients Reflected on Results / Assessment of Outcomes Changes Made / Set new Benchmarks 6 Asked ‘test’ questions Dropped one AUO

Structured Interviews Spring We pursued two overarching questions: 1. Is the information received from IR useful? And specifically, is it presented in a way that is easily understood and applied? 2. How is information received from IR used to make decisions?

In discussing these questions, we often asked: 8 How often are you in touch with IR? In what capacity? (e.g., meetings, presentations, s) What kind of information do you tend to receive from IR? (i.e., tell me about the kinds of things you tend to learn from IR; do you have a recent example?) How do you tend to use the information you receive from IR? What decisions have you made based in information you have received from IR? Can you think of a specific example? How could the information you receive from IR be more useful? Are there ways in which the information IR provides you is mismatched with the kinds of questions you tend to have, or the kinds of decisions you need to make? Are there specific kinds of information you would especially appreciate from IR? Has your understanding of data, and how best to use it, developed through your interactions with IR? Do you feel you’ve learned to use data differently? What do you think IR does particularly well? Where do you think IR could best expand or improve its campus-interactions?

Interview Findings 9 IR informs decision making—heard in 10/10 interviews  Scheduling and curriculum decisions – 6 interviews  Hiring decisions – 1 interview  By complementing faculty, administrators’ “qualitative” knowledge – 8 interviews There are opportunities to grow—10/10 interviews  IR is perceived to benefit some departments more than others – 4 interviews  More longitudinal data would be valuable, both before and after enrollment – 4 int.  More readily accessed data could be valuable – 6 interviews IR provides additional value—9/10 interviews  Supporting grants and funding – 6 interviews  Providing individualized support – 8 interviews  Facilitating productive communication – 2 interviews IR identifies strengths and points of vulnerability—7/9 interviews  Curriculum and programs – 7 interviews  Faculty – 2 interviews  Students – 3 interviews

Assessment of Outcomes 10 Campus leaders know how to interpret and apply research  7/9 of those interviewed cited examples of IR skills that they had learned. {Goal = 90%} IR Data used to make improvements  9/9 of those interviewed cited examples of IR skills that they had learned. {Goal = 100%}

Establishing a Culture of Inquiry Definition of Culture - the behaviors and beliefs characteristic of a particular group Definition of Inquiry – the systematic investigation into a matter Working definition of a Culture of Inquiry - An organization with a culture of inquiry would be characterized by the regular and systematic investigation of practices and outcomes across the organization. Related Outcome - The college community has the information needed to adequately assess and improve the effectiveness of its instructional and non-instructional program and services. RP Conference Presentation 2014

Two Assessments RP Conference Presentation 2014 Matrix to analyze data dissemination (BricTap Inquiry Guide p. 13) apacity apacity Administrator survey (derived from the Strategic Data Project from Harvard University) resources/files/news-events/sdp-rubric-self- asssessment.pdf resources/files/news-events/sdp-rubric-self- asssessment.pdf

Data Integration Matrix RP Conference Presentation 2014 Website Postings Research Office Newsletter Action research teams College research committee Sharing of actions taken based on data Information Sessions Presentations Technical Assistance Meetings Briefings Data Integration Workshops Research and Assessment Methods workshops Research Agenda Development and Implementation Data Integration Strategy Matrix High Low High IMPACT SCOPE

Application of the Matrix RP Conference Presentation 2014 Research Briefs Research Web Site Assessment Reports Research Studies distribution of research Research Projects Innovation Grants Innovation Grant Reports Research Consultation Research Presentations Technical Assistance Resources (presentations and materials) Assessment Methods Workshops Research Briefings to inform planning To inform/follow up on management meeting dialog Data Integration Strategy Matrix High Low High IMPACT SCOPE

Reflection on the Assessment Tool RP Conference Presentation 2014 PROS  Easy to apply  Can be applied in a variety of contexts  Types of activities  Activity list/log  Research agenda  Provides a new way to track and evaluate activities  Quickly identifies possible areas of improvement CONS  Evaluates outputs not their impact  May not span all research office activities  Some activities span classification areas

Administrator Survey RP Conference Presentation 2014 Questions derived from the 7 principles about the strategic use of data from Strategic Use of Data Rubric Perception survey about  Application of these principles at the college  Evaluation of personal application of these principles  Relative important of Research Office Data dissemination Pilot-tested this semester in leadership team (n=16) Survey instrument available at _Office/Research_Office_Web-based_Surveys.htm

Application of the Survey RP Conference Presentation 2014 Institution  Strengths - Using data to set goals and monitor college performance  Challenge - Using data for program enhancement Individual  Challenge - Using data to adjust budgets Important research office activities  Internal research disseminated in response to request or related to a topic/issue being discussed  Website, data briefings, discussions about actions taken in response to research Relatively unimportant research office activities  External research that is related to a topic/issue being discussed

Reflection on the Survey Tool RP Conference Presentation 2014 PROS  Can be customized to different groups  First implementation can set a baseline  Can identify relative strengths and potential areas of change  Open-ended questions provide a wealth of data  Survey can lead to individual learning and meaningful dialog CONS  Diversity in the responses may reflect diversity of roles  Not appropriate for institution-wide assessment  Indirect assessment  Identifies potential growth areas not specific interventions  Many inputs contribute to the outcome

Questions? RP Conference Presentation 2014