Presentation is loading. Please wait.

Presentation is loading. Please wait.

Data: What Do We Mean and What Kind Do We Need? Facilitator: Chris Hill, Grossmont College Presenters: Elaine Kuo, Foothill College Mallory Newell, De.

Similar presentations


Presentation on theme: "Data: What Do We Mean and What Kind Do We Need? Facilitator: Chris Hill, Grossmont College Presenters: Elaine Kuo, Foothill College Mallory Newell, De."— Presentation transcript:

1 Data: What Do We Mean and What Kind Do We Need? Facilitator: Chris Hill, Grossmont College Presenters: Elaine Kuo, Foothill College Mallory Newell, De Anza College

2 State of Accreditation Today American Council of Education recommendations on accreditation 1. Transparency – accreditation documents; student outcomes 2. Metrics and measures – improvement of data; common, comparative measures 3. Defined pathways to student success – increased focus on matriculation and student support; incentives for completion 4. New pedagogies and new entities – MOOCs; assure comparable quality of online learning 5. Adaptive accreditation reviews – flexibility for expedited reviews of sounds institutions 6. Enhanced gatekeeping – more pressure on accreditors and institutions to maintain quality between site visits From ACCJC News, Fall 2012 2

3 State of Accreditation Today From June 2011-June 2012, the ACCJC took the following institutional actions: 1. 28 institutions had a comprehensive evaluation 2. 12 institutions were reaffirmed 3. 10 institutions were placed on warning 4. 5 institutions were placed on probation 5. 1 institution was placed on show cause From ACCJC Report of Institutional Actions from the June 2011, January 2012, June 2012 Commission Meetings 3

4 WHAT BRINGS YOU HERE? 4

5 Overview of session What does ACCJC say about data and evidence? What types of data/evidence are needed? Where and how are data discussed? How are data used for continuous improvement? 5

6 WHAT DOES ACCJC SAY ABOUT DATA AND EVIDENCE? 6

7 Good evidence used in evaluations has the following characteristics: 1. It is intentional, and a dialogue about its meaning and relevance has taken place. 2. It is purposeful, designed to answer questions the institution has raised. 3. It has been interpreted and reflected upon, not just offered up in its raw or unanalyzed form. 4. It is integrated and presented in a context of other information about the institution that creates a holistic view of the institution or program. 5. It is cumulative and is corroborated by multiple sources of data. 6. It is coherent and sound enough to provide guidance for improvement. Characteristics of Evidence Source: ACCJC/WASC Guide to Evaluating Institutions, 2009 7

8 1. Use longitudinal data when possible 2. Use data in context 3. Look for both direct and indirect data 4. Do not oversimplify cause and effect of data 5. Use appropriate levels of data for appropriate levels of decisions 6. Perception is the reality within which people operate 7. Use of data should be transparent 8. Consider carefully when to aggregate or disaggregate data 9. Focus on data that is actionable 10. Consider implications and the “What if? Data 101 - Principles From Data 101: Guiding Principles for Faculty (ASCCC, 2010) 8

9 WHAT TYPES OF DATA ARE NEEDED? 9

10 Data Sources External ◦ Accountability Reporting for California Community Colleges (ARCC) ◦ Community College Survey of Student Engagement (CCSSE) ◦ CSU Analytics/UC Statfinder ◦ Department of Education – Data Quest ◦ Career Technical Education Outcomes Survey ◦ CCCCO DataMart ◦ NCES (National Center for Educational Statistics) ◦ CPEC (California Postsecondary Education Commission) ◦ CalPASS Internal ◦ Annual Master Plan Updates ◦ Degree and Transfer Data ◦ Annual Factsheets by Program ◦ Annual Program Review Data ◦ Student Learning and Service Outcomes Assessments ◦ Key Performance Indicators (KPIs) ◦ Environmental Scans ◦ Surveys ◦ Ad hoc research studies 10

11  Building an Evidence-based Infrastructure  Thinking about everything you do as a research study  Working with your researcher and department to collect data  Sharing the data with your department and/or campus  Keeping Up with the Demand  Setting aside a time each year to discuss the data in your area  Linking research to (resource) planning  Making changes to your program based on the data  Turning Data into Action  Ask questions, share it, refine it, ask for more  Keep track of all the great things you are doing with the data  Share with others that may be data weary Information Capacity Challenges From Fulks, Hasson, and Mahon - 2010 AI presentation 11

12 Helpful Hints Disaggregate the data by ethnicity! Collect data and evidence on an annual basis to inform your accreditation cycle (don’t have to collect it all in year 6). Monitor the college’s progress on completing it’s planning agendas on an annual basis. Assess the college’s institutional level outcomes on an annual basis to stay on track within your cycle, don’t forget about your AUOs as well. 12

13 Examples of evidence related to student learning Information on Student Achievement: (i.e., student progress through the institution) Is there evidence that the college has the capacity to collect, does collect, and uses in its own evaluation and planning processes, data on student achievement? Is there evidence that the college does so regularly? Is this data disaggregated by subpopulations where appropriate? This includes data on: ◦ Student demographics ◦ Student preparedness for college, including performance on placement tests ◦ and/or placement ◦ Student needs (i.e., local employment training needs, transfer education needs, ◦ basic skills needs, etc.) ◦ Course completion data ◦ Data on student progression to the next course/next level of course ◦ Data on student program (major) completion ◦ Data on student graduation rates ◦ Data on student transfer to a four-year institution ◦ Data on student job placements ◦ Data on licensure exams (scores, pass rates) 13

14 Examples of evidence related to student learning (cont’d) Information on Student Learning Outcomes: (i.e., student mastery of the knowledge, skills, abilities, competencies, etc. identified by those designing the educational experience of the college). Is there evidence that student learning outcomes are defined? ◦ By course ◦ By program ◦ By degree (including General Education requirements) Is there evidence there was dialogue about the SLOs? ◦ Prior to development ◦ As part of developing integrated educational services and courses/programs As part of institutional self evaluation, planning, and improvement ◦ At the appropriate level of inclusion for the SLOs for courses, programs, and degrees (is it evident that SLOs are “tracked” from courses, through programs, and to certificate and degrees) ◦ In terms of how institutional processes can be oriented to better support learning. 14

15 Examples of evidence related to student learning (cont’d) Information on Student Learning Outcomes: (i.e., student mastery of the knowledge, skills, abilities, competencies, etc. identified by those designing the educational experience of the college). Is there evidence the SLOs are measured and the measurements are analyzed in order to: ◦ Inform pedagogy and help improve the educational services ◦ Evaluate institutional effectiveness and plan institutional improvements  The rubrics created to describe SLOs and related measurement strategies o The ways in which specific pedagogical practices are changed in response  to analyses of SLO attainment  Analyses of SLO attainment used in the Program/Unit Review process to improve student learning, programs, and services? Is there evidence that students are learning? ◦ Samples of student work ◦ Copies of summary data on measured student learning outcomes 15

16 2001-2002 to 2011-12 From FHDA IR&P Course Level Data: Math Performance Success – Course Success Rates by Ethnicity

17 Institution Level Data: Online Course Success Rates by Ethnicity

18 Full-time students who spend over 21 hours per week preparing for classes (studying, homework, rehearsing, reading, writing)? Students who have taken or plan to take a developmental course. Developmental – DA=619, FH=290; Non Developmental – DA=306, FH=302 S urvey Based Data – CCSSE: How many hours do you spend in a 7-day week preparing for class?

19 Includes students who selected often or sometimes. Other includes Native American, Other, Decline to state. Total respondents DA=1,286; FH=904. DA: African American=25, Asian/PI=663, Latino=215, White=224, Other=139; FH: African American=33, Asian/PI=329, Latino=154, White=286, Other=102. From FHDA IR&P, CCSSE 2012 S urvey Based Data – CCSSE: How often do you use academic advising/planning?

20 Program level data: Labor Projections and Program Completions 20 Radiologic Technology From EMSI, 2012

21 21 From Tim Nadreau, EMSI Service Area Level Data: Vocational Training Projections 1/3 of jobs will require training involving up to one year’s worth of experience, training and/or instruction.

22 HOW ARE DATA AND ASSESSMENT RESULTS COMMUNICATED, DISCUSSED, AND USED FOR CONTINUOUS IMPROVEMENT? 22

23 Institutional Data Fills Two Important Gaps Elevated Level of Organizational Awareness Improved Processes & Performance Knowledge Gap Performance Gap Strategic Planning Function Institutional Effectiveness & Student Success Function (Poor Planning) (Good Planning) (Good Performance) Classroom and Service Area Assessment Standard Level of Organizational Awareness Institutional Outcomes and Benchmarks From Fulks, Hasson, and Mahon - 2010 AI presentation 23

24 Review of data and evidence is most meaningful when it informs decision making at the proper place of practice 1,000 ft Perspective 100 ft Perspective On the Ground Institutional Strategies Classroom Innovation Program Improvements Resource Allocation Institutional Policies System Structures Program Alignment Program Redesign Program Curriculum Pedagogy Course Redesign Innovations in Learning From Gregory Stoup, Pinpointing Areas to Improve (2012)

25 Where are data communicated and discussed? 25 DepartmentCollege and/or District Professional Development ActivitiesCommittee/Council meetings Department MeetingsInstitutional Planning Documents Program ReviewEmail/Newsletters Student Learning Outcomes Assessment Factsheets/Environmental Scans Annual Planning DocumentsInstitutional Website

26 Cañada College Student Performance and Equity Dashboard developed and maintained by The Office of Planning, Research and Student Success From Gregory Stoup, Pinpointing Areas to Improve (2012) 26

27 DETAILED TABLES 1.Successful Course Completion Rates…………………….……………………………. 2.Fall-to-Spring Persistence …………………………………………………………………… 3.Fall-to-Fall Persistence ……………………………………………………………………….. 4.Student Success Rates during their first year……….…………..………………… 5.Success Rates in Gen Ed Course …………………………………………………………. 6.Success Rates in CTE Courses ……………………………………………………….…….. 7.Success Rates in Pre-Transfer Courses ………………………………………….…….. 8.Success Rates in ESL Courses ……………………………………………………….…….. 9.Six Year Degree Completion Rates ……………………………………………………... 10.Six Year Certificate Completion Rates …………………………………………………. 11.Median Number of Years to Degree …….………………………….…….………….. 12.Average # of Credits Accumulated after 1 Year..……….……………………….. 13.Average # of Credits Accumulated after 2 Years..……….……………….…….. 14.Pct Placed into BS Math & taking BS math in first term ………………………. 15.Pct Placed into BS Math & taking BS math in first term ……………………… 16.Pct Placed into BS Math & taking BS math in first term ……………………… 8 10 12 14 16 18 20 22 24 26 28 30 32 34 36 38 From Gregory Stoup, Pinpointing Areas to Improve (2012) 27

28 Five Year Trend in Succesful Course Completion Rates (Cañada College 2007/08 – 2011/12) Why this matters: Course completion is perhaps the most widely used and reported indicator of student academic achievement. Higher levels of course completion are associated with higher levels of degree and certificate completion. This report highlights that for Cañada students course completion rates vary widely by both student ethnicity and student age. Course Completion Rate: also referred to as the college-wide course pass rate and the college success rate is an aggregation of student course taking performance. The success rate is the percentage of grades awarded that indicate successful course completion, namely, a grade of A, B, C, P or CR. From Gregory Stoup, Pinpointing Areas to Improve (2012) 28

29 Feedback from college constituencies College-wide Succesful Course Completion Rate Summary of the discussion of findings College Planning Council. Reviewed on September, 2012. Student Equity Committee. Reviewed on September, 2012. Why the overall decline over this period? What are the underlying forces driving the trend and what are their various magnitudes? How much of this is due to demographic changes? Budget? Policy? Hiring patterns? Process reengineering? The 25 percentage point gap by ethnicity is not acceptable. We must understand where we can best target efforts to pull up the lowest performing groups. We need to embark on a path of modest gains in terms of both closing the gap and improving the overall college average. Next steps College Planning Council. Student Equity Committee. OPRSS will identify completion rate trends within the college’s most enrolled courses. How do the rates of different courses compare? Did student enrollment in courses with traditionally lower completion rates over this period? CPC will review Program Review data that provides completion rate trends by dept. OPRSS will provide completion information on cross sectional groups including ethnicity by gender, ethnicity by age, ethnicity by geography & ethnicity by financial aid status. The goal being to identify specific populations where focused interventions can be deployed and be most effective. 29 From Gregory Stoup, Pinpointing Areas to Improve (2012)

30 Setting objectives related to college mission and goals 30

31 Documenting your process and progress 31

32 Example - Planning Agenda Progress and Completion Template 32

33 Developing the Research Agenda 1. What and who will be researched? 2. How is research tied to college plans, goals, initiatives and/or activities? 3. How will the information be used, by whom and how often? 4. Which methodology or approach will be used? Turning Data into Information 1. What do the data tell us? 2. Which questions were fully answered by the research and which need more exploration? 3. What are reasonable benchmarks based on the research? Taking Action on the Information 1. What interventions or strategies do we need to deploy in order to move the needle? 2. How should this information be shared and applied across the college? Action Research Guided Questions From Fulks, Hasson, and Mahon - 2010 AI presentation 33

34 WHAT IS YOUR BIGGEST TAKEAWAY FROM THIS SESSION? 34


Download ppt "Data: What Do We Mean and What Kind Do We Need? Facilitator: Chris Hill, Grossmont College Presenters: Elaine Kuo, Foothill College Mallory Newell, De."

Similar presentations


Ads by Google