What to do with your data?

Slides:



Advertisements
Similar presentations
Meeting MSCHE Assessment Expectations
Advertisements

Analyzing Student Work
TWS Aid for Supervisors & Mentor Teachers Background on the TWS.
Culminating Academic Review Adams State College Department of Teacher education graduate programs.
Why are we spending time writing assessment reports? 1.Because documenting your process can potentially lead to more trustworthy results and meaningful.
Quality, Improvement & Effectiveness Unit
Measuring Student Learning March 10, 2015 Cathy Sanders Director of Assessment.
Culminating Academic Review Adams State College Department of Teacher education graduate programs.
An Assessment Primer Fall 2007 Click here to begin.
FAMU ASSESSMENT PLAN PhD Degree Program in Entomology Dr. Lambert Kanga / CESTA.
Educational Outcomes: The Role of Competencies and The Importance of Assessment.
Authentic Assessment Abdelmoneim A. Hassan. Welcome Authentic Assessment Qatar University Workshop.
Department Plans for Assessment of Student Learning Teachers Talking February 16, 2011.
HELPFUL TIPS FOR UNIT PLANNING Office of Institutional Effectiveness.
Reaffirmation of Accreditation: Institutional Effectiveness Southern Association of Colleges and Schools February 2008 Stephen F. Austin State University.
How to Use Data to Improve Student Learning Training Conducted at Campus-Based SLO Summit Spring 2014 How to Use Data to Improve Student Learning.
1 Let’s Meet! October 13,  All four people have to run.  The baton has to be held and passed by all participants.  You can have world class speed.
February 8, 2012 Session 3: Performance Management Systems 1.
The Department of Educational Administration Assessment Report School of Education and Human Services Carol Godsave, Chair, Assessment Coordinator.
CCI Bootcamp Back to School Staff Development Rotation.
Applying the Principles of Prior Learning Assessment Debra A. Dagavarian Diane Holtzman Dennis Fotia.
Module 3: Unit 1, Session 3 MODULE 3: ASSESSMENT Adolescent Literacy – Professional Development Unit 1, Session 3.
What is the TPA? Teacher candidates must show through a work sample that they have the knowledge, skills, and abilities required of a beginning teacher.
Approaches to Assessment Workshop for College of the Redwoods Fred Trapp August 18, 2008.
TWS Aids for Student Teachers & Interns Overview of TWS.
EDU 385 CLASSROOM ASSESSMENT Week 1 Introduction and Syllabus.
Practicing Meaningful Learning Outcomes Assessment at UGA Department of Crop and Soil Sciences August 10, 2015 Dr. Leslie Gordon Associate Director for.
Assessing Your Assessments: The Authentic Assessment Challenge Dr. Diane King Director, Curriculum Development School of Computer & Engineering Technologies.
NCATE STANDARD I STATUS REPORT  Hyacinth E. Findlay  March 1, 2007.
UK Office of Assessment. The LEARNING Initiative Dual Track Implementation Strategy Completion Dates Not actively engaged in program level assessment.
Presentation on Outcomes Assessment Presentation on Outcomes Assessment toCCED Mohawk Valley Community College October 11, 2004.
QCC General Education Assessment Task Force March 21 and 22, 2016 Faculty Forum on General Education Outcomes.
1 Embracing Math Standards: Our Journey and Beyond 2008.
INSTITUTIONAL RESEARCH PLANNING AND ASSESSMENT DR. SHEMEKA MCCLUNG DIRECTOR ARNITRA HUNTER RESEARCH ASSOCIATE.
NCATE Unit Standards 1 and 2
Student Learning Outcomes (SLOs) Module #4: SLO Annual Report
Making the most of the assessment cycle and template
Evaluating Student-Teachers Using Student Outcomes
Functional Area Assessment
The assessment process For Administrative units
Standard I Systematic Planning.
Consider Your Audience
Developing a Student Learning Outcomes Assessment Plan and Report
Completing the Siena College Assessment Report
Director of Policy Analysis and Research
Effective Outcomes Assessment
Wethersfield Teacher Evaluation and Support Plan
NCATE Standard 3: Field Experiences & Clinical Practice
Reading, Writing & Math at Grade Level
Transforming Grading Robert Marzano
Courtney Mills Principal, Midlands Middle College
Parent & Staff Survey Results
Session 4 Objectives Participants will:
Institutional Effectiveness USF System Office of Decision Support
Office of Education Improvement and Innovation
Essential Question: How can I give back to my school and community over the course of the year?
Implementing the Specialized Service Professional State Model Evaluation System for Measures of Student Outcomes.
What Are Rubrics? Rubrics are components of:
PLCs Professional Learning Communities Staff PD
Presented by: Skyline College SLOAC Committee Fall 2007
Assessment History Since 2013 the department has performed ongoing assessment of RWS 100 & 200 comprised of Internal, department-driven, self-assessment.
Jeanie Behrend, FAST Coordinator Janine Quisenberry, FAST Assistant
Strategic Planning Final Plan Team Meeting
Curriculum Coordinator: D. Para Date of Presentation: Jan. 20, 2017
Business Administration Programs School of Business and Liberal Arts Fall 2016 Assessment Report
Student Learning Outcomes at CSUDH
NON-ACADEMIC ASSESSMENT AND REPORTING FY’17
NON-ACADEMIC ASSESSMENT REPORTING FY’17
NON-ACADEMIC ASSESSMENT REPORTING FY’19
Presentation transcript:

What to do with your data? Anne Carroll John Ward

Two Purposes To help programs demonstrate evidence of use of data for program improvement for Middle States To approach continuous improvement in a way that is meaningful for stakeholders.

Data related to Student learning outcomes: what students will know, be able to do when they have completed a Program. Direct measures directly assess student work such as tests, papers, presentations, performance. Indirect measures findings about student perceptions - such as surveys Program outcomes: other goals for programs beyond learning outcomes such as student retention, student participation in internships, student satisfaction with services, post-graduation employment, faculty scholarship.

Assessment Cycle https://assessment.gwu.edu/assessment-cycle

The Assessment Template - Represents One Cycle

The Assessment Template - Represents One Cycle Mostly on Response

The Assessment Template - Represents One Cycle 15 - 20 minutes at this point

Increase 1st year retention. Benchmark 76% SLO (ILO) Example Apply skills in critical analysis and reasoning for the interpretation of data. Discussion at tables - whole group - Two examples of measures - not our point, but sets the context for responding to data. How do these measures and benchmarks differ? In what ways do these measures and benchmarks lead to different kinds of responses? John get rubric handout for critical thinking from GEAC - 5 minutes

What is the role of success criteria? (also called targets, benchmarks) Helps select priorities Helps keep you honest Not an absolute decision rule for action or inaction Success criteria should be set so as to “pass the embarrassment test” (Linda Suskie) Setting criteria should be an iterative process Now at 30- 35 minutes. Whole group discussion after title question

Success Criteria - clear guidance for programs Clear: 90% of students will achieve a 60% or higher on the test. Not clear: Students will achieve 60% or higher on the test. Clear: the program average for standardized test will be above the state average. Not clear: Students will achieve the cut score on standardized exam.

Criteria help set priorities

The Assessment Template - Represents One Cycle Mostly on Response

Summarizing the data - suggestions for template Should be sufficient to convey essential information for broad range of audiences Should be concise - Remember that your reports will be combined with reports from other programs Most reports will include mean or some form of frequency count, n, and whether the success criterion was met Additional detail as needed to connect to your response to data Sample handout here

What do you like / dislike about data summary examples? Pages 3-4 handout At 50 minutes. Small groups - 5 minutes

The Assessment Template - Represents One Cycle 55 minutes

Retention example or your own data Increase 1st year retentio 1. How would you represent this in “Summary of Findings”? 2. Are we meeting criteria- is this an item for action? 3. What do the data tell us about response? What can be done in your program / area to help meet this goal? Benchmark 76% 55 minutes - Group response - 10 minutes Our example or your data

Legitimate responses to data (page 5 handout) Do nothing / ongoing monitor data Dig deeper into data / triangulate with other data (external and internal) Revise assessments Change success criteria Change holistic rubrics to more analytic rubrics Change measure Investigate best practices Make changes Analyze effect of previous changes Identify changes you are already making consistent with data Now at 65 minutes - Possibly part of the handout

All answers are good (except maybe not) Data are not useful for blame Don’t let the perfect be the enemy of the good If you are perfect, we don’t really believe it Discussion of a problem is a step to action, not a goal itself

Types of changes: Middle States List a. assisting students in improving their learning; b. improving pedagogy and curriculum; c. reviewing and revising academic programs and support services; d. planning, conducting, and supporting a range of professional development activities; e. planning and budgeting for the provision of academic programs and services; f. informing appropriate constituents about the institution and its programs; g. improving key indicators of student success, such as retention, graduation, transfer.. h. implementing other processes designed to improve educational programs and services

Your questions, challenges, frustrations have you experienced in using data for improvement? 75 minutes - Small group 5 minutes

Response to Data: Common Questions and Issues What if data are not credible or don’t provide clarity? Build on strengths or correct problems?? Should we use to justify changes or to brainstorm needed changes? Who to involve in different stages of the process? Outcome data show where to improve, but not how. We are already perfect - why improve? Should we make decisions based on data with low n? Pick one at your table and develop an answer

Top line principle: Important to show success and ways to improve (no matter how good we are) Demonstrate Success Continuous Improvement Standard V “Assessment of student learning and achievement demonstrates that the institution’s students have accomplished educational goals consistent with their program of study, Standard V.3 consideration and use of assessment results for the improvement of educational effectiveness Skip if getting late

what are your next steps? You are tasked with writing the report, you have your summary of data, report is due in one week, what are your next steps? Whole group

Response to data should include process Summary of Findings should be communicated with stakeholders People who actually enact change must be part of the discussion on Response to Data “We will discuss data at department meeting and develop an action plan by March 1st” Should be accountability for follow-through on plans Pick one at your table and develop an answer

Thank you

Low N - What is your response to data? SLO #6: Instructional Strategies — The teacher candidate will be able to understand and use a variety of instructional strategies Year Evaluated: 2017-2018 and annually thereafter Assessment Instruments and Methods Success Criteria Summary of Findings Response to Data Measure 1 - Final evaluation instruments from both supervisors and mentors for the following items: 3b: Engaging Students 75% of candidates will be rated as Proficient (top rating) on related items The spring semester we had only 3 student teachers. Of the 3, 2 were rated as “Acceptable” and 1 was rated “Not Met.”