Download presentation
Presentation is loading. Please wait.
Published byGeorgiana Blair Modified over 9 years ago
1
Multi-State Collaborative 1
2
Ashley Finley ◦ Senior Director of Assessment and Research – AAC&U Finley@aacu.org Bonnie Orcutt ◦ Director of Learning Outcomes Assessment MA Dept of Higher Education Borcutt@bhe.mass.edu 2 Peter Ewell ◦ Vice President – National Center for Higher Education Management Systems (NCHEMS) Gary Pike ◦ Director, Information Management and Institutional Research ◦ Professor of Higher Education and Student Affairs – IUPUI
3
Represent a population of students ◦ In order to make statements or draw inferences about a population Represent specific populations of students Ask important questions about who should be represented in a sample 3
4
To develop a campus level process for building representative samples over time To develop protocol for samples representative enough to move the process forward… ◦ Decent sample size ◦ Decent amount of randomization ◦ Decent amount of diversification across courses and faculty 4
5
Managing campus and state level expectations Reasonable levels of representation Recognize sources of bias and ways for improving process – currently and moving forward 5
6
Sampling (process) vs. Sample (noun) Sample frame ◦ A list from which you can draw a sample Element = Artifact = Student work How it all fits together: Population Sampling Frame Sample Elements 6
7
Students who have completed a minimum of 75% of credits toward graduation Student work must be completed in fall of 2014 (but can be completed at any time during the semester) Who’s eligible: ◦ Students in associate or bachelor degree program ◦ Transfer and non-transfer students ◦ Full and part-time ◦ Enrollment in day and evening courses ◦ Enrollment in online courses, hybrid/blended, and face-to- face courses 7
8
Targeted minimum of 75-100 independent artifacts ◦ Limit of one outcome assessed per artifact ◦ Limit of 7-10 artifacts TOTAL from any one faculty member ◦ Limit of one artifact per student Institutions in a consortium must agree on a common sampling procedure Institutions willing and able to collect a larger sample are encouraged to do so 8
9
Start with students ◦ Identify list of students who are eligible for pilot ◦ Then find courses containing those students (course schedules) ◦ Then ask faculty teaching those courses to participate Faculty ◦ Identify list of faculty who teach large number of students eligible for pilot ◦ Then identify courses taught by those faculty and ask to participate ◦ Then identify eligible students in courses Courses ◦ Identify list of courses that contain highest number of students eligible for pilot ◦ Then identify faculty who teach courses and ask to participate ◦ Then identify eligible students in courses 9
10
Eligible Student Population: students who have completed 75% + credits required to be graduated as of a campus determined census date Sampling Frame: identify courses within which eligible students are enrolled with willing faculty Sampling Process: simple random sampling using computerized program, manual, systematic…consistent with sampling parameter limitations Sample Elements: student work completed by the students in your sample 10
11
Sample student characteristics for MSC pilot institutions: ◦ Gender; ◦ Race/ethnicity ◦ Major or program ◦ Pell eligible ◦ Age Institutions may opt to collect additional student characteristics – beyond what the MSC requests. Links back to general sampling goals. Stages for Development Stage 1: ◦ Submit draft of plans by June 9 th, 2014 ◦ Feedback by June 30 th Stage 2: ◦ Final sampling plans due by July 20 th, 2014 Stage 3: ◦ Submit documentation detailing sampling process, difficulties of implementation, where sampling process deviated from plan and why, plans for improvement, other observations 11
12
Stage 1 and Stage 2: Reporting Document and Evaluative Tools Is our sampling plan implementable? Does our plan build in mechanisms for generating a diverse and representative sample? Where might sampling biases be introduced? 12
13
Stage 3: Reporting Documents and Evaluative Tools Overview of implemented sampling process & difficulties encountered What facilitated or impeded reaching or exceeding targeted sample size Where sampling process deviated from plan and why, What biases does your sample reflect What might be learned based upon the actual sample generated What are the characteristics of sample relative to characteristics of total eligible population What was learned from the sampling process What plans for improvement 13
14
What if all of the assignments come from only one or two departments? What if we can’t gather 75 artifacts for one of the outcomes? What if an assignment doesn’t address all of the elements of the rubric? What if the sample does not look at all like the total eligible student population? 14
15
Who can my campus contact with questions? What types of support are available to assist campuses in developing and/or implementing sampling plans and processes? 15
16
16 Sampling for the Pilot Study May 21, 4-5pm (ET) and May 22, 4-5pm (ET) Multi-State Assessment: IRB & Student Consent May 28, 5-6pm (ET) and June 3, 4-5pm (ET) Value Rubrics Date and Time TBD Coding, Formatting, Submitting: Using Taskstream Date and Time TBD (late summer) Webinars will be recorded and posted to: http://www.sheeo.org/msc Webinars already posted: Welcome to the MSC Questions? Pilot Study Overview Assignment Design Additional MSC Webinars
Similar presentations
© 2024 SlidePlayer.com Inc.
All rights reserved.