Multi-State Collaborative For Learning Outcomes Assessment (MSC) An initiative sponsored by the Association of American Colleges and Universities and the.

Slides:



Advertisements
Similar presentations
Chapter 5 Some Key Ingredients for Inferential Statistics: The Normal Curve, Probability, and Population Versus Sample.
Advertisements

Comparing Two Means: One-sample & Paired-sample t-tests Lesson 12.
SLO Course Assessment in 5 Easy Steps Vivian Mun, Ed.D.
Descriptive Statistics. Descriptive Statistics: Summarizing your data and getting an overview of the dataset  Why do you want to start with Descriptive.
Independent t -test Features: One Independent Variable Two Groups, or Levels of the Independent Variable Independent Samples (Between-Groups): the two.
Pilot Study Introduction and Overview 1.  Julie Carnahan Senior Associate - SHEEO MSC Project Director  Terrel Rhodes Vice President.
How College Shapes LivesFor detailed data, see: trends.collegeboard.org. SOURCE: National Center for Education Statistics, 2013, Tables 222, 306, and.
Integrated Postsecondary Education Data System (IPEDS) Interrelated surveys conducted annually by the National Center for Education Statistics (NCES)
Multi-State Collaborative 1.  Susan Albertine Vice President, Office of Diversity, Equity, and Student Success, AAC&U Faculty Engagement Subgroup, MSC.
Multi-State Collaborative 1.  Ashley Finley ◦ Senior Director of Assessment and Research – AAC&U  Bonnie Orcutt ◦ Director of Learning.
Title to go here By Peter Deacon & Mel Parekh The GP Patient Survey - Managing the largest healthcare study in the UK.
Faculty Survey of Student Engagement Using What Faculty Say about Improving Their Teaching Thomas F. Nelson Laird, IUB Jennifer Buckley, IUB Megan Palmer,
Introduction to the MSP Management Information System Molly Hershey-Arista December 16, 2013.
Comparing Means: T-Test PSYC 301 /SPSS SPRING 2014.
CCSSE Houston Community College System Presented by Margaret Drain June 19, 2007.
Chapter 9 Two-Sample Tests Part II: Introduction to Hypothesis Testing Renee R. Ha, Ph.D. James C. Ha, Ph.D Integrative Statistics for the Social & Behavioral.
Inferential Statistics: SPSS
AAC&U/ Minnesota Collaborative Pilot (MCP) Project: Artifacts and Assessment Aug. 19, 2014 Professional Development Day 1.
Assessment Leader Training General Education Student Learning Assessment GEO Training Series: 2 of 5 Spring 2012 February 13, 2012.
Education 793 Class Notes T-tests 29 October 2003.
The Fractal World of Student Engagement 26 th Annual Georgia Perimeter College Mathematics Conference February 15, 2013 Patricia (Patti) Gregg, Ph.D. Acting.
Tk20Tk20 CAMPUS TOOLS FOR HIGHER EDUCATION. WHAT IS IT? Tk20 is an electronic program that offers one, central, easy location to manage all courses. Instructors.
Strategies for Implementing Program-Level Assessment through Outcomes Jeremy Anderson Manager, Academic Computing Dr. Kaitlin Walsh Instructional Designer/Technologist.
Data Management for Large STEP Projects Michigan State University & Lansing Community College NSF STEP PI Meeting March 15, 2012 Workshop Session I-08.
ANOVA Practice1 ANOVA Unit 4 Analysis of Variance “A one-way ANOVA, or single factor ANOVA, tests differences between groups that are only classified on.
+ General Education Assessment Spring 2014 Quantitative Literacy.
Grading Rubrics Heartland Community College IDC. Rubrics  Purpose of Workshop –Define Rubrics and Identify their general Strengths and Weaknesses –Identify.
MGT-491 QUANTITATIVE ANALYSIS AND RESEARCH FOR MANAGEMENT OSMAN BIN SAIF Session 26.
NONTRADITIONAL STUDENTS IN COMMUNITY COLLEGES AND THE MODEL OF COLLEGE OUTCOMES FOR ADULTS Applied Technology, Training and Development University of North.
ACUHO-IEBI Resident Study 2009 Highlights from the Executive Summary.
Copyright (C) 2002 Houghton Mifflin Company. All rights reserved. 1 Understandable Statistics S eventh Edition By Brase and Brase Prepared by: Lynn Smith.
Chapter 9 Three Tests of Significance Winston Jackson and Norine Verberg Methods: Doing Social Research, 4e.
Low-income Adults in Profile: Low-income Adults in Profile: Improving Lives Through Higher Education Bryan Cook ACE Center for Policy Analysis.
VALUE/Multi-State Collaborative (MSC) to Advance Learning Outcomes Assessment Pilot Year Study Findings and Summary These slides summarize results from.
WIDS OVERVIEW Terri Johnson WIDS Consultant. Online—Access Anytime/Anywhere Custom URL and banner for your college Can be connected to active directory.
The Single-Sample t Test Chapter 9. t distributions >Sometimes, we do not have the population standard deviation. (that’s actually really common). >So.
Experimental Research Hanser and Wheeler. Principles Independent Variable Dependent Variable.
Faculty Driven Assessment Karla Guilford Shipp Tidewater Community College Virginia Beach, VA.
Trends in Student Aid 2015For detailed data, visit: trends.collegeboard.org. Student Aid and Nonfederal Loans in 2014 Dollars (in Millions), to.
+ Office Productivity Tools EDTC 5103 Carolyn Lau.
Blue RidgeCommunity College Blue Ridge Community College The Blue Ridge General Education Degree Qualifications Profile Assessment and Alignment Project.
AAC&U VALUE Project: Minnesota Collaborative Demonstration Year 2 ( ) Demonstration Year 3 ( ) 1.
College Career Ready Conference Participants will  Review components of the Grade 3 and the Grades 4 and 5 Condensed Scoring Rubric for Prose Constructed.
Educational Research Inferential Statistics Chapter th Chapter 12- 8th Gay and Airasian.
QNT 351 Genius Education Expert/qnt351geniusexpert.com FOR MORE CLASSES VISIT www. qnt351genius.com.
Faculty Well-Being Survey: Assessment Activities Presentation for the NC State Assessment Work Group May 2, 2007 Nancy Whelchel, PhD Assistant Director.
Texas Higher Education Coordinating Board Data Highlight: Completion CAAP Meeting March 30,
QNT 351 genius Expert Success/qnt351geniusdotcom FOR MORE CLASSES VISIT
The Process The Results The Repository of Assessment Documents (ROAD) Project Sample Characteristics (“All” refers to all students enrolled in ENGL 1551)
A Paradigm Shift in the Assessment of Learning Outcomes Tara Rose Director of University Assessment University of Kentucky Jeanne Mullaney.
Scott Elliot, SEG Measurement Gerry Bogatz, MarketingWorks
A nationwide US student survey
2017 Multi-State Collaborative (MSC) to Advance Student Learning
Refinement Year Results ( ) – 1/19/2018
Overview of the Teacher Work Sample (TWS)
Quantitative design: Ungraded review questions
Program Review Tool – Tracking a Cohort
Innovation and the online payment realm A market study
Aqua Training Webinar for Refinement Year Participants
AY College-Wide Learning Outcomes Results
Year-3 The standard deviation plus or minus 3 for 99.2% for year three will cover a standard deviation from to To calculate the normal.
Keeping Students on Track Using Technological Retention Tools
Example of IRB Application and Resulting Research Paper
Refinement Year Results ( ) – 3/6/2018
Program Review Tool – Tracking a Cohort
What are their purposes? What kinds?
Example of IRB Application and Resulting Research Paper
Quantitative design: Ungraded review questions
(This presentation may be used for instructional purposes)
Assessment Spring 2016.
Presentation transcript:

Multi-State Collaborative For Learning Outcomes Assessment (MSC) An initiative sponsored by the Association of American Colleges and Universities and the State Higher Education Executive Officers Pilot Year Aggregate Results NOTE: These tables represent aggregate results of a pilot study involving samples of student work from 29 2-year and 24 4-year institutions in 9 states. Work products were assessed by faculty members using common rubrics developed as part of AAC&U’s VALUE project. See for more information and rubrics used. Because this was a pilot study, the results are generalizable for all students in each participating state or nationally. The pilot sample included work products from 2,642 students sampled from 53 institutions in 9 states.

MSC Pilot by the Numbers 53 institutions uploaded artifacts – Some consortia uploading – 52 public, 1 private Institutions in all 9 states of MSC represented By sector: – 24 four-year – 29 two-year By Carnegie Type: – Associates = 31 – Masters = 15 – Research = 7 2

MSC Pilot by the Numbers 7,215 pieces of student work were submitted – Students had to be 75% of the way to completion – 2,642 artifacts scored twice (36.6%) – 5.2% marked unscoreable More than 1,100 assignments submitted More than 100 scorers Approximately 5% were marked unscorable 3

Reporting To Come: On-line reporting interface This week: – Cover Sheet – Institutional level raw data in Excel – Aggregate results – Tool Kit – SPSS Syntax 4

Taskstream Demo What is to come... 5

This Week Cover Sheet Aggregate results in power point Institutional data file – Before analyzing institutional data – Structure of institutional data file Analyzing Institutional Data – Research Questions – Excel – SPSS 6

MSC Aggregate Tables Overview of the project (see also cover sheet) Data show MSC Sample is relatively representative of MSC Participants, – Slightly over representative of hispanic and traditional age students – Not representative of nation or states and may or may not be similar to your institutions sample Breakout of assignment subject and artifacts by learning outcome 7

Things to Check About your Data Is the sample representative of the population from which it was drawn? – Demographic characteristics (race/ethnicity, gender, age, Pell status, etc) – Student characteristics (student major, degree level, Full- time/Part time status etc.) the sample at your institution similar to the project sample? (using similar breakouts above) What type of assignments were submitted? What is the distribution of the data? – Is it normally distributed around a mean? – Are there outliers? 8

Questions you may consider asking How did certain artifacts score on the dimensions for each learning outcome? How did certain assignments on the dimensions for each learning outcome? How did faculty intent match scorer perceptions? How did students as a whole perform on the dimensions for each learning outcome? How do students at my institution compare to all students involved in the project? Is there a difference between how different groups performed? (independent sample t-tests) 9

Institutional Data File Structure Assignment Data Data Included in the Assignment Coversheet: There are multiple Artifacts for each assignment. Artifact Data Data included in the Artifact Upload File: There are multiple rows for each score. Score Data Scores by dimension submitted by scorers: There is only one score per rubric dimension

Excel Example

SPSS Example