 Data file preparation  Item and scale scores  Missing data  Reporting the CEQ  Change in 2010  CEQ Q&A.

Slides:



Advertisements
Similar presentations
The Course experience questionnaire (P. Ramsden) Designed as a performance indicator 24 statements relating to 5 aspects 1 overall satisfaction statement.
Advertisements

Jacqui Dowd Introduction to LibQUAL+ University of Westminster 5 th February 2010 LibQUAL+ v LibQUAL Lite at the University of Glasgow.
Mark Troy – Data and Research Services –
Colour coding, KPIs and the Road to Damascus Shedding new light on reporting the student experience at the University of Sydney Rachel Y. Symons Quality.
IDEA What it is and How to Implement the System Texas A & M, February 2013 Shelley A. Chapman, PhD Senior Educational Consultant.
Phillips Associates 1 Capturing Elusive Level 3 Data: The Secrets Of Survey Design Session: SU116 Capturing Elusive Level 3 Data: The Secrets Of Survey.
Data for FCDC Dec ACCJC New Standards Mission Fulfillment through institutionally defined standards Bakersfield College is committed to providing.
Rating Scale Analysis Michael Glencross Community Agency for Social Enquiry (CASE) UK Stata Users Group Meeting 10 September 2009.
Question Development. Steps in the Measurement Process The researcher must determine which level is appropriate for the data that will contribute to management.
Learning Community II Survey Spring 2007 Analysis by Intisar Hibschweiler (Core Director) and Mimi Steadman (Director of Institutional Assessment)
Evaluation Results Aug 28 Seminar. Question I 1. Instruction: Rate aspects of the workshop on a 1 to 5 scale by selecting the number corresponding to.
Graduate Program Assessment Report. University of Central Florida Mission Communication M.A. Program is dedicated to serving its students, faculty, the.
What determines student satisfaction with university subjects? A choice-based approach Twan Huybers, Jordan Louviere and Towhidul Islam Seminar, Institute.
The Test Assessment Questionnaire Katherine M. Sauer William Mertens Metropolitan State College of Denver University of Colorado at Boulder
Teaching & Learning Presentation to Board July 2007.
Survey Research Measuring Environmental Consciousness and it’s Relationship to Green Behaviors and Sustainable Lifestyles.
How Military Affiliation Affects the College Experience: A Comparative Study Chelsea Tanous Robert Jolley, Faculty Advisor Department of Social Work, University.
FINAL REPORT: OUTLINE & OVERVIEW OF SURVEY ERRORS
Technical Issues Two concerns Validity Reliability
ATE Targeted Research Report Assessing the Sustainability of the Advanced Technological Education (ATE) Program.
Feedback on Summative Examinations John Davies (Science and Engineering) Donald Spaeth (Arts) Ruth Cole (Senate Office)
Fifth Annual NSF Robert Noyce Teacher Scholarship Program Conference July 7-9, 2010 Enrique Ortiz University of Central Florida Using a Teaching Goals.
Fall 2008 Graduate Survey Assessment Day Presentation Office of Assessment Fitchburg State College.
Learning & Teaching in FCA: Pathways and Loops!
Survey Data Summary Data Compiled Fall 2007.
San Luis Obispo Community College District SENSE 2012 Findings for Cuesta College.
Classroom Assessment and Grading
University Strategy Marcus Williams. Contents 0.1 Strategic goals 0.2 Re-positioning 0.3 Campus infrastructure 0.4 Sussex off campus 0.5 Malaysia Office.
Student Engagement Survey Results and Analysis June 2011.
Overview of Phase I Data: Approach and Findings Gary Bess Associates April 15, 2009.
Deputy Principal/Curriculum Coordinator Workshops /16119v4.
Essential Skills Task Force March 11, Essential Skills Survey Results Overall On Line Survey 510 respondents ALL Stakeholder groups ALL Oregon regions.
Employee engagement Guide Global Human Resources June 2014.
“Hints for Designing Effective Questionnaires” by Robert B. Frary Presentation by Brandon Benitez.
Postgraduate Taught Experience Survey (PTES) 2010 Interim Results Dr Pam Wells Adviser, Evidence-Informed Practice.
ASSESSMENT OF THE INDEPENDENT STUDY PATHWAY AT LECOM: STUDENT FEEDBACK Mark A.W. Andrews, Ph.D., Professor and Director, The Independent Study Pathway.
Teacher Engagement Survey Results and Analysis June 2011.
Sync Navigation Study Findings Report January 2012 Intranet Center of Excellence.
MGT-491 QUANTITATIVE ANALYSIS AND RESEARCH FOR MANAGEMENT OSMAN BIN SAIF Session 16.
Question Development. Steps in the Measurement Process The researcher must determine which level is appropriate for the data that will contribute to management.
Discovering the value that the library can contribute to quality teaching and learning through student evaluation data Associate Professor Stuart Palmer.
STUDENTS’ SELF-REPORTS ON LEARNING EXPERIENCES IN UW-ACE (Fall 2004) Survey and Report Prepared by Vivian Schoner, PhD Strategic Consultant, Research and.
Concerns, Data, Next Steps.  New Administration Software from Scantron  New Academic Senate Policy  New Items/Survey Form (ACE, Iowa Item Pool)  New.
An Analysis of Three States Alignment Between Language Arts and Math Standards and Alternate Assessments Claudia Flowers Diane Browder* Lynn Ahlgrim-Delzell.
360 Degree Feedback Process Development Planning Session.
Assessing the Effectiveness of a Leadership Requirement for Agronomy Majors Sherry Pogranichniy and Lance Gibson Introduction The Agronomy Department at.
Information Literacy Assessment at Chambers Library
Helmingham Community Primary School Assessment Information Evening 10 February 2016.
CEQ and GDS outcomes in psychology, 1994 to 2003 Steve Provost, Kerrie Dennis & Peter Wilson (SCU) Greg Hannan, Frances Martin, & Gerry Farrell (UTas)
Synopsis Student Survey Data May, 2008 Tegrity Lecture Capture Pilot.
Chapter Twelve Copyright © 2006 McGraw-Hill/Irwin Attitude Scale Measurements Used In Survey Research.
Leading a Course Review. Summary Evaluation of Second Module.
Design of online survey system with an advanced IPA discrimination index for customer satisfaction assessment Presenter: Mai, Yi-Ting ( 麥毅廷, 臺體大運管系 ) Paper.
2007 Alumni Survey Results Administered using SurveyMonkey Updated questions from 2001, 2004 surveys Target: 5 and 3 yr grads (2002, 2004) 43% response.
Postgraduate Taught Experience Survey (PTES) 2010 Interim Results
Selection of appropriate instruments/Validation of the Instrument It is important to ensure that instruments measures what they are designed to measure.
LIKERT SCALES Useful for measuring degrees of intensity of feeling;
Annual Student Feedback Survey 2018 response rate 33 %
Basic Marketing Research Customer Insights and Managerial Action
Your introduction to this year’s English exam.
1=Strongly Disagree 2=Agree 3=Neutral 4=Agree 5= Strongly Agree
Reading Radar Charts.
Explanation of rating scales
Select Findings from the Fall 2018 Enrolled Student Survey
Selected data collection techniques
LibQUAL+ v LibQUAL Lite at the University of Glasgow
Summary of Overall Responses to Survey Statements
Data Transforming Remember: Please sit with your groups.
2017 Postgraduate Research Experience Survey (PRES) Results
Presented by Planning & Research
Presentation transcript:

 Data file preparation  Item and scale scores  Missing data  Reporting the CEQ  Change in 2010  CEQ Q&A

 Course experience perceptions of graduates who completed coursework degrees ◦ Research degree graduates respond to the PREQ  Feedback on up to two majors ◦ More responses than respondents  Eleven scales underpinned by 49 Likert-type items ◦ Three core scales consisting of 13 items

 Good Teaching Scale (GSS) [6]  Generic Skills Scale (GSS) [6]  Overall Satisfaction Item (OSI) [1]  Clear Goals and Standards Scale (CGS) [4]  Appropriate Workload Scale (AWS) [4]  Appropriate Assessment Scale (AAS) [3]  Intellectual Motivation Scale (IMS) [4]  Student Support Scale (SSS) [5]  Graduate Qualities Scale (GQS) [6]  Learning Resources Scale (LRS) [5]  Learning Community Scale (LCS) [5]

 Remove PREQ cases  Remove cases with no CEQMAJ ◦ Imputation possible from MAJ1 and MAJ2  Remove cases with no LEVEL  Remove cases that do not fulfil these conditions: ◦ valid response to OSI, or ◦ at least four GTS item responses, or ◦ at least four GSS item responses  All collection methods retained for 2010 CEQ

 113,523 respondents  131,603 responses GDSCEQ1CEQ2

 Common five-point response scale ◦ 1 = strongly disagree ◦ 2 = disagree ◦ 3 = neither agree nor disagree ◦ 4 = agree ◦ 5 = strongly agree  Some items reverse coded  CEQ Reporting metrics: ◦ 1 = -100; 2 = -50; 3 = 0; 4 = 50; 5 = 100 (CEQ) ◦ 1 = 0; 2 = 0; 3 = 0; 4 = 100; 5 = 100 (PA) ◦ 1 = 0; 2 = 0; 3 = 100; 4 = 100; 5 = 100 (BA)

 Mean of item scores  Minimum item scores: ◦ 1 for OSI ◦ 2 for AAS ◦ 3 for AWS, CGS, IMS ◦ 4 for GQS, GSS, GTS, LCS, LRS, SSS  Item scores removed if scale score not computed  CEQ scores are normally distributed

 Planned: ◦ Optional scale not included ◦ From < 0.1% to 1.8% of responses  Unplanned: ◦ Item non-response

 START ◦ Resource library  Data files  CEQ  2010

 START ◦ Resource library  Reports  2010

 START ◦ Resource library  Data files  CEQ  2010 ◦ Save target as… ◦ Save link as…

 Concern that graduates were misreading the direction of the response scale ◦ Positive comments accompanying negative scores It was awesome!!!

 Endpoint-only labels  All points labelled following resolution by SRG at July 2009 meeting

 Yearly changes in CEQ scores typically incremental  Sample composition consistent between  Response scale change flagged as potential cause  Discussion paper prepared ◦ Core items ◦ Hardcopy/online responses

 Denoting agreement/disagreement makes valence of positive/negative response more explicit (Weijters, 2010) ◦ Respondents generally have a desire to be agreeable (McClendon, 1991) ◦ Fully-labelled response scale results in an upward shift in the response distribution (Guterbock and Hubbard, 2000)  Greater accessibility of labelled response categories is likely to cause a shift away from the midpoint (Simonson, 1989)

 ‘Neither agree nor disagree’ or ‘Undecided’?  Ambiguity of unlabelled midpoint

Item NMeanSt. Dev.NMeanSt. Dev.NMeanSt. Dev. GTS0171, , , GTS0371, , , GTS1071, , , GTS1571, , , GTS1671, , , GTS2771, , , GSS0671, , , GSS1471, , , GSS2371, , , GSS3271, , , GSS4271, , , GSS4371, , , OSI4971, , ,

 Shift in response distribution likely due to labelling ◦ Upward shift from midpoint (‘N’) to fourth point (‘A’) ◦ Consistent with literature  2010 beginning new CEQ time series  Positive development for the CEQ: ◦ More consistent responses ◦ More in line with PREQ response scale  CEQ review—response scale likely reassessed