Student Engagement Survey Middle East (SESME) Implementation Workshop Smarter Learning Symposia Shangri-La Hotel, Dubai Wednesday 24 April 2013 Professor Hamish Coates & Dr Sarah Richardson
Linking global insights into higher learning
Educational principles Academic standards Innovation Evidence Productivity Value added Participation Outcomes Work readiness Curriculum Capabilities Student focused Focused support Pathways Career-ready Multi-disciplinary Global education Human potential
that identifies key indicators of quality – the things that really countsets externally referenced and context-relevant standards of performancecollects quantitative data on performanceuses that data to highlight areas of strength and improve areas of weaknessprovides information to potential students in an inspiring fashionassures the public that minimum standards of performance are being met Imagine an institution…
Plan Act Evaluate Improve Hunch
institutional, educational, epistemological, conceptual, disciplinary, industrial, transnational, commercial, cultural, professional, practical, supranational, universal, ontological, metaphysical, pedagogical, situational, organisational, interpersonal, historical, aesthetic, political…
Little data Happiness data Effectiveness data Commitment to effectiveness Universal Elite Mass
Spotting areas of risk Producing a cultivating climate Profiling groups Responding to individuality Identifying unexpectedness Stimulating change Tracking change Change perspectives
An emerging global measure Global benchmarking: student engagement measured systematically in United States, Canada, Australia, New Zealand, Ireland, United Kingdom, China, Japan, South Africa, etc. Nearly 4 million students participated in the National Survey of Student Engagement (NSSE) Over 1,500 North American colleges and universities participated in NSSE More than 150,000 students have participated in the Australasian survey of student engagement (AUSSE)
After lunch… Refine definition of ‘student engagement’ and how we will measure this in the Middle East Look at reporting for institutions Review approach to pilot implementation in Chart next steps
Continuing consultative development Initial meetings in Qatar in November 2011 and October 2012 Constitute Advisory Group to oversee development Establish communications architectures Students, staff, institutions and other stakeholders will provide feedback Invite institutions to participate in the pilot survey fieldwork
What is student engagement? Student-centred perspective of learning Encompasses both academic and ‘beyond classroom’ activities and conditions Assumption that individuals learn and develop through involvement with key educational practices and activities Grounded in decades of research Links between student engagement and retention, completion and positive learning outcomes
Developing questionnaire for the pilot MESEQ will be based on current version of NSSE MESEQ will be adapted to be appropriate for the Middle East context Feedback will be sought from the sector and Advisory Group to further develop the MESEQ Survey will be discussed with students in focus groups, interviews and small scale tests Based on feedback and findings from consultation and focus groups, MESEQ will be further revised
Shaping aspirations Admission and integration Involvement and retention Graduate transitions Higher-Order Thinking General Learning General Development Career Readiness Average Overall Grade Departure Intention Overall Satisfaction Future Intentions Higher-Order Thinking General Learning General Development Career Readiness Average Overall Grade Departure Intention Overall Satisfaction Future Intentions Academic Challenge Active Learning Student and Staff Interactions Enriching Educational Experiences Supportive Learning Environment Work Integrated Learning Academic Challenge Active Learning Student and Staff Interactions Enriching Educational Experiences Supportive Learning Environment Work Integrated Learning
Academic Challenge Extent to which expectations and assessments challenge students to learn Time spent preparing for class Amount of reading and writing Institutional expectations Focus of coursework
Academic Challenge Hours spent preparing for class
Active Learning Students’ efforts to actively construct their knowledge Asking questions/contributing to discussions Giving presentations Working with other students Participating in community-based learning
Active Learning Frequently participated in active learning
Student and Staff Interactions Level and nature of students’ contact with teaching staff Receiving feedback Discussing grades and assignments Discussing ideas from classes Discussing career plans Working with teaching staff
Student and Staff Interactions ‘Never’ interacted with staff
Enriching Educational Experiences Participation in broadening educational activities Interacting with people of different backgrounds Participating in extracurricular activities Taking part in a practicum or internship Doing volunteer work Studying a foreign language Participating in a learning community
Enriching Educational Experiences Participated in broadening activities
Supportive Learning Environment Students’ feelings of support within the university community Quality of relationships with students and staff Academic support provision Non-academic support provision Support to socialise
Supportive Learning Environment Quality of relationships
Demographics and contexts Demographic and context questions Students’ gender Year level Mode and type of study Discipline Residency/citizenship status Home language Ethnicity Living arrangements Age
Demographics and contexts Average engagement by mode of study
Demographics and contexts Average engagement by field of study
Localisation for equivalency SourceTranslationNational reviewVerificationNational reviewFinal check
Localisation for equivalency
Shaping continuing improvement…? Your reactions to the current draft instrument? Further improvements you recommend? Language and cultural issues? Value of different questions?
How is data on engagement used? Responding to accountability and transparency Assessing and improving student learning, curriculum and teaching practices Creating partnerships through benchmarking Improving student retention and success Institutional audits and performance reviews Evidence-based conversations about improving student engagement
Benchmarking results Within an individual institution Between individual institutions Between groups of institutions With national level findings
Benchmarking between institutions Average Active Learning scale scores
Benchmarking between countries Average scale scores
Pilot fieldwork Following finalisation of MESEQ survey will be piloted with participating institutions Pilot survey will be conducted in English and Arabic Survey will be conducted primarily online with some paper survey forms used as required All data returned to ACER for processing and reporting
Psychometric analysis and reporting Following fieldwork, ACER prepares overall data file ACER conducts psychometric analysis of the survey and its scales Each participating institution receive its own data file and customised benchmark report Overall report will be prepared and published
Review of SESME pilot Based on the psychometric analyses some changes may be made to the instrument Based on feedback from participating institutions, survey methods will be refined All participating institutions will be asked to provide feedback on their experience of being involved in the pilot All feedback and analyses will feed into the development of the next SESME
SESME pilot timeline APRMAYJUNJULAUGSEPOCTNOVDECJANFEB Smarter Learning Symposia 24 Consultation on instrument Focus groups Finalise instrument Language adaptation and translation Pre-pilot preparations Pilot fieldwork Students participate in survey Data file and report preparation Review of SESME pilot