Programme level outcomes Increased usage of appropriate technology-enhanced assessment and feedback, leading to: –Change in the nature of assessment –Efficiencies, and improvement of assessment quality –Enhancement of the student and staff experience Clearly articulated business cases Models of sustainable institutional support, and guidance on costs and benefits Evidence of impact – on staff and students, workload and satisfaction
Strand A goals and objectives Improved student learning and progression Increased efficiency Enhanced learning and teaching practice Integrated strategies, policies & processes Overarching goals from Strand A projects synthesised from their bid documents.
Deliverables A Baseline report Summary of previous work in the area Evaluation report Range of assets - evidence of impact Guidance and support materials B Evaluation report Range of assets - evidence of impact Short briefing paper summarising the innovation and benefits C Description of user scenarios Descriptions of the technical model Open source widgets and code Developer guidelines Documentation for users Active community of users Short summary of the innovation
www.jisc.ac.uk/assessmentandfeedback #JISCASSESS Programme and support team
Programme Support Team Critical Friends Evaluation Support Synthesis Programme Team Support Co- ordinator
www.jisc.ac.uk/assessmentandfeedback #JISCASSESS What are we learning about technology-enhanced assessment and feedback practices?
Why baseline? Programme Level View of landscape & direction of travel Validate aims & rationale Shared understanding Identify synergies with other work Deliver effective support
Why baseline? Project Level View of landscape & direction of travel Validate scope Confirm/Identify challenges Identify stakeholders Manage & communicate scope Challenge myths Identify readiness for change Show evidence of improvement Important stage of engagement/ownership
Sources of baseline evidence structured and semi- structured interviews (some video) workshops and focus groups process maps rich pictures institutional (and devolved) strategy & policy documents institutional QA documentation reports by QAA, OFSTED & external examiners course evaluations student surveys quantitative analysis of key data sets data from research projects questionnaires
Issues: strategy / policy / principles Formal strategy/policy documents lag behind current thinking Educational principles are rarely enshrined in strategy/policy Devolved responsibility makes it difficult to achieve parity of learner experience
Issues: stakeholder engagement Learners are not often actively engaged in developing practice Assessment and feedback practice does not reflect the reality of working life Administrative staff are often left out of the dialogue
Finding: assessment and feedback practice Traditional forms such as essays/exams still predominate Timeliness of feedback is an issue Curriculum design issues inhibit longitudinal development
Activity Decide if you agree or disagree with each of the statements made on the previous slides (as being representative of mainstream practice in the sector) If you agree – state examples of what can be done about it If you disagree – state examples of evidence to the contrary
Evidence and evaluation projects – Strand B EBEAM – University of Huddersfield EEVS – University of Hertfordshire EFFECT – University of Dundee The evaluation of Assessment Diaries and Grademark – University of Glamorgan OCME – University of Exeter MACE – University of Westminster SG4CL – University of Edinburgh
Timings 11.15 – 11.35: Participants move round all 3 rooms to look at the 7 posters and have short introductory discussions with projects –Identify 3 projects youd like to know more about 11.35 – 11.50: Discussion with Project 1 11.50 – 12.05: Discussion with Project 2 12.05 – 12.20: Discussion with Project 3
Rooms Proceed – Evaluating the Benefits of Electronic Assessment Management, (EBEAM project), Cath Ellis, University of Huddersfield Online Coursework Management Evaluation (OCME project), Anka Djordjevic, University of Exeter The Evaluation of Assessment Diaries and GradeMark at the University of Glamorgan - Karen Fitzgibbon and Sue Stocking, University of Glamorgan Propel 1 - Making Assessment Count Evaluation, (MACE Project), Gunter Saunders and Peter Chatterton, University of Westminster, Mark Kerrigan, University of Greenwich and Loretta Newman-Ford, Cardiff Metropolitan University Evaluating feedback for e-learning: centralized tutors (EFFECT project), Aileen McGuigan, University of Dundee Propel 2 - Student-Generated Content for Learning: Enhancing Engagement, Feedback and Performance (SGC4L project), Judy Hardy, University of Edinburgh Evaluating Electronic Voting Systems for Enhancing Student Experience (EEVS project), Amanda Jefferies, University of Hertfordshire