Presentation is loading. Please wait.

Presentation is loading. Please wait.

The use of a computerized automated feedback system Trevor Barker Dept. Computer Science.

Similar presentations


Presentation on theme: "The use of a computerized automated feedback system Trevor Barker Dept. Computer Science."— Presentation transcript:

1 The use of a computerized automated feedback system Trevor Barker Dept. Computer Science

2 Contents Feedback considerations Approaches to feedback Automated feedback – Previous research Examples Discussion

3 Chickering and Gamson’s Seven Principles: Good practice in higher education… 1.Encourages contact between students and lecturers 2.Develops reciprocity and cooperation among students 3.Encourages active learning 4.Gives prompt feedback 5.Emphasises time on task 6.Communicates high expectations 7.Respects diverse talents and ways of learning

4 Gives prompt feedback Feedback must be prompt but it must also be good i.e. – Appropriate – Useful – Accurate – Individual – Fast – Facilitate feed forward

5 5 Reasons for automated approaches to testing and learning Vast investment in infrastructure Availability of MLE systems such as UH Studynet Changes in nature of Higher Education Online and distance education Increase in student numbers (SSR) Increasing pressures on time and cost

6 6 Previous research Computer-Adaptive Test Based on Item Response Theory (IRT) If a student answers a question correctly, the estimate of his/her ability is raised and a more difficult question is presented If a student answers a question incorrectly, the estimate of his/her ability is lowered and an easier question follows

7 7 Previous research Computer Adaptive Testing Computer-Based Tests (CBTs) mimic aspects of a paper-and-pencil test – Accuracy and speed of marking – Predefined set of questions presented to all participants and thus questions are not tailored for each individual student Computer-Adaptive Tests (CATs) mimic aspects of an oral interview – Accuracy and speed of marking – Questions are dynamically selected according to student performance

8 8 Benefits of the adaptive approach Questions that are too easy or too difficult are likely to – Be de-motivating – Provide little or no valuable information about student knowledge The CAT level identifies a unique boundary between what the student knows and what he or she does not know

9 9 Providing individual feedback based on CAT. An application of the CAT approach is in the provision of automated individual feedback This approach has been in operation for several years at the University of Hertfordshire in two BSc. Computer Science modules Recently this model has been extended to make it easier to use on other modules

10 10 About the Feedback Learners received feedback on: – Overall proficiency level; – Performance in each topic; – Recommended topics for revision – Cognitive level (Bloom) Feedback on assessment performance was initially made available to learners via a web- based application

11 11 Bloom’s taxonomy

12 Example questions 12

13 13

14 14 Performance Summary

15 15 Points for Revision

16 16 Results: tutors’ opinions Tutors consider that the fast feedback provided by a CAT is as good as or better than that currently provided in many cases. The link to Bloom’s levels was positive The approach was considered to be efficient, possibly freeing time for other activities CAT considered to be best as a formative tool, rather than for summative assessment Some tutors were concerned that the approach was ‘impersonal’ There is a need for a monitoring role for tutors, for practical and ethical reasons

17 Recent research The CAT automated feedback system has been extended from objective testing to include written and practical tests Testing and evaluation of the new system with approximately – 350 first yearBSc (1 final practical test), – 120 second year BSc(2 written and practical tests) and – 80 final year BSc (2 final practical tests) – 70 MSc students ( 2 written tests)

18

19 First prototype

20

21 Detailed marking scheme for one question showing feedback

22 Converted manually into simple database file

23 Output from system - email

24

25

26 Later prototype

27 Final summary screen

28 Added features Markers able to comment on the completeness of the hand-in – In this version, the hand-in information is presented to the marker who may then make additional comments on the completeness or nature of the hand-in. Feedback was determined by the system based on the mark awarded in each section of a question, reading it from the database file for the assignment. After all the question sections had been marked, the system presented a final summary screen so that the marker could check that the marks had been awarded accurately. The marker can add additional feedback at the end

29

30 Results Student attitude to feedback was good irrespective of score on test – Useful – Fair – Convenient – Quantity – Quality Internal moderator happy with feedback Suggestions from moderator were included in the next prototype

31 Latest version

32 Modifications Easy to set up feedback database automatically Tutors can modify and add to feedback for each question Additions to feedback saved for re-use later

33 33 In summary Larger class sizes, greater use of online and distance assessment ensures that feedback is often too slow and too general to be of any real use to learners. Personalised automated feedback is likely to become increasingly important in the future. It is being used in four modules currently at UH. Learners and tutors accept the need for automated feedback and most appreciate the benefits of such systems. The system is being further developed to make it simpler for general use


Download ppt "The use of a computerized automated feedback system Trevor Barker Dept. Computer Science."

Similar presentations


Ads by Google