Presentation is loading. Please wait.

Presentation is loading. Please wait.

A Department of Geographical and Life Sciences b Learning and Teaching Enhancement Unit

Similar presentations


Presentation on theme: "A Department of Geographical and Life Sciences b Learning and Teaching Enhancement Unit"— Presentation transcript:

1 a Department of Geographical and Life Sciences emilia.bertolo@canterbury.ac.uk b Learning and Teaching Enhancement Unit glenis.lambert@canterbury.ac.uk IMPLEMENTING CAA IN CHEMISTRY A CASE STUDY Emilia Bertolo a Glenis Lambert b

2 2 Summary of this talk What we did  CAA assignment (chemistry) Why / how / when  Student feedback from previous year  Assessment structure  Examples of CAA questions What students thought  Data from 2005 (conf. proceedings) and 2006 (new) What we learnt  What did/didn’t work in 2005; how we approached the 2006 exercise

3 3 Abstract  Level 1 Skills for Forensics Investigators. CAA exercise focused on chemistry concepts AIMS  Rapid feedback  Enhance student engagement with the subject  Reduce lecturer’s marking load

4 4 The problem Could the assessment be improved?  2004 cohort had completed a paper-based assignment (short questions) BUT  The large number of students involved meant a considerable gap between assignment and lecturer feedback  Informal student feedback highlighted their difficulties in engaging with the subject

5 5 Was CAA the answer? Jenkins (2004) Learning and Teaching in HE, 1, 67-80  Advantages: repeatability, close connection between activity and feedback, flexibility of access, increased student motivation  Pitfalls: development time, potential risks associated with hardware, software and administration, students need appropriate computing skills Lowry (2005) Chem. Ed. Research & Practice, 6 (4), 198-203  CAA system to provide chemistry support to Environmental Science students

6 6 What we did (I)  Two tests: formative + summative. Several question types (some included images)  Formative test could be taken several times, so as to: - familiarise students with the question types - spot unforeseen technical problems - allow students to enhance their chemistry knowledge (through feedback)

7 7 What we did (II)  Summative test could only be accessed once (no feedback)  Marks were released a few days after the end of the assessment period  2005 (see conference proceedings): 83 students (Feedback: 32 respondents, 38.5%)  2006 (new data): 73 students (Feedback: 25 respondents, 34.2 %) Student evaluation via on-line questionnaires

8 8 The challenges (learning technologist)  Students could take the test off campus  Students could take the test at any time  There was no real way of knowing whether the person who took the test was in fact the student

9 9 The challenges (the lecturer’s perspective)  Lack of experience in designing online assessment  Initial time investment necessary to design the tests  Designing pedagogically sound questions In 2006 the assessment was prepared using Respondus (in QTI format)

10 10 The test The original question (in 2004) What is the main biological molecule present in hair? What bonds are being broken in the hydrolysis of this biological molecule? At the end of the hydrolysis, what is present in the solution (as solvent and as solute)?

11 11 The e-learning advantage  For the lecturer, the time invested preparing the assessment was considerable, but it was spent in a creative way, as opposed to routine marking  Students valued the short gap between the assessment and the release of the marks  Students embraced the opportunity of using the formative test, with the students trying it an average of four times before moving to the summative test Overall, the experience was positive for both staff and students

12 12 The practice test was useful for understanding concepts from the lectures Feedback included comments such as “I thought the on-line chemistry test was extremely useful and it helped me understand chemistry more”

13 13 I liked doing a computer assessment

14 14 I would have preferred a paper-based assessment

15 15 The assessment was well supported by staff

16 16 Evaluation of the software & students’ computing skills Data for 2005 (2006 results very similar)

17 17 Some pitfalls  Setting up the assessment was very time consuming The assessment ran surprisingly smoothly, BUT  In 2005, 3 students tried to access the summative test before they were ready to take it  Several technical problems arose from students accessing the assessment from outside the university network  There were some issues regarding students’ “last minute” working practices

18 18 No of tests taken over the period of availability 0 5 10 15 20 25 03/11/200504/11/200505/11/200506/11/2005 07/11/200508/11/200509/11/200510/11/200511/11/200512/11/200513/11/200514/11/2005 15/11/200516/11/200517/11/2005 Date Number

19 19 Recommendations  Check the assessment carefully: time consuming (and boring), but definitely worthwhile  Discuss what to do if something goes wrong, but do not panic unnecessarily. If the assessment is robust, the process can run quite smoothly  Establish clear boundaries regarding staff availability  Do not to make rushed judgements, wait until the assessment period has ended  Consider whether you’ll be able to support the test from outside the university network

20 20 …and finally E. Bertolo would like to thank the staff from the Learning and Teaching Enhancement Unit for the support provided, and Dr Simon Clay, for his help in revising the assessment Good communication between the lecturer and the learning technologist is essential Always keep students informed


Download ppt "A Department of Geographical and Life Sciences b Learning and Teaching Enhancement Unit"

Similar presentations


Ads by Google