Presentation is loading. Please wait.

Presentation is loading. Please wait.

Issues in assessing alternative final year dissertations and capstone projects: the PASS perspective Peter Hartley, University of Bradford

Similar presentations


Presentation on theme: "Issues in assessing alternative final year dissertations and capstone projects: the PASS perspective Peter Hartley, University of Bradford"— Presentation transcript:

1 Issues in assessing alternative final year dissertations and capstone projects: the PASS perspective Peter Hartley, University of Bradford p.hartley@bradford.ac.uk http://www.brad.ac.uk/educational- development/aboutus/team/Full_details_27414_en.php

2 This input o Programme-based ideas and approaches o Programme-based assessment (PASS project). o Comparing assessment environments (TESTA project). o Applications and implications, e.g. work at Brunel. o Specific implications for the final year ‘capstone’

3 Assessment is a problem See the PASS Project Issues Paper o Please comment/feedback and use. http://www.pebblepad.co.uk/bradford/viewasset.aspx?oid=260486&type= file http://www.pebblepad.co.uk/bradford/viewasset.aspx?oid=260486&type= file Would highlight: o Assessment ‘drives and channel’. o What/why are we measuring: the‘slowly learnt’problem. o Limitations of grading (e.g. marks are not numbers). o Implications for course structures/regulations. o.

4 Programme-based assessment: PASS NTFS group project over 3 years: o Two years of development and investigation and one year of implementation. Consortium: o Led by Bradford; o 2 CETLs – ASKE and AfL. o Plus Exeter, Plymouth and Leeds Met. o Plus critical friends.

5 What are we investigating? How to design an effective, efficient, inclusive and sustainable assessment strategy that delivers the key course/programme outcomes.

6 Learning from interesting friends

7 TESTA project TESTA project NTFS group project with 4 partners: ‘aims to improve the quality of student learning through addressing programme-level assessment.’ starting from audit of current practice on nine programmes: o surveyed students using focus groups and AEQ – Assessment Experience Questionnaire – Graham Gibbs et al o also using tool to identify programme level ‘assessment environments’ (Gibbs)

8 Consistent practice? Characterising programme-level assessment environments that support learning by Graham Gibbs and Harriet Dunbar-Goddet Published in: Assessment & Evaluation in Higher Education, Volume 34, Issue 4 August 2009, pages 481 - 489Assessment & Evaluation in Higher Education34, Issue 4 August 2009, pages 481 - 489

9 Data from TESTA

10 The need for strategy An example finding from Gibbs o ‘greater explicitness of goals and standards was not associated with students experiencing the goals and standards to be clearer’ And what did make a difference?

11 The need for strategy An example finding from Gibbs o ‘greater explicitness of goals and standards was not associated with students experiencing the goals and standards to be clearer’ And what did make a difference? o Formative-only assessment; o More oral feedback; o Students ‘came to understand standards through many cycles of practice and feedback’.

12 PASS outputs to date General literature review. Students’ view of assessment strategies. Assessment issues. Medical school case study. Inclusive assessment. Survey of staff attitudes.

13 Outputs in progress Assessment types at professional level. Survey of practice across the UK & international perspective. Further case studies. ‘Manifesto’/position paper.

14 Issues to disentangle include: Defining PBA. Assessment environments & impact. Effective regulatory frameworks. Staff perspectives and workload. Student perceptions and expectations. How to develop an effective strategic approach. Grading and credit (and the ‘best’ relationships between them).

15 Defining assessment: a challenge Programme outcomes “need to be assessed in complex, multidimensional student performances” (Rogers, Mentkowski, & Reisetter Hart, 2006, p. 498). How do students define/perceive their performance? o e.g. what makes you a ‘First Class Engineer’?

16 Student perceptions and concerns o perceptions of ‘the course’ are variable; o assessment experienced as ‘fragmented’. BUT anxieties re move to more integrated assessment – perceived risk in terms of performance; concerns about feedback and timing.

17 Searching for types

18

19 An example: Peninsula Medical School Case study already available. Includes: o four assessment modules that run through the 5 year undergraduate medical programme and are not linked directly to specific areas of teaching. o focus on high-quality learning (Mattick and Knight, 2007).

20 Further case studies being explored Brunel o New regulations which separate study and assessment blocks. Liverpool Hope o New regulations which ‘abandon modules’ in all undergraduate programmes. o ‘Key Honours Assessment’.

21 Brunel: the regs 120 credits per year of study. Course/programme can include mix of study, assessment and modular blocks. Option blocks must be modular. Blocks must be in multiples of 5 credits Maximum assessment block is 40 points

22 Examples from Brunel Biomedical Sciences o Study and assessment blocks in all years. o Cut assessment load by 2/3rds; generated more time for class contact. o Synoptic exam in all three years.

23 Examples from Brunel Biomedical Sciences o Study and assessment blocks in all years. o Cut assessment load by 2/3rds; generated more time for class contact. o Synoptic exam in all three years. Mathematics o Conventional modules in final year only. o Improved understanding and ‘carry-over’ of ‘the basics’ into year 2.

24 And back to rethinking the diss? Where does it fit into the overall assessment strategy and environment? How/where does it integrate? How far does it succeed as capstone? Do the institutional rules/regulations help or hinder? Can they flex/change? How do students see and approach it?

25 And finally… assessment/ identity interface Students as ‘conscientious consumers’ (Higgins et al, 2002). But: personal identity as ‘mediator’. o e.g. apprentice (‘feedback is useful tool’) cf. victim (‘feedback is another burden’). So need to change the mindsets of some students?

26 And finally on PASS … Visit the web site: o www.pass.brad.ac.uk www.pass.brad.ac.uk Contact us at: o pass@bradford.ac.uk pass@bradford.ac.uk


Download ppt "Issues in assessing alternative final year dissertations and capstone projects: the PASS perspective Peter Hartley, University of Bradford"

Similar presentations


Ads by Google