Presentation is loading. Please wait.

Presentation is loading. Please wait.

VET in Schools: Strengthening delivery and assessment outcomes A POWERPOINT PRESENTATION DEVELOPED FOR THE NATIONAL QUALITY COUNCIL TO SUPPORT INFORMATION.

Similar presentations


Presentation on theme: "VET in Schools: Strengthening delivery and assessment outcomes A POWERPOINT PRESENTATION DEVELOPED FOR THE NATIONAL QUALITY COUNCIL TO SUPPORT INFORMATION."— Presentation transcript:

1 VET in Schools: Strengthening delivery and assessment outcomes A POWERPOINT PRESENTATION DEVELOPED FOR THE NATIONAL QUALITY COUNCIL TO SUPPORT INFORMATION SESSIONS FOR VET IN SCHOOLS PRACTITIONERS April 2011

2 Contact NQC Secretariat TVET Australia Level 21/ 390 St Kilda Road Melbourne VIC 3004 Telephone: +61 3 9832 8100 Email: nqc.secretariat@tvetaustralia.com.aunqc.secretariat@tvetaustralia.com.au Web: www.nqc.tvetaustralia.com.auwww.nqc.tvetaustralia.com.au

3 Disclaimer This work has been produced on behalf of the National Quality Council with funding provided through the Australian Government Department of Education, Employment and Workplace Relations and state and territory governments. The views expressed herein are not necessarily those of the Australian Government or state and territory governments.

4 Acknowledgement This power point presentation was designed to support the interactive information sessions that formed part of the NQC’s communication and dissemination strategy to support: VET on Schools – strengthening delivery and assessment outcomes. The presentation was developed by Shelley Gillis, Victoria University Andrea Bateman, Bateman & Giles Pty Ltd Chloe Dyson, Dyson & Associates Pty Ltd The full report may be downloaded from the NQC website at: www.nqc.tvetaustralia.com.au/nqc_publications/publications/assessment

5 Setting the Scene  Concerns about the quality of assessments and comparability of standards across the VET sector (including VETiS)  OECD (2008) Reviews of VET (Australia)  NQC (2008) Industry Expectations of VET  Skills Australia (2010) Creating a future direction for VET  Service Skills SA (2010) VETiS Project

6 Service Skills SA VETiS Report Methodology  Consultation with VETiS providers and employers across Australia  Aim to examine how to maximise career pathways and employment options from VETiS programs Findings  Concern about quality of VETiS delivery and employment outcomes  Highlighted inconsistencies and lack of agreed standards within and between jurisdictions across Australia

7 Service Skills SA

8 Our Consultations CONSULTATIONS  ACCI, AMWU, AIG, DEEWR, STA, ISC FINDINGS Assessment Related  To help teachers understand that assessment in VET is different from school-based assessment (ACCI)  Assessments should be against industry standards, not curriculum or academic standards (AMWU).  Students should be assessed against a range of conditions, not just those required by the local work placement (AMWU).  Schools need to be mindful to look outside the classroom, even if they cannot enter the workplace, to try their best to replicate the conditions and context of the workplace as much as possible (AIG).

9 Consultations Curriculum based  At the national level, greater consultation and liaison is required between ACARA and ISCs when developing VET-based curriculum to ensure industry standards are met (DEEWR).  VETiS should not be seen as mechanism to simply engage at risk students of dropping out, otherwise standards may fall (STA) Industry Engagement  Although some teachers will be experienced with validating school-based assessments, this is different to validating assessments against industry standards (ACCI)  Greater engagement of industry in validation of assessments  Distinction between industry and employer – note local employer/supervisor is not representative of industry (AMWU)

10 Consultations Comparability of Standards  Anecdotal evidence suggests that employers consider VETiS qualifications to not have the same parity of esteem as those delivered elsewhere – hence there is a lack of confidence in the school-based assessments (DEEWR, STA) Partnering  Encourage schools to partner with other organisations (eg business enterprises and/or other RTOs (eg TAFEs) to help ensure  appropriate access to staff  conditions for assessment are satisfied (eg in terms of access to equipment and facilities) and staff (STA)

11 Today’s workshop  Assessment – an Overview  Developing Assessment Tools  Assessment Quality Management Framework  Changes in Training Packages

12 What is assessment?  Purposeful process of systematically gathering, interpreting, recording and communicating to stakeholders, information on student performance.

13 Key Stages  identify and describe the purposes for the assessment  identify the assessment information that can be used as evidence of competence/learning  identify a range of possible methods that might be used to collect assessment information  define the contexts for interpreting assessment information in ways that are meaningful for both assessor and candidate  determine the decision making rules  define procedures for coding and recording assessment information  identify stakeholders in the assessment and define their reporting needs.

14 Assessment Purposes  Evaluative  designed to provide information to evaluate institutions and curriculum/standards – primary purpose is accountability  Diagnostic  Produce information about the candidate’s learning  Formative  Produce evidence concerning how and where improvements in learning and competency acquisition are required  Summative  Used to certify or recognise candidate achievement or potential

15 Assessment Purposes  Assessment for learning occurs when teachers use inferences about student progress to inform their teaching (formative)  Assessment as learning occurs when students reflect on and monitor their progress to inform their future learning goals (formative)  Assessment of learning occurs when teachers use evidence of student learning to make judgements on student achievement against goals and standards (summative)  http://www.education.vic.gov.au/studentlearning/assessment/preptoyear10/default.htm

16 Gathering purposeful evidence  Competency assessment  Greater importance on the application of knowledge and skills in practical situations  Curriculum based assessment  Tends to place greater emphasis on an individual’s knowledge and understanding Led to the emergence of a general rule in CBA – evidence should be the most direct and relevant to the performance being assessed

17 Gathering Evidence  Performance based  requires candidate to actively generate or create a response/product that demonstrates their knowledge or skills (portfolio, simulations, role plays, practical demonstrations, workplace observations, open ended questions, peer/self/supervisor ratings, oral presentations etc)  Objective Testing  paper based testing, where the candidate selects a response from a range of alternatives established by the task developers (eg MCQ, T/F)

18 Objective Testing  Perceived Advantages  ease and cost efficiency of scoring  ease of assessing a group of candidates at one time  appearance of objectivity - hence thought to reduce possible assessor bias  standardised administration conditions that can be easily established and maintained  allows multiple ways to assess underpinning knowledge and understanding  ease of determining validity through statistical procedures  ease of determining estimates of reliability

19 Objective Testing Perceived Disadvantages  Lack face validity (Shannon 1991, Wiggins 1991)  Limited to assessing knowledge and understanding (Wiggins 1991)  Requires high level skills in item writing, test construction and data analysis (Taylor, 1993, Linn, Baker & Dunbar 1991, Messick 1992, Anastasi 1988)

20 Performance tasks  Require candidates to perform a task, generate a response, or create a product to demonstrate their knowledge and skills.  Involve direct observation of candidate’s behaviour and/or inspection of a product.  Can range from simple constructed responses (e.g., open ended written and/or oral questions) to comprehensive demonstration of collections of work over time (e.g., portfolio)  Characterised by the need to make a judgement by the assessor/marker.

21 Performance Assessment Perceived Advantages  Greater face validity (due to ‘authentic’ nature of tasks)  Overcomes test wiseness  Greater relevance and direct evidence of competence  Greater flexibility to contextualise tasks  Increased fairness as tasks can be designed to cater for individual needs  Empowerment of candidate in the process  Opportunities for assessment of process, as well as product

22 Judgement error  The halo effect  Assessor Bias and inconsistencies  First impression or primacy error  Spill over effect  Same as me or different from me  Restricted Range  Central tendency

23 Potential Sources of Error in the Assessment  Within candidate  Within the assessment environment  Assessment procedures (in general)  Format of tasks  MCQ  Essay  Observations

24 Activity 1: Errors  In groups, for each source of error, brainstorm as many potential factors that may impact on students’ performances.

25 How do we make sense of assessment information?

26 Interpretation Frameworks  Norm referenced  Criterion referenced  Ipsative referenced

27 Criterion Referencing  Means of interpreting student performance by making comparisons directly against pre- established criteria.  Underpins CBT  Training Packages should form the benchmarks for assessment of students in VETiS programs

28 Session 2: Developing Assessment Tools NQC Products Guide for Developing Assessment Tools Assessment Facts Sheets Simulated Assessment Making Assessment Decisions Peer Assessment and Feedback Quality Assuring Assessment Tools Assessor Guide: Validation and Moderation http://www.nqc.tvetaustralia.com.au/nqc_publications

29 Reliability Validity Assessment tool Validation Moderation Impact Changes to definitions within the NQC publications  AQTF 2010 User Guide documentation; and the  Training Package Development Handbook

30 Essential Characteristics - Assessment Tool An assessment tool includes the following components:  The context and conditions for the assessment  The tasks to be administered to the candidate  An outline of the evidence to be gathered from the candidate  The evidence criteria used to judge the quality of performance (i.e., the assessment decision making rules); as well as the  The administration, recording and reporting requirements.

31 Ideal Characteristics  The context  Competency mapping  The information to be provided to the candidate  The evidence to be collected from the candidate  Decision making rules  Range and conditions  Materials/resources required  Assessor intervention  Reasonable adjustments  Validity evidence  Reliability evidence  Recording requirements  Reporting Requirements

32 Competency Mapping  The components of the Unit(s) of Competency that the tool should cover should be described. This could be as simple as a mapping exercise between the components within a task (eg each structured interview question) and components within a Unit or cluster of Units of Competency. The mapping will help determine the sufficiency of the evidence to be collected as well as the content validity.

33 Competency Mapping (detailed)

34 Competency mapping (moderate)

35 Competency Mapping: Steps in the process  Step 1: Unpack the unit of competency to identify its critical components.  Step 2: For each assessment method, list the tasks to be performed by the candidate.  Step 3: For each assessment method, map the critical components of the unit to each assessment task. Refer to NQC Assessor Guide: Validation and Moderation

36 Level of specificity – Risk Assessment Risk can be determined by consideration of:  Safety (eg potential danger to clients from an incorrect judgement)  Purpose and use of the outcomes (eg selection purposes)  Human capacity (eg level of expertise and experience of the assessors)  Contextual (eg changes in technology, workplace processes, legislation, licensing requirements and/or training packages)

37 Decision Making Rules  The rules to be used to:  Check the quality of the evidence (i.e. the rules of evidence)  Judge how well the candidate performed on the task according to the standard expected  Synthesise evidence from multiple sources to make an overall judgement

38 Reasonable Adjustments  This section of the assessment tool should describe the guidelines for making reasonable adjustments to the way in which evidence of performance is gathered without altering the expected performance standards (as outlined in the decision making rules).

39 Validity Evidence  Validity is concerned with the extent to which an assessment decision about a candidate, based on the performance by the candidate, is justified. Requires determining conditions that weaken the truthfulness of the decision, exploring alternative explanations for good or poor performance, and feeding them back into the assessment process to reduce errors when making inferences about competence  Evidence of validity (such as face, construct, predictive, concurrent, consequential and content) should be provided to support the use of the assessment evidence for the defined purpose and target group of the tool. .

40 Reliability Evidence  Reliability is concerned with how much error is included in the evidence.  If using a performance based task that requires professional judgement of the assessor, evidence of reliability could include providing evidence of:  The level of agreement between two different assessors who have assessed the same evidence of performance for a particular candidate (i.e., inter-rater reliability).  The level of agreement of the same assessor who has assessed the same evidence of performance of the candidate, but at a different time (i.e., intra-rater reliability).  For objective test items (e.g. multiple choice tests) than other forms of reliability should be considered such as the internal consistency of a test (i.e. internal reliability) as well as the equivalence of two alternative assessment tasks (i.e. parallel forms).

41 Examples Write Say Do Create Portfolio Interview Observation Product Refer to NQC Implementation Guide: Validation and Moderation pages 9-23

42 Activity 2: Self Assessment  In groups of 3, review the assessment tool using the self assessment checklist from the NQC (2009) Implementation Guide (Template A.1, p. 45).  Identify any gaps in the tool?  Discuss the pros and cons of including such additional information within the tool?

43 Tool Review  Has clear, documented evidence of the procedures for collecting, synthesising, judging and recording outcomes (i.e., to help improve the consistency of assessments across assessors [inter-rater reliability]).  Has evidence of content validity (i.e., whether the assessment task(s) as a whole, represents the full range of knowledge and skills specified within the Unit(s) of competency.  Reflect work-based contexts, specific enterprise language and job-tasks and meets industry requirements (i.e., face validity).  Adheres to the literacy and numeracy requirements of the Unit(s) of Competency (construct validity).  Has been designed to assess a variety of evidence over time and contexts (predictive validity).  Has been designed to minimise the influence of extraneous factors (i.e., factors that are not related to the unit of competency) on candidate performance (construct validity).

44 Tool Review  Has clear decision making rules to ensure consistency of judgements across assessors (inter-rater reliability) as well as consistency of judgements within an assessor (intra-rater reliability).  Has a clear instruction on how to synthesise multiple sources of evidence to make an overall judgement of performance (inter-rater reliability).  Has evidence that the principles of fairness and flexibility have been adhered to.  Has been designed to produce sufficient, current and authentic evidence.  Is appropriate in terms of the level of difficulty of the task(s) to be performed in relation to the skills and knowledge specified within the relevant unit(s) of Competency.  Has outlined appropriate reasonable adjustments that could be made to the gathering of assessment evidence for specific individuals and/or groups.  Has adhered to the relevant organisation assessment policy.

45 Quality Checks  Panel  Pilot  Trial

46 Activity 3: Engaging Industry  In your groups discuss what input employers (you might wish to specify a vocational area) could provide to develop valid assessment tools and processes.  For the following scenarios, note down 2/3 questions you could ask employers and how the responses will inform the development or review of assessment tools and/or processes. Relevant NQC support materials: Industry Enterprise & RTO Partnership Assessment Fact Sheets: Assessor Partnerships Assessor Guide: Validation and Moderation

47 Session 3: Assessment Quality Management NQC Products Code of Professional Practice: Validation & Moderation Implementation Guide: Validation and Moderation Assessment Facts Sheets Quality Assuring Assessment Tools Systematic Validation Assessor Guide: Validation and Moderation http://www.nqc.tvetaustralia.com.au/nqc_publications

48 Session 3: Assessment Quality Management Quality Assurance Quality Control Quality Review Refer to Handout

49 Assessment Quality Management

50 Validation  Validation is a quality review process. It involves checking that the assessment tool produced valid, reliable, sufficient, current and authentic evidence to enable reasonable judgements to be made as to whether the requirements of the relevant aspects of the Training Package or accredited course had been met. It includes reviewing and making recommendations for future improvements to the assessment tool, process and/or outcomes. NQC Implementation Guide: Validation and Moderation 2009

51 Outcomes of validation Recommendations for future improvements  Context and conditions for the assessment  Task/s to be administered to the candidates  Administration instructions  Criteria used for judging the quality of performance (e.g. the decision making rules, evidence requirements etc)  Guidelines for making reasonable adjustments to the way in which the evidence of performance was gathered to ensure that the expected standard of performance specified within the Unit(s) of Competency has not been altered  Recording and reporting requirements.

52 Moderation  Moderation is the process of bringing assessment judgements and standards into alignment. It is a process that ensures the same standards are applied to all assessment results within the same Unit(s) of Competency. It is an active process in the sense that adjustments to assessor judgements are made to overcome differences in the difficulty of the tool and/or the severity of judgements. NQC Implementation Guide: Validation and Moderation 2009

53 Outcomes of moderation  Recommendations for future improvement and adjustments to assessor judgements (if required) and  Recommendations for improvement to the assessment tools  Adjusting the results of a specific cohort of candidates prior to the finalisation of results and  Requesting copies of final candidate assessment results in accordance with recommended actions.

54 Validation Versus Moderation

55 Types of Approaches – Assessor Partnerships  Validation only  Informal, self-managed, collegial  Small group of assessors  May involve:  Sharing, discussing and/or reviewing one another’s tools and/or judgements  Benefit  Low costs, personally empowering, non-threatening  Weakness  Potential to reinforce misconceptions and mistakes

56 Types of Approaches - Consensus  Typically involves reviewing their own & colleagues assessment tools and judgements as a group  Can occur within and/or across organisations  Strength  Professional development, networking, promotes collegiality and sharing  Weakness  Less quality control than external and statistical approaches as they can also be influenced by local values and expectations  Requires a culture of sharing

57 Types of Approaches - External  Types  Site Visit Versus  Central Agency  Strengths  Offer authoritative interpretations of standards  Improve consistency of standards across locations by identifying local bias and/or misconceptions (if any)  Educative  Weakness  Expensive  Less control than statistical

58 External Representation Vs External Validation  Having an external member on the consensus validation panel is not an example of an ‘external validation’ approach  External validation requires coordination by an external body to review and monitor the assessment processes and outcomes at the RTO.  The external body has overarching authority to make recommendations for changes to the tool for future use and is responsible for monitoring whether such changes have been implemented.  It is not a model commonly used in the Australian VET system in which assessments are conducted for credentialing purposes.

59 Types of Approaches - Statistical  Limited to moderation  Yet to be pursued at the national level in VET  Requires some form of common assessment task at the national level  Adjusts level and spread of RTO based assessments to match the level and spread of the same candidates scores on a common assessment task  Maintains RTO-based rank ordering but brings the distribution of scores across groups of candidates into alignment  Strength  Strongest form of quality control  Weakness  Lacks face validity, may have limited content validity

60 Activity 3: Assessment Quality Management

61 Session 4: Streamlining Training Packages

62 VET Products for 21 st Century  COAG policy direction  “Reforming training products......to meet a more client driven system”  VET Products for the 21 st Century - June 2009  Joint NQC/COAG report  Themes  Flexibility  Streamlining  Competency and knowledge  Foundation skills

63 Recommendations 16 and 17 Separate performance standards from guidance and supporting information Restructure and streamline Training Package content Simplify endorsed components by reducing specification and detail, put this into companion volumes Eliminate unnecessary and consolidate repetitive info Divide info into components Streamlining Training Packages

64 simplifiedshortenedsegmented Key characteristics

65 New design model Training Package

66 New design model Companion Volume

67 Associate Professor Shelley Gillis Deputy Director Work-based Education Research Centre Victoria University Email: shelley.gillis@vu.edu.au Phone: 0432 756 638 Andrea Bateman Director Education Consultant Bateman Giles Pty Ltd Email: andrea@batemangiles.com.au Phone: 0418 585 754 Chloe Dyson Director VET Consultant CDA Consulting Email: chloed@alphalink.com.au Phone: 0408124825


Download ppt "VET in Schools: Strengthening delivery and assessment outcomes A POWERPOINT PRESENTATION DEVELOPED FOR THE NATIONAL QUALITY COUNCIL TO SUPPORT INFORMATION."

Similar presentations


Ads by Google