Presentation is loading. Please wait.

Presentation is loading. Please wait.

Peter Lenz IBE SeminarWarsaw, 20/10/2011 A Language Assessment Kit – Relating to the CEFR – for French and English.

Similar presentations


Presentation on theme: "Peter Lenz IBE SeminarWarsaw, 20/10/2011 A Language Assessment Kit – Relating to the CEFR – for French and English."— Presentation transcript:

1 Peter Lenz IBE SeminarWarsaw, 20/10/2011 A Language Assessment Kit – Relating to the CEFR – for French and English

2 Overview of the presentation 1.Context 2.Development 3.Product / Use 4.Looking back and forward / some thoughts

3 Overview of the presentation 1.Context 2.Development 3.Product / Use 4.Looking back and forward / some thoughts

4 2001 – EYL: Launch of CEFR & ELP 15+ in CH In 2001 the Swiss Conference of Cantonal Ministers of Education recommend to the cantons  to consider the CEFR  in curricula (objectives and levels)  in the recognition of diplomas  to facilitate wide use of the ELP 15+  make ELP accessible to learners  help teachers to integrate ELP in their teaching  to develop ELPs for younger learners

5 Common European Framework of Reference… (CEFR) A common reference for Many foreign-language professionals  Course providers  Curriculum/syllabus developers  Materials authors  Teacher trainers  Examination providers, etc. A basis for the description of  Objectives  Contents  Methods  CEFR isn't prescriptive but asks the right questions and favors certain answers…

6 An action-oriented approach and Reference levels Means of description:  Descriptors of communicative language activities  Descriptors of "competences" (or "language resources" or qualitative aspects of language use) A1A1 A2A2 B1B1 B2B2 C1C1 C2C2 Basic UserIndependent UserProficient User CEFR favors an action-oriented approach (language use in context) Main objectives relate to communicative language proficiency CEFR describes 6 reference levels: A1 through C2

7 Core elements of CEFR & ELP: scaled descriptors I can deal with most situations likely to arise whilst travelling in an area where the language is spoken. I can enter unprepared into conversation on topics that are familiar, of personal interest or pertinent to everyday life (e.g. family, hobbies, work, travel and current events). Proficiency or can-do descriptors

8 Core elements of CEFR: scaled descriptors Consistently maintains a high degree of grammatical accuracy; errors are rare, difficult to spot and generally corrected when they do occur. Descriptors of competences or qualitative aspects

9 The Concept of Illustrative Descriptors Illustrative descriptors may be considered as spotlights illuminating small areas of competence/proficiency while other areas remain in the dark. Descriptors outline and illustrate competence/proficiency levels but never define them exhaustively. D1 D2 D3 D4 D17 Can briefly give reasons and explanations for opinions, plans and actions. Listening Reading Spoken Interaction Spoken Production Writing

10 European Language Portfolios For the hands of the learners: 3 parts – 2 main functions: Lang. PassportLang. BiographieDossier Documentation Facilitation of learning

11 From the ELP 15+ to An ELP for learners age 11 to 15? - Teachers’ wish list:  More descriptors taylored to young learners ‘ needs  Less abstract formulations  Self-assessment grid and checklists with finer levels  Tools facilitating “hard” assessment  Test tasks relating to descriptors  Marked and assessed learner texts  Assessed spoken learner performances on video  Assessment criteria for Speaking (and Writing) relating to finer levels Beyond an ELP's reach

12 The initiators FL German-speaking cantons of Switzerland Principality of Liechtenstein

13 The authorities‘ rationale  CEFR as a basis  further elaboration of Reference levels  Assessment and self-assessment instruments building upon descriptors  Teacher-training material and early involvement of teachers to prepare dissemination and introduction of the instruments in the school context Promotion of the quality and effectiveness of school-based foreign-language teaching and learning by improving the quality, coherence and transparency of assessment

14 Overview of the presentation 1.Context 2.Development 3.Product / Use 4.Looking back and forward / some thoughts

15 Overview of expected products Bank of validated test tasks (  5 “skills”; C-tests) Benchmark performances (Speaking, Writing) Bank of target-group-specific descriptors (levels A1.1-B2.1) Ready-made "diagnostic" test sets Assessment criteria (Speaking, Writing) (Self-)assessment grid & checklists ELP 11-15

16 Developing a Descriptor Bank Bank of target-group-specific descriptors (levels A1.1-B2.1)

17 Reduced but subdivided range of levels

18 How were the new can-do descriptors developed? 1) Collect from written sources (ELPs, textbooks, other sources)  Teachers decide on relevance for target learners and on suitability for assessment  Teachers complement collection 2) Validate, complement the collection in teacher workshops 3) Fine-tuning and selecting descriptors  Make formulations non-ambiguous and accessible; add examples  Select descriptors to cover whole range of levels A1.1 - B2.1  Represent wide range of skills and tasks  ~330 descriptors for empirical phase Development of the descriptors

19 Data collection – Teachers assess their pupils Following Schneider & North‘s methodology for the CEFR Development of the descriptors

20 Scaling: Link and anchor assessment questionnaires of 50 descriptors each, for different levels 2 parallel sets of descrip- tors of similar difficulty per assumed level Identical descriptors as links (& sometimes CEFR anchors) Too few learners at B2 Development of the descriptors

21 Statistical analysis and scale-building (A1.1 - B1.2) Development of the descriptors

22 Self-assessment Grid and Checklists Bank of target-group-specific descriptors (levels A1.1-B2.1) (Self-)assessment grid & checklists ELP 11-15

23 Reformulations: I can... 1) Some Can do ‘s are transformed into I can ‘s Classes use descriptors for self-assessment and give feedback Can learners understand? 2) Whole bank of Can do ‘s is transformed into I can statements

24 Self-assessment tools for the ELP

25 Overview of products Bank of validated test tasks (  5 “skills”; C-tests) Bank of target-group-specific descriptors (levels A1.1-B2.1) (Self-)assessment grid & checklists ELP 11-15

26 Test Tasks  Speaking tasks (production and interaction)  Writing tasks  Listening tasks  Reading tasks 1) Test tasks relating to communicative language proficiency 2) C-Tests (integrative tests)  C-Tests are a special type of CLOZE test.  Test tasks correspond to (or operational- ize ) one or more descriptor(s).

27 Test Tasks  Speaking tasks (production and interaction)  Writing tasks  Listening tasks  Reading tasks 1) Test tasks relating to communicative language proficiency 2) C-Tests (integrative tests)  C-Tests are a special type of CLOZE test.  Test tasks correspond to (or operational- ize ) one or more descriptor(s).  All test tasks were field-tested and attributed to CEFR levels using pupils' self-assessment or teacher assessment ( common-person equating ).

28 Test Tasks  Speaking tasks (production and interaction)  Writing tasks  Listening tasks  Reading tasks 1) Test tasks relating to communicative language proficiency 2) C-Tests (integrative tests)  C-Tests are a special type of CLOZE test.  C-Tests are said to provide reliable information on a learner‘s linguistic resources.  C-Tests are quick.  Test tasks correspond to (or operational- ize ) one or more descriptor(s).  All test tasks were field-tested and attributed to CEFR levels using pupils' self-assessment or teacher assessment ( common-person equating ).

29 Criteria and Benchmark Performances Bank of validated test tasks (  5 “skills”; C-tests) Benchmark performances (Speaking, Writing) Bank of target-group-specific descriptors (levels A1.1-B2.1) Assessment criteria (Speaking, Writing) (Self-)assessment grid & checklists ELP 11-15

30 CEFR Table 3 – the point of departure Consistently maintains a high degree of grammatical accuracy; errors are rare, difficult to spot and generally corrected when they do occur. Descriptors of qualitative aspects of performance

31 Assessment criteria for Speaking Where did the new qualitative criteria come from? – Steps taken:  Collect criteria from various sources: CEFR, examination schemes... 1) Collect criteria  Teachers bring video recordings  Teachers describe differences between learner performances they can watch on video  criteria emerge  Teachers select and apply descriptors from the existing collection  Teachers agree on essential categories (e.g. Vocabulary Range, Pronunciation/Int. ) and agree on a scale for each analytical category 2) Generate & select criteria: teachers assess spoken performances 3) Prepare empirical validation (experts)  Decide on categories of criteria to be retained  Revise and complete proposed scales of analytical criteria  … and produce performances to apply the criteria to

32 Phase IV Producing video recordings of spoken performances One learner - different tasks in various settings 10 learners of English, 11 learners of French

33 33 Validation of criteria for Speaking Methodology A total of 35 teachers (14 Fr, 21 En) apply  58 analytical criteria (some from CEFR ) belonging to 5 categories  28 task-based can-do descriptors (matching the tasks performed )  to 10 or 11 video-taped learners per language, each performing 3-4 spoken tasks Analytical criteria categories  Interaction  Vocabulary range  Grammar  Fluency  Pronunciation & Intonation

34 Scaling the criteria for Speaking Criteria and questionnaires – a linked and anchored design Three assessment questionnaires for three different learner levels “Statement applies to this pupil but s/he can do clearly better” “Statement generally applies to this pupil ” “Statement doesn‘t apply to this pupil” Links between questionnaires CEFR Anchors

35 Criteria for Speaking - analysis Teacher severity and consistency Consistency: 5 out of 35 raters were removed from the analysis due to misfit of up to 2.39 logits (infit mean square) Severity: Some extreme raters (severe or lenient) show a strong need for rater training although every criterium makes a meaningful (but somewhat abstract) statement on mostly observable aspects of competence. Map for English

36 Criteria for Speaking – outcomes Statistical analysis indicates  that we have good quality criteria  which may be used to assess learners from A1.1 to B2 Statistical analysis also indicates  which of the video-taped learners are the least or most able  which raters (teachers) were severe or lenient  which raters rated consistently or inconsistently  Useful findings for teacher training on the basis of these videos The assessment criteria for written performances were developed using a very similar methodology

37 Ready-made sets of test tasks Bank of validated test tasks (  5 “skills”; C-tests) Benchmark performances (Speaking, Writing) Bank of target-group-specific descriptors (levels A1.1-B2.1) Ready-made "diagnostic" test sets Assessment criteria (Speaking, Writing) (Self-)assessment grid & checklists ELP 11-15

38 Ready-made sets of test tasks  Ready-made, class-specific bundles of test tasks for Listening, Reading, Speaking and Writing  Information and advice for teachers regarding preparations, use and scoring/score interpretation

39 Overview of the presentation 1.Context 2.Development 3.Product / Use 4.Looking back and forward / some thoughts

40 The Kit: Ring-binder and Data base Limited, non-personal licence

41 Elements: Overview

42 Elements: Descriptors

43 Elements: Test tasks Test tasks building upon descriptors C-Tests

44 Elements: Benchmark performances

45 Example: Listening tasks

46 Example: Listening task

47 Instructions in German, the local L1

48 Example: Listening task Interpretation of scores in relation to CEFR levels. Answer key

49 Example: Spoken interaction task For use by teachers and also by learners

50 Example: Spoken interaction task For learner A

51 Example: Spoken interaction task For learner B

52 Example: Spoken interaction task For learner B Instructions for learner B

53 Example: Assessment of Spoken interaction Profile and levels Type 1 descriptors: Quality of language use Type 2 descriptors: Can-do descriptors resulting Profile

54 Example: C-test C-test texts are constructed according to a set of rules. A C-test consists of 4 or 5 texts of blanks each.

55 Applications What can instruments be used for? Among other things …  Illustrate expected language proficiency and competences (e.g. for pupils and parents)  Help develop a sense of the (adapted) CEFR reference levels  Develop self-assessment and planning skills  Assemble level-related (proficiency-)tests (or use ready-made sets)  Establish learners' proficiency profile (self-assessment; tests)  Check learners' readiness for external examinations  Diagnose strengths and weaknesses with regard to different skills and competences in order to focus on individual goals for a term ……

56 online Use the live demo

57 online

58 Overview of the presentation 1.Context 2.Development 3.Product / Use 4.Looking back and forward / some thoughts

59 If I could start again… Some food for thought and discussion What reference framework would I use? How close should it be to classroom teaching and learning? far: CEFR/theory-related? intermediate: curriculum or syllabus-related? close: textbook-related?

60 If I could start again… Some food for thought and discussion What objectives would I focus on? Language proficiency ( can do )? linguistic resources (vocabulary, grammar, phonology…)? ability to communicate across the language program / the curriculum as a whole? language awareness? (inter-)cultural skills and knowledge? …

61 If I could start again… Some food for thought and discussion What purposes would I try to meet? summative assessment? – Including certification? formative assessment? diagnostic assessment?  how fine-grained?  would explicit feedback be provided? If yes – to whom?  would repeated assessments lead to an individual Roadmap or profile of learning progression? …

62 If I could start again… Some food for thought and discussion What roles would computers and the Internet play? Would pupils work online? What contributions could teachers make? Would assessment results be fed back into the system? If yes – by the teachers? Would the system provide diagnostics, profiling and feedback? If you want to improve a product or monitor its quality, you need data. Answers entered online are a unique (and cheap) data source.

63 If I could start again… Some food for thought and discussion What would I try to improve with regard to craftsmanship and technical quality? what role should the L1 play in task construction? what effort needs to be made to have more validity evidence and a better understanding of the assessment instruments?  a principled assessment design program?  combine assessment delivery and assessment research? …

64 Thank you for your interest … and your patience!


Download ppt "Peter Lenz IBE SeminarWarsaw, 20/10/2011 A Language Assessment Kit – Relating to the CEFR – for French and English."

Similar presentations


Ads by Google