Presentation is loading. Please wait.

Presentation is loading. Please wait.

CREOLE 21 October, 2009 Using English in final examinations: Rationales and realities SPINE team.

Similar presentations

Presentation on theme: "CREOLE 21 October, 2009 Using English in final examinations: Rationales and realities SPINE team."— Presentation transcript:

1 CREOLE 21 October, 2009 Using English in final examinations: Rationales and realities SPINE team

2 2 What is SPINE? Student Performance in National Examinations: the dynamics of language in school achievement (SPINE) (ESRC/DfID RES-167-25-0263) Bristol team: Rea-Dickins, Yu, Afitska, Sutherland, Olivero, Erduran, Ingram, Goldstein Zanzibar team: Z. Khamis, Mohammed, A. Khamis, Abeid, Said, Mwevura LTA = high stakes; > 50% of school aged children leave school at the end of Basic Education as unsuccessful

3 3 Case 1: context Khadija is 15 years old and in Form 2 of secondary school. Her learning in primary school was entirely through the medium of Kiswahili, with English taught as a subject. She experienced an abrupt shift from Kiswahili as medium of instruction on transition to secondary school as in Form 1 she was expected to learn all her subjects through English. However, in reality both her L1 (Kiswahili) and L2 (English) are used when she is being taught Maths, Science, Geography and her other subjects. But, at the end of Form 2 she is required to demonstrate her subject learning in formal school examinations entirely through the medium of English.

4 4 Insert the SPINE diagram here 4

5 5 The Policy & the Politics: historical perspectives Continuous assessment (CA) was introduced in Tanzania schools in 1976 after Musoma Resolution (NECTA Guidelines 1990:1). In Musoma Resolution, Nyerere (1974) (see Lema et al 2006:111-112) criticised the examination system existed since independence and addressed the need to change it. He maintained that the assessment procedures valued more competitive, written type, theoretically and intellectually focused examinations.

6 6 The Policy & the Politics: historical perspectives He saw the need to make Assessment procedures more friendly, less competitive and practically focused. In that manner, he directed that students assessment must; Include works that can enable students to function effectively in their environments. Cover both students performance in theoretical and practical works. Be carried out throughout the year.

7 7 The Policy & the Politics: historical perspectives The National Examination Council of Tanzania (NECTA), started to include projects, character assessment, exercises and tests in the assessment system Teachers were then given responsibility to; measure students ability to use their acquired knowledge and skills in their local environments. keep records of their progress continuously

8 8 Tanzania and Zanzibar Education Policies Both The Tanzania Education and Training Policy (1990:79) and The Zanzibar Education Policy (2006:28) stated that continuous assessment (based on teacher assessment) combined with the final exam would be the basis for awarding of certificates at Secondary Education levels. The Zanzibar Education Master Plan of 1996- 2006 emphasized consistency between teacher assessment and National examinations for selection purposes and quality control (p.43)

9 9 Language policy and Assessment Students assessment is affected by Language Policy because students have to demonstrate their knowledge and abilities through a language ratified by the system. The Zanzibar Education Policy of 1995, Kiswahili version Section 5.72 (p28), The Zanzibar Education Master Plan of 1996-2006 (p43) and the Education Policy of 2006 (p38), all emphasized the use of English Language as a medium of instruction for all subjects in secondary education except for Kiswahili and Islamic studies.

10 10 Language policy and Assessment So, corollary to this, English, as a language of instruction in teaching science at secondary level has automatically become language of assessment; continuous assessment and examination. This implies that students opportunities to demonstrate their knowledge in examination is limited if they are not proficient in the language

11 11 Coursework assessment as part of national examining: implementation issues At the level of Form II, Teachers Assessment (TA) was implemented basing on the following assessment procedures: Class works Oral questions Home works Weekly tests Terminal tests 11

12 12 Continuous Assessment format & Guidelines Term: --------- Class: --------- subject--------- Pupils name Month 1 10% Month 2 10% Month 3 10% Month 4 10% Total C/work 40% Exam 60% Grand Total 100% X5638223052

13 13 Continuous Assessment format & Guidelines Note: Coursework assessment will be obtained from class works, home works and weekly tests or as it will be directed otherwise by the Department or the Ministry. All the marks scored by the pupil from the mentioned activities within a particular month, will be combined and transformed into percentage of ten (10%) before filled in the relevant space in this form. Source: Ministry of Education Working Guidelines: Guideline No. 9 (1995)

14 14 Continuous Assessment format & Guidelines Additional notes from another version If the term is less or more than four (4) months, four activities chosen by the teacher him or herself should be picked at equal intervals. The teacher is advised to give as many tests as he or she can in a term but should choose only four for recording. In getting pupils marks for terminal exam for each subject, the exam itself should contribute 60% and classroom assessments (coursework assessments) should contribute 40%

15 15 Coursework assessment as part of national examining: implementation issues The language to be used is as indicated in the lesson plan. Source: Ministry of Education Working Guidelines: Guideline No. 9 (1995) These notes appear to contradict with the first version which might be leading to confusion. This version requires the teacher to find average of marks in a month while the second requires the teacher to choose one activity from which the marks can be recorded.

16 16 Coursework assessment as part of national examining: implementation issues Homework, class work, oral questions, weekly tests make 40% and the terminal exam makes 60% which when combined make 100%. The hundred percent obtained is sent to the Department of Curriculum and Examination for further processing. There was variability among teachers on: The specific activities that constitute teacher assessment marks.

17 17 Coursework assessment as part of national examining: implementation issues Number of activities they should take marks from for each month. Decision on students absentees How to get 10% of marks for each month; some picked the one that students did better, some picked randomly, and some put average. Understanding of what assessment is to be done. Awareness of assessment guidelines

18 18 Coursework assessment as part of national examining: implementation issues Percentage of marks that go to the final examination Sharing what teachers know and practice about assessment guidelines among teachers within schools Level of monitoring of Teacher Assessment by head teachers. So, students from the same school and from different schools are assessed in different ways.

19 19 Overall, teachers were observed using L1 during the lessons in 49% of cases. Specifically, they did so some of the time in 20% of cases, most of the time in 11% of cases; and rarely in 18% of cases.

20 20 Teachers presented concepts clearly most of the time (28% of cases), some of the time (62% of cases). Teachers presentation of concepts was unclear (6%) or confusing (3%) in 9% of cases.

21 21 In 42% of cases observed, teachers rarely probed pupils comprehension. They did not probe pupils comprehension at all (24%). Teachers frequently probed pupils comprehension (8%); and sometimes probed pupils comprehension (24%).

22 22 Teachers provided feedback to learners using wide range of strategies (1%); with hardly any or no feedback in 19% of cases. In 78% of cases, teachers were observed either using some range (24%) or a limited range (54%) of feedback to the learners

23 23 In 83% of cases, learners rarely (28%), almost never or never (55%) provided extended responses; 16% pupils provided extended responses some of the time; only 2% learners were observed providing extended responses most of the time.

24 24 Findings 24

25 25 Findings 25

26 26 Student performance on exam items English RC Qn: How whales resemble man 45 students took this item: 35.6% = no answer 26.7% = wrong answer 28.9% = partially correct answer 8.8% = correct answer

27 27 Interview: D1 who didnt answer Q3 explains D1:because I did not understand by this this … resemble (lines 115-117) Int:If I tell you that resemble means to look like … can you do the question now? D1:Yes Int:OK so whats the answer? D1:Man … is warm blooded … and whales also … whales have lungs and man also have lungs … (122-133)

28 28

29 29 Maths: original & modified question 29

30 30 Linguistic accommodations make a difference Modifications to item – i.e. changing the word below to under/ younger cf. below 14 years (refer to the age where the item is) was interpreted in 3 different ways: includes the 14 year olds (3+2+5+4+2=16) cells on the left of the cell containing 14 (10,11,12,13) cell below the cell containing 14 (which says 2) 30

31 31 Biology: responses to original item No answer = 67.4% Wrong answer= 21.7% Partially correct answer = 6.5% Correct answer = 4.4%

32 32 Biology: on locusts

33 33 Biology: on locusts

34 34 Biology: on locusts Questions: a) In which picture do you think the locust will/may die? b) Why do you think it will/may die?

35 35 Original item modified: Greater contextualisation Simplification of instruction 2 structured parts: A & B Visual clues to support information retrieval Rephrasing of the item Altering item layout

36 36 Results Original item –Only 32.6% of students wrote an answer –Just under 11% gave a partially correct or correct answer Modified item –100% responded to this item –42% gave a partially correct answer to Part A –53% gave a partially correct answer to Part B

37 37 Nzige (locust): Changes in student response OR Sc Written responses on Modified Item MOD Score Comment H30In picture A the locust may/will die I think it will/may dies because the locust get its breathing by using its body 2 + 2Understands that locusts breathe using the body C21I think it will/may die because the boy is dipping the locust in the water to all the bodies with its trachea that used to respiration as a respiratory surface of a locust 2 + 2Getting very close to a very complex answer I think a very able pupil indeed. The right answer but with great difficulty in expressing this in English

38 38 Nzige (locust): changes in response OR Sc Written responses on Modified Item MOD Score Comment (Neil Ingram: biologist) G20I think its because of its body covered (immersed) completely in the water, and its terrestrial not an aquatic. It can 2 + 1Test taker locates a locust as not naturally aquatic and therefore unlikely to survive if immersed in water. Evidence of a learner using everyday (informal) knowledge beyond the curriculum to answer the question but this level of insight would not be recognised in the original scoring protocol. 38

39 39 Means of Form II Exam Results (MoEVT) 39 KISW Islamic studies BIOCHEMMATHPHYS 20044937.118.430.821.125.8 20054644.322.634.215.325.4 20063647.921.534.315.525.1 200735.544.521.532.014.323.9 200850.647.524.534.816.429.1

40 40 ENG and MATH Total: 64.717, school=29.18%, pupil=70.82% ENGLISH explains (90.4- 64.717)/90.4=28.41% of the maths total variance Total=90.4, School=30.75%, Pupil=69.25%

41 41 ENG and BIO 41 Total=65.646, school=18.17%, pupil=81.83% ENGLISH alone explains (114.968-65.646)/114.968=42.90% of the total variance in BIOLOGY Total=114.968, school=18.79%, pupil=81.21%

42 42 English and CHEM Total=158.281, school=23.06%, pupil=76.94% ENGLISH explains (275.993- 158.281)/275.993=42.65% of the total CHEM variance Total=275.993, school=22.05%, pupil=77.95%

43 43 How about KISWAHILI & ARABIC? Although other two languages (Kiswahili and Arabic) are also significant predictors of the students performance in maths, biology and chemistry, it is noted that they are less capable of explaining the variances than ENGLISH. KISWAHILI explains (275.993-88.954)/275.993=31.54% of CHEM total variance, (114.968-76.585)/ 114.968=33.39% of BIO total variance, (90.4-75.678)/90.4=16.29% of MATH total variance. ARABIC explains (275.993-203.452)/275.993=26.28% of the CHEM total variance, (114.968-88.569)/114.968=22.96% of the BIO total variance, (90.4-71.885)/90.4=20.48% of the maths total variance.

44 44 Summary of the multilevel models (a) It is very clear that ENGLISH is a significant and substantial predictor of the students performance in MATH, BIO & CHEM. The school-level variances explained in the cons models as well as in the models including ENGLISH as the single explanatory variable demonstrated a substantial proportion of the variance is attributable to school factors

45 45 Summary of the multilevel models (b) Not much improvement in terms of fitness of the models (measured by the change of % of school-level variance in the total variance). Therefore, essential to collect further school- and pupil-level data to examine what factors (e.g. English language learning opportunities at home and at school, academic English proficiency) and how much they account for the variances (in the tradition of school effectiveness studies) – our plan for nationwide data collection using pupil and headteacher questionnaires and vocabulary knowledge test

46 46 High stakes classroom assessment: the realities Impact/Potential Disadvantage Consequences/Injustice Teachers do not use full range of LTA procedures & processes Use of inappropriately constructed assessment frameworks Inaccurate CWA of learners CWA implemented as a series of tests Learners not fully supported in their language & content knowledge development & fail to reach potential Test performance valued over learning Leave school with poor educational outcomes Inadequate d-base for decision making about student & learning progression

47 47 Using English in high stakes examinations (SSA, UK, TIMSS, PISA): the realities Impact/Potential Disadvantage (examples) Consequences/Injustice: (examples) Learners do not engage or respond poorly in examinations Learners are not given a fair chance to show their abilities Subject area (e.g. Biology, maths) construct can only be assessed once the linguistic construct has been successfully negotiated Loss of self-esteem & motivation for learning Learners do not achieve their potential – glass ceiling effect Learners leave school as unsuccessful learners at end of Basic Education (in some countries end of Primary Phase) Unskilled individuals unable to join the workforce in turn leading to social & economic deprivation

48 48

Download ppt "CREOLE 21 October, 2009 Using English in final examinations: Rationales and realities SPINE team."

Similar presentations

Ads by Google