Presentation is loading. Please wait.

Presentation is loading. Please wait.

Standardizing Testing in NATO Peggy Garza and the BAT WG Bureau for International Language Co-ordination.

Similar presentations


Presentation on theme: "Standardizing Testing in NATO Peggy Garza and the BAT WG Bureau for International Language Co-ordination."— Presentation transcript:

1 Standardizing Testing in NATO Peggy Garza and the BAT WG Bureau for International Language Co-ordination

2 Background STANAG 6001, Ed 2 Language Testing Seminar (LTS) Ongoing Efforts Benchmark Advisory Test (BAT) Bureau for International Language Co-ordination

3 Background: What is the issue? “…English language is the foundation of interoperability…” Gen Ralston, former SACEUR Testing is a national responsibility NATO became more concerned with enforcing SLP requirements –SHAPE LTC –Partner/Force Goals

4 Background: What is the issue? PfP nations begin to establish national testing programs based on NATO standards –Asked for clarification on STANAG 6001 In 1999 BILC Steering Committee approved formation of WG –To interpret and elaborate the STANAG –Produced STANAG 6001, Ed 2

5 Custodian of STANAG 6001 In 2003, STANAG 6001, Ed 2 was promulgated by the NATO Standardisation Agency NTG/JSSG asked BILC to –Periodically review and update STANAG 6001 –Develop a seminar to familiarize Partner nations with the STANAG 6001 standards and the methodology for testing to the standards

6 Language Testing Seminar (LTS) BILC collaborative effort –Developed by testing experts from 5 BILC member nations Aim: To give NATO/PfP language teaching professionals the opportunity to practice developing test items based on STANAG 6001 Facilitators from Canada, Denmark, Germany, the Netherlands, Romania, Hungary, UK, USA

7 Language Testing Seminar (LTS) Two-week seminar conducted since 2000 –Beginning 2007, 3 times a year More than 200 participants from NATO/PfP nations to date 35 NATO and PfP nations –Med Dialog nations recently added

8

9 Ongoing Efforts Conferences –Teaching and Testing to Level 3 (Riga, 2004) Study Groups/Working Groups –Panel on test administration issues (Strasbourg, 2004) –Review of STANAG 6001 level titles (Budapest, 2006) –BILC mentoring of new testing programs (San Antonio, 2007) –Working Group on Testing (since 1999) Bilateral and regional cooperation on item development and piloting

10 Benchmark Advisory Test (BAT) BILC conducted a survey of nations on their interest in a benchmark test Benchmark test would provide a standard against which national tests can be calibrated –Purpose: to help foster equivalency of national tests Advisory in nature

11 Benchmark Advisory Test (BAT) In 2005, NATO/ACT approved concept of a benchmark test without financial support Relying totally on Voluntary National Contributions (VNC) –BAT WG was established –Items were solicited

12 Benchmark Advisory Test (BAT) In 2006, NATO/ACT directed that the benchmark test be developed –Dec 2006 ACT offered financial support to augment BILC’s plan to use VNC –Established a development timeline and a requirement for periodic status reports

13 Benchmark Advisory Test (BAT) BAT WG –Members: Bulgaria, Canada, Denmark, Hungary, the Netherlands, Romania, USA –Collaboration via e-mail –Face-to-face meetings in Budapest, Tallin, and San Antonio –Developed test specifications, reviewed and validated items

14 BILC Benchmark Advisory Test (BAT) for Reading Reading proficiency test Multiple choice test Three level test L1-L2-L3 60 items (20 per level) Online delivered and scored Item pool (may be expanding) Your voluntary national contributions (VNC) Reviewed and validated by BAT WG Estimated completion date: Dec 2007 Modified Angoff method basis for validation

15 “Modified” Modified Angoff Method 25 “judges” from BAT WG and ELC faculty blindly rate passage and task levels TASK=TEXT => % % % AT LEVEL NEXT LEVEL BELOW LEVEL Two-pass method with discussion in between Judges average determine the cut-off scores for a reader at a given level (L1, L2, and L3) “What percentage of users at this level would answer the item?” Statistical support obtained through piloting at ELC Very low degree of disagreement among the judges

16 Results of Modified Angoff Method L1 L2 L3 Average SD Average SDAverage SD 1 st Pass 76% 2.6 73.2%2.173% 2.2 2 nd Pass 75% 4.2 73% 3.573% 3.8

17 Benchmark Advisory Test (BAT) Listening test –Development procedure will be similar to reading test Listening specifications Request VCN for item pool –First deadline 15 July 2007 Estimated completion date: March 2008

18 Benchmark Advisory Test (BAT) Speaking test –Oral proficiency interviews conducted by telephone –Digitized sound files scored by trained raters –Workshops for training raters Estimated completion date: March 2008

19 Benchmark Advisory Test (BAT) Writing test –Delivered online –Digitized files scored by trained raters –NATO writing topics and tasks –Workshop for training raters Estimated completion date: March 2008

20 Benchmark Advisory Test (BAT) Beta testing –NATO School and SHAPE LTC –Reading August or September 2007 –Listening December 2007 or January 2008

21 Benchmark Advisory Test (BAT) Conclusion –Need your assistance –more VNC Listening items NATO writing topics and tasks Reading items to expand item pool garzap@marshallcenter.org


Download ppt "Standardizing Testing in NATO Peggy Garza and the BAT WG Bureau for International Language Co-ordination."

Similar presentations


Ads by Google