1 / 34

BILC Standardization Efforts and a Preview of the Advanced Language Testing Seminar (ALTS)

This article provides an overview of the history and standardization efforts of STANAG 6001, as well as a preview of the Advanced Language Testing Seminar (ALTS).

Télécharger la présentation

BILC Standardization Efforts and a Preview of the Advanced Language Testing Seminar (ALTS)

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Bureau for International Language Co-ordination BILC Standardization Efforts and a Preview of the Advanced Language Testing Seminar (ALTS) Peggy Garza Associate BILC Secretary for Testing Programs

  2. Bureau for International Language Co-ordination Standardization Efforts • History of STANAG 6001 • STANAG 6001, Edition 2 • Language Testing Seminar (LTS) • STANAG 6001, Edition 3 • Benchmark Advisory Test (BAT) • Other Initiatives • Advanced Language Testing Seminar (ALTS)

  3. HISTORY OF STANAG 6001

  4. STANAG 6001 • Language proficiency descriptions developed by BILC • Adopted by NATO in 1976 - SLP (Standardized Language Profile) - Listening, speaking, reading, and writing - SLPs in job descriptions, prerequisites for training, Force Goals, Partner Goals

  5. Standardization Challenges • “…English language is the foundation of interoperability…” Gen Ralston, former SACEUR • Testing is a national responsibility • NATO became more concerned with enforcing SLP requirements

  6. Standardization Challenges • STANAG 6001 used without modifications since 1976, although there had been many refinements in the field to language proficiency descriptions since then • NATO/PfP interoperability issues raised common understanding of STANAG 6001 to forefront • Testing protocols diverged greatly in NATO and PfP countries

  7. BILC Response • Emphasis on testing at PfP Seminars • BILC Steering Committee approved formation of a Working Group (WG) on testing in 1999 • To interpret and elaborate STANAG 6001 language descriptions multilateral consensus • Testing Working Group (WG) involved academics from 11 countries • STANAG 6001 Interpretation and Elaboration document produced

  8. STANAG 6001, EDITION 2

  9. Custodian of STANAG6001 • Interpretation and Elaboration Document became STANAG 6001, Edition 2 • Promulgated by the NATO Standardisation Agency (NSA) in 2003 • NATO Training Group (NTG) asked BILC to • Periodically review and update STANAG 6001 • Develop a seminar to familiarize Partner nations with the STANAG 6001 standards and the methodology for testing to the standards

  10. LANGUAGE TESTING SEMINAR (LTS)

  11. Language Testing Seminar(LTS) • BILC collaborative effort • Developed by testing experts from 5 BILC member nations • Aim: To give NATO/PfP language teaching professionals the opportunity to practice developing test items based on STANAG 6001 • Started in Fall 2000 • Sponsored by Allied Command Transformation (ACT) • Hosted at the Partner Language Training Center Europe of the George C. Marshall Center

  12. Language Testing Seminar(LTS) • Two-week graduate level seminar on language testing • Faculty from Canada, Denmark, Germany, Hungary, Italy, Romania, the Netherlands, Slovak Rep, UK, and USA • Since 2000, almost 300 attendees from 38 nations and 4 NATO offices

  13. Language Testing Seminar June 2009

  14. STANAG 6001, EDITION 3

  15. Custodian of STANAG6001 • NATO Training Group (NTG) asked BILC to update STANAG 6001 • BILC Study Group reviewed STANAG 6001 in 2006 and recommended new labels for each level • BILC Testing WG developed descriptors for plus levels

  16. FORMER 0 No Proficiency 1 Elementary 2 Limited Working 3 Minimum Professional 4 Full Professional 5 Native/Bilingual CURRENT 0 No Proficiency 1 Survival 2 Functional 3 Professional 4 Expert 5 Highly-articulatenative STANAG 6001 LabelHarmonization

  17. Plus Level Descriptors • Lack of intermediary levels in the STANAG 6001 brought to the attention of the Steering Committee • Range between L 2 and L3 is significant • Descriptors developed by multinational WG of BILC testing experts • Use is optional per STANAG 6001 Edition 3

  18. STANAG 6001, Edition 3 • Ratified by the nations and promulgated by NATO Standarization Agency (NSA) in February 2009 • English and French versions • Can be downloaded from the BILC website www.bilc.forces.gc.ca

  19. BENCHMARK ADVISORY TEST (BAT)

  20. Benchmark Advisory Test(BAT) • NTG tasking to standardize testing • BILC proposed a benchmark test against which national tests can be calibrated • Purpose: to ensure equivalency of national tests • Advisory in nature

  21. Benchmark Advisory Test(BAT) • History • Survey conducted on need for a NATO-wide Benchmark Advisory Test (BAT) • Items collected from BILC member nations (Voluntary National Contributions) • Allied Command Transformation (ACT) funding for BAT was approved in December 2006

  22. Benchmark Advisory Test(BAT) • Is a general language proficiency exam • Everyday survival and work-related topics • Some NATO-related topics at higher levels • Not job-specific • The Reading and Listening tests • Computer-delivered, computer-scored

  23. Benchmark Advisory Test(BAT) • The Speaking test • Telephonic person-to-person • Human-scored • The Writing test • Computer-delivered • Human-scored • Levels reported: 0, 1, 2, 3 with optional plus levels 0 – 2

  24. Benchmark Advisory Test(BAT) • Multinational BILC WG wrote test specifications • All countries invited to submit items • Web collaboration • Items posted to a secure website • BILC WG and leadership team met 2x year • 14 members from 8 nations and SHAPE • Contributions from many other nations

  25. BILC Benchmark Test(BAT) • Validation of Listening and Reading tests • Modified modified Angoff method • Conducted by BAT WG • Speaking and Writing tests • Tester/rater norming workshop for BAT WG

  26. OTHER INITIATIVES

  27. ADVANCED LANGUAGE TESTING SEMINAR (ALTS)

  28. Advanced Language Testing Seminar • Need for continuing professional development for test developers identified • Stakeholders’ meeting in September 2008 • Result: tentative plan for the Advanced Language Testing Seminar (ALTS)

  29. Advanced Language Testing Seminar • Target audience: LTS graduates who continue to work on STANAG 6001 testing • STANAG 6001 Edition 3 standardization sessions • Focus on L3 for practical exercises • Opportunity to exchange best practices • Modular approach to seminar

  30. Advanced Language Testing Seminar • Three-week seminar • Module 1: Productive skills: Speaking and Writing (one-week) • Module 2: Receptive skills: Listening and Reading (one week) • Module 3: Test Analysis (3 days) • Module 4: Managerial Issues (2 days)

  31. Advanced Language Testing Seminar • Development (Jan-Aug 2009) • Each module is being developed by a two-person multinational team of BILC testing experts • Validation (24 Aug-11 Sept 2009) • Module developers will teach their modules • ALTS participants will provide feedback on the objectives, content and delivery of the modules

  32. ADVANCED LANGUAGE TESTING SEMINAR (ALTS) TOPICS AND ACTIVITIES

  33. Discussion Questions 1. What should the job qualifications be for members of a STANAG 6001 testing team? 2. How should work responsibilities be divided up among testing team members? Discuss the tasks of test developers vs test administrators. 3. Make recommendations for recruiting and retaining testing team members. 4. Discuss the relationship between the testing team and the teachers. 5. Make recommendations for training, norming and renorming members of a testing team. 6. How can conflict arise in test within a testing team? Make recommendations for dealing with this conflict. .

More Related