1 / 25

ENHANCING TRAINING AND TESTING EFFECTIVENESS THROUGH LESSONS LEARNED

MINISTRY OF DEFENSE REPUBLIC OF BULGARIA. ENHANCING TRAINING AND TESTING EFFECTIVENESS THROUGH LESSONS LEARNED. Emilia Nesheva, Ministry of Defense, Bulgaria Greta Keremidchieva, Rakovski National Defense Academy, Bulgaria. ENGLISH LANGUAGE TRAINING AND TESTING – OFFICIAL POLICY OF THE MOD.

Télécharger la présentation

ENHANCING TRAINING AND TESTING EFFECTIVENESS THROUGH LESSONS LEARNED

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. MINISTRY OF DEFENSE REPUBLIC OF BULGARIA ENHANCING TRAINING AND TESTING EFFECTIVENESS THROUGH LESSONS LEARNED Emilia Nesheva, Ministry of Defense, Bulgaria Greta Keremidchieva, Rakovski National Defense Academy, Bulgaria

  2. ENGLISH LANGUAGE TRAINING AND TESTING – OFFICIAL POLICY OF THE MOD BASED ON: DOCUMENTATION TRAINING PROGRAMS QUALIFIED TEACHERS – “SOFT POWER” BOOKS AND EQUIPMENT – “HARDWARE”

  3. REGULATING DOCUMENTS 1. Order of the Minister of Defense №130/ 21.02.2011 regarding Strategy for development of the English language training system 2. Order of the Minister of Defense №760/04.10.2011 regarding selection of candidates, preliminary testing and conducting English courses 3. Regulation №Н-19/25.05.2010 regarding planning and development of language courses.

  4. LANGUAGE TRAINING CENTERS RAKOVSKI NATIONAL DEFENSE ACADEMY – SOFIA NATIONAL MILITARY UNIVERSITY – VELIKO TARNOVO LANGUAGE TRAINING CENTER – SHUMEN - part of the Individual Training and Education Programme – ITEP,NATO Tier 1 Smart Defence 1.7 Project NAVAL MILITARY ACADEMY – VARNA MILITARY UNITS

  5. ENGLISH LANGUAGE TRAINING IN RAKOVSKI NATIONAL DEFENSE ACADEMY Training of officers in the Master’s degree program Training of military personnel and civilians in General and Specialized English courses

  6. Target audience in Rakovski National Defense Academy Officers in the Master’s Program Entry – now 1111; next year (2222) Exit – now 2222; next year (3 2+3 2) Officers, NCO’s and civilians in English language courses NEW

  7. Training program in Rakovski National Defense Academy The Master’s Program starts with a preliminary block of 360 hours intensive English training prior to their first semester of military studies Advantage: some military subjects could be studied in English NEW

  8. General and Specialized English language training NEW

  9. ASSESSMENT AND CONTROL Preliminary Testing (ALCPT before Level ІІ) Entry Tests Book quizzes, Progress tests Final Test, ALCPT, STANAG 6001

  10. FACULTY MEMBERS Teachers – 10 University graduates with specializations in different countries and institutions; experienced; participants in national & international conferences Test Team members – 5 University graduates; trained; design, pilot & administer STANAG Tests nation-wide

  11. BOOKS & EQUIPMENT ALC Books 1 - 34 Specialized textbooks – Campaign, Military English, Army, Air Force, Navy, etc. Language Labs Self-Access Center Audio & multimedia programs

  12. FINANCIAL RESOURCES MOD program provides financial resources for the accomplishment of Force Goal Е 0356; for improvement of infrastructure and learning technologies; for conferences and seminars.

  13. Enhancing Testing Effectiveness through Computer Testing

  14. Action PlanMinisterial Order № 857/10.11.2011 • till 1 March, 2012 – approbation of the computer-assisted “writing” module; • till 1 July, 2012 – analysis of the results; • since exam session Autumn 2012 – optional computer-assisted “writing” module; • till 1 March, 2013 – approbation of computer-assisted modules for “reading” and “listening”; • till 1 July, 2013 – analysis of the results; • since exam session Autumn 2013 – optional computer-assisted “writing”, “reading”, and “listening” modules; • since exam session Autumn 2014 – all candidates are to be tested on computers for the modules “writing”, “reading”, and “listening”. End of pen-and-paper STANAG 6001 tests.

  15. Approbation of the computer-assisted “writing” module In February 2012, two groups of volunteers – military and civilian personnel sat for the approbation of the writing test. The only requirement for taking part in the approbation was to have a STANAG 6001 certificate issued during the exam session “Autumn 2011”.

  16. Comparability of handwritten and computer-based writing assessments – some findings in the specialized literature

  17. Questionnaires (excerpt) • Did you find the font of the tasks friendly enough? • Did you meet any technical difficulties during the computer-assisted “writing” test? • Do you think that your speed of typing is decisive for your success at the computer-assisted “writing” test?

  18. Recommendations from the Questionnaires • The font of typing to be Times New Roman 12 or14. • Candidates to be able to go back to the previous task for revising or correcting mistakes. • Word count to be available. • Tabulation for indicating a new paragraph to be available. • Five minutes for a spell check to be added to the time allotted for the writing test.

  19. Actions Taken • The commercial company that had developed the testing software was tasked to correct the faults noticed during the approbation and to add the functions recommended in the questionnaires. • The STANAG testing team developed detailed administrative instructions for the computer-assisted writing test.

  20. Approbation Results

  21. Analysis Special attention was paid to the candidates who attained higher or lower STANAG 6001 level at the computer-assisted writing test. Analysis of the possible reasons: • was the rating of their pen-and-paper test subject to discussions? how many raters had assessed the papers? • additional English language studies? • different “pre-writing” habits? • rater effects in scoring handwritten and computer-based writing assessments?

  22. LESSONS LEARNT(possible scoring biases) • Typed essays generally appear to raters shorter than identical handwritten responses. • Errors made are more evident when typed.

  23. LESSONS LEARNT (cont.)(the importance of the Interface) • Traditional text editing tools – cut, copy, and paste – need to be designed to support less experienced computer users. • Space management – “word count” and “print preview” should be available.

  24. Questions?

  25. In this presentation have been used quotations by: • Breland, Lee, &Muraki, 2005 • Bridgeman & Cooper, 1998; • Davis, Strain-Seymour, Lin, & Kong, 2008; • Horkay et al., 2006; • Pearson, 2010; • Russell &Plati, 2001; • Russell & Haney, 1997; • Way & Fitzpatrick, 2006.

More Related