1 / 30

As slippery as an eel? Assessing speaking and writing Part One

As slippery as an eel? Assessing speaking and writing Part One. Ülle Türk University of Tartu Estonian Defence Forces 23rd CSW, Tampere, 27-29 March 2009. Testing writing?. Fill in the gaps with suitable words so that the text is true for you. Re-write the text in the future tense.

rene
Télécharger la présentation

As slippery as an eel? Assessing speaking and writing Part One

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. As slippery as an eel?Assessing speaking and writing Part One Ülle Türk University of Tartu Estonian Defence Forces 23rd CSW, Tampere, 27-29 March 2009

  2. Testing writing? • Fill in the gaps with suitable words so that the text is true for you. • Re-write the text in the future tense. • Fill in the form with the information given in the box. • Read the letter and write an answer. • Write an essay on the topic “Why study English?” • Study the pictures, put them in the order you think best and write the story. • Read the text and write a short summary of it. • You bought a new dictionary yesterday, but found later that several pages were missing. Write a letter to the manager of the shop informing him of the problem and telling him what you want him to do about it. • Read the basic facts about Australian history and then write a short report.

  3. Questions • What is it exactly that we assess when we say we assess students’ speaking and writing skills? • How do we arrive at a common understanding of what is ‘good’ writing, what is a ‘good’ oral presentation or what constitutes ‘good’ spoken or written communication?

  4. Terms • Assessment • Formal  informal • Continuous  fixed-point • Formative  summative • Testing • Achievement • Proficiency • Diagnostic • Placement • High-stakes  low-stakes

  5. Assessment/ test quality • Validity • Reliability • Authenticity • Washback • Practicality

  6. Validity: definitions • A good test needs to be valid. = It must test what it is meant to test. • an integrated evaluative judgement of the degree to which empirical evidence and theoretical rationales support the adequacy and appropriateness of inferences and actions based on test scores. • S. A. Messick (“Validity” in R. L. Linn (ed.) Educational Measurement. 1989, p. 13)

  7. Validity • Does the test match the curriculum, or itsspecifications? • Is the test based adequately on a relevantand acceptable theory? • Does the test yield results similar to thosefrom a test known to be valid for the sameaudience and purpose? • Does the test predict a learner’s futureachievements?

  8. Validity • Content validity • Construct validity • Criterion-related validity • Predictive validity • Construct validity is indeed the unifying concept that integrates criterion and content considerations into a common framework for testing rational hypotheses about theoretically relevant relationships. • Messick, S. A. “Test validity and the ethics of assessment.” American Psychologist 35, 1980, p. 1015

  9. Threats to test validity • construct irrelevant variance • construct under-representation

  10. Factors affecting validity • Lack of specifications • Lack of training of item/ test writers • Lack of / unclear criteria for marking • Lack of piloting/ pre-testing • Lack of detailed analysis of items/ tasks • Lack of feedback to candidates andteachers

  11. Communicative competence 1 • Canale & Swain (1980), Bachman (1990): language knowledge types • Linguistic knowledge • Discourse knowledge • Sociolinguistic knowledge

  12. Grabe & Kaplan (1996): Model ofWriting • Components of language knowledge relevantto writing • linguistic knowledge: written code, morphology,vocabulary, syntax • discourse knowledge: cohesion, structure, genre • sociolinguistic knowledge: functional uses ofwriting, register, situational parameters • Influential in teaching and testing of writing(e.g., Weigle, 2002)

  13. Communicative competence 2 • Bachman & Palmer (1996): communicative language ability • Language knowledge • Strategic competence

  14. Douglas (2000):Specific Purpose Language Ability • Language knowledge • grammatical knowledge • textual knowledge • rhetorical organization • cohesion • functional knowledge • sociolinguistic knowledge • Strategic competence • assessment • goal setting • planning • control of execution • Background knowledge • discourse domain

  15. Common European Framework of Reference for Languages: Learning, teaching, assessment(2001) • General competences • Communicative language competences

  16. General competences

  17. Communicative language competences

  18. Communicative language activities and strategies • productive activities and strategies • receptive activities and strategies • interactive activities and strategies • mediating activities and strategies • non-verbal communication • practical actions • paralinguistics • paratextual features

  19. Oral production • public address (information, instructions, etc.) • addressing audiences (speeches at public meetings, university lectures, sermons, entertainment, sports commentaries, sales presentations, etc.) • reading a written text aloud • speaking from notes, or from a written text, or from visual aids • acting out a rehearsed role • speaking spontaneously • singing

  20. Spoken interaction • transactions; • casual conversation; • informal discussion; • formal discussion, • debate; • interview, • negotiation; • co-planning; • practical goal-oriented co-operation.

  21. Oral mediation • simultaneous interpretation (conferences, meetings, formal speeches, etc.) • consecutive interpretation (speeches of welcome, guided tours, etc.) • informal interpretation: • of foreign visitors in own country • of native speakers when abroad • in social and transactional situations for friends • family, clients, foreign guests, etc. • of signs, menus, notices, etc.

  22. CEFR levels The Common European Framework of Reference(Council of Europe 2001) defines communicative proficiency • At six levels, arranged in three bands A1 A2 B1 B2 C1 C2 • in relation to six skills: listening, reading, spoken interaction, spoken production, written interaction, written production • in the form of “can do” statements

  23. Getting to know the levels • The self-assessment grid is not enough • More specific scales: • CEFR Ch 4: descriptors of communicative activities • CEFR Ch 5: descriptors of linguistic competence • The ELP (European Language Portfolio) • Manual: Relating Language Examinations to the Common European Framework ofReference for Languages: Learning, Teaching, Assessment (CEFR)

  24. I can deal with most situations likely to arise whilst travelling in an area where the language is spoken. I can enter unprepared into conversation on topics that are familiar, of personal interest or pertinent to everyday life (e.g. family, hobbies, work, travel and current events). Self-assessment grid (CEFR and standard adult passport)

  25. CercleS ELP: goal-setting and self-assessment checklists

  26. Questions to ask • What competences should my students have in • Spoken interaction • Spoken production • Written interaction • Written production • What tasks should they be able to perform to demonstrate their mastery of the competences? • How well should they be able to perform them?

  27. Reliability • A test needs to be reliable. = It must produce consistent results at different times. • NB! Atest that is not reliablecannot, by definition, be valid.

  28. Reliability • If I take the test again tomorrow, will I get thesame result? • If I take a different version of the test, will I getthe same result? • If the test had had different items, would I havegot the same result? • Do all markers agree on the mark I got? • If the same marker marks my test paper againtomorrow, will I get the same result?

  29. Factors affecting reliability • Poor administration conditions – noise,lighting, cheating • Lack of information beforehand • Lack of specifications • Lack of marker training • Lack of standardisation • Lack of monitoring

  30. References • Bachman, Lyle F. (1990) Fundamental Considerations in Language Testing, Oxford University Press, Oxford. • Bachman, Lyle F. and Palmer, Adrian (1996) Language Testing in Practice, Oxford University Press, Oxford. • Cushing Weigle, Sara (2002) Assessing Writing, Cambridge University Press, Cambridge. • Douglas, Dan (2000) Assessing Languages for Specific Purposes, Cambridge University Press, Cambridge.

More Related