1 / 56

Standards, Structure, & Field Testing of ACCESS for ELLs™

Standards, Structure, & Field Testing of ACCESS for ELLs™. Jessica Motz Center for Applied Linguistics Washington, DC. January 2005 Illinois 28 th Annual Statewide Conference for Teachers Serving Linguistically and Culturally Diverse Students. Outline. Background on the WIDA Project

kayleen
Télécharger la présentation

Standards, Structure, & Field Testing of ACCESS for ELLs™

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Standards, Structure, & Field Testing of ACCESS for ELLs™ Jessica Motz Center for Applied Linguistics Washington, DC January 2005 Illinois 28th Annual Statewide Conference for Teachers Serving Linguistically and Culturally Diverse Students

  2. Outline • Background on the WIDA Project • WIDA Standards for English Language Proficiency – language vs. content • Structure of the ACCESS for ELLs™ Test • Sample Items • Pilot Test Results • Field Testing Update • Training for ACCESS for ELLs™ • Other Issues/Questions/Discussion 2

  3. Origins of the WIDA Consortium 2002: U.S. Dept. of Education Enhancement Grant Competition 3 Original States: Ten states, representing approx. 275,000 ELLs Wisconsin Delaware Arkansas Early Additions: District of Columbia Rhode Island, Maine, New Hampshire, Vermont Later Additions: Illinois, Alabama 3

  4. Multiple WIDA Components • Language Proficiency • English Language Proficiency Standards • Large-Scale Assessment (ACCESS for ELLs) • Classroom Instruction & Assessment • Academic Content • Alternate Assessment (Alternate ACCESS) • Professional Development • Standards • Assessment • Instruction • Validation and Research 4

  5. WIDA Partners State Leadership Tim Boals, Wisconsin DPI, Chair Steering Committee Lead Developer (Standards and Project PI) Margo Gottlieb, Illinois Resource Center Item Specification Development Fred Davidson, UIUC Test Development Language Testing Division, CAL Professional Development Lorraine Valdez-Pierce, George Mason University Technical Applications (Database, Desire2Learn) University of Wisconsin - Oshkosh 5

  6. Centrality of the ELP Standards ClassroomAssessmentFramework Large-ScaleAssessmentFramework English Language ProficiencyStandards& PerformanceDefinitions ModelPerformanceIndicators:Classroom ModelPerformanceIndicators:Large-Scale 6

  7. Overall Organization of Standards Frameworks for Classroom & Large-Scale Assessment (2) English Language Proficiency Standards (5) Language Domains (4) Grade Level Clusters (4) Language Proficiency Levels (5) Model PIs are the lowest level ofexpression of the standards Model Performance Indicators 7

  8. The WIDA ELP Standards Standard 1—SI • English language learners communicate in English for social and instructional purposes in the school setting. Standard 2— LA • English language learners communicate information, ideas and concepts necessary for academic success in the content area of Language Arts. Standard 3—MA • English language learners communicate information, ideas and concepts necessary for academic success in the content area of Math. Standard 4—SC • English language learners communicate information, ideas and concepts necessary for academic success in the content area of Science. Standard 5— SS • English language learners communicate information, ideas and concepts necessary for academic success in the content area of Social Studies. 8

  9. The Levels of English Language Proficiency 5 BRIDGING 4 EXPANDING 3 DEVELOPING 2 BEGINNING 1 Formerly LEP ENTERING 6 Never LEP 7 9

  10. Criteria for Proficiency Level Definitions 1 2 3 4 5 • Comprehension and use of the technical language of the content areas • Extent of language (text or discourse) control • Development of phonological, syntactic, and semantic understanding or usage ENTERING BEGINNING DEVELOPING EXPANDING BRIDGING 10

  11. Large-Scale Standards: SC Reading 11

  12. Large-Scale Standards: SC Reading Classify living organisms (such as birds and mammals) by using pictures or icons 12

  13. Large-Scale Standards: SC Reading Interpret data presented in text and tables in scientific studies 13

  14. Large-Scale Standards: SC Reading 14

  15. Assessment Forms • Non-secure form for initial screening (July 1) • One for each grade level cluster - with items at all 5 proficiency levels • Kindergarten form - individually administered • Secure forms for annual testing • Two (initially: 100 and 200) for each grade level cluster • Tier A: Proficiency levels 1-3 • Tier B: Proficiency levels 2-4 • Tier C: Proficiency levels 3-5 15

  16. 1 2 3 4 5 ENTERING BEGINNING DEVELOPING EXPANDING BRIDGING Tier A Tier B Tier C Annual ACCESS for ELLs Tier Alignment with Proficiency Levels 16

  17. WIDA Steering Committee Administration Decisions • Listening (15%): 20-25 minutes, machine scored • Reading (35%): 35-40 minutes, machine scored • Writing (35%): Up to 1 hour, rater scored • Speaking (15%): Up to 15 minutes, administrator scored 17

  18. Grade Leveland Tier Structure of ACCESS for ELLs K 1-2 3-5 6-8 9-12 A (adaptive) A B C A B C A B C A B C Domains Listening — group admin, machine scored Reading — group admin, machine scored Speaking — individual admin, adaptive, TA scored Writing — group admin, rater scored Forms 100 (roll-out Spring 2005) 999 (used to produce screener) 200 (roll-out Spring 2006) 18

  19. Item Creation Process (Fall 2003 – present) • Item specifications drafted • Item writers assembled from nominated ESL teachers in consortium states • Item writers trained using Blackboard distance-learning management software • Item writers submitted items electronically • Items reviewed by External Reviewers, also trained via Blackboard • Items reviewed and revised internally & organized into themes… 19

  20. Item Creation Process, continued… (Fall 2003 – present) • Thematic Folders of items arranged onto pilot test forms • Forms piloted in 5 WIDA districts • Pilot analysis & feedback incorporated; items created, revised by ESL teachers • Thematic Folders of items arranged onto field test forms • Forms field tested in 8 WIDA states • Field test analysis occurring presently • For more information on item development process: Jim Bauman’s session 20

  21. Pilot Testing Results 21

  22. Pilot Test Participation April – May 2004 5 Districts: Kenosha & Milwaukee, WI; Chicago & Cicero, IL; Washington, DC Approx. 1100 students in grades 1-12 22

  23. Listening Test • Multiple choice • 20-25 minutes 23

  24. Sample Items: Listening, Science 1-2 24

  25. Science Listening 1-2 P1 PI: “identify objects according to chemical or physical properties from pictures and oral statements” SCRIPT: “A seed is small. Find the small seed.” 25

  26. Science Listening 1-2 P2 PI: “match objects with their chemical or physical properties from pictures and oral statements” SCRIPT: “One day the seed will grow into something large, round, and heavy. Find what the seed grows into.” 26

  27. Science Listening 1-2 P3 PI: “identify and group objects according to chemical or physical properties from oral statements” SCRIPT: “Seeds grow into plants. Find something else that grows.” 27

  28. 1 0.94 0.89 0.79 0.8 0.6 Mean 0.4 0.2 0 P1scored P2scored P3scored Percent Correct on Each Item (Grades 1-2: n = 173) 28

  29. Percent Correct on Each Item (by Tier) 29

  30. Percent Correct on Each Item (by Grade Level) 30

  31. 1000 900 800 700 600 GRADELEV 500 g12 400 g35 Mean ITEMDIFF 300 g68 200 g912 1 2 3 4 5 PLEVEL Average Listening Item Difficulty Across Grade Clusters (by Proficiency Levels) 31

  32. Average Reading Item Difficulty by Grade Level Cluster across P-Levels 32

  33. Reading Test • Multiple choice • 35-40 minutes 33

  34. Examining Reading Items Across a Strand for the Different Tiers Language Arts, Reading, Grades 3-5 34

  35. Sample Item (Field Test): Reading, Lang. Arts, Grades 3-5, Tier C 35

  36. Items Tied to Performance Indicators: Tier C Items #11: “To Jessica, he was the best dog in the world.” This sentence shows Jessica’s opinion. Which of the following also shows an opinion?PI (p3): Identify language associated with stating opinions found in fiction or non-fiction text. #12: When Jessica saw the sign for the lost dog, why did she believe it was Blue?PI (p4): Differentiate between statements of fact and opinion found in various reading selections. #13: Why does the woman say at the end of the story, “He’s your dog, all right!”?PI (p5): Identify author’s reasons or intent for selecting facts or opinions found in fiction or non-fiction from grade-level language arts text. 36

  37. Reading Items Adapted for Tier A • Simpler text and more graphic support • Items at proficiency levels 1, 2, and 3 • For example: 1. Which is Blue?PI (p1): Match labels or identify facts from pictures and phrases. 2. “I know he has white spots.” Which words in this sentence tell you it is a fact?PI (p2): Identify language associated with stating facts found in short fiction or non-fiction text supported by pictures or graphics 3. [Same item from Tier C] “To Jessica, he was the best dog in the world.” This sentence shows Jessica’s opinion. Which of the following also shows an opinion?PI (p3): Identify language associated with stating opinions found in fiction or non-fiction text. 37

  38. 1 2 3 4 5 ENTERING BEGINNING DEVELOPING EXPANDING BRIDGING Tier A Tier B Tier C Annual ACCESS for ELLs Tier Alignment with Proficiency Levels on Test Forms (L, R, W) 38

  39. Writing Test • Up to 1 hour • 4 tasks per tiered form: • SI • MA • SC • LA/SS Model provided to give background and structure for the task 39

  40. Sample Item: Writing Grades 6-8 Lang. Arts Writing 6-8 P5 PI: “defend positions or stances using original ideas with supporting details” 40

  41. 41

  42. Speaking Test: Adaptive Format 42

  43. Sample Item: Speaking 43

  44. Task (Proficiency Level) 1 Example: Grades 3-5 First let’s talk about things people do outside. This is a picture of people in a park. I’m going to ask you some questions about this picture. Q1: (Point to TREE) What is this? Q2: (Point to BALL) What is this? Q3: (Point to DOG) What is this? Q4: (If necessary) What else do you see in this picture (OR) What other things do you see in this picture? SI Speaking 3-5: P1 “Respond to WH- questions” 44

  45. Task (Proficiency Level) 2 Example: Now listen carefully. I’ve just asked you some questions about this picture. Now I want you to ask me some questions about it. (OR) Pretend you are the teacher and want to ask me some questions about this picture. For example, you could ask me, “Where are the people?” OK? Q1: (Point to BOY ON BIKE) What do you want to know about him? (OR) Ask me a question about him. Q2: (Point to PICNIC TABLE) What do you want to know about this? (OR) Ask me a question about this. Q3: What other things do you want to know about his picture? (OR) What’s another question you can ask me about (anything in) this picture?(Answer student’s question.) SI Speaking 35: P2 “Ask and respond to questions” 45

  46. Task (Proficiency Level) 3 Example: Now let me tell you something about these children. (Point to CHILDREN PLAYING CATCH) Their names are Alex and Leticia. They like to play catch. Q1: Do you like to play catch? Q2: (If “Yes”) What else do you like to do? Q3: (If “No”) What do you like to do? Q4: What do you like about __________? (OR) Tell me something about ___________. Q5: (If necessary) Tell me more. SI Speaking 35: P3 “Exchange personal information” 46

  47. Field Testing: Overview 47

  48. Field Test Participation Request: • Group-administered sections, 600 per form (minimum 400 per form) • Individually-administered sections, 225 per form Includes 8 WIDA states Proportional representation of states (approx. 5.5%) Approx. 8700 students total (~3500 from Illinois) 48

  49. Initial Field Test Analyses Concurrent calibration of MC items (Rasch) Separate scale construction for each domain Raw score to scale score on the screener NOTE: Steering Committee determined the following weights when a single level designation is needed: Writing 35% (e.g 2) Reading 35% (e.g. 2) Listening 15% (e.g. 5) Speaking 15% (e.g. 4) Composite = 2.75 (not yet 3) 49

  50. Sample Feedback received to date • Input from test administrators and coordinators: • Test Length: The test is taking longer to administer than anticipated. • Test Difficulty: The test is more difficult than anticipated. • Grades 1-2 Test: Too challenging for fall first graders; these students should take K test. • Grades 9-12 Test: There is not enough authentic literature on the Reading Test. • CAL and DPI responded to these concerns via e-mail and the D2L (online training) discussion board. 50

More Related