1 / 75

The Praxis of Teaching Reading after Reading Clinic/Literacy Lab National Reading Conference Austin, TX November 2007

From Clinic to Mandates. The Praxis of Teaching Reading after Reading Clinic/Literacy Lab National Reading Conference Austin, TX November 2007. B. Laster- Towson Univ. L. McEnery- Univ. of Houston-Clear Lake T. Deeney- Univ. of Rhode Island C. Dozier Univ. at Albany

dulcinea
Télécharger la présentation

The Praxis of Teaching Reading after Reading Clinic/Literacy Lab National Reading Conference Austin, TX November 2007

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. From Clinic to Mandates The Praxis of Teaching Reading after Reading Clinic/Literacy Lab National Reading Conference Austin, TX November 2007 Reading Clinics 2007

  2. B. Laster- Towson Univ. L. McEnery- Univ. of Houston-Clear Lake T. Deeney- Univ. of Rhode Island C. Dozier Univ. at Albany S. Sargent- Northeastern State Univ. J. Cobb- Coastal Carolina Univ. D. Gurvitz- National Louis University A. Morewood- West VirginiaUniversity S. McAndrews-Southern Illinois - Edwardsville D. Gaunty-Porter-Vanguard Univ. L. Dubert- Boise State University C. Barnes-West VirginiaUniversity 2007 Researchers Inspiration: P. Freppon, Univ. of Cincinnati Reading Clinics 2007

  3. Purpose • Follow-up on 2004 survey (n=150) and initial analysis of interviews (n=28) with much more depth • Examine the current roles of clinic/lab program graduates • Find out whether graduates use various practices introduced in the clinical setting, and with what level of confidence • Discover whether clinics/labs prepare teachers for various school-based roles (e.g. teaching skills vs. leadership) Reading Clinics 2007

  4. Methodology-Data Collection • IRB approval at each site. • Located and notified 2-3 graduates of Clinic who are typical graduates of the program. They should represent different populations, length of experiences, positions, etc. • In arranging the interview, send a note that says to collect three artifacts that reflect your teaching of literacy. • Interview on site at the teacher’s school, not at the university or elsewhere. • Took notes on classroom environment: Books, environmental print, room organization, student work on walls, etc. • Audiotaped interview • Transcribed the Interview Reading Clinics 2007

  5. Methodology-Data Collection • Prompts focused on five main areas: • Assessment practices • Instruction • Leadership • Coaching • Technology • A holistic approach also: “Talk about a child/teacher you are currently working with. Talk about strengths/needs. Talk about surprises. Talk about your thinking in how you assist them in their development” Reading Clinics 2007

  6. Methodology-Preliminary Data Analysis 2006 • For Theme Analysis: Categories were refined, collapsed, and redefined during subsequent readings and discussions within the teams and with the larger group of researchers until the categories encompassed all of the data for that theme. • For Site Analysis: We compiled a chart of the key activities and philosophies of each of the participating Reading Clinic/Literacy Lab. This snapshot of the sites allowed for linkages to be made between what the clinical experience was and what the professionals in the field reported about their daily job expectations and experiences. • Summaries of findings were reported to a central researcher who compiled them. Reading Clinics 2007

  7. Interviews at 11 Institutions, n=28 • Boise State University, ID • Eastern New Mexico University, NM • National Louis University, Chicago IL • Northeastern State Univ., OK • Southern Illinois Univ. – Edwardsville, IL • Southern Utah University, UT • Towson University, MD • University of Houston, Clear Lake, TX • University of Pittsburgh, PA • University of Rhode Island, RI • Vanguard University, CA Reading Clinics 2007

  8. Methodology Secondary Analysis--2007 • Short organizational meeting April • Two day writing retreat May 2007 • Reread transcripts and reexamined codes • Renegotiated specific categories within each area • Re-analyzed all data • Conferred across themes; noted alignment • Teams wrote up findings • Compiled across themes into emerging manuscript Reading Clinics 2007

  9. Emerging Themes • Mandates • Assessment • Instruction • Technology • Artifacts • Coaching and Leadership Reading Clinics 2007

  10. Navigating Mandates: Teachers Face “Troubled Seas” Jeanne Cobb Coastal Carolina University Stephan Sargent Northeastern State University Chitlada Patchen University of North Texas Reading Clinics 2007

  11. National Mandates • Four of the eleven who were interviewed (NM, ID, MD, and IL) mentioned No Child Left Behind (NCLB). • Multiple participants also mentioned Adequate Yearly Progress (AYP). Reading Clinics 2007

  12. State Mandates • State-mandated achievement tests were identified frequently. Seven of the eleven sites referred to such measures (e.g. TEKS, ISAT, CRT, MSA, HAS, Illinois Snapshot of Early Literacy, and BEAR by Riverside ). • State-mandated curriculum competencies and/or benchmarks were mentioned several times (e.g. NM, OK). Reading Clinics 2007

  13. Local, District, and School Mandates Discussions centered around categories: • Mandated commercial assessment instruments were described by nine of the eleven sites. DIBELS was the most frequently mentioned • Note: DIBELS may fall under multiple categories for purposes of this presentation. • Locally-mandated curriculums were described by nine of the eleven sites (e.g. curriculum alignment). • Mandated professional development programs were mentioned in several interviews. • Mandated commercial reading programs were described by eight of the eleven sites (e.g. Accelerated Reader) Reading Clinics 2007

  14. Continuum of Responses Discovered in Deeper Analysis • Analysis revealed a “clash” between clinical instruction and “real world mandates” • A continuum of stances emerged based on the extent to which the teachers were able to blend mandated assessment & instruction with the university reading clinic methodology. Reading Clinics 2007

  15. Continuum of Responses Discovered in Deeper Analysis • “Yes!” - One End of the Continuum… The educator who truly agrees with the mandates and simply follows the mandate(s) exactly • “Yes, and!” - On the Other End… The teacher who follows his/her instincts and not only supplements and modifies what is mandated, but often finds innovative and creative solutions to the dissonance encountered between his/her own philosophical belief system/university methodology and the restrictions of the accountability system • Yes, but!” - In the Middle… The teacher who follows the mandated instruction/assessment exactly, although he/she realizes children’s needs may not be met with this planalone Reading Clinics 2007

  16. The Case of J.A. – “Yes!” • J.A. is a reading coach/specialist who states that her pull-out instructional program focuses on skill and drill. • “The kids…they get tired of it. You know, I get tired of it. That doesn’t matter.” J.A. has adopted a stance of acceptance, resignation. • J.A. follows precisely the mandates thrust upon her and appears to agree with the requirements. • J.A. mentions trying to find time on Fridays for read-alouds, implying that mandated instructional programs leave little time for developmentally appropriate strategies. • J.A. appears to be a follower of the path laid out for her – choosing simply to comply, no questions asked. Reading Clinics 2007

  17. The Case of B.G. – “Yes, and!” • BG follows the required mandates (e.g. Target Teach, State Benchmark Testing, etc…), but also supplements with multiple reading assessment/instructional techniques to ensure the best possible literacy instruction for all children. • “There had been so much pressure on us to fix things in the past couple of years. I’m not sure it’s sunk in yet, but maybe we can get back to teaching like we know is best and not just focus on the ‘list.’” • BG mentioned “I use the materials from clinic all the time in my school.” She mentioned that materials such as the Basic Reading Inventory are quite helpful to supplement mandated measures. • After describing a score report from the BEAR Test, BG noted that one child was often in trouble. He had even been suspended and in trouble with the law. A specific example well exemplifies how a teacher can follow mandates but still meet individual needs by supplementation. Reading Clinics 2007

  18. The Case of CB “Yes, but” • Role: An instructional support teacher in a middle school • Planning • Training • Practice: • Transferred learning from course work • Difficulties with training others • Teachers don’t use assessments to guide instruction Reading Clinics 2007

  19. The Case of CB “Yes, but” • Impact : comply with district, state, and federal mandates, but… • Teachers: lack resources , experience bumpy path, obstructed from being creative thinkers • Students: less engaged, more behavior problems due to worksheet-orientation • Curriculum: highly test driven, ignored students’ and teachers’ social, developmental and practical learning needs. Reading Clinics 2007

  20. Summary Interviews revealed: • Stress and anxiety (related to mandates and students’ acceptable performance) was apparent for both teachers and students • Participants noted a lack of resources to implement all the components of NCLB • Those interviewed possess a keen awareness of individual needs and attempt to focus on individual assessment when possible • Many participants share a concern about a lack of time to incorporate the naturalistic, authentic assessment strategies learned in clinic because of the testing mandates • Awareness that clinic courses provided knowledge of naturalistic assessments and a confidence in the ability to provide for children’s needs and a striving to supplement the mandated assessments with their own assessments Reading Clinics 2007

  21. Assessment Terry Deeney Univ. of Rhode Island Stephanie McAndrews Southern Illinois - Edwardsville Reading Clinics 2007

  22. Prior Findings:Assessment Transfer from Clinic to Classroom Knowledge • Empowerment Skill • Assessment as diagnosis: “What’s going on” • Choice: What assessment to give • Analysis: How to interpret assessment data Materials • Matching assessment and instructional programs Three main themes of transfer: Reading Clinics 2007

  23. Current Study: Clinic Goals for Assessment Knowledge • Administering and interpreting a wide range of assessments standardized, informal (tools, observation, notes) • Evaluating assessments (what they are, what they do/do not do) • Purpose of assessment is to inform instruction—assessment and instruction are recursive and on-going • Triangulating data Skills • Knowledgeable professionals who make informed decisions • Recognize and address the constraints and affordances of local schools • Critical consumers and reflective decision makers • Using assessment to inform instruction and for problem solving • Choosing/selecting assessments to match purpose Dispositions/Stances • Descriptive versus evaluative (comparative) • Agentive • Collegial and collaborative problem solving with multiple stakeholders Reading Clinics 2007

  24. Data and Analyses Assessment questions • Talk about your role in the assessment process in your classroom or school • Talk about how assessment informs your instruction • How did the clinic experience prepare you for your role in the assessment process? Other portions of transcripts • All transcripts read and coded for any mention of assessment • Cut/pasted into new transcripts of just assessment discussion Analyses • Data read and re-read • Instances of clinic goals cut/pasted into file for each goal • Assessment transcript coded for themes not identified a priori Reading Clinics 2007

  25. Findings: Clinic Goals • Assessment as recursive and on-going (20) • Assessment as descriptive (15) • Triangulation (7) • Choosing/selecting assessments (24) • Problem solving (19) • Collaboration (15) • Grouping and lesson planning (15) Reading Clinics 2007

  26. Recursive/on-going • I am just going around conferring with them or checking their work or the dip stick assessment, that is what drives my instruction. I know what my curriculum is, what I have to ultimately have taught, but if they don’t have some of the basics and I realize that when I am going around looking at their work, watching them in a group, with their conferences with each other, when they are conferring with me, checking out their homework, just what they produce, if they’re not getting it, then that is what’s telling me what to do. I use assessment quite a bit to help me drive my instruction, it is not a big formal assessment. I do that, but a lot of informal assessment goes on whether it be conversations or me looking over their shoulders, it drives it. • My assessment is happening throughout the day, informally conferencing with students, pulling them aside, seeing where they’re at with writing, the gaps, and that’s constantly guiding my instruction. Reading Clinics 2007

  27. Role of Assessment • “I can use the assessments I have learned in the practicum courses. I analyze their reading and writing during authentic activities like reading and writing articles and poems. This helps me to do mini lessons on what the students need.” • “I really learned to look at how they were reading differently. I realized the students needed the grapho-phonics, syntax and semantic clues to read. I know how to prompt them to problem solve. I now focus on the students’ needs not just the curriculum.” • For writing assessment, “I use the rubric you gave us (modified from six traits). I really have them focus on the content. We wait until the very end to edit the mechanics. They spend a lot of time adding important information and organizing their ideas. They have to do research too.” • “I can see how important engaging the kids in something that they are interested in is important. They are reading and writing much more! I don’t worry about the (standardized) test scores as much because they are going up. Reading Clinics 2007

  28. Role of Assessment Descriptive • Identifying strengths and weaknesses • Looking “in-depth” • Looking at the whole child • Test data versus performance • Diagnostic approach Triangulation • Adding assessments to the “have to” list • Putting different pieces together • Comparing data from one assessment to another Reading Clinics 2007

  29. Role of Assessment Choosing assessments • Supplementing/following up on mandated assessments • Chose based on other data • Types chosen • Avoiding bias Problem solving • What happens as a result of assessments? • Realizing need to work with teachers to translate data • What are reasons for scores? • How does this inform instruction? • Working collaboratively to tackle the “big problems” identified through assessment • Constraints Reading Clinics 2007

  30. Collaboration • Working with parents, other teachers, support professionals (special ed, reading, psychology) • Interpreting/understanding assessments • Instruction based on assessments • Figuring out “what’s going on” • Formal plans (such as individualized plans as per NCLB) • Working as a school group • Choosing assessments • Planning instruction (curriculum) based on assessments • Having more to “bring to the table,” being asked for assistance (whether or not role is as specialist) • Providing professional development • Coordinating assessment schedules (reading specialists/coaches) Reading Clinics 2007

  31. Grouping • Differentiating instruction based on assessment results • Using available personnel to assist with groups • Groups changed based on on-going data Reading Clinics 2007

  32. Are we meeting our own goals? Knowledgeable • Know and use a variety of assessments • Discuss choices in assessment based on knowledge of the assessment’s purpose or finding assessments to match purpose Agentive stance • Living within the mandates • Tailoring instruction does make a difference • “Yeah, and…” versus “Yeah, but…” • Working with multiple stakeholders to advance instruction Reflective decision makers • Informed decisions of assessment and instruction based on data gathered initially and on-going • Pulling together multiple pieces of data to make decisions Reading Clinics 2007

  33. Transfer of Knowledge from Reading Clinics to Classroom Instruction Aimee Morewood West VirginiaUniversity Dolores Gaunty Porter Vanguard University Reading Clinics 2007

  34. Framework for the Study: Merge Theory And Practice Teachers Cope With Ambiguities Make Confident Research-Based Decisions Reading Clinics 2007

  35. Analysis: General and Specific Codes Reading Clinics 2007

  36. Findings: Main Components Reading Clinics 2007

  37. Findings: Comprehension Reading Clinics 2007

  38. Findings: Phonics Reading Clinics 2007

  39. Findings: Fluency Reading Clinics 2007

  40. Findings: Instructional Strategies used in Multiple Categories Reading Clinics 2007

  41. Conclusions Respondents generally spoke about the five main components of literacy instruction (NRP, 2001) and writing instruction. Respondents are transferring knowledge and instructional practices about comprehension, phonics, and fluency instruction from reading clinic/literacy lab settings to their classrooms. Reading clinics/literacy labs should provide teachers more support when transferring knowledge and instructional practices for writing, fluency, vocabulary, phonemic awareness, spelling, and emergent literacy. Respondents are transferring a variety of instructional practices from reading clinics/literacy labs that can be used in more than one area of literacy instruction. Reading Clinics 2007

  42. Tracks of Technology Lee Dubert Boise State University Barbara Laster Towson University Definition:Technology=media that support our work (tape recorders; computer, Internet, ICTs) Reading Clinics 2007

  43. Tracks of Technology:Categories that Emerged • technologies for literacy assessment • technologies designed to replace instruction • technologies that support instruction • assistive technologies--those for students with severe reading difficulties • technologies used for planning and management Reading Clinics 2007

  44. student independent use Percent Reported planning & management literacy assessment replace instruction support instruction assistive Category What Interviewees Reported Happens in Schools/Classrooms Reading Clinics 2007

  45. Technology Used for Literacy Assessment • Half (13/28) of the interviewees referred to some form of technology-based assessment. • Accelerated Reader (Star) • Hand-held computers —Wireless Generation administration, scoring, and reporting of DIBELS data • Note: other capabilities of Wireless Generation support system not mentioned (running records and other assessment systems like PALS)—clear link to Reading First or NCLB • Other computer-based state assessments • Simply computerized versions of paper/pencil tests • No evidence of more sophisticated assessments of higher level reading/comprehension skills • No evidence of use for evaluation or scoring of student writing Reading Clinics 2007

  46. Student Independent Uses of Technology • 23/28 interviewees reported student independent use of technology including but not limited to: • MySpace • U-Tube • Blogs • Word Processing • Video Productions (sometimes in place of traditional reports) • Presentation software (PowerPoint) • Independent research (not teacher structured web-quests) • Publications and on-line collaboration for class projects • Independent literacy activities (e.g. fanfiction.com, library sponsored book discussions) • On-line discourse communities (pen pals, international “sister” classrooms, etc.) • Games Reading Clinics 2007

  47. Student Independent Uses of Technology • Most reading teachers reflected a positive perception of the value of student independent uses of technology. • A few expressed concerns: • safety issues with MySpace • value of the student uses “not in an arcade” Reading Clinics 2007

  48. Technologies to Support Instruction • Eleven (n=28) reported some form of technology included in lessons they designed for students: • PowerPoint • Internet searches to develop background knowledge and improve comprehension • Class newspapers • Smart Boards • Digital Language Experience Stories • Phonics development software/sites (www.starfall.com--funded by Blue Mountain Cards CEO) • United Streaming • Blogs • Class web-pages • Web-quests Reading Clinics 2007

  49. Technologies to Replace Instruction • Half of the interviewees (12/28) made some mention of the use of software programs used to replace teacher instruction • Reading motivation/reward/award software (Accelerated Reader or Reading Counts) • Reading Academy, Read Naturally, Success maker, Reading Intervention Central, and audio recordings of books Reading Clinics 2007

  50. Assistive Technologies • Limited uses 2/17 • Text-to-speech/speech-to-text • Expanded keyboards • Assisted writing programs Reading Clinics 2007

More Related