1 / 82

Simple Stats Say It All

Simple Stats Say It All. Eileen E. Schroeder schroede@uww.edu E. Anne Zarinnia zarinnie@uww.edu.

dorcas
Télécharger la présentation

Simple Stats Say It All

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Simple Stats Say It All Eileen E. Schroeder schroede@uww.edu E. Anne Zarinnia zarinnie@uww.edu

  2. “One of the proposed strategies is to drastically reduce the number of school librarians in the area claiming that school libraries can be effectively run by aides to ensure services are provided and the library remains open. This is despite the fact that I have hundreds of students in the library each day, and teach in the classroom regularly. I have voiced my objection, but I am told that such reductions will not impact on student learning in any way.” Todd, R. Evidence-Based Practice: Findings of Australian Study, 2002-2003. AASL 2005

  3. Data driven decision making • Need to convince decision makers that library media program enhances school mission: • Students competent to enter the information age • Students achieve at higher levels due to LMS • Students achieve at higher levels because of quality library media program Berkowitz, R. From Indicators of Quantity to Measures of Effectiveness. AASL 2005

  4. Purposes of Data Collection • Baseline data (describe program) • Document need for a program or idea • Raise awareness about a problem, condition or solution • Evaluate progress • Identify strengths and weaknesses A Planning Guide to Information Power: Building Partnerships for Learning. AASL, 1999. AASL 2005

  5. Communicate with decision makers • Recognize financial concerns • Acknowledge concerns about student performance • Speak to concerns in ways decision makers understand and value • Educate about LMS role and responsibilities • Provide visible evidence of positive impact of program Berkowitz, R. From Indicators of Quantity to Measures of Effectiveness AASL 2005

  6. Evidence-based practice • Demonstrate • Outcomes of making and implementing sound decisions in daily work • Impact of decisions on organization goals and objectives Todd, R. Evidence-Based Practice: Findings of Australian Study, 2002-2003. AASL 2005

  7. What data is most valuable? • Demonstrate difference LM program makes in: • Content learning • Information literacy skills • Technology skills • Reading • Collaborative planning and teaching • Demonstrate impact of resources Loertscher, D.. California Project Achievement. AASL 2005

  8. Student Learning Data Gathering Techniques • Tallies and counts • Schedules / calendars • Logs and anecdotal records • Observations • Interviews • Performance assessment • Products A Planning Guide to Information Power: Building Partnerships for Learning. AASL, 1999. AASL 2005

  9. Student Learning Student Learning • Content learning • Achievement in coursework • Standardized test achievement • Critical thinking • Independent thinking • Interaction with others • Personal responsibility for learning • Motivation • Teacher tests • Test item analysis • Rubrics, checklists • Performance assessments • Student or teacher interviews or focus groups • Observations AASL 2005

  10. Student Learning Standardized Tests: WKCE • LMS reinforces skills taught in the classroom • [Mis)-]Alignment to standards (1998 Match to Terra Nova, Form A items) • Some standards not appropriate for pen and pencil test (e.g., media and research standards) • Grade 10: 1 item matches each of LA Standards E and F • Grade 8: 1 item matches LA Standard F • Grade 4: 1 item matches each of LA Standards E and F Wisconsin Knowledge and Concepts Examinations: Alignment to Wisconsin Model Academic Standards, 1998 (http://www.dpi.state.wi.us/oea/alignmnt.html)) AASL 2005

  11. Student Learning WKCE Sample Question AASL 2005 WKCE 8th Grade Reading Sample Question (http://www.dpi.state.wi.us/oea/read8itm.html)

  12. Student Learning WKCE Sample Question WKCE 10th Grade Social Studies Sample Question (http://www.dpi.state.wi.us/oea/ss8items.html) AASL 2005

  13. Student Learning Standardized Tests: Reading • Assessment Framework for Reading • Objectives supported by library program: • Analyzing literary and informational text • Evaluate and extend literary and informational text • Use of framework: • Match local curriculum to framework • Engage in discussions on where skills are taught and reinforced • Examine problem areas Assessment Framework for Reading in Grades 3 through 8 and 10, 2005 (www.dpi.state.wi.us/dpi/oea/wkce-crt.html) AASL 2005

  14. Student Learning Provide resources Story hours Research projects Assessment Framework for Reading in Grades 3 through 8 and 10, 2005 (www.dpi.state.wi.us/dpi/oea/wkce-crt.html) AASL 2005

  15. I & TL Information and Technology Literacy • Student mastery of grade level benchmarks • Information literacy skills • Technology skills • Independent use of skills • Ability to transfer skills to new problems • Student spontaneous and persistent use of information literacy skills and inquiry AASL 2005

  16. I & TL Collecting Information and Technology Literacy Data Interpret - Teacher • Mastery • Teachers: Assess skills in curricular projects • Teachers: Discuss or interview on student skills • Teachers: Review sample products or portfolios • LMS: Information literacy skills in lessons • Upcoming 8th grade technology assessment • Standardized tests: Analyze items linked to information literacy skills • Students: Self-assess skills • Students: Research logs • Students: Conferences (reflect on work, skills and benefits) • Students: Interviews, surveys or focus groups Interpret - Student Assess - Teacher Assess - Student Enough - Student Need / Strategies Thinking - Teacher Information Power (1988) pp. 176-181 and Loertscher & Todd (2003) We Boost Achievement, pp. 115-118 AASL 2005

  17. Independent use Spontaneous and persistent use Teacher discussions or interviews on student skills Feedback from teachers and/or LMS at next level Student interviews, surveys or focus groups LMS or teacher observations I & TL Collecting Information and Technology Literacy Data Information Power (1988) pp. 176-181 and Loertscher & Todd (2003) We Boost Achievement, pp. 115-118 AASL 2005

  18. I & TL More Rubric Sites • Oak Harbor, Washington, Information Skills Rating Scale • DigiTales: Digital Media Scoring Guides • NCRTEC Scoring Guide for Student Products • Research Report Rubric Generator • Rubric generator sites • http://school.discovery.com/schrockguide/assess.html AASL 2005

  19. I & TL Mastery: Online Self-Assessment • Student checklist of skills used during research • Locally developed web form • Tie to database (FileMaker, Access) • Email submission from form using CGI script • AASL Power Learner Survey File AASL 2005

  20. I & TL Mastery: Online Assessments • Locally created test • Discovery School example (http://school.discovery.com/quizzes31/eileenschroeder/Research.html) • College online assessments • ETS’s ICT Literacy Assessment • being tested, http://www.ets.org/ictliteracy/index.html • Cal Poly - Pomona • http://www.csupomona.edu/~library/InfoComp/instrument.htm • Raritan Valley Community College • http://library.raritanval.edu/InfolitTest/infoLitTest.html • Cabrillo College • http://www.topsy.org/InfoLitAssess.html AASL 2005

  21. I & TL Mastery: Technology Skills Assessments • 8th grade tech assessment • NETS Online Technology Assessment • NCREL / NETS for Students: Extended Rubric • Bellingham Technology Self-Assessments AASL 2005

  22. I & TL NETS Online Assessment AASL 2005

  23. I & TL AASL 2005

  24. I & TL Observations • Focus and limit the scope • What can’t be measured in a product • Observe small groups • Checklists and rubrics for consistency • Different subject areas and groups • Options: • Deep: Several observations over short period • Broad: Single observations on regular basis • Partner with others to do observations • Plan ahead and prepare Improve Your Library: A Self-Evaluation Process for Secondary School Libraries and Learning Resource Centres. Department for Education and Skills. AASL 2005

  25. I & TL AASL 2005

  26. I & TL Spreadsheet or Database? • Organize • Fields • Subfields • Scores (rubrics for scoring) • Gather • Laptop? • PDA? • Paper? AASL 2005

  27. I & TL AASL 2005

  28. I & TL Creating Surveys • What do you really want to know? • Who will have the answer? Who will you survey? How many? • Are questions and instrument as brief as possible? • Will answers be selected from options (yes/no, ranking, rating) or open-ended? • Are questions and instructions clear, not open to multiple interpretations? • Do questions ask for personal opinion? Do you want that? • Are embarrassing or leading questions excluded? Survey on Information Literacy Technology Self-Assessment Independent Use (Survey Monkey) A Planning Guide to Information Power: Building Partnerships for Learning. AASL, 1999.and Improve Your Library: A Self-Evaluation Process for Secondary School Libraries and Learning Resource Centres. Department for Education and Skills. AASL 2005

  29. I & TL Survey: Independent Use AASL 2005

  30. I & TL Online Survey Tools • Survey Monkey • $200 annually for 1000 responses per month • http://www.surveymonkey.com/ • Zoomerang • $350 annually • http://info.zoomerang.com • WebSurveyor • More sophisticated (help on survey design and analysis, use of passwords) • $250 for single surveys, $1500 annually • FileMaker Pro • VivED • Limited version free for K-12 educators • http://www.vived.com/ AASL 2005

  31. I & TL Interviews / Focus Groups • Make purpose clear • Script questions, but adapt language • Have follow-up questions ready • Record answers (tape or by hand), but don’t let this interfere with your attention to respondent • Select interview location free from interruption • Get a range of students / teachers (users and non-users, grade levels, abilities, genders, ethnicity, subject areas) • May be more useful to interview students in groups • May get more honesty if LMS does not do interviews Improve Your Library: A Self-Evaluation Process for Secondary School Libraries and Learning Resource Centres. Department for Education and Skills. AASL 2005

  32. I & TL Teacher Interview Questions: Use • Do students appear confident in working in the library? • Are the students self- motivated and able to work independently, or do they need assistance to find information? • Do students choose methods of working best suited to the information seeking task? AASL 2005

  33. I & TL Interview versus Survey • Interview • Extended, open-ended answers • Adaptive questions • Reach small number but in more depth • Survey • Closed questions - range of possible answers known • Can use branching • Can reach larger number of people • Easier to conduct and tabulate AASL 2005

  34. Choice to read voluntarily Enjoyment of reading Amount read Voluntarily As part of curriculum Access to reading materials Suitably challenging and varied selections Impact on reading comprehension Choice to read Student surveys, interviews or focus groups Reader self assessments Snapshot of reader advisory Amount read Reading inventories (pre/post) Student reading logs Circulation statistics, ILL requests Track involvement in reading incentive activities Access Collection mapping Comprehension Analysis of library involvement in teacher unit plans for reading Teacher surveys, interviews or focus groups Standardized or local reading test score Accelerated Reader / Reading Counts points Reading Reading Loertscher, D. California Project Achievement. AASL 2005

  35. Reading Reading Habits • KMMS Reading Inventory Online • (http://ms.kmsd.edu/%7Emsimc/reader_survey.html) • Independent Reading Rubric (Cornwell) • Print version: (http://www.indianalearns.org/readersindependent.asp) • Online survey based on Cornwell (in Zoomerang) • Reading Log • Power Reader Survey (AASL) AASL 2005

  36. Reading Reading Survey AASL 2005

  37. Collaboration with teachers Time and frequency of collaboration Number and range of teachers collaborating Level of collaborative activity and LMS support Gather resources for unit Provide lesson ideas Integrate info. tech literacy skills in curriculum Teach information or technology skills Quality of learning experience Types of assignments - Higher level thinking Teachers use information problem solving model Impact on content learning and information skills Integration of info and tech literacy skills Greater use of resources Level of student engagement Schedules Collaborative planning records Prepared bibliographies Unit plans Collaboration Collaboration • Unit / lesson plans • Curriculum maps • Post-unit reflections • Interviews, focus groups, surveys, • Assessment - student • content knowledge • Information skills • motivation AASL 2005

  38. Collaboration Planning Sheets: Wisconsin DPI AASL 2005

  39. Collaboration Planning Sheets Stacy Fisher. and Jane Johns. Milton Middle School AASL 2005

  40. Collaboration Post-Unit Review Unit title: Timeframe for unit: Teacher: # of students What worked well? Suggestions for improvement: Time spent on teaching information literacy / technology Information & technology skills / standards learned: From both the LMS’s and the teacher’s point of view was the unit enhanced by collaboration? Yes No Why? Was the unit successful enough to warrant doing it again? Yes No Why? How well was the unit supported by: (5=excellent, 4=above average, 3=average, 2=below average, 1=poor) The collection The web resources Diversity of formats 5 4 3 2 1 5 4 3 2 1 Recency 5 4 3 2 1 5 4 3 2 1 Number of items 5 4 3 2 1 5 4 3 2 1 Reading level 5 4 3 2 1 5 4 3 2 1 Technology 5 4 3 2 1 5 4 3 2 1 What materials / technology will we need if we are planning the unit again? Attach a list of resources used and/or found useful. Adapted from Loertscher and Achterman (2003). Increasing Academic Achievement through the Library Media Center, p. 17. AASL 2005

  41. Collaboration Tracking Collaborative Units Input form 1 Skills Report • Impact! • Collaboration profile • Activities • Hours spent • Learning venues • Difficulty level of units • Content area profile • Resource profile • Research skills profile (3-9 skills) • Collaboration timeline Input form 2 Collaboration Stats Input form 3 Collaboration Goals Input form 4 Activities Coverage Hours and Places Timeline AASL 2005

  42. Collaboration Log sheets Stacy Fisher and Jane Johns. Milton Middle School AASL 2005

  43. Collaboration AASL 2005

  44. Collaboration Filemaker Database • Reservations • Planning • Collaboration • Evaluation AASL 2005

  45. Range, appropriateness, level, and amount of resources for curricular needs and student interests Organization, accessibility and use of resources, space, and technology by staff and students In LMC, classroom, over network, from home During and outside school hours Circulation of resources Use of online resources Staff expertise and availability Collection mapping tied to curriculum Post-unit assessment of resources Post-unit student assessment Library and lab sign-ups Circulation statistics Logs of online resource use Interviews or focus groups Satisfaction surveys Resources Resources: Actual and Perceived AASL 2005

  46. Resources Circulation Statistics Circulation Statistics from Winnebago Spectrum AASL 2005

  47. Resources Use Tracking: Day Sample Val Edwards. Monona Grove High School. AASL 2005

  48. Resources Use Tracking: Quarterly Sample Val Edwards. Monona Grove High School. AASL 2005

  49. Resources Use Tracking: Week Sample AASL 2005

  50. Room Scheduling AASL 2005

More Related