1 / 21

Assessment and Communication Journey of One Institution:

Assessment and Communication Journey of One Institution:. Lessons from SPA Reviews. Rubrics, Data Collection and Reporting. Assessment Issues . Specialty Professional Associations (SPAs) reported weakness in our rubrics. Need to provide better feedback to candidates.

tala
Télécharger la présentation

Assessment and Communication Journey of One Institution:

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Assessment and Communication Journey of One Institution: Lessons from SPA Reviews. Rubrics, Data Collection and Reporting

  2. Assessment Issues • Specialty Professional Associations (SPAs) reported weakness in our rubrics. • Need to provide better feedback to candidates. Download the presentation at: http://education.indiana.edu/aacte2011

  3. The Issue: • “…the absence of tabled scores by category or a description of the range of obtained scores did not allow reviewers to determine whether ALL candidates had achieved at least a score of proficient on ALL standards.” • The Solution: • iRubric reporting format; candidate performance by indicator and performance level. • The Journey: • Ongoing collaboration with iRubric in the development of meaningful reporting to reflect candidate performance by indicator/ SPA standard.

  4. The Issue: • “Data tables need to be submitted that reflect the number/percentage of candidates scoring at each element of the assessment rubric rather than average scores.” • The Solution: • iRubric’s reports and analysis tools allows administrators to generate detailed and aggregate reports on individual standards and competencies in real-time. • The Journey: • Faculty introduction and use of iRubric reports and analysis tools; recognition of reports and tools value.

  5. Include detailed descriptors in the cells to give students an idea of what is expected. Can add new sections with different grade scales

  6. Click on the iRubric icon to grade this assignment using the attached rubric.

  7. Click on the cells and the rubric is automatically tallied and scored. Add comments to give the candidate tips for improvement. The candidate gets clear feedback on areas for improvement and also, by seeing the rubric, knows what is expected. It is this level of detail that, in Bloomington, we were cited as in need of improvement by the SPA’s.

  8. Instructor/Candidate Survey • Purpose • aid understanding trends in instructor/candidate communication about course assignments, expectations and earned grades.

  9. Instructor/Candidate Survey • Survey Process • Pre- and Post- iRubric use surveys to instructors and their candidates. • Surveys distributed electronically.

  10. Candidate Survey Results

  11. Candidate Survey Results

  12. Candidate Responses:What, if anything, did you like about the use of iRubric? • “I like that we are not wasting paper, and all of the iRubrics are found in the same place. They are also easy to follow.” • “I like the way iRubric popped up in its own window and clearly showed the grade I earned in highlights.” • “I like how you can read the expectations and then read your teacher’s comments on how you lost points.” • “It was clear and easy to see the expectations for the assignments. When they were graded, I knew exactly why the professor gave me a particular score.”

  13. Candidate Responses: Would you like to see iRubric utilized for grading in your future courses? • “Yes I would – having access to a rubric before completing the assignment is a great tool and advantage, and this particular form of rubric is very handy for the student and the instructor because of its simplicity and availability.” • “Yes, I would like to see iRubric utilized for grading in my future courses because it was easy to obtain and understand how it worked. It was always in the same place and just a simple, but efficient program.” • “Yes, I would, because I appreciate the clarity of feedback and the ease of access online.”

  14. Instructor Responses:How, if at all, did iRubric(s) help save you time and effort in grading, communication, meeting course objectives, etc.? • “Using the iRubric helped in grading extensively since I did not have to worry about being unfair and ensuring that each component matched the assignments. I did spend some time highlighting the rubrics in class so that students were aware of the requirements.” • “It made my grades very transparent and I did not get as many complaints with regards to grading assignments.”

  15. Instructor Responses:What, if anything, have been the top few contributions that iRubric has made to your classroom? • “They provide a nice format for discussing (and impetus to review) requirements for projects.” • “Made me be clearer about my expectations and grading rationale to students.” • “More efficient grading. Clear feedback.”

  16. Closing the Communication Loop • Through iRubric’s Analysis and Reporting engine, administrators have access to detailed aggregated/disaggregated performance data in real-time. • Faculty save time scoring rubrics with a few mouse clicks and iRubric will calculate the score and save the data. • Candidates are always informed as they have access to the rubric prior to submission of their work and access to scored rubrics after submission.

  17. Lessons Learned • Use accreditation assessment process as a catalyst for making improvement. • Build easy-to-use solutions within the normal faculty toolsets. • Get support of teaching committees to encourage use and to create incentive. Word-of-mouth was important factor in expanded use.

  18. Contact Us • Jill Shedd • jshedd@indiana.edu • Larry Riss • lriss@indiana.edu • Ramesh Sabetiashraf • rs@reazon.com

  19. Indiana University School of Education Assessment and Communication Journey of One Institution: Lessons from SPA Reviews, Rubrics, Data Collection and Reporting From learning to navigate new report formats and performance based data requirements as well as the requirements of each specialty professional association (SPA), the teacher education unit has learned a variety of lessons including ways to: • refine key assessments; • promote a more robust communication among faculty, teaching candidates, and; • document for outside evaluators the performance of candidates and their collective ability to impact positively student learning among a diverse student population. This presentation, given at the 2011 AACTE National Conference Roundtables Session, describes how the IU School of Education effected a significant change that came as a result of SPA reviews – the design and implementation of an integrated process using iRubric Assessment & Outcomes System to refine the use of rubrics used for key assessments within programs and integrate the technology for the development and use of rubrics within the Indiana University course management system. Read about the journey the unit has taken through the SPA program review process, its implications on our assessment practices, and how it demonstrates even greater potential to improve the quality and clarity of communication of standards and expectations between faculty and teaching candidates, and subsequently to external reviewers when needed. Download the presentation at: http://education.indiana.edu/aacte2011

More Related