160 likes | 171 Vues
This guide provides an overview of the IDEA Diagnostic Form Report and how to interpret and use the information to improve instruction. It discusses the reliability and validity of the report, average and converted scores, and compares ratings with those of other teachers. It also helps identify areas of success and areas for improvement in facilitating progress on class objectives.
E N D
Understanding the Diagnostic Guide Office of Institutional Research, Planning and Assessment January 24, 2011
Introduction • Effective teaching is complex • Purpose of student ratingsis to improve instruction • Student ratings do not provide all of the information needed by an instructor to improve instruction • Student ratings should not account for more than 50% of an instructors annual review
Diagnostic form report The IDEA Diagnostic Form Report is designed to respond to five questions: • 1. Overall, how effectively was this class taught? • 2. How does this compare with the ratings of other teachers? • 3. Were you more successful in facilitatingprogresson some class objectives than on others? • 4. How can instruction be made more effective? • 5. Do some salient characteristics of this class and its students have implications for instruction?
Reliability and validity Reliability - consistency of a set of measurements or of a measuring instrument Validity – study (or instrument) answers the questions it is intended to answer Example – Bathroom scale • Someone weighs 200 pounds and steps on the scale 10 times and gets readings of 15, 250, 95 and 140, etc., the scale is not reliable • If the scale consistently reads 150, then it is reliable, but not valid • If it reads 200 each time, then the measurement is both reliable and valid Are the findings of your diagnostic report reliable? Look at the top of the report in the shaded area. Even if the findings are not reliable, they may still be useful as feedback
Average and converted scores • Average scores - based on a five point rating scale • See small box on left side of page one • Criterion Referenced • Error for classes in the range of 15-24 is ± 0.2 • Error is slightly higher for smaller classes and lower for larger classes • Converted scores – all have an average of 50 and a standard deviation (measure of variability) of 10 (Also called standard scores) For comparative purposes • See large box on right side of page one • Norm Referenced • Both average and converted scores are presented in the “raw” or unadjusted and “adjusted” forms
Overall, How effectively was this class taught? • Examine student ratings of progress on Important or Essential Objectives • Average rating provides a good indication of effective teaching, especially if • At least 75% of enrollees responded • At least 10 students provided ratings • Progress rated on 5-point scale • 1=no progress • 2=slight progress • 3=moderate progress • 4=substantial progress • 5=exceptional progress • Average of 4.0 indicates “substantial progress” is appropriate for summarizing progress
Overall Index of teaching effectiveness • Progress of Relevant Objectives combines ratings of progress on the objectives identified by the instructor as important (weighted 1) or essential (weighted 2) • IDEA Center regards this as its single best estimate of teaching effectiveness
Summary of teaching effectiveness • Progress on Relevant Objectives (A) • Relevant objectives are those selected by the Instructor on the FIF • Weighted average of student ratings of progress on “important” or “essential” • Overall Ratings • Average student ratings that the teacher was excellent (B) • Average student ratings that the course was excellent (C) • Average of B and C is (D) • Summary Evaluation: Average of A and D
How do your ratings compare withthose of other teachers? • Refer to the comparisons shown on the right hand side of Page 1 of the IDEA Diagnostic Form Report. • Converted Averages compared to three groups • All classes in the standard IDEA database • All classes in the same discipline • All classes at RSU • Institutional and disciplinary norms are updated annually and include the most recent five years of data. • The IDEA database is updated on a periodical basis
Were you more successful infacilitating progress on some class objectivesthan on others? • Refer to the upper portion of Page 2 of the IDEA Diagnostic Form Report. • Main purpose of this table: help you focus on your improvement efforts • Twelve objectives listed and show ratings on those objectives identified by Instructor as either important or essential from the FIF • Ratings for those objectives listed as Minor or None not included • In the last column, • Percentage of students rating in the two lowest categories of 1 or 2 • No apparent progress or slight progress • Percentage of students rating in the two highest categories of 4 or 5 • Substantial progress and exceptional progress
Progress on Relevant Objectives as compared to group averages • Converted scores in the right hand section and compared with the three norm groups • All classes in the IDEA database • Discipline (IDEA data) • RSU (Institutional data) • The status of each relative to other classes in the comparison group • Much higher (highest 10%) • Higher (next 20%) • Middle (40%) • Lower (next 20%) • Much lower (lowest 10%)
How can instruction be made moreeffective? • Refer to Page 3 of the IDEA Diagnostic Form Report. • Main purpose of instruction is to facilitate progress on objectives that the instructor selects as Important or Essential • Progress is affected by many factors in addition to teaching methods, e.g., student motivation, willingness to work hard.) • Teaching methods are of critical importance to facilitate progress • Teaching methods have been grouped into 5 categories which include the relevant objectives selected by the Instructor • Review your average score, percent of students rating 4 or 5 and suggested action
Suggested Action Column • “Strength to retain” – retain these methods regardless of other changes you may make in teaching strategy • “Consider increasing use” – Infers that increasing use of these methods, may result in more success in facilitating progress • “Retain current use or consider increasing” – methods currently employed with typical frequency. Increasing frequency may positively effect the learning outcomes
Do some salient characteristics of this class and its students have implications for instruction? • Refer to the bottom portion of Page 2 of the IDEA Diagnostic Form Report • Course Characteristics. Students described the class by comparing it to other classes they have taken in terms of • (1) amount of reading, • (2) amount of work in non-reading assignments • (3) difficulty • Average ratings are compared with “All classes” in the IDEA database; if sufficient data were available, comparisons are also made with classes in the broad discipline group in which this class was categorized and all other classes at your institution. Because relatively large disciplinary differences have been found on these three characteristics, the disciplinary comparison may be especially helpful.
Do some salient characteristics of this class and its students have implications for instruction? Student Characteristics Students described their motivation by making self-ratings on the three items listed at the bottom of Page 2. These characteristics have been found to impact student ratings of progress.
Detailed Statistical summary Page 4 of the Report provides a detailed statistical summary of student responses to each of the items on the IDEA form as well as to optional locally devised items, if any.