1 / 29

Phil Davies School of Computing University of Glamorgan “Super U”

Peer-Assessment: No marks required, just feedback? Evaluating the Quality of Computerized Peer-Feedback compared with Computerized Peer-Marking. Phil Davies School of Computing University of Glamorgan “Super U”. Definition of Peer-Assessment?. In describing the teacher ..

tao
Télécharger la présentation

Phil Davies School of Computing University of Glamorgan “Super U”

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Peer-Assessment:No marks required, just feedback?Evaluating the Quality of Computerized Peer-Feedback compared with Computerized Peer-Marking Phil Davies School of Computing University of Glamorgan “Super U”

  2. Definition of Peer-Assessment? • In describing the teacher .. A tall b******, so he was. A tall thin, mean b******, with a baldy head like a lightbulb. He’d make us mark each other’s work, then for every wrong mark we got, we’d get a thump. That way – he paused – ‘we were implicated in each other’s pain’ McCarthy’s Bar (Pete McCarthy, 2000,page 68)

  3. Why Peer-Assessment • Community of Practice!! • Working together to learn (and be assessed) • Perceptions • Staff > Lecturer getting easy life not marking • Student> Lecturer getting easy life not marking • Student Awareness of Benefits …. IMPORTANT TANGIBILITY • Success dependant upon scalability (computerisation) .. “Student Numbers have risen dramatically since 1991 without a concomitant increase in resources” (Pond et al, 1995)

  4. Computerised Peer-Assessment • CAP System • Permits students to mark & comment the work of other students . (normally 8) • Also initial self-assess stage (reflection) … used as a standard of expectation • Internet, not Web-based system (developed in Visual Basic / Access)

  5. Having done the marking, what next? • Students should receive feedback • What feedback? • Marks • Comments • Which is most important? • To students or staff

  6. AUTOMATICALLY EMAIL THE MARKER .. ANONYMOUS

  7. What should the marker do?Reflect • Look at essay again • Take into account the essay owner’s comments • Further clarification (if it is needed, then is this a ‘black mark’ against the marker?) • Try to ‘appease’ the essay owner? • Modify mark based upon reflection? • Give more feedback

  8. Must be rewarded for doing the ‘mark for marking’ process .. Based on quality • How to judge? • Standard of expectation (self-assessment) • Marking consistency • Commenting, quality, measure against mark • Discussion Element • Need for additional comments – black mark? • Reaction to requests / further clarification

  9. Standard of ExpectationSelf-Assess = 52%Peer-Assessed as = 58%

  10. How easy to get mark for marking? • Statistically fairly easy to create a mark for marking based upon marks • Take into account high and low markers • Standard of expectation • Consistency … judge against final mark awarded for an essay (compensated median) • What about the comments?

  11. Feedback Index • Produce an index that reflects the quality of commenting • Produce an average feedback index for an essay • Compare against marker in a similar manner to marks analysis • Where does this feedback index come from and is it valid?

  12. The way to get the feedback index? • Develop an application?? • C-Rater? • Spelling Mistakes • Similar Meanings? • That was cool • Really Choc • Really Good Essay • Manually

  13. Commonality!! • In the 67 essays that were marked • Only 96 comments • 44% positive and 56% negative • Highly critical if something not explained properly (21% of total comments (of which 73% were negative) • Better students were more critical than weaker students .. • Better understanding permitting criticism? • Confidence? • Hostility? • Comments grouped into 10 categories • Need to QUANTIFY these comments .. Feedback index • Create a database holding positive & negative (by category)

  14. Time Consuming? • Can we formulate the marking process • Take away need for quantification process of analyzing comments • Is it still peer-assessment if the students are told what to say?

  15. STUDENT FRED REFERENCES:Positive ……… Negative ……. Personal Valuation 5, 3, 2, 1 3, 1, 2

  16. Some points outstanding • What should students do if they identify plagiarism? • Is it ethical to get students to mark the work of their peers? • Is a computerised solution valid for all? • At what age / level can we trust the use of peer assessment? • How do we assess the time required to perform the marking task?

  17. Peer-Assessment: No marks required, just feedback? • Not there yet • Feedback index results are very positive • Require more evaluation before totally automated system is possible • Getting there .. Removing subjectivity in marking

More Related