1 / 44

Computerised Peer-Assessment

Computerised Peer-Assessment. Lecturer getting out of doing marking. Phil Davies University of Glamorgan South Wales. Need for assessment?. As tutors we are trying to “separate” the sheep from the goats via the assessment process.

adamdaniel
Télécharger la présentation

Computerised Peer-Assessment

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Computerised Peer-Assessment Lecturer getting out of doing marking Phil Davies University of Glamorgan South Wales

  2. Need for assessment? • As tutors we are trying to “separate” the sheep from the goats via the assessment process. • This can often be difficult with the time constraints imposed on tutors, in what Chanock describes as “more goat-friendly times” (Chanock, 2000). • Problem …. Feedback against time!

  3. Defining Peer-Assessment • In describing the teacher .. A tall b******, so he was. A tall thin, mean b******, with a baldy head like a lightbulb. He’d make us mark each other’s work, then for every wrong mark we got, we’d get a thump. That way – he paused – ‘we were implicated in each other’s pain’ McCarthy’s Bar (Pete McCarthy, 2000,page 68)

  4. Why Peer-Assessment • Working together to learn (and be assessed) • Overcome Perceptions • Staff > Lecturer getting easy life not marking • Student> Lecturer getting easy life not marking • Student Awareness of Benefits …. IMPORTANT TANGIBILITY • Success of dependant upon scalability (computerisation) .. “Student Numbers have risen dramatically since 1991 without a concomitant increase in resources” (Pond et al, 1995)

  5. Computerised Peer-Assessment • CAP System (2000) • Permits students to mark & comment the work of other students . (normally 6-8) • Also initial self-assess stage (reflection) … used as a standard of expectation • Internet, not Web-based system (developed in Visual Basic / Access)

  6. Having done the marking, what next? • Students should receive feedback • What feedback? • Marks • Comments • Which is most important? • To students or staff

  7. AUTOMATICALLY EMAIL THE MARKER .. ANONYMOUS

  8. What should the marker do?Reflect • Read the essay again • Take into account the essay owner’s comments • Further clarification (if it is needed, then is this a ‘black mark’ against the marker?) • Try to ‘appease’ the essay owner? • Modify mark based upon reflection? • Give moreconstructivefeedback

  9. Exercise One • In the top box … what are the inherent advantages and disadvantages of peer-assessment • Middle Box … Mark (/10) and comment to support • Bottom Box … Mark the marker /10 and comment to support • Now give the work back! • Which of the above required higher order skills?

  10. Must be rewarded for doing the ‘mark for marking’ process .. Based on quality • How to judge? • Standard of expectation (self-assessment) • Marking consistency • Commenting, quality, measure against mark • Discussion Element • Need for additional comments – black mark? • Reaction to requests / further clarification

  11. How easy to get an automated ‘mark for marking’? • Statistically fairly easy to create a mark for marking based upon marks • Take into account high and low markers • Standard of expectation • Consistency … judge against final mark awarded for an essay (compensated median) • What about the comments?

  12. Feedback Index • Produce an index that reflects the quality of commenting • Produce an average feedback index for an essay (also compensated?) • Compare against marker in a similar manner to marks analysis • Where does this feedback index come from and is it valid?

  13. The way to get the feedback index? • Develop an application?? • C-Rater? • Spelling Mistakes • Similar Meanings? • That was cool • Really Choc • Really Good Essay • Manually

  14. Commonality!! • In the 67 essays that were marked • Only 96 comments • 44% positive and 56% negative • Highly critical if something not explained properly • Comments grouped into 10 categories • Need to QUANTIFY these comments .. Feedback index • Create a database holding positive & negative (by category)

  15. General points of note • Main criteria used for peer-marking • Explanations, conclusions, references, examples • Better students were critical (upper quartile) • Better understanding permitting criticism? • Confidence? • Hostility? • Weaker student in ‘cloud cuckoo’ land … consistently • Consistency of markers in marks & comments: 30-34, 35-39, 55-59; 65-69; 70-74 • Note 70-74 OVER-Comment & Mark

  16. Time Consuming? • Can we formulate the marking process • Take away need for quantification process of analyzing comments • Is it still peer-assessment if the students are told what to say?

  17. Exercise Two • I think you’ve missed out a big area of the research • You’ve included a ‘big chunk’ that you haven’t cited • There aren’t any examples given to help me understand • Grammatically it is not what it should be like • Your spelling is atroceious • You haven’t explained anything to me • You’ve directly copied my notes as your answer to the question • Most of what you’ve said isn’t about the question

  18. STUDENT FRED REFERENCES:Positive ……… Negative ……. Personal Valuation 5, 3, 2, 1 3, 1, 2

  19. Is it my job to teach students how to write essays, etc? • Assessment MUST be directed at subject skills • Why bother writing essays, doing exam questions, etc. … doesn’t relate to needs or learning outcomes of subject • Post HND … N-tier … Assess the essays of the final year (last year) • Preparation/Research: Judge knowledge against last year’s results .. Both marks & comments • Mistake!!

  20. Who benefited the most by doing this exercise? Cured plagiarism?

  21. Can peer-assessment benefit other subject areas? • Java Programming with Coursemarker • Stuart Lewis’ idea • Students create a solution to a programming assignment • Submission(s) • Peer-Evaluate other solutions • Comments … Marks for Marking (weightings)

  22. CM Modula-2 Java C CourseMarker Computer Assisted Teaching and Assessment STUDENT TEACHER CourseMarker Core Exercise Developm. System Student Exercise Environment • assignments • exercises • notes • questions • exercise setup • submission File Storage System • edit • compile • link • run feedback and mark • test methods • solution template • marking scheme Marking System Evaluation System • final mark • position in class • course statistics • course statistics • flagging-up of ‘problem cases’ immediate support comments / questions FEATURES • UNIX (Linux), Windows, Mac, based all platforms • Assessment of text I/O assignments only no marking of graphical output • remote student / teacher access distance learning, open all hours Advantages / Disadvantages STUDENTS TEACHER • re-usability • automated marking • - fair • - frees time • plagiarism check • steep learning curve • difficult setup • (but it’s getting easier) • immediate feedback • fast support • additional • overheads

  23. PeerMarker Screen

  24. Student while marking • Exposure to different solutions • Development of critical evaluative skills • Useful experience of reading code for future employment situations • Plagiarism? … Good solution / No understanding

  25. Student while reviewing feedback from peers • Range of subjective marking • Confirmation of objective automated marking • Anonymous discussion between marker and marked

  26. Current position • Test system working • Changes following beta test in progress • Plans to try sample study again (at a more convenient time, and with added rewards!) • Employ 2nd Placement Student • Graphical Interface

  27. Some Points Outstanding or Outstanding Points • What should students do if they identify plagiarism? • Is it ethical to get students to mark the work of their peers? • Is a computerised solution valid for all? • At what age / level can we trust the use of peer assessment? • How do we assess the time required to perform the marking task? • What split of the marks between creation & marking • BEST STORY

  28. Contact Information • pdavies@glam.ac.uk Phil Davies School of Computing University of Glamorgan • Innovations in Education & Teaching International • ALT-J • CAA Conference site (excellent PDF resource)

  29. Time for questions?

More Related