1 / 30

Computerised Peer-Assessment that supports the rewarding of evaluative skills in Essay Writing (C.A.P.) & Programmin

Computerised Peer-Assessment that supports the rewarding of evaluative skills in Essay Writing (C.A.P.) & Programming (Coursemarker). Phil Davies & Stuart Lewis School of Computing University of Glamorgan. Need for Assessment?.

Faraday
Télécharger la présentation

Computerised Peer-Assessment that supports the rewarding of evaluative skills in Essay Writing (C.A.P.) & Programmin

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Computerised Peer-Assessment that supports the rewarding of evaluative skills in Essay Writing (C.A.P.) & Programming (Coursemarker) Phil Davies & Stuart Lewis School of Computing University of Glamorgan

  2. Need for Assessment? • As tutors we are trying to “separate” the sheep from the goats via the assessment process. • This can often be difficult with the time constraints imposed on tutors, in what Chanock describes as “more goat-friendly times” (Chanock, 2000). • Problem …. Feedback against time!

  3. Defining Peer-Assessment • In describing the teacher .. A tall b******, so he was. A tall thin, mean b******, with a baldy head like a lightbulb. He’d make us mark each other’s work, then for every wrong mark we got, we’d get a thump. That way – he paused – ‘we were implicated in each other’s pain’ McCarthy’s Bar (Pete McCarthy, 2000,page 68)

  4. What functionality do we require in a computerised peer-assessment system? • Method to Peer-Mark & COMMENT • Method to allow students to view comments • Method to permit conversation anonymous • Method to take into account high/low markers … Fair to all • Method to Qualitatively Assess marking & commenting processes (higher order skills) • Method to permit a student to make comments that are understandable (framework) to them and owner • Security / Recognise and Avoid Plagiarism / Flexibility

  5. AUTOMATICALLY EMAIL THE MARKER .. ANONYMOUS

  6. Must be rewarded for doing the ‘mark for marking’ process .. Based on quality • How to judge? • Standard of expectation (self-assessment) • Marking consistency • Commenting, quality, measure against mark • Discussion Element • Need for additional comments – black mark? • Reaction to requests / further clarification

  7. Feedback Index • Produce an index that reflects the quality of commenting • Produce an average feedback index for an essay (also compensated?) • Compare against marker in a similar manner to marks analysis • Where does this feedback index come from and is it valid?

  8. CAA Conference 2003 Future Work • It should be noted that students marking work only tend to use a subset of these comments. • From their feedback have a different regard to the weighting of each of the comments with respect to their commenting on the quality of an essay.

  9. Exercise • I think you’ve missed out a big area of the research • You’ve included a ‘big chunk’ that you haven’t cited • There aren’t any examples given to help me understand • Grammatically it is not what it should be like • Your spelling is atroceious • You haven’t explained your acronyms to me • You’ve directly copied my notes as your answer to the question • 50% of what you’ve said isn’t about the question

  10. Each Student is using a different set of comments … these new weightings MAY give a better feedback index? Currently being evaluated

  11. Is it my job to teach students how to write essays, etc? • Assessment MUST be directed at subject skills • Why bother writing essays, doing exam questions, etc. … doesn’t relate to needs or learning outcomes of subject • Post HND … N-tier … Assess the essays of the final year (last year) • Preparation/Research: Judge knowledge against last year’s results .. Both marks & comments • Mistake!!

  12. e.g. a group of marking differences +4, -22, +16, -30, +8, +12 would result in an average difference of -12 / 6 = -2 (taking 0 as the expected difference). The absolute differences from this value of -2 are 6, 20, 18, 28, 10, 14. This gives an average consistency valuation of 13 (96/6). This shows a poor consistency by this marker. Compare this with a student whose marks produced were +4, -4, -10, -8, 6, 0. The average difference for this marker is again -2 (-12/6). The absolute differences from this value however are 6, 2, 8, 6, 8, 2. This gives a consistency valuation of 5.33 (32/6). This student deserves much more credit for their marking even though the average standard deviation of the two sets of markings was the same. The fact that a student always high or low marked is now removed as it is the absolute difference that is being compared.

  13. Who benefited the most by doing this exercise? • Cured plagiarism?

  14. Can the same principles be applied in other subject areas? • Java Programming with Coursemarker • Stuart Lewis’ idea • Students create a solution to a programming assignment • Submission(s) • Peer-Evaluate other solutions • Comments … Marks for Marking (weightings)

  15. Modula-2 Java C CourseMarker CM Computer Assisted Teaching and Assessment STUDENT TEACHER CourseMarker Core Exercise Developm. System Student Exercise Environment • assignments • exercises • notes • questions • exercise setup • submission File Storage System • edit • compile • link • run feedback and mark • test methods • solution template • marking scheme Marking System Evaluation System • final mark • position in class • course statistics • course statistics • flagging-up of ‘problem cases’ immediate support comments / questions FEATURES • UNIX (Linux), Windows, Mac, based all platforms • Assessment of text I/O assignments only no marking of graphical output • remote student / teacher access distance learning, open all hours Advantages / Disadvantages STUDENTS TEACHER • re-usability • automated marking • - fair • - frees time • plagiarism check • steep learning curve • difficult setup • (but it’s getting easier) • immediate feedback • fast support • additional • overheads

  16. PeerMarker Screen

  17. Student while marking • Exposure to different solutions • Development of critical evaluative skills • Useful experience of reading code for future employment situations • Plagiarism? … Good solution / No understanding

  18. Student while reviewing feedback from peers • Range of subjective marking • Confirmation of objective automated marking • Anonymous discussion between marker and marked

  19. Current position • Test system working • Changes following beta test in progress • Plans to try sample study again (at a more convenient time, and with added rewards!) • Employed 2nd Placement Student • Graphical Interface

  20. Some Points Outstanding or Outstanding Points • What should students do if they identify plagiarism? • Is it ethical to get students to mark the work of their peers? • Is a computerised solution valid for all? • At what age / level can we trust the use of peer assessment? • How do we assess the time required to perform the marking task? • What split of the marks between creation & marking • BEST STORY

  21. Contact Information • pdavies@glam.ac.uk • sflewis@glam.ac.uk Phil Davies / Stuart Lewis School of Computing University of Glamorgan • Innovations in Education & Teaching International • ALT-J

More Related