1 / 34

Welcome to the ITEAM Project Stakeholders Event

Welcome to the ITEAM Project Stakeholders Event. Academic Skills Tutor. Personal Tutor. Module Leader. ITEAM Project I ntegrating T echnology E nhanced A ssessment M ethods for Student Support and Self-regulation. Programme Tutor. Electronic Voting Systems.

borka
Télécharger la présentation

Welcome to the ITEAM Project Stakeholders Event

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Welcome to the ITEAM Project Stakeholders Event

  2. Academic Skills Tutor Personal Tutor Module Leader ITEAM Project Integrating Technology Enhanced Assessment Methods for Student Support and Self-regulation Programme Tutor Electronic Voting Systems Student Dashboard . Questionmark Perception Electronic Submission & Online feedback

  3. Assessment for Learning Principles

  4. Assessment for Learning Principles These principles suggest that good practice in Assessment-for-Learning: 1. Engages students with the assessment criteria 2. Supports personalised learning 3. Ensures feedback leads to improvement 4. Focuses on student development 5. Stimulates dialogue 6. Considers student and staff effort

  5. Guidance and support…. Knowledge exchange Guidance Tool kit

  6. Developing good practice……. Workshops to embed the principles into the assessment process and accessible grading criteria for all students

  7. The resource calculator for a summative essay

  8. The comparison

  9. Electronic Voting Systems

  10. Electronic Voting Systems To: • Encourage active student participation • Engagethe whole class • Check student knowledge • Give prompt feedback • Give quiet students a voice • Add interest and fun UH bought: • 3845 in 2010-11 • 3500 in 2011-12 • 2975 in 2012-13

  11. Many cogs to the wheel… Learning and Teaching • Workshops • Online resources • ADL+Ts Champions • Promoting pedagogic use • Supporting staff • Communication Logistics (The EVS Life Cycle) • Purchasing • Distribution • Registration • On-going support • Return

  12. Working with students To simulate the experience of using EVS in an assessment in-order to trigger a dialogue about good practice and develop student informed guidance for teachers Participants: 11 students from 7 disciplines (Engineering, Computer Science, Creative Arts, Life Science, Business, Law, Psychology) Two quizzes of similar content and length delivered, one without pre-reading paper and one with. Discussion about the experience of using EVS Recommendations

  13. Future • Procurement window (open now!) • Upgrade summer 2013 • Software/Receivers • Staff support

  14. QuestionMark Perception

  15. QuestionMark Perception What is it? Development between IH and LTI Assessment Team User group Student perspective Staff perspective The future

  16. Development between IH and LTI assessment team • IH Helpdesk: • Request for staff access • Request for student participant group • Other aspects • LTI assessment team • Pedagogical aspect of QMP usage • Staff training jointly run by IH and LTI • 20th June 2013 • 17th July 2013

  17. QMP user group • Members: • IH system management • IH Helpdesk • LTI assessment team • Academic staff users • Professional staff users • Educational Technologists • Next meeting September 4th September 2013

  18. Student perspective With QMP I think • It is easy to handle • Neutral • It is difficult to handle

  19. Future • More and more staff have expressed their interest • Two training sessions are fully booked with people on waiting list • Comments: • ‘When it works it’s brilliant’ • ‘There are issues, but it is worth doing it’

  20. Online submission and feedback

  21. Grading Criteria Supporting consistency across the University Based on ‘top level’ principles of good practice Aligned to UH grade descriptors (UPR 14) Exemplar StudyNet amendments Individualised to School/Department needs Ready for 2013/14 academic year

  22. Student Progress Dashboard

  23. Student Progress Dashboard An IH development which pulls together information about Studynet (module level) engagement and assessment performance data and displays it in a single place within StudyNet. To help students with the self-regulation of their studies. To help staff identify where supportfor individuals or groups of students might be needed LTI’s role – to support further development by facilitating conversations and gathering feedback from staff and students

  24. Average grade Modules Traffic lights

  25. Filter students Set threshold

  26. Informing the development Student input: development meetings, focus groups (25 students; 22 male, 3 female; F/T and P/T; UG and PG), student view pilot (running now, 22 students), feedback meeting in May Staff input: development meetings, pilot of staff view in semester A (Business, Computer Science, Health, Engineering, Education, LMS) School projects (now): Conversations and Dissemination (Business, Health, Education, LMS, Engineering)

  27. Student comments ..wanted to know where they were in relation to others as this would be a motivator to do better (did not feel it would de-motivate them to know they were at the lower end of the league table). ..said the currency/completeness of data is important if dashboard is to be meaningful to them ..wanted re-assurance that their information would only be seen by ‘those needing it’ ..were interested in what else could go in there e.g. book loans, attendance, module pass/fail rates (one-stop shop approach) ..would like a predictor or calculator for working out final classification

  28. Staff input Staff pilot Bristol on Line survey and some follow up interviews School projects

  29. Visual appeal Link to other systems Data accuracy Ease of access Staff support Access levels completeness Data currency Clarity signposting Design Principles

  30. Feedback and Questions

  31. How effective do you feel the communication between the LTI team and yourself has been? • Very good • Good • Neutral • Poor • Very poor • Not applicable

  32. Do you feel you have had the right level of involvement in the project? • Yes • No • Not sure

  33. Are you looking forward to lunch? • Quite looking forward to it • Really looking forward to it • Not looking forward to it at all • Don’t mind either way

  34. Thank you for listening and for being part of the ITEAM project!Any questions?

More Related