1 / 17

Joint Information Systems Committee

Myles Danson, John Winkley Programme Manager – e-Learning e-Assessment. Joint Information Systems Committee. Supporting education and research. Personalised learning. Lifelong learning. Learning & Teaching Practice. Technology & Standards. Learning Resources and Activities.

Télécharger la présentation

Joint Information Systems Committee

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Myles Danson, John WinkleyProgramme Manager – e-Learninge-Assessment Joint Information Systems Committee Supporting education and research

  2. Personalised learning Lifelong learning Learning & Teaching Practice Technology & Standards Learning Resources and Activities E-Assessment Technology Enhanced Learning Environments E-Administration E-Portfolios Strategy & Policy Work-based learning Widening participation Assessment in JISC’s e-Learning Programme

  3. E-Assessment StrandEVALUATION SESSION • AIM • To share findings from our evaluation work to date • To help projects with their internal evaluation work • To obtain projects’ input to blueprints for future project work • ACTIVITIES • Feedback from the current evaluation work - process • Supporting projects in their evaluation • Feedback from the current evaluation work - process • Baseline, blueprint and future projects.

  4. “Classifying” projects

  5. Current projects

  6. Programme Development activity • Context: • Toolkit projects completed • Major programmes of curriculum delivery and design commencing • Evidence that e-assessment is “broadening” as a field • Evidence that e-assessment is not yet as well as embedded as had been hoped • Timely opportunity to consider: • Where are we now? • Where are we trying to get to? • What should we do next? • …sounds familiar? [Evaluation/Baseline] Done (mostly) [Vision] Draft [Blueprint/Roadmap] Next!

  7. Undertaking Evaluation – our approach • Qn AWhat have we done/built/achieved, to what quality and how efficiently? • Qn BWhat has been learned or confirmed through the development activities? • Qn CHow has the learning been acted on and disseminated?

  8. Undertaking Evaluation – our approach (2) • Innovations in process and practice • Sustainable institutional change • Tangible benefits • Technical developments • Unanticipated outcomes • Relevance to/response from sector • Lessons learned/increased knowledge

  9. Group Session 1 – Evaluation and Vision • Your evaluation work: • Is your approach formative or summative? • What sort of questions are you asking? (does your approach map easily on to Qns A, B and C?) • What evidence are you gathering to support the evaluation findings? • What aspects of the evaluation are you finding easy and difficult? • Is our vision for e-assessment: • Complete? • Ambitious? • Achievable? • Measurable?

  10. Innovations in process and practice • The REAP project has provided powerful new evidence of the use of formative assessment within HEIs. • The technologies required to support formative assessment are broader than the traditional view of e-assessment (MCQs, on-screen testing systems) • E-assessment remains a priority, but the focus of effort on e-assessment implementation projects has shifted. The implementation of effective assessment within teaching and learning remains at the heart of the learning process. The application of ICT to this involves greater work today on pedagogy, management and cultural aspects, rather than the technology itself (which appears to be largely adequate).

  11. Sustainable institutional change • JISC has produced substantial resources to support e-assessment embedding, and there are additional resources available in the sector (eg from HEA) • There are clearly understood methods for embedding e-assessment in institutions • e-assessment is not yet widely embedded in institutions • JISC’s activities are unlikely to achieve this embedding on their own • Summative e-assessment has achieved a higher level of embedding within UK FE and schools

  12. Tangible benefits • To explore and develop effective practice on the use of e-assessment systems and tools through the development of standards and piloting of e-assessment related technologies. We also seek to provide guidance for institutions on effective practice in this area. Evaluation Findings • The IMS QTI1 and 2 specifications have been thoroughly explored, and the UK HEI e-assessment community maintains a leading position in their development and implementation. • The WebPA assessment tool is being used across a range of institutions for peer and group assessment • The REAP project and E-assessment case studies have demonstrated that e-assessment can have effectiveness and efficiency benefits to a wide range of HEI programmes. • A small but active community of developers and educationalists has been created and sustained in the area of e-assessment.

  13. Technical developments • The technical aspects of IMS QTI2.x development has continued successfully but questions remain about the purpose and future applicability of such work. • Eassessment core toolkit components are now largely complete to “proof of concept” stage • Experiences with OSS and SOA development have been largely positive. Most projects have gained valued knowledge and experience by working with these methods. • E-Framework and FREMA are not widely engaged with by projects

  14. Unanticipated outcomes • The technical requirements for an e-assessment system have evolved as the various technically-focused projects have progressed. • Implementation problems with QTI2.0 have fed back to the IMS team • The operational challenges of implementing e-assessment, particularly in multi-institutional settings, are greater than expected. • http://www.frema.ecs.soton.ac.uk/db/

  15. Relevance to/response from sector • JISC’s wider stakeholder engagement is largely undertaken by the programme manager and expert consultant • A sustainable e-assessment community comprising of technologists and educationalists has emerged. • Interest in e-assessment generally appears to remain high but wider consultation activities with the sector have been limited (this has not been the role of the majority of projects commissioned) but there have been some notable successes which demonstrate the continued wider interest in e-assessment • The National Student Survey continues to highlight both the importance of quality timely feedback and the fact that many feel this is not being provided to them. • Where projects make efforts to promote their work, this effort appears to be rewarded with interest. • Subject clusters appear to provide a locus for sustainable activity • The group of technical e-assessment developers is well served by JISC’s work • Wider demand for QTI (in HEIs, and elsewhere) is weak • There is confusion about technical/educational purpose of toolkit developments

  16. Group Session 2 – Evaluation and Blueprint • Do you agree with our findings? (Pick one or two areas to look at which are most familiar to you) • 15 mins discussion • What should JISC include in its blueprint for future work (our tentative thoughts are shown on the last page) • 10 mins discussion

  17. Myles Dansone-Learning Programme Manager John Winkley Programme Consultant Myles Danson JISC Programme Managerm.danson@jisc.ac.uk 07796 336319 John Winkley AlphaPlus Consultancy T 01274 412490 M 07973 744617 john.winkley@alphaplusconsultancy.co.uk www.alphaplusconsultancy.co.uk

More Related