1 / 18

FREMA: e-Learning Framework Reference Model for Assessment

FREMA: e-Learning Framework Reference Model for Assessment. David Millard Yvonne Howard IAM, DSSE, LTG University of Southampton, UK. Introduction. What is FREMA? JISC funded Project between Southampton, Strathclyde and Hull Part of the e-Learning Framework (ELF) effort What is the ELF?

nubia
Télécharger la présentation

FREMA: e-Learning Framework Reference Model for Assessment

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. FREMA: e-Learning Framework Reference Model for Assessment David Millard Yvonne Howard IAM, DSSE, LTG University of Southampton, UK

  2. Introduction • What is FREMA? • JISC funded Project between Southampton, Strathclyde and Hull • Part of the e-Learning Framework (ELF) effort • What is the ELF? • Service Oriented Architecture for e-learning systems • Layered (Domain services over Common Services) • Dynamic and evolving • FREMA will develop a Reference Model for the Assessment domain • What is a Reference Model? • A description of how services behave within a particular domain • A community resource

  3. E-Learning • In this context the term should be interpreted broadly • A Common view: • Using computers to deliver course material • Making (adaptive) content available • Automatic assessment • The Broader View: • Using computers to support all aspects of teaching and learning • Organising learning • Designing (scheduling, timetabling, etc) • Run-time (workflow, communication, etc) • Supporting virtual community (study groups, classes, etc) • Virtual organisations (UoM) • Facilitating Quality Assurance

  4. The Assessment Domain • Fundamental part of learning • Formative • Summative • Supporting design-time activities • Locate assessment items for courses • Supporting run-time activities • Marking • Reporting to learners • Plagiarism detection • Virtual organisations and lifelong learning • Information may need to be kept for a long time • Learner portfolios • A degree from one institution, courses from another • Trust and integrity issues

  5. E-Learning Framework (ELF) • Part of the JISC e-learning programme • A Service-Oriented Architecture for e-learning • Based on Web Services • Needs to bring in existing standards and systems • Evolutionary not designed • JISC strategy is to fund overlapping projects and see what sticks! • The Services are multi-layered • User Agents sit on • Learning Domain Services sit on • Common Services

  6. The ELF Wall

  7. Identifying Domains • JISC are building the services of the framework (bricks of the wall) by focusing on domains • Assessment • Learning content • Enterprise • Personal Development Planning • Personal Learning Environment • Resource Repositories • The objective is to identify services that should work together within the domain

  8. Assessment and ELF • Not enough to describe and define these services • Need a proper audit trail of decision making • Start by defining the domain • Work up through the services to a reference implementation • This is an ELF Reference Model

  9. Gap Analysis Use Cases Use Cases Use Cases Common Usage Patterns Anatomy of a Reference Model • Domain Definition • Overview of the domain, and how projects and standards fit within it • Identifying Common Usage Patterns • Scoping the FREMA Project • Developing Use Cases • Formal descriptions of usage patterns • Gap Analysis • Mapping of Use Cases to the Services in ELF Reference Impl’ • Service Profiles • Formal descriptions of those services Service Profiles • Reference Implementation • Of key/core services • Examples • Validation • Resource Assessment Domain Definition

  10. Gap Analysis Use Cases Use Cases Use Cases Common Usage Patterns What does it look like? • An evolving, cross-referenced, searchable web site • Indexed resources and narrative descriptions of the domain • UML Use Cases and Scenario documents • Service descriptions, narrative and WSDL • Service implementations to download (Java/.NET) • Different gateways into the model according to how you want to use it Reference Impl’ Service Profiles Assessment Domain Definition

  11. Gap Analysis Use Cases Use Cases Use Cases Common Usage Patterns How might you use it? • Use the Reference Implementation • Build on some or all of the developed services • Use the Service Profiles • To develop your own services that will fit into the framework • Use the Use Cases • To help understand usage patterns within the domain • Develop new Service Profiles and thus Services • Use the Domain Definition • To develop a context for your own work • Understand how existing work fits together • Identify standards • Locate experts Reference Impl’ Service Profiles Assessment Domain Definition

  12. Road Map to a Reference Model • Incremental • Evolutionary • Agile • Community Open Source • Components • Using existing open source infrastructures and web services, e.g. • Web servers, authentication services

  13. 4 work packages, 11 deliverables, 1 year • Work Package 1 • Domain Definition • Define our ‘footprint’ in the domain • Work package 2 • Use cases and scenarios • Web Service profiles • Reference implementation of a core model • Work package 3 • Review And evolve use cases from wp2 • Extend the core use cases • Evolve and extend the web service profiles • Evolve and extend the reference implementation July ‘05 October ‘05 April ‘06

  14. WP4: Engagement and Dissemination • Working with our Domain Experts • CETIS Assessment Sig • Research Projects • TOIA – Technologies for Interoperable assessment • ASSIS – Assessment Sequencing • APIS – assessment Provision through Interoperability • … and many others • Standards Bodies • In Assessment – IMS, OSIDs … • In Web Services - SOAP, WSDL, WSRF, W3C … • Working with the e-learning community to disseminate and evolve the web service framework • Toolkit developers - ASAP • Application developers

  15. ELF Bricks We can start to identify a set of infrastructure services that are already available in the distributed services world

  16. Using the OMII stack Application Application Application Application service service service service service service PBAC (Process based Access Control) authorisation service Ws-Sec authentication Tomcat and Axis web server and SOAP message handler

  17. Software Engineering Research Questions for e-learning • Reliability of long running, distributed transactions • Using modelling and simulation for orchestrations of distributed web services • Compensation models for long running transactions

  18. Conclusions • E-learning is: • Distributed systems • Information management • Workflow and Communication • Service oriented framework (ELF) • Reference models • Define the domain services • But also rely on common services • Successful e-learning services should be underpinned by good software engineering practises • Ensure the reliability, security and integrity of the resources and services

More Related