1 / 21

Methodologies evaluation

Methodologies evaluation. Agentlink III AOSE TFG Budapest, 17 sep. 2005. Evaluation framework for AOSEM. Towards an evaluation framework for AOSEM Previous approaches Questionnaire results Review Outline and plan for document on AOSEM evaluation framework.

ulfah
Télécharger la présentation

Methodologies evaluation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Methodologies evaluation Agentlink III AOSE TFG Budapest, 17 sep. 2005

  2. Evaluation framework for AOSEM • Towards an evaluation framework for AOSEM • Previous approaches • Questionnaire results • Review • Outline and plan for document on AOSEM evaluation framework Budapest meeting, 17/9/2005

  3. An evaluation framework for AOSEM • Context • Diverse scope of application of methodologies • Several aspects: analysis, design, implementation, deployment, validation, verification, etc. • Several application domains: from closed systems to open systems, web support, etc. • Tool support • Tools for modelling and code generation • Some methodologies have no tool support at all (or in a very experimental state) • Development process not always defined • Different notations • Different agent concepts • Standardization efforts • Several approaches for integration: • A common standard agent specification language: which one? • Fragments: method engineering Budapest meeting, 17/9/2005

  4. An evaluation framework for AOSEM • Evaluation of AOSEM can help towards the success of AOSE • Clarification of concepts => towards some standardization • Integration of fragments • Definition of AOSE processes: heavy to light approaches • Promotion of tools Budapest meeting, 17/9/2005

  5. Inputs for AOSEM evaluation • A. Sturm, O. Shehory, D. Dori (2004). Evaluation of Agent-Oriented Methodologies. In: AL3 TF1-AOSE TFG • Q.N. Tran, G. Low (2005). Comparison of ten agent-oriented methodologies. In: Henderson-Sellers, B. and Giorgini, P., editors (2005). Agent-Oriented Methodologies. Idea Group Publishing. Chapter XII, pp. 341-367. • C. Bernon, et al. (2004). A Study of some Multi-Agent Meta-Models. Proc. AOSE 2004 (to appear in LNCS, Springer-Verlag). • L. Cernuzzi, G. Rossi (2004). On the evaluation of agent oriented methodologies. In: Proc. of the OOPSLA 2002 Workshop on Agent-Oriented Methodologies. • L. Cernuzzi, M. Cossentino, F. Zambonelli (2005). Process Models for Agent-Based Development. International Journal on Engineering Applications of Artificial Intelligence (EAAI). Elsevier. (in edition?) Budapest meeting, 17/9/2005

  6. Questionnaire • Originally from Mickael Winikoff and modified by Massimo Cossentino • Aim: assess an AOSE methodology against a range of criteria. The criteria fall into a number of areas. • Concepts/properties: The ideas that the methodology deals with, basically the ontology • Modelling: The models that are constructed and the notations used to express the models. • Process: The phases and steps that are followed as part of the methodology. • Pragmatics: Practical issues that are concerns when adopting a methodology (e.g., the availability of training materials and courses, the existence and cost of tools, etc.) Budapest meeting, 17/9/2005

  7. Questionnaire • Answers from: • ADELFE (Carole Bernon/creator) • INGENIAS (Jorge Gómez-Sanz & Juan Pavón/creators) • OPEN Process Framework (OPF) (Brian Henderson-Sellers/creator) • Prometheus-ROADMAP (Lin Padgham/creator) • Gaia (Giancarlo Fortino/Alfredo Garro: users!!!) • PASSI (M. Cossentino:creator, L. Sabatucci, V. Seidita/PhD Students: users/doing research on it, 8 graduating students: users) • TROPOS (3 students) • Others are always welcome!!! • Answers from users (not creators) can provide a better critical view of methodologies Budapest meeting, 17/9/2005

  8. Questionnaire • Looking at the results of the questionnaire • It can be useful to consider changes in the questionnaire • Subjective interpretation of questions and answers • Not applicable • Missing questions • Useful? Clarifying? • Identification of methodology challenges • Let’s see what are the results and discuss… Budapest meeting, 17/9/2005

  9. Creator/PhD Students/Grad. Stud. Questionnaire – Concepts & Properties Budapest meeting, 17/9/2005 N: None L: Low M: Medium H: High

  10. Questionnaire – Concepts & Properties Budapest meeting, 17/9/2005 SD: Strongly Disagree    D: Disagree    N: Neutral   A: Agree    SA: Strongly Agree

  11. Creator/PhD Students/Grad. Stud. Questionnaire – Modelling & Notation A methodology is really notation independent. Yes, there is a need for a modelling language and in the FAME project we have FAML (FAME modelling language) although not yet a notation. So we can’t really answer these notation specific questions (i.e. 21-26) SD: Strongly Disagree    D: Disagree    N: Neutral   A: Agree    SA: Strongly Agree NA: Not Applicable Budapest meeting, 17/9/2005

  12. Questionnaire – Modelling & Notation Budapest meeting, 17/9/2005 SD: Strongly Disagree    D: Disagree    N: Neutral   A: Agree    SA: Strongly Agree

  13. Questionnaire – Process C: Clear definition of activities E: Examples given H: Heuristics given P: Partial Budapest meeting, 17/9/2005

  14. Questionnaire – Process Budapest meeting, 17/9/2005 SD: Strongly Disagree    D: Disagree    N: Neutral   A: Agree    SA: Strongly Agree

  15. Questionnaire – Pragmatics Budapest meeting, 17/9/2005 SD: Strongly Disagree    D: Disagree    N: Neutral   A: Agree    SA: Strongly Agree

  16. Questionnaire – Pragmatics Budapest meeting, 17/9/2005

  17. Evaluation framework revisited • Taking the experience of this questionnaire • Review evaluation framework criteria and their organization • Review method for evaluation: questionnaire, case studies development, ... • Refine questionnaire • Define case studies • Review metrics • How to avoid subjectivity Budapest meeting, 17/9/2005

  18. Evaluation framework revisited Criteria for AOSEM evaluation Modelling Autonomy, society, … AbstractionModularity Domain specific concepts Knowledge skills Scalability Process DeliverablesActivities Team work Domain specific methods Tools Features Complexity Domain Pragmatics Budapest meeting, 17/9/2005

  19. Towards an AOSEM evaluation framework • The evaluation framework should allow: • Criteria refinement and extensions • Criteria metrics depending on the domain • E.g. agents in a web service or in robotics • Definition of standard case studies for evaluation • Evaluation of documentation and filling questionnaires is not enough • … Budapest meeting, 17/9/2005

  20. Towards an AOSEM evaluation framework • The framework can be based on the definition and use of evaluation models • Case studies for putting the methodologies to work • Organized by criteria • For each criteria, define metrics • Criteria can be refined to get more insight or being more specific • For instance, agent behaviour, depending on whether BDI, neural network, CBR, reactive, or whatever model is used • New criteria can be added • Some criteria may be considered non applicable • Associate criteria to case studies Budapest meeting, 17/9/2005

  21. Outline and plan for document on AOSEM evaluation framework • Outline • Participants • Plan Budapest meeting, 17/9/2005

More Related