1 / 13

The SEALS Platform for Scholarly Research

The SEALS Platform for Scholarly Research. Asun Gómez-Pérez asun@fi.upm.es Ontology Engineering Group Universidad Politécnica de Madrid Acknowledge to: Miguel Esteban Gutiérrez, Raul García-Castro, Nandana Mihindukulasooriya and the SEALS consortium. http:// www.seals - project.eu /.

pavel
Télécharger la présentation

The SEALS Platform for Scholarly Research

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. The SEALS Platform for Scholarly Research Asun Gómez-Pérez asun@fi.upm.es Ontology Engineering Group Universidad Politécnica de Madrid Acknowledge to: Miguel Esteban Gutiérrez, Raul García-Castro, Nandana Mihindukulasooriya and the SEALS consortium http://www.seals-project.eu/

  2. Asun Gómez-Pérez Requirements Ed Hovy Requirements A kind of system that helps me to reproduce experiments that appear in research papers and compare my technology with other technologies • A kind of framework that graphically depicts the whole scholarly environment/workflow - on this, a set of markers, like Post Its notes, that each indicate a technology available for use at the relevant point - also on this, another set of Post Its notes that each indicate some kind of bottleneck or gap

  3. Scenario Scholarly workflows ✔ ✔ Tool1 Tool2 Tool3 Tool1 Tool4 Tool5 Tool1 ✖ ✔ ✔ ✖ ✔ ✔ Tool6 Tool6 Tool7 ✔ ✖ Tool2 Tool6 Tool1 Tool4 Tool5  ✖ ✔   How do I know which tools are appropriate?: Reproduce Experiments (unbiased, repeatable, …) Tools Results Evaluations Test data My tool My results

  4. Heterogeneity in the tools to be compared • Data Sets • Levels of complexity • Scalability • Synthetic, hand-crafted and real world data • Tools • Execution environment • Functionalities • Criteria to evaluate: scalability, interoperability, etc. • Metrics • Precision/recall, information losses of data, etc. • Evaluations • Fully automatic • The user is in the loop • Hybrid approach • Extrinsic and intrisic • Starting point • From scratch • Extend and adapt preliminar approaches

  5. Dimensions Interpretation selection Goals selection Metrics selection Test data selection Evaluation workflows implementation Assumptions selection Evaluation design Evaluation services implementation Criteria selection Tool features selection Execution requirements selection Tools selection

  6. SEALS platform for technology evaluation • SEALS Platform Features • Open (everybody can use it) • Scalable (to users, data size) • Extensible (to more tests, different technology, more measures) • Sustainable (beyond SEALS) • Independent (unbiased) • Repeatable (evaluations can be reproduced)

  7. The SEALS platform Evaluations Tools Results Results Comparisons Semantic Technology Recommendation Framework SEALS Platform Test Data

  8. SEALS Logical Architecture Virtualization: two execution environments Evaluation Organisers Technology Providers Technology Adopters SEALS Portal RuntimeEvaluation Service SEALS Service Manager Software agents SEALS Repositories Test Data Repository Service Tools Repository Service Results Repository Service Evaluation Descriptions Repository Service

  9. Evaluation Execution Process at 10,000ft ER

  10. Evaluation services Tools Results Reproduce experiments Evaluations Test data My tool My results Exploit results Use your dataset to evaluate others tool Evaluations Tools My test data My results Tools Test data Define your own experiment My evaluation My results My test data My tool

  11. Achievements Implementation and Evaluation of Repositories Specification of the Platform Architecture Workflows Implemented in BPEL SEALS entities (dataset, tools, evaluations and results) are described using Ontologies Execution Subsystem http://www.seals-project.eu/ontologies/ Repository Front-end Subsystem Infrastructure Management Subsystem Results Comparisons Semantic Technology Recommendation Framework Platform Administration Subsystem Test Data Repository Service Tools Repository Service Results Repository Service Evaluation Descriptions Repository Service 23.09.2014

  12. SEALS support to scholarly workflows Scholarly Portal ✔ ✔ Tool1 Tool2 Tool3 Tool1 Tool4 Tool5 Tool1 ✖ ✔ ✔ ✖ ✔ ✔ Tool6 Tool6 Tool7 ✔ ✖ Tool2 Tool6 Tool1 Tool4 Tool5 ✖ ✔    SEALS Service Manager RuntimeEvaluation Service SEALS Repositories Scholarly WorkflowRepository Test Data Repository Tools Repository Results Repository Evaluation Descriptions Repository

  13. Thank you for your attention!

More Related