1 / 11

Feedback on OPM

Feedback on OPM. Yogesh Simmhan Microsoft Research Synthesis of pairwise conversations with: Roger Barga Satya Sahoo Microsoft Research Beth Plale Abhijit Borude Indiana University. Roles. Using “role” annotations in OPM is not well defined…

Télécharger la présentation

Feedback on OPM

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Feedback on OPM Yogesh Simmhan Microsoft Research Synthesis of pairwise conversations with: Roger Barga Satya Sahoo Microsoft Research Beth Plale Abhijit Borude Indiana University

  2. Roles • Using “role” annotations in OPM is not well defined… • Named relationships are used as first class objects as defined in the RDF model • Affect the way inferences are made • Semantically meaningful or not?

  3. John Flour Used(flour) wasGeneratedBy(unused) Eggs(1) Bake Eggs(3) Cake Used(eggs) wasGeneratedBy(cake) John Flour Used(flour) wasGeneratedBy(unused) Eggs(A) Bake Eggs(A, B, C) Cake Used(eggs) wasGeneratedBy(cake)

  4. Accounts • Composite processes identified in OPM • Different granularity? • Different “view” (client vs service) • service/workflow composition using alternate accounts? • Should we specify composition more explicitly in edges as edge types? Subclasses? Baker Customer A Observer Baking Observers Baking [] [] [] Customer B

  5. Data Collections • does not seem to support the idea of granularity for data products • Alternate accounts more suited for process granularity, less for data granularity • processtypes for data de/compositions? Subclasses?

  6. Annotations • Causality is not the only relationship between provenance entities • Relevant domain-specific relationships that are needed to answer a scientists query. • Subclasses stronger form of annotations • Different? • Subclasses part of model • Annotations dependent on representation? Extensibility mechanisms?

  7. Representation/Serialization • OPM maps exactly to the W3C recommended standard to represent metadata Resource Description Framework (RDF) • OPM graph is differently named RDF graph • XML, RDF, CSV…

  8. Time • OPM approach to incorporating temporal parameter in provenance using time interval to represent instantaneous is not well defined • based on granularity of <t> values the query result will vary • Accuracy of timestamps affects inference • Logical timestamps? • Do we need time range? • Long running process (provenance is “past”, not “current”)…

  9. Agent • Loose form of control flow? • Workflow engine? • Commandline invoking workflow engine? • Researcher who starts commandline? • Previous component that triggers next component? • Where do we have TriggeredBy and where do we have ControlledBy?

  10. WF Engine User WF document ? ? Input data Service Output data WF document Client WF Engine ? ? WF document ? Client? ? Service WF Engine? Output data Input data Input data Service Output data

  11. Vagueness in Inferences • Edge count limits? • Weak and strong semantics • P1 used A1 • P1 MUST have used A1 • P1 MAY have used A1 • P1 used A1; A2 wasGenerated by P1 • A2 MUST have been derived from A1 • A2 MAY have been derived from A1 • Weak is lowest common denominator • mayHaveBeenUsed <= mustHaveBeenUsed…subclass?

More Related