1 / 11

Recording the Context of Action for Process Documentation

Recording the Context of Action for Process Documentation. Ian Wootten Cardiff University, UK I.M.Wootten@cs.cf.ac.uk. Context. Definitions: Circumstances forming the setting for an event, statement or idea [Oxford English Dictionary, 2008]

Télécharger la présentation

Recording the Context of Action for Process Documentation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Recording the Context of Action for Process Documentation Ian Wootten Cardiff University, UK I.M.Wootten@cs.cf.ac.uk

  2. Context • Definitions: • Circumstances forming the setting for an event, statement or idea [Oxford English Dictionary, 2008] • User environment elements a computer knows about [Brown, 1995] • Characterisation of the situation of entities [Dey and Abowd, 2000] • Properties which can support/dispute evidence of actions • More than component interaction • More informed judgements can be made • Subjective in nature • Ad-hoc documentation between applications • Records of data with unknown relationships could be useful • May help out at a later date

  3. Process Distinction • Provenance is about processes • “The process which led to some data” [Groth et al. 2006] • Sequences of actions • How did this come to be the way it is? • Achieved by: • Documenting relationships, component interaction • Evidence • If actions in a process are the same, locating distinct traces becomes more difficult • e.g. I invoke this workflow multiple times, are any records unique? Were they performed in different situations?

  4. Context Uses • Automatic assertion in legacy actors • E.g. Long running, data mining services • Prediction of future actor properties • Record context and actions • Probabilistic model constructed • Similarity of past process traces • Context recorded and compared for two provenance traces • And others….

  5. Documenting Process • Cannot answer all provenance queries with documentation of interaction alone • Eg. What was happening to cause such behaviour? Why does execution of the same workflow result in different execution times? How do we know an action is subject to the same conditions? • We know nothing of the context under which assertions are made • Answers can be given by entities themselves (e.g using PReServ) • Particular focus on deriving context from measurable values

  6. Time Series Knowledge Representation (TSKR) • Properties and States for an actor are represented using the TSKR [Moerchen 2006] • Series extracted from several numerical variables • Segmentation, Shape-based • Coincidence intervals found • Resultant series shows time intervals when multiple conditions occur (states) • Monitored variables specified by service administrator • States represented in transition table

  7. Documenting Context • Provide a mechanism to specify and automatically record environmental context for any application • Capture using process documentation as assertions of actor state, using PReServ • Operate according to a particular owner defined policy • Triggered on service execution through service wrappers • Reuse existing monitoring resources (Ganglia, Nagios) through plug-ins • Our policy configuration • Gathers monitoring data and mines states using TSKR

  8. Experimentation • Ran two services from provenance challenge 1000 times • Context recorded as actor state assertions • Action recorded as interaction assertions • TSKR series patterns can be used for comparison of states • Where series is segmented • Where vast collections of data need to be explored • Based upon • Context component distances • Maximum distance possible • TSKR Transition history mapped to a transition table • Used as a predictive tool

  9. Prediction Results • Three approaches: • State prediction (TSKR) • Random (with history) • Random

  10. Similarity Results • Indicates small subsets of documentation

  11. Conclusions • Context helps to understand evidence • For processes realised using SOA, understanding records of action • Actions may be the same but performed in different circumstances • TSKR is a good fit for context measurement • Registry approach assists in context capture • Automates the collection of actor state • Demonstrated as: • Good predictor of state • Useful identification of state properties

More Related