1 / 25

Monitoring

Monitoring. 27-Feb-08 - H. Vincent - TCF - Template for slideshows. Plan Étape 1: SemEUsE Objective Contexte Requirements Features Architecture Étape 2: résolution de contraintes (simple) Étape 4: capacité de configuration de l'architecture de monitoring

joann
Télécharger la présentation

Monitoring

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Monitoring 27-Feb-08 - H. Vincent - TCF - Template for slideshows

  2. Plan • Étape 1: SemEUsE • Objective • Contexte • Requirements • Features • Architecture • Étape 2: résolution de contraintes (simple) • Étape 4: capacité de configuration de l'architecture de monitoring • (les étapes indiqués dans le document de Philippe)

  3. Objective • SEMEUSE's deliverable (étape 1): • Monitoring service for domain-specific QoS data

  4. Gateway Gateway Gateway Use case: Fire-fighting (pervasive C.) Business Process … … Select fire-men Select policies Select trucks Late Binding: Application 2 Late Binding: Application n Late Binding: Application 1 … pull pull pull Data Collector application (based on a data mediation fmk) Get properties: fire-men, policies, trucks,… push/pull push/pull push/pull Fuel level,… (truck #i) Heart frequency,… (fire-man #i) … Sensors Sensors Sensors Vizilles Eybens Gières …

  5. Sender(s) Sender(s) Sender(s) Fire-man #i, truck #i, policy #i, … Fire-man #i, truck #i, policy #i, … Fire-man #i, truck #i, policy #i, … Probe(s) Probe(s) Probe(s) Gateway Gateway Gateway Business Process M4LB scope … … Select fire-men Select policies Select trucks Late Binding: Application 2 Late Binding: Application n Late Binding: Application 1 pull pull pull … Sender(s) Data buffering Data coherency Data adaptation Data Collector application(local or distributed mediationbased on a data mediation fmk) M4LB (view) Probe(s) push push push push/pull push/pull push/pull Local mediation: Report on truck #i (fuel level,…) , fire-man #i (heart-frequency,…) Sensors Sensors Sensors Vizilles Eybens Gières

  6. Requirements • Late binding requirements: • R1: Rapid access:Late Binding decision requires a rapid access to QoS data (the monitoring should not block the process execution); • R2: Data coherency/QoI:Late Binding decision requires a temporal correlation between measured QoS data of different QoS dimensions, collected from probes associated to the candidate services; • R3: Domain-specific QoS:Late Binding is interested in the instantaneous QoS level of a service which depends on technical indicators (e.g. response time) and on domain-specific indicators (e.g. cardiac rhythm of a fireman); • R4: Data adaptation:Late binding requires an adaptation of the QoS data provided by the data source when the QoS dimensions metrics differ. • Furthermore, we consider two optimizations: • O1:Share QoS data for multiple Late Binding process. • O2:Aggregation of QoS data at data source.

  7. coherency=10secs now Sliding direction Speed (S) Essence (E) time Valid coherency data age = 1 min Features • Data buffer: a local cache to process rapid access (R1) and share QoS data (O1) • QoI/ data coherency (R2) • Age (A) • Coherency (C) Late Binding Monitoring APIs Data processing Buffer system speed-buffer essence-buffer water-buffer probe probe probe Monitoring Sensors Truck: speed essence water (di.ts is timestamp of QoS data, t is current time) Example: select the first tuple conformed to QoI by a sliding windows

  8. Disassociation processing Data buffering, processing Monitoring Business Process Services Semantic contract negociation Services Probe Remote data update Services Late Binding QoS Semanticmatching result Gateway time Sensors Data provider Adaptor Generator time Dynamic code generation Sensors Data Adaptor Example: aggregate multiple QoS elements from two sensors in a batch and periodically updates to monitoring Monitoring Features Data Adaptation (R4): different metric b/w providers and consumers Aggregation of QoS data at data source(O2): save bandwidth and connections

  9. View Description Dataflow example • Dataflow Co-localization (R1) Late Binding Monitoring consumer View CoherencyFilter View CoherencyFilter Temperature (C) push Adaptor pull Buffer Buffer Buffer Non-block Block Speed Essence Fractal Component Temperature (F) Comp Probe Probe Monitoring Data providers Weather Truck (gateway)

  10. Architecture - receive QoS data query- Wrap a QoI processor- Param: coherency, age - Abstract View Builder- Sharing Param: age - Abstract Probe- Param: period Strategy pattern applied for processing components: - QoI processor - Data buffer - Data adaptation

  11. Étape 1: SemEUsE • Étape 2: résolution de contraintes (simple) • Objective & Requirement • Roles • Definition • Approach • Example • Étape 4: capacité de configuration de l'architecture de monitoring

  12. Objective • Objective – étape 2: Cohérence de QoI centralisé • Monitoring scope: as previous (slide 4), the M4LB is co-localized with its client

  13. Roles • Roles Data consumer Data acquisition, buffering, processing Administrator M4LB Data provider

  14. Requirements • New requirements plus previous requirements: • R5: QoI-aware data requests: multiple data consumers requires different QoI • Each data consumer requires a different and explicit QoI. • Each data consumer relates to multiple services  a relationship "multiple data consumers – multiple services" ( a graph). • Target role: data consumer • Allow data consumers to request different QoI • Example: • QoI-aware user-preferences ("extended" Fire fighting) • E.g. Late Binding specifies explicitly QoI in a data request • Monitoring levels (normal, detail) for fault diagnostics [4] • Classes of users (golden, sliver, bronze) (à la QMON [5])

  15. age (s, t) element is changed (at sensor)element is updated (at M4LB) network latency age update period time sensor s Definition • Network latency (NL) • Quantify: NL = time to propagate new update from sensor to M4LB • Computational latency (CL) • Quantify: CL = time to process data inside M4LB • Assumption: computational latency is negligible in co-localized context • Bandwidth (B) • Quantify: B = (No. msg per time unit) * mean (msg_size) • B proportions to update frequency and 1/update period • Age (A) & update period (UP) • See slide 5: age, coherency • AP: age offers by probe vs. AV: age requires by views. AP= UP + NLmax • Notes • Sensor: remote element that generates QoS data stream • Probe: M4LB element that remotely synchronizes with a sensor or other probe • View: M4LB element which is a access point to its clients/represents a data request (1)

  16. V2 V1 Vn Data buffering, processing … Monitoring P1 Pi Pm Approach • Approach • Multiple QoI-aware data requests (views V1…VN) over different data sources (probes P1…PM) • Notes • (*) apply adaptation policies (remarques de Jacques dans l'étape 3) • Connection between views/probes is a graph, called monitoring graph Input:- V1,…,VJ connect to Pi- Vj params = <AVj>, for all j=1..N- Pi params = <UPPi,NLmaxPi>, for all i=1..M Monitoring graph • Algorithm: • For Pi, find result R = (is Pi satisfactory all Vj, j=1..J), so: • If R=YES: set UPPi = UPrequest • If R=NO: fail to configure (*) Constraint to resolve UPrequest = min {AVj – NLmaxPi}, j = 1…J R = (UPPi <= UPrequest) APi <= AVj, j = 1…x (1) (2)

  17. Example • Example: • for Pi : UPPi = min {AVj – NLPi}, j = 1…x • Optimization: if AV >> AP, one can aggregate multiple updates and send a data batch • Ex. AV = 120s, AP = 55  aggregate up to 2 updates InputOutput ex2 ex1 Request: A = 30s Request: A = 120s Request: A = 30s Request: A = 120s V1 V2 V1 V2 Setting: UP = 115s YES Setting:UP = 115s  YES Offer:UP= 10NLmax = 5s Offer:UP = 50NLmax = 5s P1 P2 Setting:UP = 25s YES Setting:UP=25  NO

  18. Étape 1: SemEUsE • Étape 2: résolution de contraintes (simple) • Étape 4: capacité de configuration de l'architecture de monitoring • Objective & requirement • Definition • Approach

  19. Objective Objective – étape 4: the M4LB is capable to (re-)configure and to be self-adaptive • Monitoring scope: as previous (slide 4), the M4LB is co-localized with its client

  20. Requirements • New requirements plus previous requirements: • R6: guarantee optimal QoI under resource constraints. This requirement addresses • Easy to assign resource (e.g. 5% of total traffic) for monitoring • Robustness to dynamic monitoring workload • E.g. under the critical resource, monitoring should optimally reply to data requests • Target role: Administrator • Allow Administrator to configure the M4LB • Example • Bursty workload (e.g. a huge number of Late Binding processes/data requests or a huge number of sensor updates)

  21. age (s, t) element is changed (at sensor)element is updated (at probe) Don't send this update due to VP setting age Inter/extrapolation to calculate new value + update timestamp max network latency time sensor s Definition • Valueprecision (VP): an arithmetic filter [2,3,6,7] • abstract concept/open • Ex: |updatenew – updateold| > threshold send update • Intuitionally, bandwidth BPi can be statistically estimated with a given <APi, VPPi, NLPi> • Incoming bandwidth (Bin): bandwidth for QoS data input (from sensor) • Quantify: bandwidth Bin= SUM (BPi), i = 1…N, N is number of probes • Cost (Co) • Abstract concept/open, associate with each element (sensor, probe, data request/view) • Quantify (for a data request in view) CoV=f (QoIV) = f (<CV, AV, VPV>), called QoI-based cost function QoI properties = { C, A, VP} Age & value precision are commonly use in the state of the art for trade-off precision vs. performance [1,2,3]

  22. Constraint to resolve: SUM (B'Pi) <= Bin SUM (Co'Vi) is minimal Approach • Approach • Trade-off QoI vs. resource consumption, principal idea is: • Define cost estimation and resource estimation functions • Find the optimal QoI that the total resource is satisfied and the total cost is minimal by constraint programming • Example for bandwidth • Input:- Bin: total incoming bandwidth for monitoring • CoVj = f (QoIVi) = f(<CVj, AVj, VPVj>), f is cost estimation function • BPiestimate = g(<UPPi, NLmaxPi, VPPi>), g is bandwidth estimation function • - Vj params = <CVj, AVj>, for all j=1..N- Pi params = <UPPi,NLmaxPi>, for all i=1..M Idea for this example:reduce bandwidth of probes that have less connected views is more efficient. Note (in general): the cost estimation function is flexible to different trade-off target/context. Algorithm: 1. calculate VPPi, UPPi which satisfy all Vi, i=1..J (as étape 2) 2. estimate bandwidth for each Pi, BPiestimate = g (<UPPi, NLmaxPi, VPPi>) 3. if Bin < Bin estimate = SUM (BPiestimate), do trade-off: - reduce QoIVi to QoI'Vi, i = 1..N, so new bandwidth of each Pi will be reduced to B'Pi, so:

  23. Reference • Shah, S.; Ramamritham, K.; Shenoy, P., "Resilient and coherence preserving dissemination of dynamic data using cooperating peers," • WebView: Scalable Information Monitoring for Data-Intensive Web Applications. Navendu Jain, Mike Dahlin, and Yin Zhang. • - (precedent of WebView) PRISM: PRecision-Integrated Scalable Monitoring. Navendu Jain, Dmitry Kit, Prince Mahajan, Praveen Yalagandula, Mike Dahlin, and Yin Zhang. TR06-22, UTCS Technical Report. • Olston, C., Jiang, J., and Widom, J. 2003. Adaptive filters for continuous queries over distributed data streams. In Proceedings of the 2003 ACM SIGMOD international Conference on Management of Data • Leveraging Many Simple Statistical Models to Adaptively Monitor Software Systems - http://www.springerlink.com/content/4jv08475vw80026w/ • Agarwala, S.; Yuan Chen; Milojicic, D.; Schwan, K., "QMON: QoS- and Utility-Aware Monitoring in Enterpris Systems," Autonomic Computing, 2006. ICAC '06. • Vibhore Kumar , Brian F. Cooper , Shamkant B. Navathe, Predictive filtering: a learning-based approach to data stream filtering, • Ankur Jain , Edward Y. Chang , Yuan-Fang Wang, Adaptive stream resource management using Kalman Filters

  24. Features Aggregation of QoS data Data buffering, processing Disassociation processing Probe Monitoring truck1.speed truck1.essence truck2.speed truck2.essence Gateway <truck1.speed, truck1.essence, truck2.speed, truck2.essence> Data provider Sensors

  25. Roles Data consumer params: QoI {coherency Cv, age CV, value precision VPV },cost CoV Data consumer Role Administrator role Administrator params: CPU, memory, bandwidth_in,throughput,computational latency … V2 V1 Vn Data buffering, processing Monitoring Role … P1 Pi Pm Data source Role Data provider params: bandwidth BP, latency LP, age AP, value precision VPp Monitoring graph (connection b/w monitoring elements)

More Related