1 / 75

Methodologies for the MAIS environment

Methodologies for the MAIS environment. ICSOC’04 – Dec. 16, 2004. M. Comerio, F. De Paoli, S. Grega, C. Batini – UniMiB D. Ardagna, C. Cappiello - Polimi. A methodology should address …. MAIS proposal  Run time negotiation Design of services Redesign of services.

luke
Télécharger la présentation

Methodologies for the MAIS environment

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Methodologies for the MAIS environment ICSOC’04 – Dec. 16, 2004 M. Comerio, F.De Paoli, S. Grega, C. Batini – UniMiBD. Ardagna, C. Cappiello - Polimi

  2. A methodology should address … • MAIS proposal  Run time negotiation • Design of services • Redesign of services

  3. Relevant inputs to methodologies • Functional requirements • Quality requirements • Negotiable/non negotiable • Checkable/Observable •  Registry of QoS • User profiles and preferences • Preferences/Body functions/Relational capabilities/Skills/Activity Particip. •  Registry of UP • Context specs • New available technologies/channels (e.g. localization services) • Mais Architecture and tools • Reflective architecture • Atomic interaction units • …. • Costs

  4. Layered MAIS Model USER MODEL SERVICE MODEL EXTENDED REFLECTIVE ARCHITECTURE PLAIN REFLECTIVE ARCHITECTURE DEVICE NETWORK MAC PHYSICAL

  5. Details on the user and service model

  6. Furthermore, it should consider two points of view… • User of service • Choice of best service • Provider of service • Monopolistic • Existing market • Coalitions/integrators

  7. Services are of two types • Atomic • Composite

  8. Underlying strategy • Heuristics • Formal model • Knowledge based • Logic • Mathematical • Probabilistic

  9. MAIS groups addressing method. issues (Order not meaningful) • Bicocca group on redesign methodologies • Bicocca/Polimi (Ardagna/Cappiello) on quality classification and quality composition • Polimi (Ceri/Florian group) on extending the WebML methodology to multichannel applications/web sites • Polimi/Roma1 results on feasibility study/pricing/market • RomaTre methodology for user preferences • Roma1 methodology for usability evaluation and improvement • Roma1 tools on atomic interaction units and user preferences • Isufi et al Registry of user profiles • Roma1/Firenze/Polimi/Bicocca classifications of functions • Bicocca reflective architecture • Polimi (Paolini) contributions on improving dialogues in using web sites • Others……

  10. Current results

  11. Bicocca methodology for Redesign Atomic services Qualities (partially) No user profiles No costs Tools: atomic interaction units Reflective architecture (to some extent) Heuristics

  12. The enrichment of the redesign methodology to QoS and UPs • The methodology has been conceived for the redesign environment • It is made of five steps • See M.Comerio, F.De Paoli, C.De Francesco, A.Di Pasquale, S.Grega, C.Batini “A Service Re-design Methodology for Multi-channel Adaptation”. In Proceedings of the 2nd International Conference on Service Oriented Computing (ICSOC), New York City (USA), November 15-18, 2004.

  13. High level description

  14. Milestones of the methodology • Phase 1 – Actual service model • Specification of the logical and operational structure of the service under redesign – UML diagrams • Phase 2 – High level redesign • Identification of quality requirements created by new channels, user & domain characteristics • Definition of the UML diagrams to address QoS • Phase 3 – Behavior modeling • Identification of the atomic interaction units (AIU) • Definition of the activity diagrams • Phase 4 – Customization • 4.a Channel Customization • 4.b Service customization • Identification and validation of technological scenarios supported by reflective architecture and user profiles • Validation and specialization of UML diagrams

  15. The four phases of the methodology

  16. What is new wrt BPR methodologies?

  17. Bicocca methodology for redesign short description

  18. Actual service modelling • GOAL: aims to reconstruct the abstract characteristics of an existing macro-service. • INPUT: • Actual service: is the specification of the service that is under re-design in terms of interfaces and interactions; • Logical data diagram: is the definition of the data structure (e.g. E-R schemes). • OUTPUT: • UML Diagrams: highlight the logical and operational structure of the actual service.

  19. High level redesign • GOAL: aims to redesign the service architecture, described by UML diagrams, in the light of new requirements enabled by the new channels and by domain characteristics • INPUT: • New services provided by new channels: requirements derived from the characteristics of the chosen channels; • Userquality requirements :negotiable QoS related to the user. • OUTPUT: • Enriched UML Diagrams: highlight the logical and operational structure of the redesigned service.

  20. Behaviour modelling • GOAL: aims to model the interface of the analyzed service and the interaction between the user and the service by the use of the Atomic Interaction Units (AIU) • INPUT: • Atomic Interaction Units (AIU): used to decompose a single service into smaller atomic units that describe the interaction between user and system in an abstract way (independently from the user device). • OUTPUT: • Activity Diagrams: highlight the interactions between the user and the system in terms of AIUs.

  21. Customization • GOAL: define the characteristics and the architecture of the new enriched multi-channel service considering the technical characteristics of the channels and the user profile. • INPUT: • MAIS reflective architecture: is used to gather information (e.g. the user’s location). • OUTPUT: • Enriched service models: are the UML diagrams that describe the new service.

  22. Channel customization • GOAL: aims to specialize AIUs chosen in the previous phase considering the technical characteristics of the channels. • INPUT: • Technical characteristics of new channels: are quality requirements related to the channels chosen to support the multi-channel access; • OUTPUT: • Specialized Activity diagrams: the interactions between the user and the system in terms of AIUs are specialized for each channel.

  23. Service customization • GOAL: service adaptation to the users’ different needs. • INPUT • User profile: static (e.g. role) and dynamic (e.g. historical information captured by the system) characteristics of the user.

  24. Extension of Bicocca methodology for redesign To QoS and User Profiles short description

  25. Bicocca methodology for Redesign Atomic services Qualities User profiles No costs Tools: Reflective architecture/atomic interaction units Heuristics

  26. Steps of the enriched methodology for QoS + UP • Define new functional and non functional requirements; • Step 2: High level redesign revisited • Classify extracted qualities in terms of the MAIS classification; • Compare user quality requirements with the Mais QoS Registry (QR), and extract relevant QoS dependency sub-trees; • Examine the application conceptual schema/domain ontology, and enrich the extracted QoS tree with domain dependent QoS, resulting from refinement of qualities in the Registry or else definition of new domain qualities • Step 4: Channel/Service customization revisited • Check the compatibility of quality requirements with negotiable and non negotiable qualities of the Mais infrastructure; • Compare user quality requirements with the Mais UP Registry; • Build the QoS/UP matrix on the base of intrinsic dependencies and application domain relevence; • Check compatibility of Qos/UP with negotiable and non negotiable qualities of the Mais infrastructure; • TBD Build an analytical model of quality composition • TBD Use the model to refine the compatibility check

  27. Phase 2: High level redesign revisited

  28. Steps of the enriched methodology for QoS + UP • Define new functional and non functional requirements; • Step 2: High level redesign revisited • Classify extracted qualities in terms of the MAIS classification; • Compare user quality requirements with the Mais QoS Registry (QR), and extract relevant QoS dependency sub-trees; • Examine the application conceptual schema/domain ontology, and enrich the extracted QoS tree with domain dependent QoS, resulting from refinement of qualities in the Registry or else definition of new domain qualities • Step 4: Channel/Service customization revisited • Check the compatibility of quality requirements with negotiable and non negotiable qualities of the Mais infrastructure; • Compare user quality requirements with the Mais UP Registry; • Build the QoS/UP matrix on the base of intrinsic dependencies and application domain relevence; • Check compatibility of Qos/UP with negotiable and non negotiable qualities of the Mais infrastructure; • TBD Build an analytical model of quality composition • TBD Use the model to refine the compatibility check

  29. Phase 2: High-level redesign 2.4 New service modeling 2.5 QoS modeling The activities performed in this phase are described by the following steps. User Quality requirements (UQR) Domain knowledge 2.1 UQR analysis QoS New service provided by new channels 2.2 QoS classification Negotiable QoS: (provider and device characteristics) Non-negotiable QoS (provider) and Negotiable User profile QoS 2.3 Merge and Matching New service requirements UML diagrams Enriched UML diagrams

  30. Rappresentazione di alberi integrati negli Strati Mais USER MODEL count (9) - total(11) E-Service Cost E-Service Confidentiality E-Service Accuracy Service Availability Completeness Authorization E-Service Data Encryption Flexibility Capacity Cost Supported Standard SERVICE MODEL count (13) - total(15) Session availability Price Service Availability Data Timeliness Data Completeness Channel Availability Response Time Data Accuracy Data Reliability Data Timeliness Data Accuracy Bandwidth Data Completeness ARCHITETTURA RIFLESSIVA ESTESA Strategies

  31. Quality tree extraction Hotel quality H. efficiency Always on connection Service Availability Channel Availability Tempo di risposta Bandwidth Indici di prestazione (QoS) Delay Throughput Bit Rate Consumo di potenza

  32. Channel and service customization revisited

  33. Steps of the enriched methodology for QoS + UP • Define new functional and non functional requirements; • Step 2: High level redesign revisited • Classify extracted qualities in terms of the MAIS classification; • Compare user quality requirements with the Mais QoS Registry (QR), and extract relevant QoS dependency sub-trees; • Examine the application conceptual schema/domain ontology, and enrich the extracted QoS tree with domain dependent QoS, resulting from refinement of qualities in the Registry or else definition of new domain qualities • Step 4: Channel/Service customization revisited • Check the compatibility of quality requirements with negotiable and non negotiable qualities of the Mais infrastructure; • Compare user quality requirements with the Mais UP Registry; • Build the QoS/UP matrix on the base of intrinsic dependencies and application domain relevence; • Check compatibility of Qos/UP with negotiable and non negotiable qualities of the Mais infrastructure; • TBD Build an analytical model of quality composition • TBD Use the model to refine the compatibility check

  34. Phase 4a: Channel Customization 4a.5 UML diagrams revision. User quality requirements 4a.1 Analyze relationship among QoS and extract relevant QoS tree Tools of MAIS system: Reflective Architecture Technical characteristics of the new channels 4a.2 Set weight for every QoS. 4a.3 Set range of values for every QoS. 4a.4 Assumption evaluation. Assumption revision Enriched service models

  35. Phase 4a: Channel Customization USABILITY Phase 4a.1 Phase 4a.2 Phase 4a.3 0,2 0,4 0,3 0,1 COMPREHENSIBILITY PLEASANTNESS LEARNABILITY OPERABILITY 0,2 0,8 SCREENQoS NETWORKINTERFACEQoS RESOLUTION SIZE CONTRAST BRIGHTNESS COLORDEPTH [1.O”…19.0”] …………. …………. [16BIT…64BIT] [800X600]…[1024X640]

  36. Phase 4a: Channel Customization 0 ………… ………… 1 0,2 Phase 4a.4 SCREENQoS [0…1] RESOLUTION SIZE CONTRAST BRIGHTNESS COLORDEPTH [800X600]…[1024X640] [1.O”…19.0”] …………. …………. [16BIT…64BIT]

  37. Phase 4a: Channel Customization USABILITY 0,2 0,4 0,3 0,1 COMPREHENSIBILITY LEARNABILITY OPERABILITY PLEASANTNESS 0,38 0,2 0,8 SCREENQoS NETWORKINTERFACEQoS 0,7 0,3 Phase 4a.4: The Simple Additive Weighting Method permits to evaluate the value of each QoS starting from technical characteristics. For example, considering the following tuples: Tk SCREENQoS = 0,7 Tj NETWORKINTERFACEQoS = 0,3 COMPREHENSIBILITY=0.2*0.7+0.8*0.3=0.38 USABILITY=0,2*LEARNABILITY+0,4*0,38+……….

  38. Phase 4a: Channel Customization USABILITY USABILITY >= 0,8 < 0,8 • Phase 4a.4: Assumption evaluation • Phase 2.1:The service must be easy to perceive, learn and use for every user of the domain. • Phase 2.2: Usability (Value=0,8): negotiable QoS related to user profile. • Phase 4a.4 • The assumption is not satisfied. It’s necessary to go back and revise the service. • The assumption is satisfied. It’s not necessary to go back and revise the service.

  39. Phase 4b: Service Customization revisited 4b.1 UP – QoS mapping User Profile (UP) Mapping table 4b.2 Set weight for each mapping Table with weight Assumption revision 4b.3 Assumption evaluation 4b.4 UML diagrams revision

  40. Phase 4b: Service Customization • Phase 4b.1 - UP – QoS mapping • Phase 4b.2 - Set weight for each mapping • Phase 4b.3 - Assumption evaluation QoS User profile X X X X X X X X X X X X X X X

  41. Phase 4b: Service Customization • Phase 4b.1 - UP – QoS mapping • Phase 4b.2 - Set weight for each mapping • Phase 4b.3 - Assumption evaluation QoS User profile 0,1 0,5 0,2 0,3 0,1 0,4 0,5 0,2 0,3 0,2 0,1 0,2 0,3 0,5 0,1

  42. Phase 4b: Service Customization • Phase 4b.3 - Assumption evaluation • User profile attributes are values in the range [0,1]; • Each QoS is associated with a set of weight for each attribute: • COMPREHENSIBILITY=0,1*IS_ITL_pref+0,4*IS_ICF_BODY_FUN CT+ 0,3*EXPERTISE+0,2*DELIVERYPREF For example: COMPREHENSIBILITY=0,1*0,2+0,4*1+0,3*0,8+0,2*0,8=0,82 • Channel customization sub-phase sets: • COMPREHENSIBILITY = 0,38<0,82 • It’s suitable to realize an alternative version of the service or to revise the assumption made in the previous phases.

  43. Extension of Bicocca methodology for redesign to QoS and User Profiles detailed description

  44. Phase 2: High level redesign revisited

  45. Steps of the enriched methodology for QoS + UP • Define new functional and non functional requirements; • Classify extracted qualities in terms of the MAIS classification; • Compare user quality requirements with the Mais QoS Registry (QR), and extract relevant QoS dependency sub-trees; • Enrich the extracted QoS tree with domain dependent QoS, resulting from refinement of classes in the application concpetual schema/domain ontology • Check the compatibility of quality requirements with negotiable and non negotiable qualities of the Mais infrastructure; • Compare user quality requirements with the Mais UP Registry; • Build the QoS/UP matrix on the base of intrinsic dependencies and application domain relevence; • Check compatibility of Qos/UP with negotiable and non negotiable qualities of the Mais infrastructure; • TBD Build an analytical model of quality composition • TBD Use the model to refine the compatibility check

  46. Example: checks itinerary • Phase 2.1 : UQR analysis • Sensitive data must be available only to selected user. • Efficient planning of controls. • The service must be easy to perceive, learn and use for every user of the domain. • Phase 2.2 : QoS classification • Security (value=0,8): non-negotiable quality related to the provider. • Precision of Localization (value=0,7):intrinsic quality of the new service. It’s a negotiable quality related to user devices characteristics. • Usability (value=0,8):negotiable quality related to user profile. • Phase 2.3 : Merge and Matching • Prec. of Localization Localization Techniques New service requirements: these techniques should be used to supply the vet with an itinerary to follow for herds controls. This itinerary is planned considering the actual position of the vet on the territory.

  47. Phase 2: High-level redesign • Goal: redesign the service architecture, described by UML diagrams, in the light of new requirements enabled by the new channels and by domain characteristics. • Input: • New services provided by new channels: new opportunities derived from the characteristics of the chosen channels; • Userquality requirements : QoS related to user and domain needs. • Output: • Enriched UML Diagrams: highlight the logical and operational structure of the redesigned service.

  48. New services provided by new channels • They are new opportunities for the enrichment of the service under re-design derived from the technical characteristics of the available channels. • For example, localization techniques introduced by mobile devices support the knowledge of the position of a user on the territory. This information supports the specification of new requirements for the enrichment of the service. • According to these new requirements, the UML diagrams produced in the previous phase and describing the actual service need to be revised and modified. New classes, attributes and use cases must be introduced for the description of the new service.

  49. Phase 2: High-level redesign -2 • Goal: redesign the service architecture, described by UML diagrams, in the light of new requirements enabled by the new channels and by domain characteristics. • Input: • New services provided by new channels: new opportunities derived from the characteristics of the chosen channels; • Userquality requirements : QoS related to user and domain needs. • Output: • Enriched UML Diagrams: highlight the logical and operational structure of the redesigned service.

  50. Phase 2: High-level redesign - 2 • These quality dimensions are considered in an abstract way. They will be verified in the customization phase. • They are negotiable dimensions used in the negotiation phase performed at run time. They are classified in two categories: • Negotiable quality dimension – Provider: qualities dimensions related to the resources used by the provider to offer services; • (for example: provide service with an high usability) • Negotiable quality dimension – User: qualities dimension required by the final users or related to the user device characteristics; • (for example: knowledge of the position of a user on the territory) • They are non-negotiable quality dimensions that characterize the service from the provider side. They are classified in two categories: • Intrinsic service quality dimension; • (for example: provide service with an high traceability) • Service delivery quality dimension. • (for example: safe data distribution)

More Related