1 / 17

All hat, no answers

All hat, no answers. Some issues related to the evaluation of architecture John Wroclawski USC/ISI NSF FIA Meeting March 19, 2013. Why think about this?. NSF 13-538: “An evaluation plan…” Guide and inform your work (feedback) Explain to and convince others

tracy
Télécharger la présentation

All hat, no answers

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. All hat, no answers Some issues related to the evaluation of architecture John WroclawskiUSC/ISI NSF FIA MeetingMarch 19, 2013

  2. Why think about this? • NSF 13-538: “An evaluation plan…” • Guide and inform your work (feedback) • Explain to and convince others • Advance the field – a {hard, transformational} research area in itself.

  3. Architecture • “high level design principles that guide the technical development of a system, especially the engineering of its protocols and algorithms” • Two distinct levels: • A specification of system modularity, functional decomposition, interface locations and definitions, etc. – the “what”. • A set of fundamental structuring principles that drive the choices of the first bullet – the “why”.

  4. Architecture versus System versus Prototype

  5. System “A realized instantiation of a design, that meets specific requirements” • Well studied discipline - multiple design approaches: Waterfall… • Requirementsanalysis • Detailed systemengineering • Detailed component engineering • Construction.. Agile… Some evaluation criteria: • “Meets Requirements” “Performance” “Lifecycle Cost” …

  6. Prototype “A partial system development focusing on key components or issues” • Proof of concept • Learn what’s missing • Expose ideas and convince others • Test key system functions quickly • … • Some (meta?) evaluation criteria • “Focused on the key open questions” “Learn what you need to” “Not misleading” “Leverage existing resources” “Quick turnaround” • (See Craig Partridge’s comments in summary from last meeting)

  7. Observation • Design objectives, and hence evaluation processes, are significantly different for each of these things. • All are useful. • But: need to tease them apart. Need to discuss each separately.

  8. Network* Architecture • Distinction between requirements / properties of the architecture and requirements /properties of a system. • Many systems can share onearchitecture Basic technical requirements and properties Evaluation Coordination in space–multiple independent implementers Coordination and evolution in time –multiple, evolving technologies and uses *or maybe “Internet”…

  9. Diversion - “Elegance” • Architects talk about it a lot. Means what? • One view – economy of mechanism, minimalist, “clean” • Another view (or maybe not..): structural principles intuitively apparent –architecture conveys understanding, not just rules. “Feels like it makes sense.” • Why does this matter? Guidance for evolution in space and time. • An evaluation point: are the architecture’s structuringprinciples clear, intuitive, and apparent?

  10. Evaluation • Step 1: Clarify the proposed architecture – separate from system and prototype • Step 2: Identify and execute evaluation strategies

  11. Evaluation Strategies • Today, we have basically two. • Socratic discourse • Build it, achieve success, and wait 30 years • In fact, this – particularly the first – has worked rather better than my snide tone might suggest • With the right structure and context, people have demonstrated good ability to reason about architectural principles • Telephone vs IP – strengths of each • ATM QoSvs IP QoS – a clear lesson in simplicity, coordination in space, and more • Discussion point: Can we foster this structure and context? How?

  12. Evaluation Strategies II • Still, we would like more. Lets talk about two: • … • … • Moving towards Theoretical Frameworks – increased rigor and more structured understanding • Experimental Research – contrasted with prototype deployment

  13. Theoretical Frameworks • We do not today have a “theory of architecture” – (very) far from it • But – there are glimmers. From • Game theory • Optimization theory • Various bits of economics • … • A possible synthesis principle for increased architectural evaluability (among other benefits…): • Choose modularities with intent to bring emerging theory into scope • Allows evaluation of sub-architectures using these tools

  14. An example:Theoretically Derived Architectural Modularity Network resource allocation formulated as global optimization problem Primal-dual decomposition generates a set of dual problems/algorithms/modules: Local (except scheduling) Tied together through congestion prices (sub)-system architecturetraceable to theoretically provable optimality.. Applications Congestion control Routing Scheduling Channel OLD SLIDE... Utility function U_s{x_s} (strictly concave function of the sending rates) Cross-layer interaction in form of “congestion prices” (cost per unit flow of sending data along a link to a destination) Optimal Cross-Layer Congestion Control, Routing, and Scheduling Design in Ad Hoc Wireless Networks. Lijun Chen, Steven H. Low, Mung Chiang†,John C. Doyle (Caltech and †Princeton)

  15. Experimental Research • …as distinct from prototype construction and deployment • Focus on • Worst case experiments • Intentional perturbation • Accelerated evolution • Comparison (implies repeatability?) • Given current state of the art, likely to involve simulation of key system components, modeling of users, etc.

  16. Takeaway Points • Clarify the domain – Architecture vs System vs Prototype • Each domain both requires and deserves separate evaluation • Clarify the architecture – both underlying principles and resulting structure • Mixed mode evaluation for a still-empirical discipline • Evaluate through discourse • Evaluate through analysis • Evaluate through experiment Driven by reasoning Driven by increasing rigor

More Related