170 likes | 265 Vues
Explore the challenges and strategies related to evaluating architecture in complex systems. Understand the importance of distinguishing between architecture, system, and prototype evaluation criteria, and learn about different evaluation processes and criteria. Discover the significance of architecture principles and their impact on system evolution. Dive into the discussion on theoretical frameworks and experimental research for enhancing architectural evaluability. Gain insights into network architecture distinctions, coordination in space and time, and the elegance of architectural design. Step into the world of architectural modularity and its implications in theory-driven architectural evaluations. Delve deeper into evaluation coordination, principles of effectiveness, and evolving technologies.
E N D
All hat, no answers Some issues related to the evaluation of architecture John WroclawskiUSC/ISI NSF FIA MeetingMarch 19, 2013
Why think about this? • NSF 13-538: “An evaluation plan…” • Guide and inform your work (feedback) • Explain to and convince others • Advance the field – a {hard, transformational} research area in itself.
Architecture • “high level design principles that guide the technical development of a system, especially the engineering of its protocols and algorithms” • Two distinct levels: • A specification of system modularity, functional decomposition, interface locations and definitions, etc. – the “what”. • A set of fundamental structuring principles that drive the choices of the first bullet – the “why”.
System “A realized instantiation of a design, that meets specific requirements” • Well studied discipline - multiple design approaches: Waterfall… • Requirementsanalysis • Detailed systemengineering • Detailed component engineering • Construction.. Agile… Some evaluation criteria: • “Meets Requirements” “Performance” “Lifecycle Cost” …
Prototype “A partial system development focusing on key components or issues” • Proof of concept • Learn what’s missing • Expose ideas and convince others • Test key system functions quickly • … • Some (meta?) evaluation criteria • “Focused on the key open questions” “Learn what you need to” “Not misleading” “Leverage existing resources” “Quick turnaround” • (See Craig Partridge’s comments in summary from last meeting)
Observation • Design objectives, and hence evaluation processes, are significantly different for each of these things. • All are useful. • But: need to tease them apart. Need to discuss each separately.
Network* Architecture • Distinction between requirements / properties of the architecture and requirements /properties of a system. • Many systems can share onearchitecture Basic technical requirements and properties Evaluation Coordination in space–multiple independent implementers Coordination and evolution in time –multiple, evolving technologies and uses *or maybe “Internet”…
Diversion - “Elegance” • Architects talk about it a lot. Means what? • One view – economy of mechanism, minimalist, “clean” • Another view (or maybe not..): structural principles intuitively apparent –architecture conveys understanding, not just rules. “Feels like it makes sense.” • Why does this matter? Guidance for evolution in space and time. • An evaluation point: are the architecture’s structuringprinciples clear, intuitive, and apparent?
Evaluation • Step 1: Clarify the proposed architecture – separate from system and prototype • Step 2: Identify and execute evaluation strategies
Evaluation Strategies • Today, we have basically two. • Socratic discourse • Build it, achieve success, and wait 30 years • In fact, this – particularly the first – has worked rather better than my snide tone might suggest • With the right structure and context, people have demonstrated good ability to reason about architectural principles • Telephone vs IP – strengths of each • ATM QoSvs IP QoS – a clear lesson in simplicity, coordination in space, and more • Discussion point: Can we foster this structure and context? How?
Evaluation Strategies II • Still, we would like more. Lets talk about two: • … • … • Moving towards Theoretical Frameworks – increased rigor and more structured understanding • Experimental Research – contrasted with prototype deployment
Theoretical Frameworks • We do not today have a “theory of architecture” – (very) far from it • But – there are glimmers. From • Game theory • Optimization theory • Various bits of economics • … • A possible synthesis principle for increased architectural evaluability (among other benefits…): • Choose modularities with intent to bring emerging theory into scope • Allows evaluation of sub-architectures using these tools
An example:Theoretically Derived Architectural Modularity Network resource allocation formulated as global optimization problem Primal-dual decomposition generates a set of dual problems/algorithms/modules: Local (except scheduling) Tied together through congestion prices (sub)-system architecturetraceable to theoretically provable optimality.. Applications Congestion control Routing Scheduling Channel OLD SLIDE... Utility function U_s{x_s} (strictly concave function of the sending rates) Cross-layer interaction in form of “congestion prices” (cost per unit flow of sending data along a link to a destination) Optimal Cross-Layer Congestion Control, Routing, and Scheduling Design in Ad Hoc Wireless Networks. Lijun Chen, Steven H. Low, Mung Chiang†,John C. Doyle (Caltech and †Princeton)
Experimental Research • …as distinct from prototype construction and deployment • Focus on • Worst case experiments • Intentional perturbation • Accelerated evolution • Comparison (implies repeatability?) • Given current state of the art, likely to involve simulation of key system components, modeling of users, etc.
Takeaway Points • Clarify the domain – Architecture vs System vs Prototype • Each domain both requires and deserves separate evaluation • Clarify the architecture – both underlying principles and resulting structure • Mixed mode evaluation for a still-empirical discipline • Evaluate through discourse • Evaluate through analysis • Evaluate through experiment Driven by reasoning Driven by increasing rigor