1 / 44

CSC 2920 Software Development & Professional Practices

CSC 2920 Software Development & Professional Practices. Fall 2010 Dr. Chuck Lillie. Chapter 8. Design Characteristics and Metrics. Characterizing Good Design. Besides the obvious - - - design should match the requirements - - - there are two fundamental characteristics:

len
Télécharger la présentation

CSC 2920 Software Development & Professional Practices

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. CSC 2920Software Development & Professional Practices Fall 2010 Dr. Chuck Lillie

  2. Chapter 8 Design Characteristics and Metrics

  3. Characterizing Good Design • Besides the obvious - - - design should match the requirements - - - there are two fundamental characteristics: • Consistency across design: • Common UI • looks • Logical flow • Common error processing • Common reports • Common system interfaces • Common help • All design carried to the same depth level • Completeness of the design • All requirements are accounted for • All parts of the design are carried to their completion, to the same depth level

  4. Intuitively Complexity is related to “Good” Design • Some Legacy Characterization of Design Complexity • Halstead Complexity metrics • McCabe’s Cyclomatic Complexity metric • Henry-Kafura Information flow metrics • Card and Glass design complexity metrcis

  5. Halstead Metrics • Developed by Maurice Halstead of Purdue in the 1970’s to mostly analyze program source code complexity. • Used 4 fundamental units of measurements from code: • n1 = number of distinct operators • n2 = number of distinct operands • N1 = sum of all occurrences of n1 • N2 = sum of all occurrences of n2 • Program vocabulary, n = n1 + n2 • Program length, N = N1 + N2 • Using these, he defined 4 metrics: • Volume , V = N * (Log2 n) • Potential volume , V@ = (2 + n2@) log2 (2+n2@) • Program Implementation Level, L = V@/ V • Effort, E = V / L Halstead metrics really only measures the lexical complexity, rather than structural complexity of source code.

  6. Halstead Metrics -- Example 01 public class Class1 { 02 public void x(boolean v) { 03 int i; 04 if (v) {i = 1;} 05 else {i = 2;} 06 switch(i){ 07 case 1: 08 case 2: 09 default:; 10 } 11 try { 12 while(v){ 13 v = false; 14 int r = 1; 15 Boolean b = new Boolean(true); 16 i = i*i+r; 17 break; 18 } 19 } catch (Exception e) { 20 throw e; 21 } 22 } 23 } N1: Each time the same operator occurs n1: The first time the operator appears N2: Each time the same operand occurs n2:: The first time the operand appears

  7. T.J. McCabe’s Cyclomatic complexity metric is based on the belief that program quality is related to the complexity of the program control flow. McCabe’s Cyclomatic Complexity Cyclomatic complexity = E - N + 2p where E = number of edges N= number of nodes p = number of connected components (usually 1) So, for this control flow : 7 edges – 6 nodes + 2 = 3 n1 e1 e2 e3 n2 e4 Region 1 e5 n4 n3 Region 2 n5 e7 Cyclomatic comlexity number can also be computed as follows: - number of binary decision +1 - number of closed regions + 1 e6 n6

  8. Henry-Kafura (Fan-in and Fan-out) • Henry and Kafura metric measures the inter-modular flow, which includes: • Parameter passing • Global variable access • inputs • outputs • Fan-in : number of inter-modular flow into a program • Fan-out: number of inter-modular flow out of a program Module, P Module’s Complexity, Cp = ( fan-in x fan-out ) 2 for example above: Cp = (3 + 1) 2 = 16

  9. McCabe’s Cyclomatic Complexity • Measures structural design complexity • Applied to design and code risk analysis • Also used to determine the number of test cases needed to drive through the linearly independent paths in the system • Larger the number, more risk, more testing required • 1 – 10: low risk and simple • >50: high risk • Keep control flow so cyclomatic complexity is much less than 50

  10. Henry-Kafura (Fan-in and Fan-out) • Measures intermodular flow • Based on flow of information in and out of a module • Includes • Parameter passing • Global variable access • Inputs • Outputs

  11. Henry-Kafura (Fan-in and Fan-out) • Total Henry-Kafura structural complexity • Sum of Henry-Kafura structural complexity for all program modules. CT = ∑Cpi n i=1 Cp for Mod-A = (3 * 1)2 = 9 Cp for Mod-B = (4 * 2)2 = 64 Cp for Mod-C = (2 * 3)2 = 36 Cp for Mod-D = (2 * 2)2 = 16 CT for entire program = 125 Cp = (fan-in X fan-our)2

  12. Card and Glass (Higher Level Complexity) • Card and glass used the same concept of fan-in and fan-out to describe design complexity: • Structural complexity of module x • Sx = (fan-out )2 • Data complexity • Dx = Px / (fan-out +1), where Px is the number of variables passed to and from the module • System complexity • Cx = Sx + Dx Note: Except for Px, fan-in is not considered here

  13. “Good” Design Attributes • Easy to: • Understand • Change • Reuse • Test • Integrate • Code • Believe that we can get many of these “easy to’s” if we consider: • Cohesion • Coupling

  14. Modularity • A concept closely tied to abstraction • Modularity supports independence of models • Modules support abstraction in software • Supports hierarchical structuring of programs • Modularity enhances design clarity, eases implementation • Reduces cost of testing, debugging and maintenance • Cannot simply chop a program into modules to get modularly • Need some criteria for decomposition

  15. Coupling • Independent modules: if one can function completely without the presence of other • Independence between modules is desirable • Modules can be modified separately • Can be implemented and tested separately • Programming cost decreases • In a system all modules cannot be independent • Modules must cooperate with each other • More connections between modules • More dependent they are • More knowledge about one module is required to understand the other module. • Coupling captures the notion of dependence

  16. Coupling… • Coupling between modules is the strength of interconnections between modules • In general, the more we must know about module A in order to understand module B the more closely connected is A to B • "Highly coupled" modules are joined by strong interconnection • "Loosely coupled" modules have weak interconnections

  17. Coupling… Uncoupled - no dependencies Loosely coupled - some dependencies Highly coupled - many dependencies

  18. Coupling… • Goal: modules as loosely coupled as possible • Where possible, have independent modules • Coupling is decided during architectural design • Cannot be reduced during implementation • Coupling is inter-module concept • Major factors influencing coupling • Type of connection between modules • Complexity of the interface • Type of information flow between modules

  19. Coupling… • Complexity and obscurity of interfaces increase coupling • Minimize the number of interfaces per module • Minimize the complexity of each interface • Coupling is minimized if • Only defined entry of a module is used by others • Information is passed exclusively through parameters • Coupling increases if • Indirect and obscure interfaces are used • Internals of a module are directly used • Shared variables employed for communication

  20. Coupling… • Coupling increases with complexity of interfaces e.g. number and complexity of parameters • Interfaces are needed to support required communication • Often more than needed is used, e.g. passing entire record when only a field is needed • Keep the interface of a module as simple as possible

  21. Coupling… • Coupling depends on type of information flow • Two kinds of information: data or control. • Transfer of control information • Action of module depends on the information • Makes modules more difficult to understand • Transfer of data information • Module can be treated as input-output function

  22. Coupling… • Lowest coupling: interfaces with only data communication • Highest: hybrid interfaces (some data, some control) Coupling Interface Type of Type of complexity connections communication Low Simple To module Data Obvious by name High Complicated To internal Control Obscure elements Hybrid

  23. Coupling • Coupling addresses the attribute of “degree ofinterdependence” between software units, modules or components. Content Coupling Accessing the internal data or procedural information Levels of coupling where Data coupling is lowest Common Coupling Lower the better Control Coupling Stamp Coupling Data Coupling Passing only the necessary information No Coupling Ideal, but not practical

  24. Coupling… HIGH COUPLING Content coupling One component modifies internal data of another: Not good, Common coupling Organize data in common data store: Better, but dependencies still exist Control coupling LOOSE One component passes parameters to control another component Stamp coupling Pass data structure from one component to another Data coupling Only data is passed, not data structure Uncoupled LOW Change in one component will not affect any other component

  25. A B C Component B E D Component D Go to D1 Go to D1 D1: Content Coupling B branches into D even though D is under control of C

  26. Global: A1 A2 A3 Variables: V1 V2 Common data area and variable names Component X Component Y Component Z V1 = V2 + A1 Increment VI Change VI to zero Common Coupling All three components make changes to V1. Could be a problem.

  27. Cohesion • Coupling characterized the inter-module bond • Reduced by minimizing relationship between elements of different modules • Another method of achieving this is by maximizing relationship between elements of same module • Cohesion considers this relationship • Interested in determining how closely the elements of a module are related to each other • In practice both are used

  28. Cohesion… • Cohesion of a module represents how tightly bound are the elements of the module • Gives a handle about whether the different elements of a module belong together • High cohesion is the goal • Cohesion and coupling are interrelated • Greater cohesion of modules, lower coupling between module • Correlation is not perfect

  29. Levels of Cohesion • There are many levels of cohesion. • Coincidental • Logical • Temporal • Communicational • Sequential • Functional • Coincidental is lowest, functional is highest • Scale is not linear • Functional is considered very strong

  30. Cohesion • Cohesion of a unit, of a module, of an object, or a component addresses the attribute of “ degree of relatedness” within that unit, module, object, or component. Functional Performing 1 single function Levels of Cohesion where Functional is the “highest” Sequential Communicational Higher the better Procedural Temporal Logical Performing more than 1 unrelated functions Coincidental

  31. HIGH COHESION Functional Ideal cohesion: every processing element is essential to the performance of a single function Sequential Output from one element of component input to another element of the component Communicational Grouping functions in a single component just because they produce the same data set Procedural Grouping functions in a single component just to insure the proper ordering of execution Temporal Functions are related by timing issues only: hard to make changes Logical Logically related functions or data elements are placed in the same component Coincidental LOW Components whose parts are unrelated to one another Levels of Cohesion…

  32. Levels of Cohesion… FUNCTION A FUNCTION A TIME T0 FUNCTION A FUNCTION B FUNCTION C FUNCTION A’ TIME T0 + X FUNCTION B logic FUNCTION D FUNCTION E FUNCTION C TIME T0 + 2X FUNCTION A” Logical Similar functions Temporal Related by time Coincidental Parts unrelated Procedural Related by order of functions DATA FUNCTION A FUNCTION A FUNCTION A - part 1 FUNCTION B FUNCTION A - part 2 FUNCTION B FUNCTION C FUNCTION C FUNCTION A - part 3 Functional Sequential with complete, related functions Communicational Access same data Sequential Output of one part is input to next

  33. Determining Cohesion • Describe the purpose of a module in a sentence • Perform the following tests 1. If the sentence has to be a compound sentence, contains more than one verb, the module is probably performing more than one function. Probably has sequential or communicational cohesion. 2. If the sentence contains words relating to time, like "first", "next", "after", "start" etc., the module probably has sequential or temporal cohesion.

  34. Determining Cohesion… 3.If the predicate of the sentence does not contain a single specific object following the verb, the module is probably logically cohesive. E.g. "edit all data", while "edit source data" may have functional cohesion. 4. Words like "initialize", "clean-up" often imply temporal cohesion. • Functionally cohesive module can always be described by a simple statement

  35. System 1 A Scope of control: has control on other elements Scope of effect: controlled by other elements No component should be in the scope of effect if it is not in the scope of control. If scope of effect is wider than scope of control, almost impossible to guarantee that a change will not destroy the system. System 1 may be better because of this. B C D E F G System 2 A F B C D E G Scope of Control

  36. Using Program and Data Slices to Measure Program Cohesion • Bieman and Ott introduced a measure of program cohesion using the following concepts from program and data slices: • A data token is any variable or constant in the program • A slice within a program is the collection of all the statements that can affect the value of some specific variable of interest. • A data slice is the collection of all the data tokens in the slice that will affect the value of a specific variable of interest. • Glue tokens are the data tokens in the program that lie in more than one data slice. • Super glue tokens are the data tokens in the program that lie in every data slice of the program Measure Program Cohesion through 2 metrics: - weak functional cohesion = (# of glue tokens) / (total # of data tokens) - strong functional cohesion = (#of super glue tokens) / (total 3 of data tokens)

  37. Data Tokens: z1 n1 end1 min1 max1 i1 end2 n2 max2 z2 01 min2 z3 02 i2 03 i3 end3 i4 z4 i5 max3 max4 z5 i6 z6 i7 min3 min4 z7 i8 max5 min5 (33) Slice max: z1 n1 end1 max1 i1 end2 n2 max2 z2 01 i2 03 i3 end3 i4 z4 i5 max3 max4 z5 i6 max5 (22) Slice min: z1 n1 end1 min1 i1 end2 n2 min2 z3 02 i2 03 i3 end3 i4 z6 i7 min3 min4 z7 i8 min5 (22) Glue Tokens: z1 n1 end1 i1 end2 n2 i2 03 i3 end3 i4 (11) Super Glue: z1 n1 end1 i1 end2 n2 i2 03 i3 end3 i4 (11) A Pseudo-Code Example of Functional Cohesion Measure Finding the maximum and the minimum values procedure: MinMax ( z, n) Integer end, min, max, i ; end = n ; max = z[0] ; min = z[0] ; For ( i = 0, i = < end , i++ ) { if z[ i ] > max then max = z[ i ]; if z[ i ] < min then min = z[ i ]; } return max, min;

  38. Example of pseudo-code Cohesion Metrics • For the example of finding min and max, the glue tokens are the same as the super glue tokens. • Super glue tokens = 11 • Glue tokens = 11 • The data slice for min and data slice for max turns out to be the same number, 22 • The total number of data tokens is 33 The cohesion metrics for the example of min-max are: weak functional cohesion = 11 / 33 = 1/3 strong functional cohesion = 11 / 33 = 1/3 If we had only computed one function (e.g. max), then : weak functional cohesion = 22 / 22 = 1 strong functional cohesion = 22/ 22 = 1

  39. Chidamber and Kemerer (C-K) OO Metrics • Weighted Methods per class (WMC) • Depth of Inheritance Tree (DIT) • Number of Children (NOC) • Coupling Between Object Classes (CBO) • Response for a Class (RFC) • Lack of Cohesion in Methods (LCOM) Note that LCOM is a negative measure in that high LCOM indicates low cohesion and possibly high complexity. Cohesion is based on the association of methods with common instance variables.

  40. Cohesion and Coupling Tight High Level Strong Cohesion Coupling Weak Loose Low Level

  41. User Interface • Mandel’s 3 “golden rules” for UI design • Place the user in control • Reduce the users’ memory load ( G. Miller’s 7 + or – 2) • Consistency ( earlier - design completeness and consistency) • Shneiderman and Plaisant (8 rules for design) • Consistency • Short cuts for frequent (or experienced) users • Informative feedback • Dialogues should result in closure • Strive for error prevention and simple error handling • Easy reversal of action (“undo” of action) • Internal locus of control • Reduce short term memory

  42. UI Design Prototype and “Test” • UI design prototypes: • Low fidelity (with cardboards) • High fidelity (with “story board” tools) • Usability “laboratories” and statistical analysis • # of subjects who can complete the tasks within some specified time • Length of time required to complete different tasks • Number of times “help” functions needed • Number of times “redo” used and where • Number of times “short cuts” were used

  43. Origin of Law of Demeter • A design “guideline” for OO systems that originated from the Demeter System project at: • Northeastern University in the 1980’s • Aspect-Oriented Programming Project • Addresses the design coupling issue through placing constraints on messaging among the objects • Limit the sending of messages to objects that are directly known to it

  44. Law of Demeter • An object should send messages to only the following kinds of objects: • the object itself • the object’s attributes (instance variables) • the parameters of the methods in the object • any object created by a method in the object • any object returned from a call to one of the methods of the object • any object in any collection that is one of the above categories

More Related