1 / 16

Complexity Metrics & Quality

Complexity Metrics & Quality. Complexity Measures for: Process Project : (broader than just process- e.g. includes people) Product Tools We Will Focus on Product. Product Complexity/Size Metrics. Lines of Code (LOC) ---- talked about earlier Halstead Cyclomatic (McCabe)

wauna
Télécharger la présentation

Complexity Metrics & Quality

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Complexity Metrics & Quality • Complexity Measures for: • Process • Project : (broader than just process- e.g. includes people) • Product • Tools • We Will Focus on Product

  2. Product Complexity/Size Metrics • Lines of Code (LOC) ---- talked about earlier • Halstead • Cyclomatic (McCabe) • Other Constructs • Inter-module (Fan-in and Fan-out) • Functional (Function Points) --- lectured earlier • Object complexity metrics --- your read C-K metrics

  3. Lines of Code • Lines of code count is an indirect measurement of complexity; it is really a measurement of “size” or “volume.” • More fit for assessing amount of time or effort required to develop the software, even though that may not be perfect either. • There have been cases where the size(loc) of the module and the defect density seem to demonstrate a curvilinear relationship • The larger(loc) the module the bigger the defect density. • There is also the other side where the small(loc) modules are very high in defect density. • (*** Is there a middle, optimal size(loc) module which is minimal in defect density ?***) – possibly related to optimal people performance capacity.

  4. Halstead Measure • Maurice Halstead of Purdue University attempted to tie physical, syntactical objects and psychological measurements together with a set of definitions : • n1 = total number of unique operators in a program • n2 = total number of unique operands in a program • N1 = total number of occurrences of the operators • N2 = total number of occurrences of the operands • and equations or relationships: • Volume of program, V = N * (log2 n) , N=N1+N2; n =n1+n2 (volume represented some mental comparisons needed to write the program.) • Level of Program, L = V*/ V where V is volume and V* isminimum volume needed to write theprogram(an estimated value --- possible problem).

  5. Halstead measure (cont.) • Difficulty of program = 1/Level or V/V* • Faults of program = V/3000 ; based on psychological assumption that a person can make 3000 decisions before making an error. • The problem with Halstead measure is that the psychological assumptions along with the projections of numbers have been subject to dispute. There are little supporting evidence except for possibility the estimation of total “length” = [n1log(n1)+n2(log(n2)].

  6. Cyclomatic Measure • This measure was introduced by McCabe to look at the number of unique paths for testing. • Initially it was based on graph theory, but can also be attained by : M = # of binary decisions + 1 Where some of the decision constructs such as CASE are also considered. (e.g. n-way CASE is the same as n-1 binary decisions.) • This seems to be too simplistic a measure for complexity, for not all the internal structures are considered (e.g. different types of loop structures).

  7. Cyclomatic Measure • There have been numerous studies that looked for correlations between Cyclomatic complexity and defect rate. • Results have been much more positive than the Halstead measure. • Some tried to associate McCabe’s cyclomatic measure to inspection results and found some correlations

  8. Other Constructs • There have been other relationships studied at the module level: • Field defects = a+ b(loc) + c(unique operands) • Field defects = a(If-then) + b(calls) • Field defects = a + b(DO While)+ c(Case)+d(If-then) • All of these showed certain amount of significant correlations; however, the coefficients, a,b,c, and d may not be applicable in general and are product specific. • Complexity metrics of the various programming constructs are still not totally understood and certainly far from being an agent for prediction of defect rates.

  9. “Inter-Module” Measure • Given a module x: • Fan-in : a count of the number of modules calling a module x. • Fan-out : a count of the number of modules called by module x. • One may, from experience, observe that a small module X which performs a common function will have a high Fan-in number. This module may or may not be complex itself, but contributes to potentially high inter-module complexity • A large module X that performs multiple functions may have a high Fan-out number, but again, module X itself may not be very complex. • A module that has both high Fan-in and Fan-out number may be a problem, with possible design flaws.

  10. Inter-Module Relationships Fan-in : ……. a) Passing Control (no s*) b) Passing Data (no s*) Module X Fan-out: Module X a) Passing Control(s*) b) Passing Data(s*) …….

  11. Fan-in vs Fan-out • Analytical difference in complexity between Fan-in and Fan-out • There have also been positive statisticalcorrelation between defect rate and number of Fan-outs • There have also been insignificant relationship between Fan-in and defect rate. • This area of inter-module design complexity versus defect rates is still open to more studies. • Strong cohesion • Lose coupling

  12. Further Definitions on Complexity • Structure complexity = (Fan-in * Fan-out)**2 • System Complexity = S + D where: • S (structural complexity) = Sum(f(i)**2)/n • f(i) = fan out of module i • n = number of modules in the system • D (data complexity) = Sum [v(i)/(f(i)+1)] / n • v(i) = number of I/O variables in module i • f(i) = fan out of module i • n = number of (new) modules in the system (excludes reused modules • Using System Complexity, the following model was found to be of some positive correlation to defect rate • Defect rate = 5.2 + .4 (System Complexity)

  13. An IBM (Rochester Lab) Study • One comparative study at IBM’s AS/400 organization: (used 70kloc of PL/1 like language component that had high error rate): • The following showed significant correlation to defect: • PTR : past error proneness • DCR : number of design changes • Fan-out : number of calls to external routines • Cyclomatic : module’s internal decision complexity

  14. Some OO Metrics: the Original C-K Metric by Chidamber and Kemerer - 1994 • Weighted methods per Class (WMC) • Weights are assigned to each method based on perceived difficulty in implementing the method. • WMC = SUM(wi * mi) where i = 1, 2, - - - , n methods • Coupling between Objects (CBO) • Number of other classes to which a given class is coupled • Coupling includes inheritance relationship and invocation relationships • Depth of Inheritance Tree (DIT) • Length of the longest path from a class to the root class in the inheritance hierarchy • Number of Children (NOC) • Number of immediate child classes that have inherited from a given class. • Response for a Class (RFC) • Number of methods that can be potentially invoked in response to a message received by an object of a class. • Lack of Cohesion Methods (LCOM) • Count of number of method-pairs whose similarity is zero minus the count of method pairs whose similarity is not zero (see next page)

  15. Lack of cohesion metric • Lack of cohesion metric (LCOM) of a class • Let m1, - - - , mn be methods in a class • Let I1, - - - , In be instance variables used in the respective methods • Let P be all the sets of non-common instance variables • P = { ( Ii, Ij ) | Ii Ij = O } • Let Q be all the sets of common instance variables • Q = { (Ii, Ij ) | Ii Ij = O ) • LCOM = # P - # Q (if #P > #Q); LCOM = 0 otherwise Example: consider a Class with 3 methods, m1,m2,m3 whose respective instance variable sets are I1= {a,b,c,s}; I2 = {a,b,z}, and I3 = {f, h, w}. Then I1 I2 = {a,b} ; I1 I3 = O; I2 I3 = O #P = 2 and # Q = 1 LCOM = 2 – 1 = 1

  16. Some Early Sources for OO Metrics • S.R. Chidamber and S. F. Kemerer, “A Metric Suite for Object Oriented Design,” IEEE Trans. Software Engineering,” vol.20, pp. 476 -493, 1994. • W. Li and S. Henry, “ Object Oriented Metrics that Predict Maintainability,” J. Systems and Software, vol. 23, pp 111-122, 1993. • R. Subramanyam and M.S. Krishnan, “ Empirical Analysis of CK Metrics for Object-Oriented Design Complexity: Implications for Software Defects,” vol. 29, pp. 297 – 310, 2003.

More Related