1 / 28

Software Design Analysis of Algorithms

Software Design Analysis of Algorithms. i206 Fall 2010 John Chuang. Some slides adapted from Glenn Brookshear, Marti Hearst, or Goodrich & Tamassia. Confidentiality Integrity Authentication …. Analysis of Algorithms. Distributed Systems. C/S, P2P Caching. Security. Cryptography.

Télécharger la présentation

Software Design Analysis of Algorithms

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Software DesignAnalysis of Algorithms i206 Fall 2010 John Chuang Some slides adapted from Glenn Brookshear, Marti Hearst, or Goodrich & Tamassia

  2. Confidentiality Integrity Authentication … Analysis of Algorithms Distributed Systems C/S, P2P Caching Security Cryptography Network Standards & Protocols sockets Inter-process Communication Methodologies/ Tools Principles TCP/IP, RSA, … UML, CRC OperatingSystem Formal models Application Design Process I/O Finite automata regex Context switch Process vs. Thread Locks and deadlocks Algorithms Program Analysis Memory hierarchy ALUs, Registers, Program Counter, Instruction Register Memory Big-O Compiler/ Interpreter Assembly Instructions Data Structures Register, Cache Main Memory, Secondary Storage Searching, sorting, Encryption, etc. Machine Instructions CPU Op-code, operands Instruction set arch Data storage Stacks, queues, maps, trees, graphs, … Circuits Lossless v. lossy Info entropy & Huffman code Decimal, Hexadecimal, Binary Adders, decoders, Memory latches, ALUs, etc. Gates Data compression Number Systems Data AND, OR, NOT, XOR, NAND, NOR, etc. Boolean Logic Numbers, text, audio, video, image, … Truth table Venn Diagram DeMorgan’s Law Data Representation Binary Numbers Bits & Bytes John Chuang

  3. Traveling Salesman Problem http://xkcd.com/399/ John Chuang

  4. Sorting algorithms Bubble sort Insertion sort Shell sort Merge sort Heapsort Quicksort Radix sort … Search algorithms Linear search Binary search Breadth first search Depth first search … Which Algorithm is Better?What do we mean by “Better”? John Chuang

  5. Analysis • Characterizing the running times of algorithms and data structure operations • Secondarily, characterizing the space usage of algorithms and data structures John Chuang

  6. Running Time • In general, running time increases with input size • Running time also affected by: • Hardware environment (processor, clock rate, memory, disk, etc.) • Software environment (operating system, programming language, compiler, interpreter, etc.) (n) John Chuang

  7. Quantifying Running Time • Experimentally measure running time • Need to fully implement and execute • Analysis of pseudo-code (n) John Chuang

  8. Pseudo-code Analysis Example Algorithm: arrayMax(A,n): Input: An array A storing n >= 1 integers Output: the maximum element in A currentMax = A[0] for i=1 to (n-1) do if currentMax < A[i] then currentMax = A[i] Return currentMax John Chuang

  9. The Seven Common Functions • Constant • Linear • Quadratic • Cubic • Exponential • Logarithmic • n-log-n Polynomial John Chuang

  10. The Seven Common Functions • Constant: f(n) = c • E.g., adding two numbers, assigning a value to some variable, comparing two numbers, and other basic operations • Linear: f(n) = n • Do a single basic operation for each of n elements, e.g., reading n elements into computer memory, comparing a number x to each element of an array of size n John Chuang

  11. The Seven Common Functions • Quadratic: f(n) = n2 • E.g., nested loops with inner and outer loops • E.g., insertion sort algorithm • Cubic: f(n) = n3 • More generally, all of the above are polynomial functions: • f(n) = a0 + a1n + a2n2 + a3n3 + … + annd • where the ai’s are constants, called the coefficients of the polynomial Brookshear Figure 5.19 John Chuang

  12. The Seven Common Functions • Exponential: f(n) = bn • Where b is a positive constant, called the base, and the argument n is the exponent • In algorithm analysis, the most common base is b=2 • E.g., loop that starts with one operation and doubles the number of operations with each iteration • Similar to the geometric sum: • 1+2+4+8+…+2n-1 = 2n – 1 • BTW, 2n – 1 is the largest integer that can be represented in binary notation using n bits John Chuang

  13. The Seven Common Functions • Logarithmic: f(n) = logbn • Intuition: number of times we can divide n by b until we get a number less than or equal to 1 • E.g., the binary search algorithm has a logarithmic running time • The base is omitted for the case of b=2, which is the most common in computing: • log n = log2n Brookshear Figure 5.20 John Chuang

  14. The Seven Common Functions • n-log-n function: f(n) = n log n • Product of linear and logarithmic • E.g., quicksort algorithm has n log n running time on average (but n2 in worst case) John Chuang

  15. Comparing Growth Rates • 1 < log n < n1/2 < n < n log n < n2 < n3 < bn John Chuang http://www.cs.pomona.edu/~marshall/courses/2002/spring/cs50/BigO/

  16. Polynomials • Only the dominant terms of a polynomial matter in the long run.  Lower-order terms fade to insignificance as the problem size increases. John Chuang http://www.cs.pomona.edu/~marshall/courses/2002/spring/cs50/BigO/

  17. Example: Counting Primitive Operations Algorithm: arrayMax(A,n): Input: An array A storing n >= 1 integers Output: the maximum element in A currentMax = A[0] for (i=1; i<=n-1; i++) if currentMax < A[i] then currentMax = A[i] Return currentMax Two operations: indexing into array; assign value to variable One operation: assign value to variable Two operations repeated n times: subtract, compare • Four or six operations repeated (n-1) times: • index and compare (for the if statement); - index and assign (for the then statement if necessary) • addition and assign (for the increment) One operation: return value of variable Total: 2 + 1 + 2n + 4(n-1) + 1 = 6n (best case) Total: 2 + 1 + 2n + 6(n-1) + 1 = 8n – 2 (worst case) John Chuang

  18. Big-O Notation • Asymptotic analysis • As n becomes large and approaches infinity • Let f(n) and g(n) be functions mapping non-negative integers to real numbers • Then f(n) is O(g(n)) or “f(n) is big-Oh of g(n)” • If there is a real constant c > 0 and an integer constant n0 >=1 such that • f(n) <= cg(n) for n >= n0 • Example: the function f(n)=8n–2 is O(n) • The running time of algorithm arrayMax is O(n) John Chuang

  19. Example 1: Search • Given: • A physical phone book • Organized in alphabetical order • A name you want to look up • An algorithm in which you search through the book sequentially, from first page to last: Sequential Search or Linear Search • What is the order of: • The best case running time? • The worst case running time? • The average case running time? • What is: • A better algorithm? • The worst case running time for this algorithm? John Chuang

  20. Example 1: Search • This better algorithm is called Binary Search • What is its running time? • First you look in the middle of n elements • Then you look in the middle of n/2 = ½*n elements • Then you look in the middle of ½ * ½*n elements • … • Continue until you find the target element, or until there is only 1 element left • Say you did this m times: ½ * ½ * ½* …*n • Then the number of repetitions is the smallest integer m such that John Chuang

  21. Binary Search Pseudo-code Brookshear Figure 5.14 John Chuang

  22. Analyzing Binary Search • In the worst case, the number of repetitions is the smallest integer m such that • We can rewrite this as follows: Multiply both sides by Take the log of both sides Since m is the worst case time, the algorithm is O(log n) John Chuang

  23. Example 2: Insertion Sort is O(n2) Brookshear Figure 5.11 Brookshear Figure 5.19 John Chuang Brookshear Figure 5.10

  24. Sj=0 X[j] i A[i] = i+1 Example 3: Prefix Average Compute running average of a sequence of numbers 5 10 15 20 25 30 5/1 15/2 30/3 50/4 75/5 105/6 There are two straightforward algorithms: The first is easy but wasteful. The second is more efficient, but requires insight into the problem. John Chuang

  25. Sj=0 X[j] i A[i] = i+1 1st Algorithm Algorithm: prefixAverages1(X): Input: An n-element array X of numbers Output: An n-element array A of numbers such that A[i] is the average of elements X[0],…X[i] Let A be an array of n numbers For i  0 to n-1 do a  0 For j  0 to i do a  a+X[j] A[i]  a/(i+1) Return array A John Chuang

  26. 2nd Algorithm Algorithm: prefixAverages2(X): Input: An n-element array X of numbers Output: An n-element array A of numbers such that A[i] is the average of elements X[0],…X[i] Let A be an array of n numbers s  0 For i  0 to n-1 do s  s + X[i] A[i]  s/(i+1) Return array A A useful tool: store intermediate results in a variable! Uses space to save time. The key – don’t divide s. Eliminates one for loop – always a good thing to do. John Chuang

  27. Summary: Analysis of Algorithms • A method for determining the asymptotic running time of an algorithm • Here asymptotic means “as n gets very large” • Useful for comparing algorithms • Useful also for determining tractability • E.g., Exponential time algorithms are usually intractable. John Chuang

  28. Traveling Salesman Problem http://xkcd.com/399/ John Chuang

More Related