370 likes | 456 Vues
Divide and Conquer. Recall. Divide the problem into a number of sub-problems that are smaller instances of the same problem.
E N D
Recall • Divide the problem into a number of sub-problems that are smaller instances of the same problem. • Conquer the sub-problems by solving them recursively. If the sub-problem sizes are small enough, however, just solve the sub-problems in a straightforward manner. • Combine the solutions to the sub-problems into the solution for the original problem.
Recurrences • A recurrence is an equation or inequality that describes a function in terms of its value on smaller inputs. • We define the running time of MERGE-SORT via a recurring equation • We will study three methods to solve recurrences—that is, for obtaining asymptotic “Θ” or “O” bounds on the solution: • Substitution method • Recurrence tree • Master Method
Strassen’salgorithm for matrix multiplication • If A = (aij) and B = (bij)are square n x n matrices, then in the product C =A x B, we define the cij, for i, j = 1,2, … ,n, by • We must compute n2 matrix entries, and each is the sum of n values.
Matrix multiplication procedure • For two square matrices the following procedure computes their product Time Complexity
Analysis • T(1) = ϴ(1) • In lines 6–9, we recursively call SQUARE-MATRIX-MULTIPLY-RECURSIVE a total of eight times. • the time taken by all eight recursive calls is 8 T (n/2) • We also must account for the four matrix additions in lines 6–9. Each of these matrices contains n2/4 entries, and so each of the four matrix additions takes ϴ (n2) time • T(1) = ϴ(n3)
Algorithm (1) • Step 3 • Step 4
Solving Recurrences We have three methods to solve recurrence equations • Substitution Method • Recurrence Tree Method • Master Method
Substitution Method The substitution method for solving recurrences comprises two steps: • Guess the form of the solution. • Use mathematical induction to find the constants and show that the solution works.
Example Consider the recurrence • We guess that the solution is T(n) = O(n lg n) • The substitution require us to prove that T(n) <= cnlgn for c > 0 • We start by assuming that this bound holds for all positive m < n, in particular for m = Floor (n/2) • It gives • Substituting it into the recurrence yields
Example - continued Now we require to show that this solution holds for boundary conditions • Let us assume, for the sake of argument, that T(1) = 1 • For n = 1? • We only require to prove for any n0 • for n > 3, the recurrence does not depend directly on T(1) • Distinction between base case of recurrence and induction • We derive from the recurrence that T(2) = 4 and T(3) = 5 • We can complete the inductive proof that T(n) =c n lgnfor some constant c>=1 by choosing c large enough so that T(2) <= c2 lg 2 and T(3) <= c3 lg3 • c >= 2?
Another Example • Make sure you show the same exact form when doing a substitution proof. • Consider the recurrence T (n) = 8T (n/2) + Θ (n2) • For an upper bound: T (n) ≤ 8T (n/2) + cn2. Guess: T (n) ≤ dn3. T (n) ≤ 8d(n/2)3 + cn2 = 8d(n3/8) + cn2 = dn3 + cn2 ≤ dn3doesn’t work!
Another Example - continued • Remedy: Subtract off a lower-order term. Guess: T (n) ≤ dn3 − d’n2. T (n) ≤ 8(d(n/2)3 − d’(n/2)2) + cn2 = 8d(n3/8) − 8d’(n2/4) + cn2 = dn3 − 2d’n2 + cn2 = dn3 − d’n2− d’n2+ cn2 ≤ dn3 − d’n2if −d’n2+ cn2 ≤ 0 , d’≥ c
Yet another example • T(n) = cn + 3T(2n/3) • How about F(n) = nlgn? • cn + 3kF(2n/3)= cn + 3k(2n/3) lg (2n/3)= cn + 2knlgn − 2knlg (2/3)= cn + 2knlgn + 2knlg (3/2) • There is no way to choose k to make the left side (knlgn) larger • Therefore, n lg n is not correct
Yet another example - continued • Try a higher order of growth like n2 or n3, but which one? Maybe nx • We can solve for the correct exponent x by plugging in knx: cn + 3T(2n/3)= cn + 3k(2/3)xnx • This will be asymptotically less than knx as long as 3(2/3)x> 1 , which requires x > lg3/23 • Let a = lg3/2 3, then our algorithm is O(na+ε) for any positive ε • Let's try O(na) itself, the RHS after substituting knais cn + 3(2/3)akna = cn + kna≥ kna • This tells us that kna is an asymptotic lower bound on T(n): T(n) is Ω(na). So the complexity is somewhere between Ω(na) and O(na+ε). It is in fact Θ(na). • To show the upper bound, we will try F(n) = na + bn where b is a constant to be filled in later.
Yet another example - continued • The idea is to pick a b so that bn will compensate for the cn term that shows up in the recurrence. • Because bn is O(na), showing T(n) is O(na + bn) is the same as showing that it is O(na). Substituting kF(n)for T(n) in the RHS of the recurrence, we obtain: cn + 3kF(2n/3)= cn + 3k((2n/3)a + b(2n/3))= cn + 3k(2n/3)a + 3kb(2n/3)= cn + kna + 2kbn= kna + (3kb+c)n • The substituted LHS of the recurrence is kna + kbn, which is larger than kna + (2kb+c)n as long as kb>2kb+c, or b<−c/k. There is no requirement that b be positive, so choosing k=1, b= −1 satisfies the recurrence. • Therefore T(n) = O(na + bn) = O(na), and since T(n) is both O(na) and Ω(na), it is Θ(na).
Substitution method - warning • Be careful when using asymptotic notation. • The false proof for the recurrence T (n) = 4T (n/4) + n, that T (n) = O(n):T (n) ≤ 4(c(n/4)) + n ≤ cn+ n = O(n) wrong! • Because we haven’t proven the exact form of our inductive hypothesis (which is that T (n) ≤ cn), this proof is false.
Recursion tree method • Use to generate a guess. Then verify by substitution method. • T (n) = T (n/3)+T (2n/3)+Θ (n). • For upper bound, rewrite as T (n) ≤ T (n/3) + T (2n/3) + cn; • for lower bound, as T (n) ≥ T (n/3) + T (2n/3) + cn. • By summing across each level, the recursion tree shows the cost at each level of recursion (minus the costs of recursive calls, which appear in subtrees):
Recursion tree method • There are log3n full levels, and after log3/2n levels, the problem size is down to 1. • Each level contributes ≤ cn. • Lower bound guess: ≥ dnlog3n =(nlgn) for some positive constant d. • Upper bound guess: ≤ dnlog3/2n=O(nlgn) for some positive constant d. • Then prove by substitution.
Recursion tree method Upper bound: Guess: T (n) ≤ dnlgn. Substitution: T (n) ≤ T (n/3) + T (2n/3) + cn ≤ d(n/3) lg(n/3) + d(2n/3) lg(2n/3) + cn = (d(n/3) lg n − d(n/3) lg 3) + (d(2n/3) lg n − d(2n/3) lg(3/2)) + cn = dn lg n − d((n/3) lg 3 + (2n/3) lg(3/2)) + cn = dn lg n − d((n/3) lg 3 + (2n/3) lg 3 − (2n/3) lg 2) + cn = dnlgn − dn(lg 3 − 2/3) + cn ≤ dnlgn if −dn(lg 3 − 2/3) + cn≤ 0 , d ≥ c / lg3 − 2/3 Lower bound?
Another example • T(n) = 3T(n/4)+cn2.
Another example • The sub-problem size for a node at depth i is n=4i • Thus, the sub-problem size hits n = 1 when n/4i =1 or, equivalently, when i=log4 n. • Thus, the tree has log4 n + 1 levels (at depths 0; 1; 2, … ,log4 n). • the number of nodes at depth i is 3i • each node at depth i, for i = 0; 1; 2, …, log4 n -1, has a cost of c(n/4i)2 • Total cost at depth i is (3/16)icn2 • The bottom level, at depth log4n has each cost T(1) for a total cost of
Another example • Taking advantage of a decreasing geometric sequence
Master Method • Used for many divide-and-conquer recurrences of the form • T (n) = aT (n/b) + f (n) , • where a ≥ 1, b > 1, and f (n) > 0. • Based on the master theorem