1 / 54

Karmarkar Algorithm

Karmarkar Algorithm. Anup Das, Aissan Dalvandi, Narmada Sambaturu, Seung Min, and Le Tuan Anh. Contents. Overview Projective transformation Orthogonal projection Complexity analysis Transformation to Karmarkar’s canonical form. LP Solutions. Simplex Dantzig 1947 Ellipsoid

sibley
Télécharger la présentation

Karmarkar Algorithm

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Karmarkar Algorithm Anup Das, Aissan Dalvandi, Narmada Sambaturu, SeungMin, and Le Tuan Anh

  2. Contents • Overview • Projective transformation • Orthogonal projection • Complexity analysis • Transformation to Karmarkar’s canonical form

  3. LP Solutions • Simplex • Dantzig 1947 • Ellipsoid • Kachian 1979 • Interior Point • Affine Method 1967 • Log Barrier Method 1977 • Projective Method 1984 • NarendraKarmarkar form AT&T Bell Labs

  4. Linear Programming General LP Karmarkar’s Canonical Form M Subject to M Subject to • Minimum value of is 0

  5. Karmarkar’s Algorithm • Step 1: Take an initial point • Step 2: While • 2.1 Transformation: such that is center of • This gives us the LP problem in transformed space • 2.2 Orthogonal projection and movement: Project to feasible region and move in the steepest descent direction. • This gives us the point • 2.3 Inverse transform: • This gives us • 2.4 Use as the start point of the next iteration • Set

  6. Step 2.1 • Transformation: such that is center of - Why?

  7. Step 2.2 • Projection and movement: Project to feasible region and move in the steepest descent direction - Why? Projection of to feasible region

  8. Big Picture

  9. Karmarkar’s Algorithm • Step 1: Take an initial point • Step 2: While • 2.1 Transformation: such that is center of • This gives us the LP problem in transformed space • 2.2 Orthogonal projection and movement: Project to feasible region and move in the steepest descent direction. • This gives us the point • 2.3 Inverse transform: • This gives us • 2.4 Use as the start point of the next iteration • Set

  10. Projective Transformation • Transformation: such that is center of Transform

  11. Projective transformation • We define the transform such that • We transform every other point with respect to point Mathematically: • Defining D = diag • projective transformation: • inverse transformation

  12. Problem transformation: • projective transformation: • inverse transformation The new objective function is not a linear function. Minimize s.t. Minimize S.t. Transform

  13. Problem transformation: Minimize S.t. Minimize S.t. Convert

  14. Karmarkar’s Algorithm • Step 1: Take an initial point • Step 2: While • 2.1 Transformation: such that is center of • This gives us the LP problem in transformed space • 2.2 Orthogonal projection and movement: Project to feasible region and move in the steepest descent direction. • This gives us the point • 2.3 Inverse transform: • This gives us • 2.4 Use as the start point of the next iteration • Set

  15. Orthogonal Projection • Projecting onto • projection on • normalization Minimize S.t.

  16. Orthogonal Projection • Given a plane and a point outside the plane, which direction should we move from the point to guarantee the fastest descent towards the plane? • Answer: perpendicular direction • Consider a plane and a vector

  17. Orthogonal Projection • Let and be the basis of the plane () • The plane is spanned by the column space of ] • , so • is perpendicular to and

  18. Orthogonal Projection • = projection matrix with respect to ’ • We need to consider the vector • Remember • is the projection matrix with respect to

  19. Orthogonal Projection • What is the direction of? • Steepest descent = • We got the direction of the movement or projected ’ onto • Projection on

  20. Calculate step size • r = radius of the incircle Minimize S.t.

  21. Movement and inverse transformation Transform Inverse Transform

  22. Big Picture

  23. Matlab Demo

  24. Contents • Overview • Projective transformation • Orthogonal projection • Complexity analysis • Transformation to Karmarkar’s canonical form

  25. Running Time • Total complexity of iterative algorithm = (# of iterations) x (operations in each iteration) • We will prove that the # of iterations = O(nL) • Operations in each iteration = O(n2.5) • Therefore running time of Karmarkar’s algorithm = O(n3.5L)

  26. # of iterations • Let us calculate the change in objective function value at the end of each iteration • Objective function changes upon transformation • Therefore use Karmarkar’s potential function • The potential function is invariant on transformations (upto some additive constants)

  27. # of iterations • Improvement in potential function is • Rearranging,

  28. # of iterations • Since potential function is invariant on transformations, we can use whereandare points in the transformed space. • At each step in the transformed space,

  29. # of iterations • Simplifying, we get Where • For small • So suppose both and are small -

  30. # of iterations • It can be shown that • The worst case for would be if • In this case

  31. # of iterations • Thus we have proved that • Thus where k is the number of iterations to converge • Plugging in the definition of potential function,

  32. # of iterations • Rearranging, we get

  33. # of iterations • Therefore there exists some constant such that after iterations, where , being the number of bits is sufficiently small to cause the algorithm to converge. • Thus the number of iterations is in

  34. Rank-one modification • The computational effort is dominated by: • Substituting A’ in, we have • The only quantity changes from step to step is D • Intuition: • Let and be diagonal matrices • If and are close the calculation given should be cheap!

  35. Method • Define a diagonal matrix , a “working approximation” to in step k such that • for • We update D’ by the following strategy • (We will explain this scaling factor in later slides) • For to doif Then let and update

  36. Rank-one modification (cont) • We have • Given , computation of can be done in or steps. • If and differ only in entry then • => rank one modification. • If and differ in entries then complexity is .

  37. Performance Analysis • In each step: • Substituting , and , • Let call • Then Let • Then

  38. Performance analysis - 2 • First we scale by a factor • Then for any entry I such that • We reset • Define discrepancy between and as • Then just after an updating operation for index

  39. Performance Analysis - 3 • Just before an updating operation for index i Between two successive updates, say at steps k1 and k2 Let call Then Assume that no updating operation was performed for index I -

  40. Performance Analysis - 4 • Let be the number of updating operations corresponding to index in steps and be the total number of updating operations in steps. • We have Because of then

  41. Performance Analysis - 5 • Hence • So • Finally

  42. Contents • Overview • Transformation to Karmarkar’s canonical form • Projective transformation • Orthogonal projection • Complexity analysis

  43. Transform to canonical form General LP Karmarkar’s Canonical Form M Subject to • x M Subject to • x

  44. Step 1:Convert LP to a feasibility problem • Combine primal and dual problems • LP becomes a feasibility problem Dual Primal M Subject to M Subject to • x Combined

  45. Step 2: Convert inequality to equality • Introduce slack and surplus variable

  46. Step 3: Convert feasibility problem to LP • Introduce an artificial variable to create an interior starting point. • Let be strictly interior points in the positive orthant. • Minimize • Subject to • +

  47. Step 3: Convert feasibility problem to LP • Change of notation • Minimize • Subject to • + M Subject to • x

  48. Step4: Introduce constraint • Let ,…,] be a feasible point in the original LP • Define a transformation • for • Define the inverse transformation • for

  49. Step4: Introduce constraint • Let denote the column of • If • Then • We define the columns of a matrix ’ by these equations • for • = • Then

  50. Step 5: Get the minimum value of Canonical form • Substituting • We get • Define by • Then implies

More Related