1 / 38

Least Square

Least Square. y = f ( x ). Motivation. Given data points, fit a function that is “ close ” to the points Local surface fitting to 3D points. y. P i = ( x i , y i ). x. Line Fitting. y - offsets minimization. y. P i = ( x i , y i ). x. Line Fitting.

wayde
Télécharger la présentation

Least Square

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Least Square

  2. y = f (x) Motivation • Given data points, fit a function that is “close” to the points • Local surface fitting to 3D points y Pi = (xi, yi) x

  3. Line Fitting • y-offsets minimization y Pi = (xi, yi) x

  4. Line Fitting • Find a liney = ax + bthat minimizes • E(a,b)is quadratic in the unknown parametersa, b • Another option would be, for example: • But – it is not differentiable, harder to minimize…

  5. Line Fitting – LS minimization • To find optimal a, b we differentiate E(a, b): E(a, b) = (–2xi)[yi – (axi + b)] = 0 E(a, b) = (–2)[yi – (axi + b)] = 0

  6. Line Fitting – LS minimization • We get two linear equations for a, b: (–2xi)[yi – (axi + b)] = 0 (–2)[yi – (axi + b)] = 0 [xiyi – axi2 – bxi] = 0 [yi – axi – b] = 0

  7. Line Fitting – LS minimization • We get two linear equations for a, b: ( xi2) a + ( xi) b = xiyi ( xi)a + ( 1)b = yi

  8. Line Fitting – LS minimization • Solve for a, b using e.g. Gauss elimination • Question: why the solution is the minimum for the error function? E(a, b) = [yi – (axi + b)]2

  9. Fitting Polynomials

  10. Fitting Polynomials • Decide on the degree of the polynomial, k • Want to fit f (x) = akxk + ak-1xk-1 + … + a1x+ a0 • Minimize: E(a0,a1, …,ak) = [yi – (akxik+ak-1xik-1+ …+a1xi+a0)]2 E(a0,…,ak) = (– 2xm)[yi – (akxik+ak-1xik-1+…+ a0)] = 0

  11. Fitting Polynomials • We get a linear system of k+1 in k+1 variables

  12. General Parametric Fitting • We can use this approach to fit any function f(x) • Specified by parameters a, b, c, … • The expression f(x) linearly depends on the parameters a, b, c, …

  13. General Parametric Fitting • Want to fit function fabc…(x) to data points (xi, yi) • Define E(a,b,c,…) = [yi – fabc…(xi)]2 • Solve the linear system

  14. General Parametric Fitting • It can even be some crazy function like • Or in general:

  15. Solving Linear Systems in LS Sense • Let’s look at the problem a little differently: • We have data points (xi, yi) • We want the function f(x) to go through the points:  i =1, …, n: yi = f(xi) • Strict interpolation is in general not possible • In polynomials: n+1 points define a unique interpolation polynomial of degree n. • So, if we have 1000 points and want a cubic polynomial, we probably won’t find it…

  16. Solving Linear Systems in LS Sense • We have an over-determined linear system nk: f(x1) = 1f1(x1) + 2f2(x1) + … + kfk(x1) = y1 f(x2) = 1f1(x2) + 2f2(x2) + … + kfk(x2) = y2 … … … f(xn) = 1f1(xn) + 2f2(xn) + … + kfk(xn) = yn

  17. Solving Linear Systems in LS Sense • In matrix form:

  18. Solving Linear Systems in LS Sense • In matrix form: Av = y

  19. Solving Linear Systems in LS Sense • More constrains than variables – no exact solutions generally exist • We want to find something that is an “approximate solution”:

  20. Finding the LS Solution • v Rk • Av  Rn • As we vary v, Av varies over the linear subspace of Rnspanned by the columns of A: Av = = 1 + 2 +…+ k 1 2 . . k A1 A2 Ak A1 A2 Ak

  21. Av closest to y Finding the LS Solution • We want to find the closest Av to y: y Subspace spanned by columns of A Rn

  22. Finding the LS Solution • The vector Av closest to y satisfies: (Av – y)  {subspace of A’s columns}  column Ai, <Ai, Av – y> = 0  i, AiT(Av – y) = 0 AT(Av – y) = 0 (ATA)v = ATy These are called the normal equations

  23. Finding the LS Solution • We got a square symmetric system • (ATA)v = ATy • If A has full rank (the columns of A are linearly independent) then (ATA) is invertible. (kk)

  24. Laplacian Editing

  25. Handle Based Editing Handles, they are free to move by user Model deforms when handles are places at target positions

  26. General Idea • A visually pleasing deformed mesh should maintain • local parameterization (triangle shape) • local geometry information (local shape) • We decompose the global geometry into • coefficients of the Laplace operator • Laplacian coordinates (LCs)

  27. Laplacian Coordinates • LC represent local geometry • Coefficients of Laplacian operator represent local parameterization or xi xj ...

  28. Laplacian Editing • We solve new vertex positions with handle constraints • In general there is no solution • We use LS method to find optimal solution Solve x:

  29. Shape Matching

  30. Shape Matching • We have two objects in correspondence • Want to find the rigid transformation that aligns them

  31. Shape Matching • When the objects are aligned, the lengths of the connecting lines are small.

  32. Shape Matching – Formalization • Align two point sets • Find a translation vector t and rotation matrix R so that:

  33. Shape Matching – Solution • Turns out we can solve the translation and rotation separately. • Theorem: if (R, t) is the optimal transformation, then the points {pi} and {Rqi + t} have the same centers of mass.

  34. Finding the Rotation R • To find the optimal R, we bring the centroids of both point sets to the origin: • We want to find R that minimizes

  35. Finding the Rotation R • First we compute optimum linear transformation A that minimizes • We solve it in LS sense and build the systems = Q = q0 q1 … qn P = p0 p1 … pn A QT PT AT = 3x3

  36. Finding the Rotation R • We can solve columns of AT = {a1, a2, a3} superlatively • By using LS method QQT AT = QPT  A = (QQT)-1(QPT) QT = PT a1 a2 a3 3x1 3x3 3x3

  37. Finding the Rotation R • Second we extract the pure rotation part R from A • We use singular value decomposition (SVD) • A = UWV • A = RS • R = UV, S = VTWV • Since U and V are orthogonal, therefore R does • S is a positive-semidefinite symetric matrix • Finally we compute translation t = p - Rq U, V: orthogonal matricesW: diagonal matrix

  38. Reference • SVD • Numerical Recipes in C (section2.6) • http://www.nrbook.com/a/bookcpdf.php • A Tutorial on Principal Component Analysis • http://www.snl.salk.edu/~shlens/pub/notes/pca.pdf • Laplacian Editing • Demo Program • http://www.cse.ust.hk/~oscarau/comp290/290project_demo_code.zip • Differential Coordinates for Interactive Mesh Editing [PDF]

More Related