1 / 31

Smallest font

Smallest font. Please put away. electronic devices. Take out. clickers. Smallest font. Math Tools. A representational view of matrices. Operation. Data. Data. Data. Data. Operation. Operation. Operation. Operation. Output. Operations. Input. A note on notation.

kswann
Télécharger la présentation

Smallest font

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Smallest font Please put away electronic devices Take out clickers Smallest font

  2. Math Tools

  3. A representational view of matrices Operation Data Data Data Data Operation Operation Operation Operation Output Operations Input

  4. A note on notation • m = number of rows, n = number of columns • “Number”: Number of vectors • “Size”: Dimensionality of each vector (elements) • [Operation matrix] [Input matrix] • [number of operations x operation size] [size of input x number of inputs] • [m1 x n1] [m2 x n2] = [m1 x n2] • [2 x 3] [3 x 2] = [2 x 2] • n1 and m2 (where the matrices touch) have to match – size of operations has to match size of inputs. • The resulting matrix will have dimensionality number of operations x number of inputs.

  5. Inner Product = Dot product Dot product = Row Vector * Column Vector = Scalar DP = scalar [1xn] [mx1] [1x1] (m=n) 5 6 7 8 1 2 3 4 1*5 + 2*6 + 3*7 + 4*8 70

  6. Matrix multiplication 19 22 43 50 1 2 3 4 5 6 7 8 1 2 3 4 5 6 7 8 (1*5 + 2*7) (1*6 + 2*8) 3 4 3 4 1 2 1 2 (3*6 + 4*8) (3*5 + 4*7)

  7. The outer product • What if we do matrix multiplication, but when the two matrices are a single column and row vector? • Output is a *matrix*, not a scalar. [mx1] [1xn] [mxn] 1 2 3 4 OP = (1*3) (1*4) (2*3) (2*4) 3 4 6 8

  8. The transpose = flipping the matrix 2 3 5 6 7 8 9 4 7 2 5 8 3 6 9 NOT just a rotation!

  9. Singular Value Decomposition (SVD) • If you really want to understand what Eigenvalues and Eigenvectors are… • The culminating end point of a linear algebra course. • What we will use for the class (and beyond) to use the power of linear algebra to do stuff we care about.

  10. Remember this?

  11. Singular Value Decomposition (SVD) M = U S VT Orthogonal Orthogonal (“Rotate”) (“Rotate”) Diagonal (“Stretch”)

  12. SVD S U VT Output Scaling Input

  13. A simple case V= U S VT u2 O S I x Outer Product! v2 u1 v1

  14. SVD can be interpreted as • A sum of outer products! • Decomposing the matrix into a sum of scaled outer products. • Key insight: The operations on respective dimensions stay separate from each other, all the way – through v, s and u. • They are grouped, each operating on another piece of the input.

  15. Why does this perspective matter? v2 v1 U S VT s2v2 O S I x s1v1 u2s2v2 u1s1v1

  16. Is the SVD unique? • Is there only one way to decompose a matrix M into U, S and VT? • Is there another set of orthogonal matrices combined with a diagonal one that does the same thing? • The SVD is *not* unique. • But for somewhat silly reasons. • Ways in which it might not be unique?

  17. Sign flipping! U U S S VT VT

  18. Permutations! U U S S VT VT

  19. Given that the SVD is *not* unique, how can we make sure programs will all arrive at the exact *same* SVD? • Conventions! • MATLAB picks the values in the main diagonal of S to be positive – this also sets the signs in the other two matrices. • MATLAB orders them in decreasing magnitude, with the largest one in the upper left. • If it runs its algorithm and arrives at the wrong order, it can always sort them (and does this under the hood).

  20. What happens if one of the entries in the main diagonal of S is zero? U U S S VT VT

  21. The nullspace • The set of vectors that are mapped to zero. • Information loss. • Always comes about via multiplication by a zero in the main diagonal of S. • A one-dimensional black hole. • A hydraulic press. • “What happens to the inputs?” • The part of the input space (V) that is lost • Is a “subspace” of the input space.

  22. What if S is not square? U S VT 1 0 0 0 1 0

  23. What if the null space is multidimensional? U S VT 1 0 0 0 0 0

  24. Nullspaces matter in cognition • Categorization • Attention • Depth perception

  25. The range space • Conceptually, a complement to the null space. • “What can we reach in the output space?” • “Where can we get to?” • Some things might be unachievable. • Where we can get to in the output space (U)?

  26. How might we not reach a part of the output space? U S VT

  27. The inverse question • If I observe the outputs of a linear system and watch what is coming, could I figure out what the inputs were? • Related problem: If you start with 2 things in the input space and run them through the system and compare the outputs, can we still distinguish them as different? • So when is the linear system invertible? • How might it not be?

  28. Inversion of matrices • Matrix M can be inverted by doing the operations of the SVD in inverse order. • Peel them off one by one (last one done first): • M = U S VT • Inverting M: • V S# UT U S VT I I

  29. The pseudoinverse • “The best we can do”. • We recover the information we can. • The non-zero dimensions in the diagonal of S.

  30. Next stop • Eigenvectors • Eigenvalues • They will pop out as a special case of the SVD.

  31. How else might we not reach a part of the output space? U S VT 1 0 0 1 0 0

More Related