1 / 28

Lecture 16 Cramer’s Rule, Eigenvalue and Eigenvector

Lecture 16 Cramer’s Rule, Eigenvalue and Eigenvector. Shang-Hua Teng. Determinants and Linear System Cramer’s Rule. Cramer’s Rule. If det A is not zero, then Ax = b has the unique solution. Cramer’s Rule for Inverse. Proof:. Where Does Matrices Come From?. Computer Science.

bmelanie
Télécharger la présentation

Lecture 16 Cramer’s Rule, Eigenvalue and Eigenvector

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Lecture 16Cramer’s Rule, Eigenvalue and Eigenvector Shang-Hua Teng

  2. Determinants and Linear SystemCramer’s Rule

  3. Cramer’s Rule • If det A is not zero, then Ax = b has the unique solution

  4. Cramer’s Rule for Inverse Proof:

  5. Where Does Matrices Come From?

  6. Computer Science • Graphs: G = (V,E)

  7. Internet Graph

  8. View Internet Graph on Spheres

  9. Graphs in Scientific Computing

  10. Resource Allocation Graph

  11. Road Map

  12. Adjacency matrix: Matrices Representation of graphs

  13. Adjacency Matrix: 1 5 2 3 4

  14. 1 2 4 3 Matrix of Graphs Adjacency Matrix: • If A(i, j) = 1: edge exists Else A(i, j) = 0. 1 2 -3 4 3

  15. Laplacian of Graphs 1 5 2 3 4

  16. 1 2 4 3 Matrix of Weighted Graphs Weighted Matrix: • If A(i, j) = w(i,j): edge exists Else A(i, j) = infty. 1 2 -3 4 3

  17. Random walks How long does it take to get completely lost?

  18. 1 2 6 3 4 5 Random walks Transition Matrix

  19. Markov Matrix • Every entry is non-negative • Every column adds to 1 • A Markov matrix defines a Markov chain

  20. Other Matrices • Projections • Rotations • Permutations • Reflections

  21. Term-Document Matrix • Index each document (by human or by computer) • fij counts, frequencies, weights, etc • Each document can be regarded as a point in m dimensions

  22. Document-Term Matrix • Index each document (by human or by computer) • fij counts, frequencies, weights, etc • Each document can be regarded as a point in n dimensions

  23. Term Occurrence Matrix

  24. c1 c2 c3 c4 c5 m1 m2 m3 m4 human 1 0 0 1 0 0 0 0 0 interface 1 0 1 0 0 0 0 0 0 computer 1 1 0 0 0 0 0 0 0 user 0 1 1 0 1 0 0 0 0 system 0 1 1 2 0 0 0 0 0 response 0 1 0 0 1 0 0 0 0 time 0 1 0 0 1 0 0 0 0 EPS 0 0 1 1 0 0 0 0 0 survey 0 1 0 0 0 0 0 0 1 trees 0 0 0 0 0 1 1 1 0 graph 0 0 0 0 0 0 1 1 1 minors 0 0 0 0 0 0 0 1 1

  25. Matrix in Image Processing

  26. Random walks How long does it take to get completely lost?

  27. 1 2 6 3 4 5 Random walks Transition Matrix

More Related