1 / 47

Matrix Multiplication

Matrix Multiplication. Hyun Lee, Eun Kim, Jedd Hakimi. This section describe the matrix algebra, especially multiplication. 2.1 Matrix Operations. Key Idea Matrix multiplication corresponds to composition of linear transformation. The definition of AB, is critical for the

Télécharger la présentation

Matrix Multiplication

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Matrix Multiplication Hyun Lee, Eun Kim, Jedd Hakimi

  2. This section describe the matrix algebra, especially multiplication.

  3. 2.1 Matrix Operations • Key Idea Matrix multiplication corresponds to composition of linear transformation. The definition of AB, is critical for the development of theory and application. Then what is the definition of AB?

  4. What is AB? • The subscripts tell the location of an entry. If A is m x n matrix, then m represents the row and n represents the column. In the product AB, left-multiplication by A acts on the columns of while right multiplication by B actions on the rows of A. In other words..

  5. Definition of AB continues.. Columns j Columns j of AB = A x of B Also, the following is true. (row i of AB)=(row i of A) x B

  6. How do we add matrix? Let me show you an example: suppose A= 2 3 B= 4 7 7 8 6 5 , then what is A+B? = 2+4 7+3 = 6 10 7+6 8+5 13 13

  7. Example of Bx • X= 1 1 1 3 2 5 then.. What is 4x? 4x= 4 4 4 12 8 20

  8. Matrix Multiplication • How do we multiply the matrix? When matrix x is multiply by B, we are transforming x into vector Bx. Lets say we multiply by Bx by A again, now we have created A(Bx). This is the key concept to move on to next steps!

  9. We can always express A(Bx) and (AB)x to represent this composite mapping as a matrix that was multiplied by a single factor.

  10. Example • If A is a 4 x 5 matrix and B is a 5 x 3 matrix, what are the sizes of AB and BA, if they are defined?

  11. Remember! You can’t multiply any matrix. There are some conditions! The number of columns on the first matrix has to be same as the number of rows of the second matrix.

  12. Theorem 1 • A,B, and C are the matrices of same size and let r and s be the scalars. • A+B= B+A 1.1 • (A+B)+C=A+(B+C) 1.2 • A+0=A 1.3 • r(A+b)=rA+rB 1.4 • (r+s)A= rA+sA 1.5 • R(sA)=(rs)A 1.6

  13. Lets prove it! Note that all the vectors of A, B, and C has the same size. Lets say 2nd column of vector of A, B, C is A2, B2, and C2(respectively). Then … 3(A2+B2)= 3A2+3B2 The first matrix has the same size as the matrix on the right: The corresponding columns are equal so from 1.1-1.6 it can be proved using same logic as I proved 1.4

  14. A= 3 0 B= 4 7 1 1 6 8 5 2 A x B= 3(4)+0(6), 3(7)+0(8) 1(4)+1(6), 1(7)+1(8) 5(4)+2(6), 5(7)+2(8) = 12 21 10 15 32 51 A X B Lets multiply 2 different matrices now!

  15. Row-Column Rule If we think of the ith row of A and the jth column of B as vectors, then the element in the ith row and the jth column of C is, scalar product of the ith row of A and jth column of B. (AB)ij=Ai1Bij+Ai2Bi2+….+AinBnj Rowi(AB)=rowi(a) * B

  16. Theorem 2 A, B, and Cs are matrices they have sizes for which the indicates sums and products are defined A(BC)=(AB)C 2.1 A(B+C)=AB+AC 2.2 (B+C)A=BA+CA 2.3 r(AB)=(rA)B=A(rB) 2.4 ImA=A=AIn 2.5

  17. Let’s prove it A(BC)=(AB)C ------Associative law First observe that both A(BC) and (AB)C are m x n matrices. Let Uj denote column j of AB. Since Uj==ABj, column j of A(BC)is AUj= A(BCj). Further more, column j of (AB)C is (AB)Cj=A(Bcj). It follows that the corresponding columns of A(BC) and (AB)C are equal.

  18. Example of “Associative Law” • A=(1,2), B= 3 4 C= 3 0 2 2 1 5 1 0 AB=(1,2) 3 4 = (7,6) 2 1 (AB)C=(7,6) 3 0 2 =(51,6,4) 5 1 0 A(BC)=(1,2) 29 4 6 =(51,6,14) 11 1 4

  19. Proof of “Distributive Law” Both (A+B)C and AC +BC are m *n matrices, and so we compare the corresponding columns of each matrix. For any j, ACj and BCj are the jth columns of AC and BC, respectively. But …….. *jth column of (A+B)C is (A+B)Cj=ACj+BCj. By property of matrix jth column of AC+BC is equal to the above. This left distributive law applies to right distributive law, too.

  20. Example of “Distributive Law” A=(1,2) B= 3 4 C= 4 2 0 5 1 7 B+C= 7 6 A(B+C)=(9,30) 1 12 AB=(3,14) AC=(6,16) AB+AC=(9,30)

  21. AB is not equal to BA • A = 1 B= (3,4,1,5); AB= 3 4 1 5 2 6 8 2 10 0 0 0 0 0 1 3 4 1 5 BA =(3,4,1,5) 1 = (3+8+5)=16 2 0 1

  22. The Transpose of a Matrix • Sometimes it is of interest to interchange the rows and columns of a matrix • The transpose of a matrix A=Aij is a matrix formed from A by inter changing rows and columns such that row i of A becomes columns I of the transpose matrix. The transpose is denoted by At and • At=Aji when A= Aij

  23. Example of the “Transpose” • A=1 3 At= 1 2 2 5 3 5 • A= 1 3 4 At= 1 0 0 1 0 3 1 4 0 It will be observed that if A is m x n, At is n x m

  24. Matrix Powers If matrix A is squared it is denoted as .. =A2 Ak=A x A x A x…A A0=1(it’s the convention…right??) Therefore, A0X=X (itself) Also, you we can apply that (Ap)x(Aq)= A (p+q)

  25. Theorem 3 Suppose A and B represent matrices of appropriate sizes for the following sums and products.

  26. Theorems (a)-(c) are obvious, so the proofs are not required. For theorem (d),

  27. 2.4 Partitioned Matrices I see you’ve chosen the Red Pill… Welcome to Partitioned Matrices -Not as exciting as an action movie, but it might still make you say “Whoa.”

  28. By now you’re teeming with anticipation, eager to expand your mind. Let’s start by clearing a few things up by understanding what we are working with. Why don’t I ask your first question for you (because you might feel silly talking to a computer screen):

  29. Q. What is a Partitioned Matrix and what does it have to do with me? A. Ah, good question. Well, a Partitioned Matrix is a matrix that has been broken down into several smaller matrices But why tell you when I can show you a picture. Lets say I have a 5x4 Matrix called “G”

  30. And now a partitioned version (with the partition lines in red):

  31. And now we name the individual parts (AKA: Blocks or Submatrices):

  32. Now we can rewrite “G” as a 3x2 Matrix: Now doesn’t that look a lot nicer than our original “G?” Of course is does. Now, to address the second part of your question, this partitioned matrix can help us by speeding up a supercomputer or something like that. Isn’t that exciting? Don’t answer that. On to more questions…

  33. Q. Can I add Partitioned Matrices to each other? A. Sure, as long as the matrices being multiplied are identical in the way they are partitioned. Each blockcan be added to its corresponding block. Too easy. How bout another question?

  34. Q. Can I multiply Partitioned Matrices • by a scalar? • Sure, as long you multiply the scalar • one block at a time. • Come on, give me a harder one…

  35. Q. How can I multiply Partitioned Matrices • by each other? • Okay, now you’ve asked a tough one. • The best way to explain this is through an • example. In the following example uppercase • Letters will represent blocks. • First matrix “J” partitioned like so:

  36. Or And Matrix “K” partitioned like so:

  37. Or

  38. First of all, Partitioned Matrices can Only be multiplied because the number of vertical partitions of the first matrix in the equation is equal to the number of horizontal partitions of the second matrix in the equation. matrix: (Row 1:) AE+BF (Row 2:) CE+DF

  39. Now we expand each one of the Blocks.

  40. Q. What Theorems can we get from this method of multiplying Partitioned Matrices ? A. Well you’re a curious one aren’t you? Well, in fact there is a theorem. Its called the Column-row Expansion Theorem and it basically means that because blocks work so well for multiplication then columns of the first matrix in a multiplication equation will correspond to rows of the second matrix in that equation to make an easier way to compute the equation. Here’s the proof:

  41. For each row index I and column index j, the (i,j)- entry in column n (A) row n (B) is the product of a(i,N) from column n (A) and b (N, j) from row n (B). Hence the (i, j) entry in the sum shown in (1) is: a (i, 1)b(1,j)+ a (i, 2)b(2,j)+…+ a (i,N)b(N,j) This sum is also the (i, j)-entry in the AB, by the row-column rule. Or you can just take my word for it.

  42. The End Produce by NYU Math Masters

More Related