1 / 26

Introduction to Matrices and Matrix Approach to Simple Linear Regression

Introduction to Matrices and Matrix Approach to Simple Linear Regression. Matrices. Definition: A matrix is a rectangular array of numbers or symbolic elements

hammondr
Télécharger la présentation

Introduction to Matrices and Matrix Approach to Simple Linear Regression

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Introduction to Matrices and Matrix Approach to Simple Linear Regression

  2. Matrices • Definition: A matrix is a rectangular array of numbers or symbolic elements • In many applications, the rows of a matrix will represent individuals cases (people, items, plants, animals,...) and columns will represent attributes or characteristics • The dimension of a matrix is its number of rows and columns, often denoted as r x c (r rows by c columns) • Can be represented in full form or abbreviated form:

  3. Special Types of Matrices

  4. Regression Examples - Carpet Data

  5. Matrix Addition and Subtraction

  6. Matrix Multiplication

  7. Matrix Multiplication Examples - I

  8. Matrix Multiplication Examples - II

  9. Special Matrix Types

  10. Linear Dependence and Rank of a Matrix • Linear Dependence: When a linear function of the columns (rows) of a matrix produces a zero vector (one or more columns (rows) can be written as linear function of the other columns (rows)) • Rank of a matrix: Number of linearly independent columns (rows) of the matrix. Rank cannot exceed the minimum of the number of rows or columns of the matrix. rank(A) ≤ min(rA,ca) • A matrix if full rank if rank(A) = min(rA,ca)

  11. Geometry of Vectors • A vector of order n is a point in n-dimensional space • The line running through the origin and the point represented by the vector defines a 1-dimensional subspace of the n-dim space • Any p linearly independent vectors of order n, p < n define a p-dimensional subspace of the n-dim space • Any p+1 vectors in a p-dim subspace must have a linear dependency • Two vectors u and v are orthogonal if u’v = v’u = 0 and form a 90 angle at the origin • Two vectors u and v are linearly dependent if they form a 0 or 180 angle at the origin

  12. Geometry of Vectors - II If two vectors each have mean 0 among their elements then cos(a) is the product moment correlation between the two vectors

  13. Matrix Inverse • Note: For scalars (except 0), when we multiply a number, by its reciprocal, we get 1: 2(1/2)=1 x(1/x)=x(x-1)=1 • In matrix form if A is a square matrix and full rank (all rows and columns are linearly independent), then A has an inverse: A-1 such that: A-1 A = AA-1 = I

  14. Computing an Inverse of 2x2 Matrix

  15. Use of Inverse Matrix – Solving Simultaneous Equations

  16. Useful Matrix Results

  17. Random Vectors and Matrices

  18. Linear Regression Example (n=3)

  19. Mean and Variance of Linear Functions of Y

  20. Multivariate Normal Distribution

  21. Simple Linear Regression in Matrix Form

  22. Estimating Parameters by Least Squares

  23. Fitted Values and Residuals

  24. Analysis of Variance

  25. Inferences in Linear Regression

More Related