1 / 12

Prediction variance in Linear Regression

Prediction variance in Linear Regression. Assumptions on noise in linear regression allow us to estimate the prediction variance due to the noise at any point. Prediction variance is usually large when you are far from a data point.

dinos
Télécharger la présentation

Prediction variance in Linear Regression

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Prediction variance in Linear Regression • Assumptions on noise in linear regression allow us to estimate the prediction variance due to the noise at any point. • Prediction variance is usually large when you are far from a data point. • We distinguish between interpolation, when we are in the convex hull of the data points, and extrapolation where we are outside. • Extrapolation is associated with larger errors, and in high dimensions it usually cannot be avoided.

  2. Linear Regression • Surrogate is linear combination of given shape functions • For linear approximation • Difference (error) between data and surrogate • Minimize square error • Differentiate to obtain

  3. Model based error for linear regression • The common assumptions for linear regression • The true function is described by the functional form of the surrogate. • The data is contaminated with normally distributed error with the same standard deviation at every point. • The errors at different points are not correlated. • Under these assumptions, the noise standard deviation (called standard error) is estimated as • is used as estimate of the prediction error.

  4. Prediction variance • Linear regression model • Define then • With some algebra • Standard error

  5. Interpolation, extrapolation and regression • Interpolation is often contrasted to regression or least-squares fit • As important is the contrast between interpolation and extrapolation • Extrapolation occurs when we are outside the convex hull of the data points • For high dimensional spaces we must have extrapolation!

  6. 2D example of convex hull • By generating 20 points at random in the unit square we end up with substantial region near the origin where we will need to use extrapolation • Using the data in the notes, give a couple of alternative sets of alphas Approximately for the point (0.4,0.4)

  7. Example of prediction variance • For a linear polynomial RS y=b1+b2x1+b3x2find the prediction variance in the region • (a) For data at three vertices (omitting (1,1))

  8. Interpolation vs. Extrapolation • At origin . At 3 vertices . At (1,1)

  9. Standard error contours • Minimum error obtained by setting to zero derivative of prediction variance with respect to . • What is special about this point • Contours of prediction variance provide more detail.

  10. Data at four vertices • Now • And • Error at vertices • At the origin minimum is • How can we reduce error without adding points?

  11. Graphical Comparison of Standard Errors Three points Four points

  12. Homework • Redo the four point example, when the data points are not at the corners but inside the domain, at +-0.8. What does the difference in the results tells you? • For a grid of 3x3 data points, compare the standard errors for a linear and quadratic fits.

More Related