1 / 88

Regression

Regression. 10-601 Machine Learning. Outline. Regression vs Classification Linear regression – another discriminative learning method As optimization  Gradient descent As matrix inversion ( O rdinary L east S quares) Overfitting and bias-variance

Télécharger la présentation

Regression

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Regression 10-601 Machine Learning

  2. Outline • Regression vs Classification • Linear regression – another discriminative learning method • As optimization  Gradient descent • As matrix inversion (Ordinary Least Squares) • Overfitting and bias-variance • Bias-variance decomposition for classification

  3. What is regression?

  4. Classifier Predict category Predict real # Regressor Inputs Inputs Where we are Prob- ability Density Estimator √ Inputs √ Today

  5. Regression examples

  6. Prediction of menu prices Chaheau Gimpel … and Smith EMNLP 2012 …

  7. A decision tree: classification Play Don’t Play Don’t Play Play

  8. A regression tree Play ~= 0 Play ~= 5 Play ~= 32 Play ~= 37 Play = 0m, 0m, 15m Play = 0m, 0m Play = 30m, 45min Play = 20m, 30m, 45m,

  9. Theme for the week: learning as optimization

  10. Types of learners • Two types of learners: 1. Generative: make assumptions about how to generate data (given the class) - e.g., naïve Bayes 2. Discriminative - directly estimate a decision rule/boundary - e.g., logistic regression Today: another discriminative learner, but for regression tasks

  11. Least Mean Squares Regression for LMS as optimization Toy problem #2

  12. Linear regression • Given an input x we would like to compute an output y • For example: - Predict height from age - Predict Google’s price from Yahoo’s price - Predict distance from wall from sensors Y X

  13. Linear regression • Given an input x we would like to compute an output y • In linear regression we assume that y and x are related with the following equation: y = wx+ where w is a parameter and  represents measurement or other noise Y Observed values What we are trying to predict X

  14. Linear regression Y • Our goal is to estimate w from a training data of <xi,yi> pairs • Optimization goal: minimize squared error (least squares): • Why least squares? • - minimizes squared distance between measurements and predicted line • - has a nice probabilistic interpretation • - the math is pretty X see HW

  15. Solving linear regression • To optimize: • We just take the derivative w.r.t. to w …. prediction prediction Compare to logistic regression…

  16. Solving linear regression • To optimize – closed form: • We just take the derivative w.r.t. to w and set to 0: covar(X,Y)/var(X) if mean(X)=mean(Y)=0

  17. Regression example • Generated: w=2 • Recovered: w=2.03 • Noise: std=1

  18. Regression example • Generated: w=2 • Recovered: w=2.05 • Noise: std=2

  19. Regression example • Generated: w=2 • Recovered: w=2.08 • Noise: std=4

  20. Bias term • So far we assumed that the line passes through the origin • What if the line does not? • No problem, simply change the model to y = w0 + w1x+ • Can use least squares to determine w0 , w1 Y w0 X

  21. Multivariate regression • What if we have several inputs? - Stock prices for Yahoo, Microsoft and Ebay for the Google prediction task • This becomes a multivariate regression problem • Again, its easy to model: y = w0 + w1x1+ … + wkxk +  Microsoft’s stock price Google’s stock price Yahoo’s stock price

  22. Multivariate regression • What if we have several inputs? - Stock prices for Yahoo, Microsoft and Ebay for the Google prediction task • This becomes a multivariate regression problem • Again, its easy to model: y = w0 + w1x1+ … + wkxk + 

  23. Not all functions can be approximated by a line/hyperplane… y=10+3x12-2x22+ In some cases we would like to use polynomial or other terms based on the input data, are these still linear regression problems? Yes. As long as the coefficients are linear the equation is still a linear regression problem!

  24. Non-Linear basis function • So far we only used the observed values x1,x2,… • However, linear regression can be applied in the same way to functions of these values • Eg: to add a term w x1x2 add a new variable z=x1x2 so each example becomes: x1, x2, …. z • As long as these functions can be directly computed from the observed values the parameters are still linear in the data and the problem remains a multi-variate linear regression problem

  25. Non-Linear basis function • How can we use this to add an intercept term? Add a new “variable” z=1 and weight w0

  26. Non-linear basis functions • What type of functions can we use? • A few common examples: - Polynomial: j(x) = xj for j=0 … n - Gaussian: - Sigmoid: - Logs: Any function of the input values can be used. The solution for the parameters of the regression remains the same.

  27. General linear regression problem • Using our new notations for the basis function linear regression can be written as • Where j(x) can be either xjfor multivariate regression or one of the non-linear basis functions we defined • … and 0(x)=1 for the intercept term

  28. Learning/Optimizing Multivariate Least Squares Approach 1: Gradient Descent

  29. Gradient descent

  30. Gradient Descent for Linear Regression Goal: minimize the following loss function: sum over nexamples sum over k+1 basis vectors

  31. Gradient Descent for Linear Regression Goal: minimize the following loss function:

  32. Gradient Descent for Linear Regression • Learning algorithm: • Initialize weights w=0 • For t=1,… until convergence: • Predict for each example xi using w: • Compute gradient of loss: • This is a vector g • Update: w = w – λg • λ is the learning rate.

  33. Gradient Descent for Linear Regression • We can use any of the tricks we used for logistic regression: • stochastic gradient descent (if the data is too big to put in memory) • regularization • …

  34. Linear regression is a convex optimization problem so again gradient descent will reach a global optimum proof: differentiate again to get the second derivative

  35. Multivariate Least Squares Approach 2: Matrix Inversion

  36. OLS (Ordinary Least Squares Solution) Goal: minimize the following loss function:

  37. Goal: minimize the following loss function: k+1 basis vectors Notation: nexamples

  38. Goal: minimize the following loss function: k+1 basis vectors nexamples

  39. k+1 basis vectors nexamples …

  40. k+1 basis vectors nexamples …

  41. k+1 basis vectors nexamples …

  42. n examples

  43. n examples k+1 basis …

  44. k+1 basis vectors nexamples …

  45. recap: Solving linear regression • To optimize – closed form: • We just take the derivative w.r.t. to w and set to 0: covar(X,Y)/var(X) if mean(X)=mean(Y)=0

  46. k+1 basis vectors nexamples …

  47. LMS for general linear regression problem Deriving w we get: n entries vector k+1 entries vector n by k+1 matrix This solution is also known as ‘pseudo inverse’ Another reason to start with an objective function: you can see when two learning methods are the same!

  48. LMS versus gradient descent • LMS solution: • + Very simple in Matlab or something similar • Requires matrix inverse, which is expensive for a large matrix. • Gradient descent: • + Fast for large matrices • + Stochastic GD is very memory efficient • + Easily extended to other cases • - Parameters to tweak (how to decide convergence? what is the learning rate? ….)

  49. Regression and Overfitting

  50. An example: polynomial basis vectors on a small dataset • From Bishop Ch 1

More Related