1 / 19

Gaussian Process Regression for Dummies

Gaussian Process Regression for Dummies. Greg Cox Richard Shiffrin. Continuous response measures. The problem. What do we do if we do not know the functional form? Rasmussen & Williams, Gaussian Processes for Machine Learning http://www.gaussianprocesses.org/. Linear regression.

jabari
Télécharger la présentation

Gaussian Process Regression for Dummies

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Gaussian Process Regression for Dummies Greg Cox Richard Shiffrin

  2. Continuous response measures

  3. The problem What do we do if we do not know the functional form? Rasmussen & Williams, Gaussian Processes for Machine Learning http://www.gaussianprocesses.org/

  4. Linear regression

  5. Bayesian linear regression

  6. Gaussian processes A Gaussian process is a collection of random variables, any subset of which is jointly normally distributed. Normal regression: assume functional form  mean and covariance among data Gaussian process regression: assume form of mean and covariance among data functional form

  7. Covariance kernel How much does knowledge of one point tell us about another point?

  8. Returning to linear regression Mean = Function of parameters Covariance = Uncertainty about parameters + Observation noise

  9. Takeaways from linear regression • Rather than work in “parameter space”, we can bypass it by just working in “data space” • This allows us to worry only about how different data points relate to one another without needing to specify the parameters of the data generating process • The posterior predictive distribution encapsulates our uncertainty about the data generating process • The choice of covariance kernel—which says how different observations inform one another—implies certain properties of the data generating process

  10. Posterior predictive distribution So far, we have computed the posterior predictive via the parameters (e.g., b) of the data generating process. But, a Gaussian process may have an infinite number of parameters (q). How can we compute the posterior predictive in this case? The covariance kernel to the rescue! Let’s say we don’t know the data generating process, but we assume all observations are drawn from the same Gaussian process (i.e., are multivariate normal) and have an idea about how observations can mutually inform one another, the covariance kernel k(x, x’). Then... New data values f*(x*), given observed data f(x): But these are all multivariate normal!

  11. Building a function

  12. A hierarchical Bayesian approach

  13. Spivey, Grosjean, & Knoblich, 2005

  14. The GP model

  15. Model structure

  16. The GP model

  17. Results

  18. Results Inflection points can indicate important changes in cognitive processing

  19. Summary • Gaussian process models offer a useful and extensible way of dealing with behavioral trajectories • Able to model entire spectrum of dynamics • Can be embedded in a generative model to infer attractors and inflection points • Allow for deeper inferences about underlying cognitive processes

More Related