1 / 15

Probabilistic models

Probabilistic models. Jouni Tuomisto THL. Outline. Deterministic models with probabilistic parameters Hierarchical Bayesian models Bayesian belief nets. Deterministic models with probabilistic parameters. Inputs are uncertain, but causal relations are assumed certain.

mfang
Télécharger la présentation

Probabilistic models

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Probabilistic models Jouni Tuomisto THL

  2. Outline • Deterministic models with probabilistic parameters • Hierarchical Bayesian models • Bayesian belief nets

  3. Deterministic models with probabilistic parameters • Inputs are uncertain, but causal relations are assumed certain. • Works well with established situations, especially if physical foundations. • Exposure = ∑ (ci ti) / ∑ ti • i = microenvironment • c = concentration • t = time

  4. Functional vs. probabilistic dependency Va1=2.54*Ch1^2 Va2=normal(2.54*Ch1^2,2)

  5. Hierarchical Bayesian models • Relations are probabilistic • Gibbs Sampler • Another MCMC (Markov chain Monte Carlo) Method • Update a single parameter at a time • Sample from conditional distribution • When other parameters are fixed

  6. Gibbs sampling • To introduce the Gibbs sampler, consider a bivariate random variable (x; y), and suppose we wish to compute one or both marginals, p(x) and p(y). • The idea behind the sampler is that it is far easier to consider a sequence of conditional distributions, p(x |y) and p(y |x), than it is to obtain the marginal by integration of the joint density p(x; y), e.g., • p(x) = ∫ p(x; y)dy.

  7. Gibbs sampling in practice • The sampler starts with some initial value y0 for y and obtains x0 by generating a random variable from the conditional distribution p(x |y = y0). • The sampler then uses x0 to generate a new value of y1, drawing from the conditional distribution based on the value x0, p(y j x = x0). The sampler proceeds as follows • xi ≈p(x |y = yi-1) (proportionality) • yi ≈p(y |x = xi) • Repeating this process k times, generates a Gibbs sequence of length k, where a subset of points (xj; yj) for 1 ≤j ≤m < k are taken as our simulated draws from the full joint distribution.

  8. Hierarchical model with parameters and hyperparameters • A useful graphical tool for representing hierarchical Bayes models is the directed acyclic graph, or DAG. In this diagram, the likelihood function is represented as the root of the graph; each prior is represented as a separate node pointing to the node that depends on it.

  9. Bayesian belief nets • Relations are described either with conditional probabilities P(x|y), P(y) or marginal probabilities P(x), P(y) and a rank correlation between them. • You need to get the conditional probabilities from somewhere. • Unlike hierarchical Bayes model, belief nets are not developed for updating when new data comes out. • The model is used to make inference.

  10. Bayesian belief nets P(rain) P(sprinkler | rain) P(grass wet | sprinkler, rain)

  11. Uninet: diagram view • V1: rain (mm/day) • V2: sprinkler on (h/day) • V3 ”wetness” of grass (range 0-1)

  12. Uninet: variable definition view

  13. Bayes belief network: unconditional situation

  14. Conditioning on input variables

  15. Conditioning on outcome

More Related