1 / 32

Bayesian Probabilistic reasoning and learning

Bayesian Probabilistic reasoning and learning. Tang Ying State Key Lab of Cad&CG, Zhejiang Univeristy 03/10/04. Outline. Probability axioms The meaning of probability Forward probabilities and inverse probabilities Facial modeling example. Probability Axioms.

laddie
Télécharger la présentation

Bayesian Probabilistic reasoning and learning

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Bayesian Probabilistic reasoning and learning Tang Ying State Key Lab of Cad&CG, Zhejiang Univeristy 03/10/04

  2. Outline • Probability axioms • The meaning of probability • Forward probabilities and inverse probabilities • Facial modeling example

  3. Probability Axioms • Marginal Probability – sum the joint probability • Conditional Probability

  4. Product Rule (chain rule) • Obtained from the definition of conditional probability • Sum Rule • A rewriting of the marginal probability definitions

  5. Bayes’ Theorem • obtained from the product rule • Independence

  6. Example

  7. The Meaning of Probability (1) • Describe frequencies of outcomes in random experiments • Probability = percentage of events in infinite trials • Related to random variables • Used in Medicine, Biology, etc. where we can repeat random experiments

  8. The Meaning of Probability (2) • Describe degrees of belief in proposition that do not involve random variables • “the probability that Mr. S. was the murderer of Mrs. S., given the evidence” • “the probability that Shakespear’s play were written by Francis Bacon” • “the probability that a particular signature on a particular cheque is genuine”

  9. Belief • Let B(X) = “belief in X”, • B(¬X) = “belief in not X” • An ordering of beliefs exists • B(X) = f(B(¬X)) • B(X) = g(B(X|Y),B(Y))

  10. Cox axioms R.T. Cox, “Probability, frequency, and reasonable expectation,” American J. Physics, 14(1):1-13, 1946

  11. Bayesian Viewpoint • You cannot do inference without making assumptions • Real world is uncertain • Don’t have perfect information • Don’t really know the model • Model is non-deterministic

  12. Forward Probabilities • Forward probability problems involve a generative model that describes a process that is assumed to give rise to some data; the task is to compute the probability distribution or expectation of some quantity that depends on the data

  13. Inverse Probabilities • Like forward probability problems, inverseprobability problems involve a generative model of a process, but instead of computing the probability distribution of some quantity producedby the process, we compute the conditional probability of one or more of the unobserved variables in the process, given the observed variables. • This invariably requires the use of Bayes’ theorem

  14. Prior probability of B The Likelihood of B The posterior probability of B given A Evidence Terminology of inverse probability

  15. An example • Bill tosses a coin N times, obtaining a sequence of heads and tails, suppose k heads have occurred in N tosses. We assume that the coin has a probability f of coming up heads. We do not know f. what is the probability distribution of f ?

  16. Assume we have a uniform prior (subjective), Maximum a Posteriori (MAP)

  17. Learning • Maximum a Posterior (MAP) • Maximize the posterior • Maximum Likelihood(ML) • The MAP estimate under uniform priors

  18. Learning in a nutshell • Create a mathematical model • Get data • Solve for unknowns

  19. Face modeling • Blanz, Vetter, “A Morphable Model for the Synthesis of 3D Faces,” SIGGRAPH 99

  20. Generative model • Faces come from a Gaussian Learning

  21. Bayes Rule Often

  22. Learning a Gaussian

  23. Maximization trick • Maximize <->minimize

  24. Fitting a face to an image • Generative Model

  25. Fitting a face to an image Maximize Minimize

  26. General features • Models uncertainty • Applies to any generative model • Merge multiple sources of information • Learn all the parameters

  27. Caveats • Still need to understand the model • Not necessarily tractable • Potentially more involved than ad hoc methods

  28. My current work on texture compression • Find the most reused parts in the texture image • Use the parts as samples and for each block in the original image find the corresponding similar block in these samples • If we cannot find the similar blocks in the given threshold, we just cut the current block and paste it in our codebook. • For each block in the original image, record the positions in the codebook.

  29. Thank you!

More Related