1 / 19

Using Bayes ’ rule to formulate your problem

Using Bayes ’ rule to formulate your problem. Bayes Rule. Example:. A : Event that the person tested has cancer B : Event that the test is positive We are interested in P(A = has cancer | B = positive). Suppose the test is 95% accurate : Suppose we also know that

cade
Télécharger la présentation

Using Bayes ’ rule to formulate your problem

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Using Bayes’ rule to formulate your problem

  2. Bayes Rule

  3. Example: • A : Event that the person tested has cancer • B : Event that the test is positive • We are interested in P(A = has cancer | B = positive). • Suppose the test is 95% accurate : • Suppose we also know that • According to Bayes’ Rule:

  4. Advantages of Bayesian Model • Allows us to learn and/or model causal relationships between hypothesis model parameters and observations. • Can combine prior knowledge into the prior probability • Can handle incomplete data observations or noisy data observations through incorporating proper priors into the model • The Bayesian method is a probabilistic method that find the most likely solution of decision boundaries or model parameters

  5. Bayesian and the MAP problem • If we are interested in finding a hypothesis model (or hypothesis model parameters), we can maximize the posterior or joint probability given the current observations. • In case we do not have any prior probability • MAP problem is equal to Maximum Likelihood (ML) problem

  6. Bayesian method in Computer Vision (Image Denoising Example) • Given a noisy image, IN • we want to estimate a clear image I • assuming each pixel is corrupted by noise that is identical and independent. • Further assume the noise follows a Gaussian distribution. • Based on the iid and the Gaussian distribution assumption, we can define the likelihood probability:

  7. Bayesian method in Computer Vision (Image Denoising Example) • Without a prior, trivial solution to the image denoising problem is I = IN • A common prior for image denoising problem is neighborhood smoothness prior • We use Gaussian distribution again to model prior probability distribution. • This is a standard MRF approach.

  8. Bayesian method in Computer Vision (Image Denoising Example) • Results Comparison Input Gau. Filter Median Filter MRF Ground Truth

  9. Optimization Methods • Linear Regression • Alternating Optimization (EM Algorithm) • Belief Propagation

  10. Linear Regression • Used when the model parameters are linear • Basic assumption: • Estimation errors follow Gaussian distribution • Observations are identical and independent distribution • Global optimal solutions corresponds to • Closed form solution by solving a SVD system

  11. Alternating Optimization • Used when model parameters are interdependent • Divide parameters into disjoint subset: {A0,…,AN} = {A0,…,Ai} U {Ai+1,…,AN} Optimize {A0,…,Ai} and {Ai+1,…,AN} alternatively and iteratively • Convergence is generally not guaranteed • Solutions converge to local maxima/minima only

  12. AO Image Denoise Example • We also want to estimate of the Gaussian noise • Model parameters become , the two disjoint subset of parameters are and • The update rules: • Solve given by MRF • Solve

  13. AO Image Denoise Example • Effects of in image denoising Optimal σ Large σ Small σ

  14. Expectation Maximization • EM algorithm is a special case of AO consists of E-step and M-step • Solve for maximum-likelihood parameters • Convergence is guaranteed • Basis assumption: • Observations are iid

  15. EM-GMM Estimation Example • Observations: incomplete data set • Parameters: probability density function for ‘missing’ data and Gaussian distribution parameters • E-step: • M-step:

  16. Belief Propagation • Solve discrete labeling problem in pairwise Markov Random Field (MRF) • Basic formulation:

  17. Belief Propagation • Iterative inference algorithm that propagates messages in the network • Standard solver is available.

  18. Bayesian Model Summary • The definition of Likelihood and Prior probabilities are unique for each problem. • To apply Bayesian model in your research, you need to • Identify the problem and design your goal. • Understand what observations you can obtain from the problem and what assumptions you can make about the problem. • Define variables. • Write down Bayesian equation based on the causal relationships between variables. • Define the likelihood probabilities from observations and define the prior probabilities from assumptions. • Understand the nature of your objective functions and choose a proper method to solve the problem. • Evaluate the results. If the results are different from your expectations, you may need to reformulate the problem or double check the implementation details.

  19. Tips in Bayes’: Make your live easier • Express everything you know as probabilities • Use Gaussians everywhere. Maybe multiple of them. • Learn from examples when you have them • Hack a noise model when you don‘t know • Leave constant when desperate • Use Ax=b when equations are quadratic • Use alternating optimization when parameters are mixed • Use MRF when ``segmentation’’ • Work in the log domain where everything is additive • Find the maximum

More Related