1 / 35

Markov random field: A brief introduction

Markov random field: A brief introduction. Tzu-Cheng Jen Institute of Electronics, NCTU 2007-03-28. Outline. Neighborhood system and cliques Markov random field Optimization-based vision problem Solver for the optimization problem. Neighborhood system and cliques. Prior knowledge.

vanig
Télécharger la présentation

Markov random field: A brief introduction

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Markov random field: A brief introduction Tzu-Cheng Jen Institute of Electronics, NCTU 2007-03-28

  2. Outline • Neighborhood system and cliques • Markov random field • Optimization-based vision problem • Solver for the optimization problem

  3. Neighborhood system and cliques

  4. Prior knowledge • In order to explain the concept of the MRF, we first introduce following definition: 1. i: Site (Pixel) 2. Ni: The neighboring point of i 3. S: Set of sites (Image) 4. fi: The value at site i (Intensity) A 3x3 imagined image

  5. Neighborhood system • The sites in S are related to one another via a neighborhood system. Its definition for S is defined as: where Ni is the set of sites neighboring i. • The neighboring relationship has the following properties: (1) A site is not neighboring to itself (2) The neighboring relationship is mutual

  6. Neighborhood system: Example First order neighborhood system Second order neighborhood system Nth order neighborhood system

  7. Neighborhood system: Example The neighboring sites of the site i are m, n, and f. The neighboring sites of the site j are r and x

  8. Clique • A clique C is defined as a subset of sites in S. Following are some examples

  9. Clique: Example • Take first order neighborhood system and second order neighborhood for example: Neighborhood system Clique types

  10. Markov random field

  11. Markov random field (MRF) • View the 2D image f as the collection of the random variables (Random field) • A random field is said to be Markov random field if it satisfies following properties Image configuration f

  12. Gibbs random field (GRF) and Gibbs distribution • A random field is said to be a Gibbs random field if and only if its configuration f obeys Gibbs distribution, that is: Design U for different applications Image configuration f U(f): Energy function; T: Temperature Vi(f): Clique potential

  13. Markov-Gibbs equivalence • Hammersley-Clifford theorem: A random field F is an MRF if and only if F is a GRF Proof(<=): Let P(f) be a Gibbs distribution on S with the neighborhood system N. A 3x3 imagined image

  14. Markov-Gibbs equivalence • Divide C into two set A and B with A consisting of cliques containing i and B cliques not containing i: A 3x3 imagined image

  15. Optimization-based vision problem

  16. Denoising Noisy signal d denoised signal f

  17. MAP formulation for denoising problem • The problem of the signal denoising could be modeled as the MAP estimation problem, that is, (Observation model) (Prior model)

  18. MAP formulation for denoising problem • Assume the observation is the true signal plus the independent Gaussian noise, that is • Under above circumstance, the observation model could be expressed as U(d|f): Likelihood energy

  19. MAP formulation for denoising problem • Assume the unknown data f is MRF, the prior model is: • Based on above information, the posteriori probability becomes

  20. MAP formulation for denoising problem • The MAP estimator for the problem is: ?

  21. MAP formulation for denoising problem • Define the smoothness prior: • Substitute above information into the MAP estimator, we could get: Observation model (Similarity measure) Prior model (Reconstruction constrain)

  22. Super-resolution • Super-Resolution (SR): A method to reconstruct high-resolution images/videos from low-resolution images/videos

  23. Super-resolution • Illustration for super-resolution d(1) d(2) d(3) d(4) Use the low-resolution frames to reconstruct the high resolution frame f(1)

  24. MAP formulation for super-resolution problem • The problem of the super-resolution could be modeled as the MAP estimation problem, that is, (Observation model) (Prior model)

  25. MAP formulation for super-resolution problem • The conditional PDF can be modeled as the Gaussian distribution if the noise source is Gaussian noise • We also assume the prior model is joint Gaussian distribution

  26. MAP formulation for super-resolution problem • Substitute above relation into the MAP estimator, we can get following expression: (Observation model) (Prior model)

  27. Solver for the optimization problem

  28. The solver of the optimization problem • In this section, we will introduce different approaches for solving the optimization problem: 1. Brute-force search (Global extreme) 2. Gradient descent search (Local extreme, Usually) 3. Genetic algorithm (Global extreme) 4. Simulated annealing algorithm (Global extreme)

  29. Gradient descent algorithm (1)

  30. Gradient descent algorithm (2)

  31. Simulation: SR by gradient descent algorithm Use 6 low resolution frames (a)~(f) to reconstruct the high resolution frame (g)

  32. Simulation: SR by gradient descent algorithm

  33. The problem of the gradient descent algorithm • Gradient descent algorithm may be trapped into the local extreme instead of the global extreme

  34. Genetic algorithm (GA) • The GA includes following steps:

  35. Simulated annealing (SA) • The SA includes following steps:

More Related