1 / 14

From Learning Models of Natural Image Patches to Whole Image Restoration

From Learning Models of Natural Image Patches to Whole Image Restoration. Daniel Zoran Interdisciplinary Center for Neural Computation Hebrew University of Jerusalem Yair Weiss School of Computer Science and Engineering Hebrew University of Jerusalem Presented by Eric Wang

kreeli
Télécharger la présentation

From Learning Models of Natural Image Patches to Whole Image Restoration

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. From Learning Models of Natural Image Patches to Whole Image Restoration Daniel Zoran Interdisciplinary Center for Neural Computation Hebrew University of Jerusalem YairWeiss School of Computer Science and Engineering Hebrew University of Jerusalem Presented by Eric Wang Duke University 6/18/2012

  2. Introduction • Patch based learning on images has significant computational advantages over learning dictionaries over the entire image. • A primary concern of this paper is the affect the choice of dictionary priors have on the performance of the model. • This paper addresses 3 main questions • (1) Do priors that give high likelihoods yield better patch restoration performance? • (2) Do priors that give high likelihoods yield better whole-image restoration performance? • (3) Can we learn better priors?

  3. Motivation and Patch Restoration • Answer to question (1): Priors with higher likelihoods will yield improved per-patch denoising performance, as shown with several popular priors PSNR of restoration on patches vs. dictionary (prior) likelihoods. Trained on 50,000 8x8 patches of natural images of faces. Tested on unseen patches.

  4. From Patches to Whole Image Restoration • Good patch-based image restoration does not guarantee high quality image restoration, many reconstruction methods can generate significant artifacts.

  5. Expected Patch Log-Likelihood • Choosing random patches can provide a solution to minimize artifacting, but has the issue that most random patches will have low likelihood to a given dictionary. • This paper presents an optimization algorithm that maximizes expected patch log-likelihood (EPLL) with the constraint that the reconstructed image be close to the original image. • The EPLL under prior p is defined as • Where is a mask that extracts the ith random patch from image x.

  6. Cost Function • Let be the corruption model on the image where x is a vectorized image, Adefines the corruption model and y is the vectorized noisy observation. • The cost function to minimize is then • Direct optimization of this cost function is intractable, so an alternative method called half quadratic splitting is used, where are a set of per-patch auxiliary variables.

  7. The Corruption Model • The choice of the matrix A is determined by the application. • For denoising, A is an identity matrix, and is the noise precision • For deblurring, A is a convolution matrix with a known kernel • For inpainting, A is a diagonal matrix with zeros for the missing elements.

  8. Optimization • The EPLL optimization involves two steps: (1) solving for given • And (2) solving for given , which is dependent on the dictionary and involves solving a MAP estimate of the most likely dictionary element for a particular patch. • is either set by hand or set to where is the estimated noise standard deviation in .

  9. Image Restoration • Answer to question (2): It is shown that priors with higher patch likelihoods yield improved whole image restoration

  10. Building a better dictionary via the GMM • Gaussian zero mean data can usually be well represented by the top-m eigenvectors of its covariance matrix. • This paper proposes clustering the patches (pixels) via a GMM. Patches sharing a cluster also share a dictionary. Sample dictionaries for six mixture components

  11. Building a better dictionary via the GMM • Answer to question (3): The GMM prior outperforms other priors in both patch and whole-image restoration. PSNR values, 200 GMM components

  12. Building a better dictionary via the GMM • The GMM prior is also shown to outperform the ICA prior in reconstruction with noise level using EPLL.

  13. Image Denoising • 68 images from the Berkeley dataset, 8x8 patches, comaprison is in PSNR

  14. Image Deblurring • 68 images from the Berkeley dataset with known blur kernels and 1% white Gaussian noise (comparison is in PSNR)

More Related