1 / 87

4054 Machine Vision Modelling Textures

4054 Machine Vision Modelling Textures. Dr. Simon Prince Dept. Computer Science University College London. http://www.cs.ucl.ac.uk/s.prince/4054.htm. Introduction. Individual pixels measurement at single pixels inference of labels labels not connected Markov Random Fields

beck
Télécharger la présentation

4054 Machine Vision Modelling Textures

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. 4054 Machine VisionModelling Textures Dr. Simon Prince Dept. Computer Science University College London http://www.cs.ucl.ac.uk/s.prince/4054.htm

  2. Introduction • Individual pixels • measurement at single pixels • inference of labels • labels not connected • Markov Random Fields • measurement at single pixels • labels connected to 4 neighbours • Textures • measurement at single pixels • labels connected over larger region • or • Measurement of image patches • (patch) labels connected to 4 neighbours

  3. Modelling Textures • Goals of texture modelling • Texture synthesis • Inference: Non-parametric approach • Inference: Parametric models • Epitomes and Jigsaws

  4. Modelling Textures 1. Goals of texture modelling

  5. What is a texture? • Hard to define, but for our purposes: • describing local appearance of larger object • image properties determined by object material • joint statistics of pixels in small image region • usually stochastic • usually spatially homogenous/stationary !Not used in the same sense as texture mapping in computer graphics!

  6. Why model textures? • Rationale: • Step towards modelling objects – properties more stochastic and harder to describe than single pixels, but still fairly regular. • What can we achieve? • Texture synthesis • Super-resolution • Image denoising • Image inpainting • Image segmentation

  7. Texture Synthesis input image SYNTHESIS True (infinite) texture generated image GOAL: Given a finite sample of some texture, synthesize other samples from that same texture. METHOD: Learn a generative model for this texture and sample from it

  8. Image Inpainting • GOAL: Given an image with missing regions, fill in with plausible values • METHOD: • learn model of natural textures • generate from model with constraints that matches edges of non-damaged image pixels

  9. Super-resolution Original Image Bicubic Interpolation Super-resolution • GOAL: Given low resolution image, produce higher resolution version • METHOD: • describe generation of low res. image from latent high-res. image • infer high res. image from low res. image (inverse problem) • use model of natural texture as a prior

  10. Image Denoising Original Image (noisy) Restored Image • GOAL: Given noisy image, produce clean version • METHOD: • describe generation of noisy image from latent clean image • infer clean image from noisy image (inverse problem) • use model of natural texture as a prior

  11. Image Segmentation • GOAL: Segment scenes into semantically meaningful regions (tree, car, sky etc.) • METHOD: • Build model of each constituent texture • Put in MRF framework (tree close to tree etc.) • Find MAP segmentation.

  12. Modelling Textures 2. Texture synthesis

  13. Texture Synthesis input image SYNTHESIS True (infinite) texture generated image GOAL: Given a finite sample of some texture, synthesize other samples from that same texture. METHOD: Learn a generative model for this texture and sample from it

  14. Texture Synthesis Milestones Plan: Discuss 4 key papers in texture synthesis Pyramid-based synthesis (Heeger and Bergen, 1995) Non-parametric sampling (Efros and Leung, 1999) Image quilting (Efros and Freeman, 2001) Graphcut Textures (Kwatra et al. 2004) Then... discuss relationship to Markov random fields and then... consider the analysis of texture

  15. Key Paper #1: Pyramid-based texture synthesis (Heeger & Bergen 1995) Key idea: New image should have same statistics as sample Start with a random image and manipulate until statistics similar. Sample New image (start) New image (finish)

  16. What statistics should we match? • Just matching RGB pixel statistics insufficient • Filter image with filter bank • Match marginal statistics of filter responses Particular choice of filters: Steerable pyramids Filter bank turns one image into many each with different scale, orientation.

  17. Algorithm • Generate random noise image • Match R, G, and B histograms • Analyze new image by applying filter bank • Match filter bank histograms • Synthesize new image that agrees with marginal histogram • Go to stage 1 • How do we match histograms? • A. Stretch and squeeze x axis of one histogram until density under each is similar.

  18. Results (successful)

  19. Results (failures)

  20. Conclusions • Works okay for very stochastic types of texture • Not so good when there is more structure • Why? • there are dependencies between scales and orientations • e.g. at edges oriented filters at several scales respond • Improved by Portilla and Simoncelli (1999) – modelled joint filter responses to capture some of these dependencies.

  21. Key Paper #2: Non-parametric sampling (Efros and Leung, 1999) • Abandoned idea of matching statistics • Took inspiration from natural language synthesis • [Shannon,’48] proposed a way to generate English-looking text using N-grams: • Assume a generalized Markov model • Use a large text to compute prob. distributions of each letter given N-1 previous letters • Starting from a seed repeatedly sample this Markov chain to generate new letters

  22. Markov Text – Letter Level Order-1: Runs ch g Sprif ighaifay fe; ie llathis, fur Gos ngithigh f Lom sist aminth uces yom Je Movin th we hof I juthe peathor ly dis Order-2: For unto yousay law to do retway hein: thein ther on, Who dopento the he but wit forethered Jesin: ach minto at of the livence, Order-3: For righbour in from her own Sion, There not, which confidentillined; For thereignation and thes ves: things is gospel Order-4: Therein the law to Gomorrha. Owe not atten for it was lieth believed. The gosperously report? For Israel. ln, not in business shalt Order-5: What their lusts there in your father, because also maketh not, it is among ought of his he liveth, if it fulfil things which cause KEY IDEA: Use the same principle but with pixels

  23. p Synthesizing One Pixel • Assuming Markov property, what is conditional probability distribution of p, given neighbourhood ? • Instead of constructing a model, let’s directly search the input image for all such neighbourhoods to produce a histogram for p • Draw a sample from this histogram non-parametric sampling Input image Synthesizing a pixel

  24. p Synthesizing One Pixel non-parametric sampling Input image Synthesizing a pixel • However, since sample image is finite, an exact neighbourhood match might not be present • So find the best match using SSD error (weighted by a Gaussian to emphasize local structure), and take all samples within some distance from that match

  25. Details • Use a random 3x3 patch from input image as seed • Starting from the seed, “grow” texture one pixel at a time • Growing is in “onion skin” order • Within each “layer”, pixels with most neighbors are synthesized first • If no close match can be found, the pixel is not synthesized until the end • The size of the neighbourhood window is a parameter that specifies how stochastic the user believes this texture to be • Using Gaussian-weighted SSD is very important to make sure the new pixel agrees with its closest neighbors

  26. Effect of window size

  27. More Synthesis Results Increasing window size

  28. More Results wood granite

  29. More Results white bread brick wall

  30. Constrained Synthesis

  31. Failure Cases Growing garbage Verbatim copying

  32. Homage to Shannon

  33. Space time textures (Wei and Levoy, 2000)

  34. Conclusions • Definite improvement over trying to explicitly model statistics • Uses original image directly as model of new image • Still some failure cases • Slow as have to synthesize one pixel at a time

  35. Key Paper #3: Image Quilting (Efros and Freeman,2001) • MAIN IDEA: Synthesizing one pixel at a time is very inefficient – use chunks of the original input texture all at once. • Once again, inspired by Shannon’s N-Grams • Rather than generate text by producing letters with Markov properties, why not use words as the unit?

  36. Markov Text – Word Level Order-1: Then said unto all thine arrows of Joseph of Saul, that enter into stubble. Darts are abomination to his servants. And it shall stink; Order-2: And thou shalt die the common people. Nevertheless the centurion saw what was I ever wont to haunt. Order-3: The wicked are overthrown, and are not: but the publicans and the harlots believed him: and ye, when ye shall come into the house... Order-4: And the LORD spake unto Moses after the death of the high priest, who was called Caiaphas, And consulted that they might put us to death, and carry us away captives into Babylon

  37. Algorithm • Choose n x n square image patch size • Synthesize blocks in raster order • Search input texture for block that satisfies overlap constraints (above and left) • Easy to optimize using NN search [Liang et.al., ’01] • Paste new block into resulting texture • Blend at boundary so smoothly join together

  38. B1 B1 B2 B2 Neighboring blocks constrained by overlap Minimal error boundary cut block Input texture B1 B2 Random placement of blocks

  39. 2 _ = overlap error min. error boundary Minimal error boundary overlapping blocks vertical boundary

  40. Failures

  41. Portilla & Simoncelli Xu, Guo & Shum input image Wei & Levoy Image Quilting

  42. Homage to Shannon! Portilla & Simoncelli Xu, Guo & Shum input image Wei & Levoy Image Quilting

  43. Texture Transfer GOAL: Take texture from one object and “paint” onto another object • This requires separating texture and shape • That’s HARD, but we can cheat • Assume we can capture shape by boundary and rough shading KEY IDEA: add another constraint when sampling: similarity to underlying image at that spot

  44. parmesan + = rice + =

  45. Object Removal by Exemplar-Based Inpainting (Criminisi et al. 2003) Goal: take image and remove unwanted object.

More Related