1 / 55

CSE 473/573 Computer Vision and Image Processing (CVIP)

CSE 473/573 Computer Vision and Image Processing (CVIP). Ifeoma Nwogu inwogu@buffalo.edu Lecture 13 + 14 – Texture. Schedule. Last class We finished local features Today Textures Readings for today: Forsyth and Ponce Chapter 6.1 – 6.4. Texture. From F & P, used with permission.

Télécharger la présentation

CSE 473/573 Computer Vision and Image Processing (CVIP)

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. CSE 473/573 Computer Vision and Image Processing (CVIP) Ifeoma Nwogu inwogu@buffalo.edu Lecture 13 + 14 – Texture

  2. Schedule • Last class • We finished local features • Today • Textures • Readings for today: Forsyth and PonceChapter 6.1 – 6.4

  3. Texture From F & P, used with permission

  4. What is texture? • Patterns of structure from • changes in surface albedo (eg printed cloth) • changes in surface shape (eg bark) • many small surface patches (eg leaves on a bush) • Hard to define; but texture tells us • what a surface is like • (sometimes) object identity • (sometimes) surface shape

  5. Texture and Material http://www-cvr.ai.uiuc.edu/ponce_grp/data/texture_database/samples/

  6. Texture and Orientation http://www-cvr.ai.uiuc.edu/ponce_grp/data/texture_database/samples/

  7. Texture and Scale http://www-cvr.ai.uiuc.edu/ponce_grp/data/texture_database/samples/

  8. Texture: Core Problems • Represent complex surface textures to recognize • objects • materials • textures • Synthesize texture from examples • to create big textures for computer graphics • to fill in holes in images caused by editing

  9. Texture representation • Core idea: Textures consist of • a set of elements • repeated in some way • Representations • identify the elements • summarize the repetition

  10. Notice how the change in pattern elements and repetitions is the main difference between different textured surfaces (the plants, the ground, etc.)

  11. Different materials tend to have different textures

  12. Filter based texture representations • Choose a set of filters, each representing a pattern element • typically a spot and some oriented bars • Apply all the filter the image at a variety of scales • Rectify the filtered images • typically half wave, to avoid averaging contrast reversals • e.g. should not average dark spot on light background, light spot on dark background to zero • Compute summaries of rectified filtered images • e.g. smoothed average • at a variety of scales to capture • nearby pattern elements and general picture of pattern elements • Now describe each pixel by vector of summaries • which could be very long

  13. How can we represent texture? • Compute responses of blobs and edges at various orientations and scales

  14. Overcomplete representation: filter banks LM Filter Bank Code for filter banks: www.robots.ox.ac.uk/~vgg/research/texclass/filters.html

  15. Filter banks • Process image with each filter and keep responses (or squared/abs responses)

  16. Representing texture • Take the vectors of filter responses at each pixel and cluster them, then take histograms

  17. Texture representation • Textures are made up of repeated local patterns, so: • Find the patterns • Use filters that look like patterns (spots, bars, raw patches…) • Consider magnitude of response • Describe their statistics within each local window • Mean, standard deviation • Histogram • Histogram of “prototypical” feature occurrences

  18. Texture representation: example original image … statistics to summarize patterns in small windows derivative filter responses, squared

  19. Texture representation: example original image … statistics to summarize patterns in small windows derivative filter responses, squared

  20. Texture representation: example original image … statistics to summarize patterns in small windows derivative filter responses, squared

  21. Texture representation: example original image … … statistics to summarize patterns in small windows derivative filter responses, squared

  22. Texture representation: example Dimension 2 (mean d/dy value) Dimension 1 (mean d/dx value) … … statistics to summarize patterns in small windows

  23. Texture representation: example Windows with primarily horizontal edges Both Dimension 2 (mean d/dy value) Dimension 1 (mean d/dx value) … … Windows with small gradient in both directions Windows with primarily vertical edges statistics to summarize patterns in small windows

  24. Texture representation: example original image visualization of the assignment to texture “types” derivative filter responses, squared

  25. Texture representation: example Dimension 2 (mean d/dy value) Far: dissimilar textures Close: similar textures Dimension 1 (mean d/dx value) … … statistics to summarize patterns in small windows

  26. Example based texture representations Q: how does one choose the filters? • Responses to over-complete filter banks • build histograms over filter responses • describe textures using clusters of responses • Alternative • build a vocabulary of pattern elements from pictures • describe textures using this vocabulary

  27. Building a vocabulary • There are 2 steps to building a pooled texture representation for texture in an image domain. • Build a dictionary representing the range of possible pattern elements, using a large number of texture patches • Vectorize the patches in images using the clusters learned

  28. Building a dictionary • Collect many training example textures • Construct the vectors x for the relevant pixels; these could be a reshaping of a patch around the pixel or a vector of filter outputs computed at the pixel • Obtain k cluster centers c for these examples

  29. Representing an image domain • For each relevant pixel pixel I in the image • Compute the vector representation xi for that pixel • Obtain j, the index of the cluster center cj closest to that pixel • Insert j into a histogram for that domain

  30. Clustering the examples • For using k-meanswe can represent patches with • intensity vector • vector of filter responses over pixel/patch

  31. Representing a region • Vector Quantization • Represent a high-dimensional data item with a single number • Find the number of the nearest cluster center in dictionary • Use that to represent the cluster • Summarize the pattern of patches • Cut region into patches • Vector quantize - vector quantized image patches are sometimes called visual words • Build histogram of resulting numbers

  32. Texture representations • A vector summarizing the trends in pattern elements • either overall trend in filter responses • or histogram of vector quantized patches • At a pixel • compute representations for domains centered on the pixel • For a region • compute representations the whole region

  33. Texture synthesis • Problem: • Take a small example image of pure texture • Use this to produce a large domain of “similar” texture • Why: • Computer graphics demands lots of realistic texture, hard to find • Fill in holes in images created by removing objects • Simple case: • Assume we must synthesize a single pixel in a large image • Approach: • Match the window around that pixel to other windows in the image • Choose a value from the matching windows • most likely, uniformly and at random

  34. Just like copying… • but not just repetition Photo Pattern repeated

  35. Approaches Generate new examples of a texture. • Original approach: Use the same representation for analysis and synthesis • This can produce good results for random textures, but fails to account for some regularities • Recent approach: Use an image of the texture as the source of a probability model • This draws samples directly from the actual texture, so can account for more types of structure • Very simple to implement • However, depends on choosing a correct distance parameter

  36. Other more recent approaches • Matching methods • Find another patch that looks a lot like the boundary hole • Place that patch over the hole and blend (segmentation) the patch and hole together • Boundary hole filling • Coherence methods (applying cost function + constraints) • Variational methods (for non-texture holes)

  37. Efros and Leung method • For each new pixel p (select p on boundary of texture): • Match a window around p to sample texture, and select several closest matches • Matching minimizes sum of squared differences of each pixel in the window (Gaussian weighted) • Give zero weight to empty pixels in the window • Select one of the closest matches at random and use its center value for p • Size and shape of the neighborhood matter

  38. Initial conditions for E & L • If no initial conditions are specified, just pick a patch from the texture at random • To fill in an empty region within an existing texture: • Grow away from pixels that are on the boundary of the existing texture

  39. Usefulness of boundary edges

  40. Window size parameter

  41. More results – window size

  42. Image extrapolation

  43. Image extension results…..

  44. Failures

  45. Fill in holes by looking for example patches in the image. If needed, rectify faces (lower images).

  46. State of the art in image fill-in is very good. This uses texture synthesis and other methods.

More Related