1 / 61

Dr. R. J. RAMTEKE Department of Computer Science North Maharashtra University, Jalgaon.

Dr. R. J. RAMTEKE Department of Computer Science North Maharashtra University, Jalgaon. Image Processing : An Overview. Digital Imaging. Vision plays important role in human perception. Digital Imaging has moved on a shade. Applications are not limited to Remote Sensing Medical Imaging

Télécharger la présentation

Dr. R. J. RAMTEKE Department of Computer Science North Maharashtra University, Jalgaon.

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Dr. R. J. RAMTEKE • Department of Computer Science • North Maharashtra University, Jalgaon. Image Processing : An Overview

  2. Digital Imaging Vision plays important role in human perception Digital Imaging has moved on a shade . . .

  3. Applications are not limited to • Remote Sensing • Medical Imaging • Forensic Studies • Military • Film Industry • Document Processing • Graphics Arts • Printing Industry • Pattern Recognition

  4. Image • 2D light intensity f(x,y), where x, y are spatial co-ordinates and f is the grey level of the image at that point. • Digital image • Image f(x,y) is discretized both in spatial co-ordinates and brightness • Represented by M x N matrix of intensity values (pixels) • Digital image processing (DIP) • Process of using computer algorithms to perform image processing tasks on digital images

  5. There are three basic types of image processing: 1. Optical image processing:uses an arrangement of optical elements to carry out an operation. Eye glasses are a form of optical image processing. When a process is applied to an image that is in the form of transmitted or reflected light, we refer it an optical process. 2. Analog image processing:uses analog electrical circuits to carryout the operation. When the process is applied to an image that is in the form of analog signal, we refer to it as an analog process. 3. Digital Image Processing:uses digital circuits, computer processors and software to carryout the operation. Within the digital domain, an image is represented by discrete points of numerically defined brightness. By manipulating this brightness, digital computer implements image processing.

  6. VISUAL x DIGITAL IMAGE PROCESSING

  7. BINARY IMAGES Binary images are images that have been quantized to two values, usually denoted 0 and 1 .

  8. GRAYSCALE AND COLOUR IMAGES Grayscale imagesare distinct from black-and-white images, which in the context of computer imaging are images with only two colors, black and white . Acolor imageis a digital image that includes color information for each pixel. It is necessary and almost sufficient to provide three color channels (red, green and blue) for each pixel.

  9. Color Image Processing Wavelet & Multiresolution processing Image Compression Morphological Processing Image Restoration Segmentation Image Enhancement Representation Image Acquisition Object Recognition Fundamentals of Digital Image Processing Knowledge Base

  10. Types of projections • 2 types of projections • Perspective • Parallel. • Key factor is the center of projection. • if distance to center of projection is finite : perspective • if infinite : parallel • Parallel • Perspective

  11. Image Acquisition • Images are generated by… • - Illumination • Reflection or absorption • of energy from the source by the elements of the scene being imaged Illumination means • Visible light source illuminates a common everyday 3-D scene • Source of electromagnetic energy such as radar, infrared, X-ray, Ultrasound etc. Depending on nature of the source illumination energy is reflected from or transmitted through Object. Light reflected from planer surface X-Ray passes through a patients body

  12. Image Enhancement • The idea behind enhancement technique is to bring out detail • that is to highlight certain features of interest in an image. • It means emphasize certain image features for analysis for • image display or feature extraction. e.g. Contrast and edge enhancement • Pseudo Coloring • Noise filtering • Sharpening and Magnifying • This process does not increase the inherent information content • in the data. • Enhancement is based on human subjective preferences • regarding what constitutes a good enhancement result.

  13. Image Restoration It refers to removal or minimization of known degradation in an image. This includes deblurringof images degraded by the limitation of a sensor or it’s environment, noise filtering and correction of geometric distortion or non-linearity due to sensors. • Image restoration is an area that deals with improving the appearance • of an image. • Restoration techniques tends to be based on mathematical or • probalisticmodel of Image degradation. • Image Restoration is objective. • Image Enhancement is subjective.

  14. Color Image Processing • Two Major Areas… • Full Color Processing – Color T.V., Camera, or Color Scanner • Pseudo Color Processing – False Color Image Processing • It Consist of assigning colors to gray value based on • specified criterion. • The principle use of pseudo color is for human visualization and • interpretation of gray scale events in an image.

  15. Wavelet and Multiresolution Processing Wavelets are the foundation for representing images in various degree of resolution. Wavelet Transform is used to compress, transmit and analyze images which is based on Small Waves called Wavelet of varying frequency. Fourier Transform is also used for the same which is based on Sinusoids. Multiresolution Theory is concerned with the representation and analysis of signals (or Images) at more than one resolution.

  16. Image Compression • It is a technique for reducing the storage requirement to save an image or bandwidth required to transmit it without appreciable loss of information. • The amount of data associated with visual information is so large that it’s storage would require huge storage capacity. • Image data compression is concerned with minimizing the number of bits required to represent an image.

  17. Morphological Image Processing • It deals with tools for extracting image components that are useful in • representation and Description of shape, such as boundaries, edge, • skeletons etc… • Morphological techniques are used for pre or post processing such • as morphological filtering , thinning etc… • It’s a mathematical morphology as a tool. The language of • mathematical morphology is Set theory. It offers unified and • powerful approach to numerous image processing problems. • Sets in mathematical morphology represents objects in an image. • E.g. the set of all black pixels in binary image is a complete • morphological description of the image

  18. Vertical projection of Numeral String Segmentation • It processes partition an image into it’s constituent parts or object. • It refers to the decomposition of a scene into it’s components. It • is a key in image analysis, For ex.. A document reader would • first segment the various character before proceeding to identify • them. • Segmentation subdivides an image into it’s constituent regions or • objects. The level to which subdivision is carried depends on the • problem being solved. That is segmentation should stop when the • objects of interest in an application have been isolated.

  19. Representation and Description • It is necessary to convert data to form suitable computer processing, the first decision that must be made is whether the data should be represented as a Boundaryor as a complete Region. • Boundary Representation is appropriate when the focus is on external shape Characteristics, such as corner. • Regional Representation is appropriate when the focus is on internal properties, Such as texture or skeleton shape. • A method must also be specified for describing the data so that features of interest are highlighted. Description also called as feature selection, which deals with extracting attributes that results in some quantitative information of interest or are basic for differentiating one class of object from another.

  20. Object Recognition • It is a process that assigns a label to an object • based on it’s descriptors. • It is also called as Pattern Recognition. • Pattern Recognition broadly divided into two • areas • Statistical – Quantitative Descriptors • Structural – Qualitative Descriptors

  21. Image Representation • I= f (x, y) • Where, • I: Intensity • f: Amplitute • (x,y):Coordinates

  22. MATHEMATICAL REPRESENTATION OF IMAGE M x N Digital Image Image Elements (Pixels) bits to store the image = M x N x k gray level = 2k

  23. PIXEL NOTATION

  24. REPRESENTING DIGITAL IMAGES

  25. Digital Image Acquisition Process

  26. Image Sampling and Quantization Image sampling: discretize an image in the spatial domain Spatial resolution / image resolution: pixel size or number of pixels

  27. Sampling • Sampling represents the image by measurements at regularly spaced sample intervals. The samples are usually treated as small, often rectangular or square cells or pixels. • Two important criteria:- • Sampling interval • distance between sample points or pixels • Tessellation • the pattern of sampling points • The number of pixels in the image is called the resolution of the image. If the number of pixels is too small, individual pixels can be seen and other undesired effects may be evident.

  28. = Sampling locations How to choose the spatial resolution Spatial resolution Original image Sampled image Under sampling, we lost some image details!

  29. 256x256 pixels 128x128 pixels 64x64 pixels 32x32 pixels Effect of Spatial Resolution

  30. Effect of Spatial Resolution

  31. Quantisation • Quantisation uses an ADC (analogue to digital converter) to transform brightness values into a range of integer numbers, 0 to M, where M is limited by the ADC and the computer. where m is the number of bits used to represent the value of each pixel. This determines the number of grey levels. • Too few bits results in steps between grey levels being apparent.

  32. Effect of Quantization Levels 256 levels 128 levels 64 levels 32 levels

  33. In this image, it is easy to see false contour. Effect of Quantization Levels (cont.) 16 levels 8 levels 4 levels 2 levels

  34. Bit Plane Slicing Higher order bits contain majority of the visually significant data, while the lower bits contains subtle details in the image. Bit Plane can be used in Image compression. We can transmit only higher order bits And remove lower order bits.

  35. How to select the suitable size and pixel depth of images The word “suitable” is subjective: depending on “subject”. Low detail image Medium detail image High detail image Lena image Cameraman image To satisfy human mind 1. For images of the same size, the low detail image may need more pixel depth. 2. As an image size increase, fewer gray levels may be needed.

  36. (x-1,y-1) (x,y-1) (x+1,y-1) (x-1,y) (x,y) (x+1,y) (x-1,y+1) (x,y+1) (x+1,y+1) Basic Relationship of Pixels (0,0) x y Conventional indexing method

  37. (x,y-1) (x-1,y) (x+1,y) N4(p) = (x,y+1) Neighbors of a Pixel Neighborhood relation is used to tell adjacent pixels. It is useful for analyzing regions. 4-neighbors of p: (x-1,y) (x+1,y) (x,y-1) (x,y+1) p 4-neighborhood relation considers only vertical and horizontal neighbors. Note: qÎ N4(p) implies pÎ N4(q)

  38. Neighbors of a Pixel (cont.) 8-neighbors of p: (x-1,y-1) (x,y-1) (x+1,y-1) (x-1,y-1) (x,y-1) (x+1,y-1) (x-1,y) (x+1,y) (x-1,y+1) (x,y+1) (x+1,y+1) (x-1,y) p (x+1,y) (x-1,y+1) (x,y+1) (x+1,y+1) N8(p) = 8-neighborhood relation considers all neighbor pixels.

  39. Neighbors of a Pixel (cont.) Diagonal neighbors of p: (x-1,y-1) (x+1,y-1) (x-1,y-1) (x+1,y-1) (x-1,y+1) (x+1,y+1) p ND(p) = (x-1,y+1) (x+1,y+1) Diagonal -neighborhood relation considers only diagonal neighbor pixels.

  40. Connectivity • Connectivity is adapted from neighborhood relation. Two pixels are connected if they are in the same class (i.e. the same color or the same range of intensity) and they are neighbors of one another. • For p and q from the same class • 4-connectivity: p and q are 4-connected if q Î N4(p) • 8-connectivity: p and q are 8-connected if q Î N8(p) • mixed-connectivity (m-connectivity): p and q are m-connected if q Î N4(p) or q Î ND(p) and N4(p) Ç N4(q) = Æ

  41. Image Enhancement in the Spatial Domain • Image Enhancement is purely subjective process. • Subjective means desired results varies from • person to person. • It is a cosmetic procedure i.e. it doesn’t add any • extra information to the original information. • Enhancement can be done in two domain. • Spatial domain • Frequency domain The term Spatial domain means working in the given space, in this case, the image , It implies working directly with pixels values. The modified image can be expressed as G(x,y) = T| f(x,y) f(x,y) is original image , T is transformation applied to it, G(x,y) is modified image

  42. DISTRIBUTION Dark image Components of histogram are concentrated on the low side of the gray scale. Bright image Components of histogram are concentrated on the high side of the gray scale.

  43. Histogram • Histogram of an image represent the relative frequency of occurrences of the • various Gray-level in an image. • The histogram of digital image with gray levels in the range [0 , L-1] is a discrete • function h(rk) = nk, where rk is the kth gray level and nk is the number of pixels in • the image having gray level rk. • It’s a group showing the number of pixels in an image at each different intensity • value found in that image. • Histogram manipulation can be used effectively for image enhancement. • It provides useful image statistics. • Linear Stretching and Histogram Equalization are two features of Histogram.

  44. Histogram The (intensity or brightness) histogram shows how many times a particular grey level (intensity) appears in an image. For example, 0 - black, 255 – white image histogram

  45. Histogram equalization

  46. Example (Histogram equalization) L = 8 (No. of Gray Level) 1023 790 850 656 329 245 122 81 1 2 3 4 5 6 7 0

  47. Example (Contd…) n=4096

  48. Example (Contd…) 1023 985 850 790 448 1 2 3 4 6 7 5 0

More Related