220 likes | 657 Vues
Image Compression (Chapter 8). CSC 446 Lecturer: Nada ALZaben. Outline: . Introduction. Image Compression Model. Compression Types . Data Redundancy. Redundancy Types. Coding redundancy Lossless compression. Introduction.
E N D
Image Compression(Chapter 8) CSC 446 Lecturer: Nada ALZaben
Outline: Introduction. Image Compression Model. Compression Types . Data Redundancy. Redundancy Types. Coding redundancy Lossless compression
Introduction • Most data nowadays are available on line and for the limited storage space and communications requirement, methods of compressing data prior to storage and/or transmission are being interesting study field. • Image compression address the problem of reducing the amount of data required to represent a digital image. • Image compression is done prior to storage and/or transmission and then decompressed to reconstruct the original image.
Image Compression Model • Source encoder: removes input redundancy. • Channel encoder: increase the noise immunity of source encoder output. • Channel: if it is noise free the channel encoder and channel decoder is omitted.
Compression Types Lossy image compression: is useful in applications such as broadcasting television and video conferencing in which certain amount of error is acceptable trade off for increased compression performance. Lossless image compression: useful in image archiving such as medical records where the image will be compressed and decompressed without losing any information.
Data redundancy [1] It is a central issue in digital image compression. If denote the number of information carring units in two data sets that represent the same information, the relative data redundancy of the first data set can be defined as: where “compression ratio” ,is total size in bits of original Image, ,total size of bits of compressed image.
Data redundancy [2] In case then indicating that the first representation of the information contains no redundant data. In case ,, =1, indicating significant compression and highly redundant data. In case ,0, =, indicating that compressed data contain much more data than original representation.
Redundancy Types • Coding redundancy. • Interpixel redundancy. • Psychvisual redundancy. Data compression is done if one or more of these types are achived.
Coding Redundancy Assuming the discrete random variable in the interval [0,1] representing the gray levels of an image and that each occurs with probability ) where L=number of gray levels, =number of times that gray level appears in the image, n is the total number of pixels in the image. If is represented by bits then the average of number of bits required to represent each pixel is The total number of bits required to code an M X N image is M X N X
Coding Redundancy [example] an 8-level image has the gray level distribution as: If a natural 3-bit binary code (code1) is used to represent the 8 possible gray levels =3 bits because =3bits
Coding Redundancy [example] but for code2 the Avg number of bits required to code the image is reduced to : = = 2(0.19)+2(0.25)+2(0.21)+3(0.16)+4(0.08)+5(0.06)+ 6(0.03)+6(0.02)= 2.7 bits = approximately 10% of data resulting from the use of code1 is redundant. Level of redandant
Lossless Compression • Assigning fewer bits to the more probabilty gray level than the least probable ones achive data compression which is called “variable- length coding” Lossless compression • “Huffman code” is a kind of the variable length coding.
Lossless Compression • “Huffman code example”. • The letters A,B,C,D, and E are to be encoded and have relative probability of occurrence as follows: • p(A)=0.16,p(B)=0.51, p(C)=0.09, p(D)=0.13, p(E)=0.11 • the two characters with the lowest probability are combined in the first binary tree which has the characters as leaves. p(CE)=0.20 • Each right branch 1 and each left branch 0
“Huffman code example”. • Huffman tree... • Huffman table… • Get =
Lossless Compression • “Run- length encoding” Lossless compression. • The Idea of run-length encoding is replaceing long sequences (runs) of identical samples with a special code that indicates the value to be repeated and the number of times repeated. 1110010000 (3,1)(2,0)(1,1)(4,0) RLE