1 / 30

Information Theory

Information Theory. Prepared by: Amit Degada Teaching Assistant, ECED, NIT Surat. Goal of Today’s Lecture. Information Theory……Some Introduction Information Measure Function Determination for Information Average Information per Symbol Information rate Coding Shannon-Fano Coding.

delora
Télécharger la présentation

Information Theory

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Information Theory Prepared by: Amit Degada Teaching Assistant, ECED, NIT Surat

  2. Goal of Today’s Lecture • Information Theory……Some Introduction • Information Measure • Function Determination for Information • Average Information per Symbol • Information rate • Coding • Shannon-Fano Coding

  3. Information Theory • It is a study of Communication Engineering plus Maths. • A Communication Engineer has to Fight with • Limited Power • Inevitable Background Noise • Limited Bandwidth

  4. Information Theory deals with • The Measure of Source Information • The Information Capacity of the channel • Coding If The rate of Information from a source does not exceed the capacity of the Channel, then there exist a Coding Scheme such that Information can be transmitted over the Communication Channel with arbitrary small amount of errors despite the presence of Noise

  5. Information Measure • This is utilized to determine the information rate of discrete Sources Consider two Messages A Dog Bites a Man  High probability  Less information A Man Bites a Dog  Less probability  High Information So we can say that Information α (1/Probability of Occurrence)

  6. Information Measure • Also we can state the three law from Intution Rule 1: Information I(mk) approaches to 0 as Pk approaches infinity. Mathematically I(mk) = 0 as Pk  1 e.g. Sun Rises in East

  7. Information Measure Rule 2: The Information Content I(mk) must be Non Negative contity. It may be zero Mathematically I(mk) >= 0 as 0 <= Pk <=1 e.g. Sun Rises in West.

  8. Information Measure Rule 3: The Information Content of message having Higher probability is less than the Information Content of Message having Lower probability Mathematically I(mk) > I(mj)

  9. Information Measure Also we can state for the Sum of two messages that the information content in the two combined messages is same as the sum of information content of each message Provided the occurrence is mutually independent. e.g. There will be Sunny weather Today. There will be Cloudy weather Tomorrow Mathematically I (mk and mj) = I(mk mj) = I(mk)+I(mj)

  10. Information measure • So Question is which function that we can use that measure the Information? Information = F(1/Probability) Requirement that function must satisfy • Its output must be non negative Quantity. • Minimum Value is 0. • It Should make Product into summation. Information I(mk) = Log b (1/ Pk ) Here b may be 2, e or 10 If b = 2 then unit is bits b = e then unit is nats b = 10 then unit is decit

  11. Conversion Between Units

  12. Example • A Source generates one of four symbols during each interval with probabilities P1=1/2, P2=1/4, P3= P4=1/8. Find the Information content of three messages.

  13. Average Information Content • It is necessary to define the information content of the particular symbol as communication channel deals with symbol. • Here we make following assumption….. • The Source is stationery, so Probability remains constant with time. • The Successive symbols are statistically independent and come out at avg rate of r symbols per second

  14. Average Information Content • Suppose a source emits M Possible symbols s1, s2, …..SM having Probability of occurrence p1,p2,…….pm For a long message having symbols N (>>M) s1 will occur P1N times, like also s2 will occur P2N times so on…….

  15. Average Information Content • Since s1 occurs p1N times so information Contribution by s1 is p1Nlog(1/p1). • Similarly information Contribution by s2 is p2Nlog(1/p2). And So on……. • Hence the Total Information Content is • And Average Information is obtained by Bits/Symbol It means that In long message we can expect H bit of information per symbol. Another name of H is entropy.

  16. Information Rate • Information Rate = Total Information/ time taken • Here Time Taken • n bits are transmitted with r symbols per second. Total Information is nH. • Information rate Bits/sec

  17. Some Maths • H satisfies following Equation Maximum H Will occur when all the message having equal Probability. Hence H also shows the uncertainty that which of the symbol will occur. As H approaches to its maximum Value we can’tdetermine which message will occur. Consider a system Transmit only 2 Messages having equal probability of occurrence 0.5. at that Time H=1 And at every instant we cant say which one of the two message will occur. So what would happen for more then two symbol source?

  18. Variation of H Vs. p • Let’s Consider a Binary Source, means M=2 Let the two symbols occur at the probability p and 1-p Respectively. Where o < p < 1. So Entropy can be Horse Shoe Function

  19. Variation of H Vs. P Now We want to obtain the shape of the curve Verify it by Double differentiation

  20. Example

  21. Maximum Information rate We Know that Also Hence

  22. Coding for Discrete memoryless Source • Here Discrete means The Source is emitting different symbols that are fixed. • Memoryless = Occurrence of present symbol is independent of previous symbol. • Average Code Length Where Ni=Code length in Binary digits (binits)

  23. Coding for Discrete memoryless Source Efficiency

  24. Coding for Discrete memoryless Source Kraft’s inequality If this is satisfied then only the Coding is uniquely Decipherable or Separable.

  25. ExampleFind The efficiency and Kraft’s inequality This Code is not Uniquely Decipherable mi pi Code I Code II Code III Code IV A B C D 00 01 10 11 0 1 10 11 0 01 011 0111 0 10 110 111 ½ ¼ ¼ ¼

  26. Shannon –Fano Coding Technique Algorithm. Step 1: Arrange all messages in descending order of probability. Step 2: Devide the Seq. in two groups in such a way that sum of probabilities in each group is same. Step 3: Assign 0 to Upper group and 1 to Lower group. Step 4: Repeat the Step 2 and 3 for Group 1 and 2 and So on……..

  27. Example Messages Mi Pi Coding Procedure No. Of Bits Code ½ 1/8/ 1/8 1/16 1/16 1/16 1/32 1/32 0 1 1 1 1 1 1 1 0 0 1 1 1 1 1 0 1 M1 M2 M3 M4 M5 M6 M7 m8 1 3 3 4 4 4 5 5 0 100 101 1100 1101 1110 11110 11111 0 0 1 1 1 0 1 0 1 1 0 1

  28. This can be downloaded from www.amitdegada.weebly.com/download After 5:30 Today

  29. Questions

  30. Thank You

More Related