170 likes | 355 Vues
عنوان:نظریه اطلاعات شانون تنظیم کنندگان: شکوه ریاضی مینا محمدی فرانک کاظمی مجد. Biog raphy invention Awards & Honors What is Information Theory ? NOISE communication theory Definition of Entropy?. contents. Biography.
E N D
عنوان:نظریه اطلاعات شانون تنظیم کنندگان: شکوه ریاضی مینا محمدی فرانک کاظمی مجد
Biographyinvention Awards & Honors What is Information Theory ? NOISE communication theory Definition of Entropy? contents
Biography Claude Elwood Shannon (April 30, 1916 – February 24, 2001) was an American mathematician, electronic engineer known as "the father of information theory “
Invention He also invented many devices, including rocket-powered flying discs, a motorized pogo stick, and a flame-throwing trumpet for a science exhibition.One of his more humorous devices was a box kept on his desk called the "Ultimate Machine", based on an idea by Marvin Minsky. Otherwise featureless, the box possessed a single switch on its side. When the switch was flipped, the lid of the box opened and a mechanical hand reached out, flipped off the switch, then retracted back inside the box. Renewed interest in the "Ultimate Machine" has emerged on YouTube and Thingiverse. In addition he built a device that could solve the Rubik's cube puzzle. He is also considered the co-inventor of the first wearable computer along with Edward O. Thorp. The device was used to improve the odds when playing roulette.
Awards & Honors Throughout Shannon's life, he has received many honors including the Morris Liebmann Memorial award in 1949, the Ballantine Medal in 1955, and the Merin J. Kelly Award of the American Institute of Electrical Engineers in 1962. In addition, he was awarded the National Medal of science in 1966, as well as the Medal of Honor that same year from the Institute of Electrical and Electronics Engineers. Likewise, he received the Jaquard award in 1978, the John Fritz Medal in 1983, and the Kyoto Prize in Basic Science in 1985, along with numerous other prizes and over a dozen honorary degrees. Also, he is a member of the American Academy of Arts and Sciences, the National Academy of Sciences, the National Academy of Engineering, the American Philosophical Society, and the Royal Society of London.
What is Information Theory Information theory deals with measurement and transmission of information through a channel. A fundamental work in this area is the Shannon's Information Theory, which provides many useful tools that are based on measuring information in terms of bits or - more generally - in terms of (the minimal amount of) the complexity of structures needed to encode a given piece of information.
In defining information, Shannon identified the critical relationships among the elements of a communication system the power at the source of a signal the bandwidth or frequency range of an information channel through which the signal travels the noise of the channel, such as unpredictable static on a radio, which will alter the signal by the time it reaches the last element of the System the receiver, which must decode the signal.
NOISE Noise can be considered data without meaning; that is, data that is not being used to transmit a signal, but is simply produced as an unwanted by-product of other activities. Noise is still considered information, in the sense of Information Theory.
ENTROPY A quantitative measure of the disorder of a system and inversely related to the amount of energy available to do work in an isolated system. The more energy has become dispersed, the less work it can perform and the greater the entropy.
Entropy He showed that if the entropy rate, the amount of information you wish to transmit, excceds the channel capacity, then there were unavoidable and uncorrectable errors in the transmission. This is intuitive enough. What was truly surprising, though, is that he also showed that if the sender's entropy rate is below the channel capacity, then there is a way to encode the information so that it can be received without errors. This is true even if the channel distorts the message during transmission. Shannon adapted his theory to analyze ordinary human (written) language. He showed that it is quite redundant, using more symbols and words than necessary to convey messages. Presumably, this redundancy is used by us to improve our ability to recognize messages reliably and to communicate different types of information.
Entropy One of the most important feature of Shannon's theory was the concept of entropy, which he demonstrated to be equivalent to a shortage in the information content in a message. According to the second law of thermodynamics, as in the 19th century, entropy is the degree of randomness in any system always increased. Thus many sentences could be significantly shortened without losing their meaning. Shannon proved that in a noisy conversation, signal could always be send without distortion. If the message is encoded in such a way that it is self-checking, signals will be received with the same accuracy as if there were no interference on the line. A language, for example, has a built in error-correcting code. Therefore, a noisy party conversation is only partly clear because half the language is redundant. Shannon's method were soon seen to have applications not only to computer design but to virtually very subject in which language was important such as linguistic, psychology, cryptography and phonetics.
Information theory and entropy Information theory tries to solve the problem of communicating as much data as possible over a noisy channel Measure of data is entropy Claude Shannon first demonstrated that reliable communication over a noisy channel is possible (jump-started digital age)
Shannon’s Information Theory provide us the basis for the field of Information Theory • Identify the problems we have in our communication system • We have to find the ways to reach his goal of effective communication system.
REFRENCES Dewdney, A. K. The New Turing Omnibus. New York: Henry Holt and Company, 2001 http://encyclopedia.thefreedictionary.com/Shannon's%20law http://www-2.cs.cmu.edu/~dst/Tutorials/Info-Theory/ http://encyclopedia.thefreedictionary.com/Shannon%27s%20theorem http://www.lucent.com/minds/infotheory/what5.html