100 likes | 224 Vues
This piece delves into the essential elements of information theory, arguing for its study within philosophy. It covers vital components such as data compression, error correction, and encryption, using practical examples like Morse code and secret key systems. The text presents methods of encoding data securely and discusses the concept of entropy as a measure of uncertainty. It also addresses broader topics in the field, including learning and recognizing patterns, highlighting the interdisciplinary nature of communication and data representation.
E N D
A Bit about Bits Shouldn’t “information theory” be studied in the philosophy department?
Secret Sauce • Ingredients: • Data Compression • Error correction • Encryption
Error Correction Insert four bits of information: abcd. Choose e, f, and g so that each circle sums to an even number. The new sequence is abcdefg. a b c d
Encryption • Use “secret key” to hide information. • Pair each bit of information with a bit of secret key. • If the secret key bit is ‘1,’ flip the information bit. Otherwise leave it alone. Example: Message: 010110 Secret Key: 110011 Result: 100101
Information Theory • How far can we push these magic tricks? • Entropy, entropy, entropy • Other topics: • Learning • Sensing • Recognizing patterns.