Tuesday, January 8, 2008

Silence

Information proliferates by incorporating a greater diversity of cultural codes and worldly sources, and generates still greater variety by internal means; the sheer volume of information increases as words, sounds and images become freighted with multiple, shifting allusions and meanings. In this way, information itself takes on multiple personalities, and the very nature of information becomes less “natural.” However, through the redundancies trafficked by means of mass culture, much information pushes so far beyond this boundary as to become “naturalized,” and effectively muted, devalued, and take on the characteristic of interference, which is contrary to the necessary communication inherent in the word, “information.”

-Paraphrased from Douglas Kahn’s Noise Water Meat

Douglas Kahn made a brilliant statement concerning the evolution of sound within a mass culture, which I have above changed to suit the wider spectrum of information theory. I draw my understanding from Claude Shannon's theories of information, and information entropy.




"Information is a degree of order, or non-randomness, which can be measured and treated mathematically much as mass or energy or other physical quantities are. A mathematical characterization of the generalized communication system yields a number of important quantities, including

1. the rate at which information is produced at the source.
2. the capacity of the channel for handling information.
3. the average amount of information in a message of any particular type.

To a large extent the techniques used with information theory are drawn from the mathematical science of probability. Estimates of the accuracy of a given transmission of information under known conditions of noise interference, for example, are probabilistic, as are the numerous approaches to encoding and decoding that have been developed to reduce uncertainty or error to minimal levels."

"According to the second law of thermodynamics, as in the 19th century, entropy is the degree of randomness in any system always increased. Thus many sentences could be significantly shortened without losing their meaning. Shannon proved that in a noisy conversation, signal could always be sent without distortion. If the message is encoded in such a way that it is self-checking, signals will be received with the same accuracy as if there were no interference on the line. A language, for example, has a built in error-correcting code. Therefore, a noisy party conversation is only partly clear because half the language is redundant."




Twitter Updates