Home / General / Information Theory

Information Theory

/
/
/
1132 Views

Several decades back, Claude Shannon defined information theory. Here’s a short description that partakes of some of the hype that has always surrounded information theory.

Back in the 1950s and 1960s, Shannon’s “A Mathematical Theory of Communication” served the purpose that “quantum” does today. It was a way for a group of people to show their perspicacity in understanding pretty much everything and probably a contribution to dealing with a developing field. The transistor was invented in the early 1950s, and World War II had demonstrated the usefulness of high-speed computation.

There was much trying-to-be-erudite discussion of information theory and just plain mystification, as for the quantum everything we keep hearing about today. “Entropy” has always been useful. The fad for information theory in this respect has largely died, which is why I was surprised to hear that it has intrigued Elon Musk and others in Silicon Valley.

I was a youngster and learning about chemical entropy, so naturally I tried to connect the two. In chemistry, temperature is something like information, and entropy is a measure of the disorder incurred by temperature, although it also includes things like one molecule reacting to produce two.

It has always seemed to me that there is a fundamental difference between chemical thermodynamics and information theory. Chemical thermodynamics deals with measurable quantities and whether or not chemical reactions can take place under particular conditions of pressure and temperature. Information, in one of the senses of that word, depends on human minds. Without a human mind to make the connection between a string of 1’s and 0’s and, say, a photograph, information means nothing. There is nothing inherent in that string of 1’s and 0’s that implies a photograph.

Chemical entropy has occasionally been used as a mystification device, but statistical mechanics has put a firm theoretical foundation to it. The physicists’ understanding of thermodynamics seems to exist in a different world, so maybe their entropy is more like Shannon’s entropy.

Information theory has been essential in the development of computation. But maybe “information” in that phrase is much more specialized than the way we commonly understand that word.

Cross-posted to Nuclear Diner

  • Facebook
  • Twitter
  • Linkedin
This div height required for enabling the sticky sidebar
Ad Clicks : Ad Views : Ad Clicks : Ad Views : Ad Clicks : Ad Views : Ad Clicks : Ad Views : Ad Clicks : Ad Views : Ad Clicks : Ad Views : Ad Clicks : Ad Views : Ad Clicks : Ad Views : Ad Clicks : Ad Views : Ad Clicks : Ad Views : Ad Clicks : Ad Views : Ad Clicks : Ad Views : Ad Clicks : Ad Views : Ad Clicks : Ad Views : Ad Clicks : Ad Views : Ad Clicks : Ad Views :