The science of information

"Information: The New Language of Science" by Hans Christian von Baeyer.

I recently wrote about Claude Shannon and Warren Weaver’s book “The Mathematical Theory of Communication” and its insights into the idea of measuring information. I had planned to describe this book as an introduction to a review of a more recent, more easily digestible book, Information: The New Language of Science by Hans Christian von Baeyer, but decided to write separate entries on the two books.

['Information' cover]

I enjoyed von Baeyer’s book a great deal, and recommend it to anyone interested in where the science of information has been (for example, Shannon and Weaver) and where it’s leading. The author, a physicist at Virginia’s College of William and Mary, has written several popular science books on physics-related topics. This background leads him to draw an extended analogy throughout the book between the evolution of scientific approaches to dealing with information and historical approaches to dealing with energy:

The gradual crystallization of the concept of information during the last hundred years contrasts sharply with the birth of the equally abstract quantity called energy in the middle of the nineteenth century. Then, in the brief span of twenty years, energy was invented, defined, and established as a cornerstone, first of physics, then of all science. We don’t know what energy is, any more than we know what information is, but as a now robust scientific concept we can describe in in precise mathematical terms, and as a commodity we can measure, market, regulate and tax it.

He certainly knows the history of approaches to energy, and from Ludwig Boltzman’s work to the present he describes many issues common to the quantification of both energy and information, such as entropy, randomness, probability, noise, and the relationship between logarithmic measurement and human perception. Toward the end of the book he shows how more recent subatomic issues in physics have more direct implications on information theory than the history of energy does, as he reviews qubits and Schrodinger’s ideas about how we can know what’s what inside of an atom.

The concept of linking, which I think of as an expression of resource relationships, has always been close to my heart. After Von Baeyer quotes Henri PoincarĂ© saying “The aim of science is not things in themselves, as the dogmatists in their simplicity imagine, but the relations between things; outside those relations there is no reality knowable”, I found it particularly interesting when von Bayer described information as “the communication of relationships”.

Claude Shannon is one of the book’s heroes (von Baeyer writes that Shannon’s “A Mathematical Theory of Communication” has been “likened to the Magna Carta, Newton’s laws of motion, and the explosion of a bomb”) and von Baeyer makes an interesting prediction about Shannon and Weaver’s distinction between semantic information and information as a set of symbols whose successful transmission can be accurately measured:

…the word ‘information’ has two different senses. The colloquial usage, as in ‘personal information’ and ‘directory information’, refers to the meaning of a message of some sort. The technical sense, on the other hand, emphasizes the symbols used to transmit a message… Eventually the two definitions of information should converge, but that hasn’t happened yet. When it does, we will finally know what information is; until then we have to make do with compromises.

His prediction of this convergence is pretty exciting. Overall, “Information” is a fascinating book, never too technical, and will especially appeal to geeks interested in the future of publishing applications and any applications that manipulate content or data with semantic value.