The Information, by James Gleick. Pantheon Books. Hardback. 2011.
Book Review 0.1, from a previous blog post.
Book Review 0.2, Progress Update, pg. 17, from my Goodreads
“I’m sure someone will review, “Ho hum, I knew all this stuff on information already.” Meh; good for you. Guess what? Me, too. But, I still like these books and think, as I read, to bear in mind if others might enjoy the book. Not only do I read the book for me, but so I can argue if the book is valuable to others. Gleick likes writing about science and has found a niche.”
Book Review 0.3, Progress Update, pg. 124, from my Goodreads
“Charles Babbage & Ada Lovelace, partners in imagination who were each and together well ahead of their time. Their frustration of seeing what could be accomplished yet having neither the technology to build the Analytic Engine nor the language to even discuss the principles involved. How to discuss technology and science when language lags?”
Book Review 0.4, Progress Update, pg. 178, from my Goodreads
“Claude Shannon is not a name most would recognize. He epitomizes education, however. As a kid, he built a telegraph from barbed wired. While studying engineering, he pursued mathematics. While working on math, he developed a theory of algebra for the study of genetics. His work was never published. Forty years later his work was discovered. Shannon was well-ahead of his time, or, on-time yet not recognized.”
Book Review 0.5, Progress Update, pg. 204, from my Goodreads
“Kurt Gödel makes a substantive appearance, proving there are some truths which cannot be proved & no systems are complete and consistent.”
Book Review 0.6, Progress Update, pg. 293, from my Goodreads
“This book is completely unexpected. The Information covers many facets of information, not merely computer information, but DNA, RNA, cryptography, language, and other forms of information. Also, sort of a biography of Claude Shannon.”
I stitched in my Goodreads Progress Updates for a couple reasons. First, you get a sense of how my perspective on the book changed over a few days. Second, you get a sense of the book’s scope, from Charles Babbage and Ada Lovelace, to Claude Shannon and genetics, some commentary about Kurt Gödel, and eventually we arrive back at Claude Shannon.
I very much enjoyed the opening chapters. These chapters dealt primary with the realization, across time, across space, of the importance of Information. The cliche is, “from the Dawn of Humankind people have sought <insert quest>.” The importance of Information cannot be traced back to the Dawn of Humankind. We can trace the History of Information back to the Libraries of Alexandria, perhaps. We might be able to go somewhat further back if we consider maps, murals, and other glyphs as information, with rocks, mud, the canvas or storage media, and inks rendered from natural materials as the data record.
Cultures in Africa had a means to convey messages. The drums we see used primarily for music today are either replicas or derivatives of those used to send messages. The tones generated did not coalesce into words in the listener’s brain, though. The tones connoted a theme, “visitors coming your way,” for example. While not mentioned, I imagine the Aboriginal didgeridoo had the same utility.
These early chapters dealt primary with how to code a message. Not about making a “secret” message, but how do we reduce the complexity of a message while still maintaining fidelity. The economies of the 1800s made transmitting a telegraph somewhat expensive so senders tried to avoid writing grammatically correct sentences. In fact, senders gave up writing sentences for the most part. Depending on the sender, an individual or a person sending on behalf of a company, an elaborate coding scheme may have been developed in advance of any telegraph. These code books would then be used to decrypt the telegraph. Today, we do this with text (SMS) messaging. We replace “laugh out loud” with “LOL,” and “In Case You Missed It,” with “ICYMI,” for example. But, to advance to a coding level, questions like, “How to reduce a written language to a set of easily coded symbols also easy to restore to the original state at the destination?” arise. Charles Babbage figured prominently in this work, as did Samuel Morse, obviously; we have a code named for him.
Later chapters relate some somewhat convoluted themes. Early 20th century scientists, Einstein, Szilard, Turing, Wheeler, Godel all seemed to chime in on what they felt about Information. Makes sense; these were many of the same people trying to break wide open the mysteries of the atom and the Universe. What materials best store a message? What materials best allow a message to be sent across a distance? How fast can a message be sent? How do I ensure the message which arrives is not broken or corrupted? How can I store a message and be able to retrieve that information at some later time? How fast can a message be processed? Some of these answers involved psychology, understanding how the human brain processes information, how messages which conform to a structure can still be misinterpreted.
Very early in the 20th century, and again, back to Babbage in the 19th century, people began to realize these questions and their answers may lead to some fascinating discoveries. Could a machine be built to handle numbers? If a machine could be built to handle numbers, could a machine then be able to think? If a machine could think, how would one realize she was a having a conversation with a machine? And, thus we delve into the world of Alan Turing, albeit briefly.
Alan Turing developed a thought experiment which some think has the potential to be able to discriminate people a real person and a machine intelligence. If you’d like to see this in action, just watch “Blade Runner” (or, better, read “Do Androids Dream of Electric Sheep?) Decker uses a Turing Test to determine who is a Replicant and who is human. The question remains, though, is it possible for a machine to believe itself to be “human,” and thus be able to pass a Turing Test?
Two sciences, genetics and psychology, began to realize their interconnectedness. RNA and DNA were discovered, genes and chromosomes were revealed, and scientists began to realize the depths to which information existed. With genes and DNA, we now realized information existed at least at the molecular level. The information for re-constructing an entire organism exists within a single cell. Psychology become involved when corruption in the genetic coding led to personality issues, or preferential relationships, or bias concerning physical traits. Psychologist could begin to examine how genetic coding influenced or was influenced by environmental factors. As more research was conducted, some, like Richard Dawkins, would state that our physical bodies are nothing more than a vehicle which works at the whim of our genes. (pg. 323)
If I seem like I’m wandering, I am, and the book does pretty the same. I’m leaning to the idea the book may have bit off more than was possible to chew in one book. In a sense, The Information tries to be a biography of Information. That is a huge undertaking. Gleick covers not only the telegraph, the transistor, but RNA, DNA, genes, chromosomes, logic, and taxonomy. In addition to topical coverage, Gleick also provided some biographical details on Charles Babbage and Claude Shannon. The Information tries, and succeeds to varying degrees, to also be a biography of Babbage, and a biography of Claude Shannon.
Claude Shannon figures prominently throughout The Information, as well he should. He literally wrote the book on information theory. Or, more accurately, wrote his master’s thesis, “A Symbolic Analysis of Relay and Switching Circuits” detailing how boolean algebra could be used to solve most any numerical problem. His MS thesis has been called, “the most important, and the most famous, master’s thesis of the 20th century.” (wikipedia) Claude had many interests in many fields, but everything distilled down to information. He performed worked in genetics analysis 40 years ahead of similar efforts by those in the genetics field. If only those folks had known about Claude’s research into algebra for theoretical genetics. We must remember, even though the world was connected in the 1940’s and 1950’s, people often had no idea what was happening in their discipline within their own country, or in Europe, or in Russia, for that matter.
Prior to reading The Information I had no recollection of having learned about Claude Shannon in any previous reading. It could be previous authors simply footnoted Shannon and I just didn’t make a note. Shannon, for lack of a better title, could be called the Father of Information Theory. Shannon is also credited with helping develop the digital computer and is one of the earliest researchers of cryptography. He and his efforts need to be discussed in introductory computer science courses. I’ve had several computer science courses and I simply have no recollection of hearing about him. Maybe times have changed. However, a good portion of the book invokes Claude’s life and his efforts which led to the evolution of the field of Information Theory. One aspect of Shannon’s life omitted from the text was his use of his intellect to obtain a certain amount of wealth from gambling in Las Vegas. His methods are detailed in the book, “Bringing Down the House” by Ben Mezrich (Atria, 2003) and are perhaps mentioned in the movie, “21,” (IMDB) based on the book, and starring Kevin Spacey. While neither the book nor the movie are about Claude Shannon, per se, the real-life exploits of college students in Las Vegas are based on techniques Shannon developed. Shannon’s impact on information theory and science enfuses the text, though. His spirit is clearly present.
The scope of material conveyed by Gleick can feel overwhelming at times. From telegraphy, telephony, cryptography, to DNA and genetics, with a dash of biogeography and taxonomy of birds tossed into the fray, a fray which also contains some mathematical proofs of what is “knowable.” The book is not a linear story-telling, nor does it settle with a single personality for too long, or a single theme. But, then, Information does not limit itself to a single niche, either.
The scope of Humankind’s interest in Information obviously dates back millennium, however, our realization of the gamut of data available to collect is really a very recent manifestation. That our universe may be an unimaginably large quantum computing environment, full of Information, scaling from the largest structures in the Universe down to the state of a theoretical quantum qubit, is something was literally not even possible to imagine considering 100 years ago. Perhaps as late as 50 years ago. A few people, Charles Wheeler, Roger Penrose, Stephen Hawking may have venture down those dark recesses of scientific philosophy, but the vast majority of the scientific community was very much locked into the Newtonian view of the Universe.
I’m calling out my recommendation from my other reviews. The Information is so atypical of other books I’ve read lately that I would not recommend this book to just anyone. I’m not going to say, “Rush right out and read this; your life will be much improved.” My guess is some readers would simply give up, after a few chapters. Gleick seems to present story after story, anecdote after anecdote, separated by geography and time. Some readers might balk at the asynchronous delivery of these stories and anecdotes. The book might be recommended for anyone working or studying in a STEM discipline, or perhaps a historian. I’m not sure just anyone would read this book for pleasure, like I did.