Here’s a wonderful little post by James Gleick about the meaning of the word “information,” according to the OED. A palace indeed. This reminds of of one of my favorite books, Jeremy Campbell’s Grammatical Man: Information, Entropy, Language, and Life — probably the first book I read that suggested serious connections among my own work (the interpretation of texts), cognitive science, and computers. It was this book that told me who Claude Shannon is and why he matters so very, very much to the world we now inhabit.
Grammatical Man is almost thirty years old now and much of the science is clearly outdated, but it’s still fascinating, and I wish someone brilliant would tell the same story again, in light of present knowledge. Maybe a job for Steven Johnson?
7 Comments
Comments are closed.
The book sounds interesting – but perhaps you could do a post sometime on what you found interesting about it and how it helped you in your practice as a literary critic? I'd be very interested, for sure.
One of my main disagreements with my adviser was over the role of information theory in doing Artificial Intelligence. Usually the problem in AI is the matter of going from syntax to semantics. There's data (in the form of bits), computer programs are really good at searching for syntactic patterns, and usually in specific cases, there is some relationship of these patterns to meaning (although no has yet — and I doubt there is — found any general-level relationship between syntax and semantics).
Now one — but not the only — way of finding syntactical patterns is using the concept of entropy and perhaps because entropy is called a measure of "information" as per Information Theory, everyone seems to think that this is THE unit of syntax that has THE relationship to semantics. I happen to think that as a way of finding and evaluating patterns, entropy is no better or worse than any other quantity we can come up with. In fact, this whole notion that entropy is a measure of "information" misleads practitioners from looking at other ways of mapping syntax that can work better in specific cases.
(Which is not to say that entropy is just another quantity in communication theory which it clearly revolutionized. But again, communication theory needs to be understood not as communication involving meaning but just as a way of transmitting data accurately.)
Will someone who is smart and who loves words please write about what they're doign to "disruption" and "emergent"?!?
The OED should've drilled deeper. A reference to physicist John Wheeler's "It from Bit" [*] would've added some flair.
[*] http://suif.stanford.edu/~jeffop/WWW/wheeler.txt
Vince, that's awesome, thanks.
Scritic, I will try to write more later. About information and AI, I've long been interested in what Douglas Hofstadter says about the importance of analogy in intelligent response to information. Not something I know much about, but fruitful with . . . analogies to what I do.
Tony, what are they doing to "disruption" and "emergent"? "Emergent" has a semi-technical meaning, related to swarm behavior and collective intelligence (e.g., in ant colonies) — is that what you mean?
What are they doing? The same goddam thing that did to "evangelist" and "innovation"!
Tony, you should become an evangelist for more precise English.
I only learned about this in the last week, so I'm not sure if I'm doing this right, but here it goes:
I see what you did there.