8.4 /10 1 Votes
Publication date 1982 Originally published 1982 Page count 319 Publisher Simon & Schuster | 4.2/5 Pages 319 ISBN 0671440616 | |||||||||||||||||||||||||||||||||
Subjects Information theory, Systems theory, Cybernetics, Linguistics Information theory books A Mathematical Theory of, Quantifying the Qualitativ, Neural Networks for Knowl, Entropy in control engineering, Spectral Theory of Random |
Grammatical Man: Information, Entropy, Language, and Life is a 1982 book written by the Evening Standard's Washington correspondent, Jeremy Campbell. The book touches on topics of probability, Information Theory, cybernetics, genetics and linguistics. The book frames and examines existence, from the Big Bang to DNA to human communication to artificial intelligence, in terms of information processes. The text consists of a foreword, twenty-one chapters, and an afterword. It is divided into four parts: Establishing the Theory of Information; Nature as an Information Process; Coding Language, Coding Life; How the Brain Puts It All Together.
Contents
Part 1: Establishing the Theory of Information
Part 2: Nature as an Information Process
For Laplace's "intelligence," as for the God of Plato, Galileo and Einstein, the past and future coexist on equal terms, like the two rays into which an arbitrarily chosen point divides a straight line. If the theories I have presented are correct, however, not even the ultimate computer --the universe itself-- ever contains enough information to specify completely its own future states. The present moment always contains an element of genuine novelty and the future is never wholly predictable. Because biological processes also generate information and because consciousness enables us to experience those processes directly, the intuitive perception of the world as unfolding in time captures one of the most deepseated properties of the universe.
To understand complex systems, such as a large computer or a living organism, we cannot use ordinary, formal logic, which deals with events that definitely will happen or definitely will not happen. A probabilistic logic is needed, one that makes statements about how likely or unlikely it is that various events will happen.
Campbell also discusses John von Neumann in relating information theory, evolution, and linguistics to machines. The chapter closes with an examination of emergent systems and their relation to Gödel incompleteness.