Foreword

The story is told that young King Solomon was given the choice between wealth and wisdom. When he chose wisdom, God was so pleased that he gave Solomon not only wisdom but wealth also. So it is with science.

—Arthur Holly Compton

A half century ago, in 1948, Claude Shannon published his epic paper “A Mathematical Theory of Communication.” This paper was thereafter republished in book form in 1949 by the University of Illinois Press, together with an expository introduction by Warren Weaver. The slight change in the title between 1948 and 1949 is both trivial and profound. The paper and, by extension, the book have had an immense impact on technological progress, and so on life as we now know it. Despite its specialized character, the book has sold more than 40,000 copies and continues to sell 700 or more copies per year. It is required reading in many courses on information theory. We are delighted to have a small role in the publication of this golden anniversary edition, which is identical to the original except for the correction of a number of minor typographical errors that have existed through four hard cover and sixteen paperback printings.

In this important work, Shannon chose to use a light touch and a gentle delivery more common in technical papers of that time than in our own. He saw deeply into the essence of the communication problem and chose to deliver that wisdom in his own way and with a minimum of mathematical proof. Perhaps this underlies the timeless quality of the work. The proofs came elsewhere and later, but the insight shines through. One measure of the greatness of the book is that Shannon's major precept that all communication is essentially digital is now commonplace among the modern digitalia, even to the point where many wonder why Shannon needed to state such an obvious axiom. Yet his audience fifty years ago took a somewhat skeptical and aloof view of his work.

Shannon had the presight to overlay the subject of communication with a distinct partitioning into sources, source encoders, channel encoders, channels, and associated channel and source decoders. Although his formalization seems quite obvious in our time, it was not so obvious back then. Shannon further saw that channels and sources could and should be described using the notions of entropy and conditional entropy. He argued persuasively for the use of these notions, both through their characterization by intuitive axioms and by presentation of precise coding theorems. Moreover, he indicated how very explicit, operationally significant concepts such as the information content of a source or the information capacity of a channel can be identified using entropy and maximization of functions involving entropy.

Shannon's revolutionary work brought forth this new subject of information theory fully formed but waiting for the maturity that fifty years of aging would bring. It is hard to imagine how the subject could have been created in an evolutionary way, though after the conception its evolution proceeded in the hands of hundreds of authors to produce the subject in its current state of maturity.

The exposition by Warren Weaver that introduces the book is one of his many and diverse contributions toward promoting the understanding of science and mathematics to a broad audience. It illustrates how Shannon's ideas have implications that were (at least fifty years ago) well beyond the immediate goals of communication engineers and of Shannon himself. These include insights for linguists and for social scientists addressing broad communication issues.

The impact of Shannon's theory of information on the development of telecommunication has been immense. This is evident to those working at the edge of advancing developments, though perhaps not quite so visible to those involved in routine design. The notion that a channel has a specific information capacity, which can be measured in bits per second, has had a profound influence. On the one hand, this notion offers the promise, at least in theory, of communication systems with frequency of errors as small as desired for a given channel for any data rate less than the channel capacity. Moreover, Shannon's associated existence proof provided tantalizing insight into how ideal communication systems might someday fulfill the promise. On the other hand, this notion also clearly establishes a limit on the communication rate that can be achieved over a channel, offering communication engineers the ultimate benchmark with which to calibrate progress toward construction of the ultimate communication system for a given channel.

The fact that a specific capacity can be reached, and that no data transmission system can exceed this capacity, has been the holy grail of modem design for the last fifty years. Without the guidance of Shannon's capacity formula, modem designers would have stumbled more often and proceeded more slowly. Communication systems ranging from deep-space satellite links to storage devices such as magnetic tapes and ubiquitous compact disks, and from high-speed internets to broadcast high-definition television, came sooner and in better form because of his work. Aside from this wealth of consequences, the wisdom of Claude Shannon's insights may in the end be his greatest legacy.

Richard E. Blahut
Bruce Hajek