In 1936, graduate student Claude Shannon arrived at the Massachusetts Institute of Technology.

In the best tradition of grad. students, Shannon was short of money, and happy to be recruited by his professor, Vannevar Bush, to tend the unwieldy entrails of Bush's mechanical computing device - the Differential Analyser.

The Differential Analyser, while a marvel of scientific engineering for its time, was a lot of hard work to maintain. Basically an assembly of shafts and gears, the gears themselves had to be manually configured to specific ratios before any problem could be ‘fed’ to the machine - a boring, laborious (and extremely messy) business.

("I had to kind of, you know, fix [it] from time to time to keep it going".)

Encouraged by Bush to base his master's thesis on the logical operation of the Differential Analyser, Shannon inevitably considered ways of improving it, perhaps by using electrical circuits instead of the present cumbersome collection of mechanical parts.

Not long afterwards, it dawned on Shannon that the Boolean algebra he'd learned as an undergraduate was in fact very similar to an electric circuit. The next obvious step would be to lay out circuitry according to Boolean principles, allowing the circuits to binary-test propositions as well as calculate problems.

Shannon incorporated his musings into his 1937 thesis. The paper, and its author, were hailed as brilliant, and his ideas were almost immediately put into force in the design of telephone systems. Later, of course, Shannon's thesis came to be seen as a focal point in the development of modern computers.

A half-century later, Shannon laid it all at the feet of Lady Luck:

"It just happened that no one else was familiar with both fields at the same time."

Shannon's later work, "A Mathematical Theory of Communication" (1948), outlining what we now know as Information Theory, described the measurement of information by binary digits representing yes-no alternatives - the fundamental basis of today's telecommunications.

"A Mathematical Theory of Communication" was luckily written while Shannon was employed by Bell Labs - Luckily, because Shannon wasn't planning on publishing his work, and only did so in the end at the urging of fellow employees.

The paper outlined a mathematical definition of information and, probably based on his work in cryptography during the war, Shannon described ways to measure data using the quantity of disorder in any given system, together with the concept of entropy.

(Information, in this sense, includes messages occurring in any communications medium - television, radio, telephone, data-processing devices such as computers and servomechanisms, even neural networks.)

The impact of this work was immediate and far-reaching. Lauded as "The Magna Carta of the information age", disciplines as diverse as computer science, genetic engineering and neuroanatomy used Shannon's discoveries to solve puzzles as different as computer error correction code problems and biological entropy.

Shannon retired at the age of 50, although he published papers sporadically over the next ten years. These days, apparently, his formidable intellect is bent upon more important things - inventing motorised pogo-sticks and generally enjoying life.

Dr. Shannon passed away 24 Feb 2001.

A Note on the Edition

Claude Shannon's ``A mathematical theory of communication'' was first
published in two parts in the July and October 1948 editions of the
*Bell System Technical Journal* [1].
The paper has appeared in a number of republications since:

- The original 1948 version was reproduced in the collection
*Key Papers in the Development of Information Theory*[2]. The paper also appears in*Claude Elwood Shannon: Collected Papers*[3]. The text of the latter is a reproduction from the*Bell Telephone System Technical Publications*, a series of monographs by engineers and scientists of the Bell System published in the*BSTJ*and elsewhere. This version has correct section numbering (the*BSTJ*version has two sections numbered 21), and as far as we can tell, this is the only difference from the*BSTJ*version. - Prefaced by Warren Weaver's introduction, ``Recent contributions
to the mathematical theory of communication,'' the paper was included
in
*The Mathematical Theory of Communication,*published by the University of Illinois Press in 1949 [4]. The text in this book differs from the original mainly in the following points:- the title is changed to ``
*The*mathematical theory of communication'' and some sections have new headings, - Appendix 4 is rewritten,
- the references to unpublished material have been updated to refer to the published material.

- the title is changed to ``

The text we present here is based on the *BSTJ* version with a number
of corrections. (The version on this site before May 18th 1998 was based
on the University of Illinois
Press version.)

Here you can find a PostScript (460 Kbytes), gzipped PostScript (146 Kbytes) and pdf (358 Kbytes) version of Shannon's paper. PDF files can be viewed by Adobe's acrobat reader. Tarred and gzipped contents of the directory (63 Kbytes) that contain the LaTeX code for the paper is also available.

**1**-
C. E. Shannon, ``A mathematical theory of communication,''
*Bell System Technical Journal,*vol. 27, pp. 379-423 and 623-656, July and October, 1948. **2**-
D. Slepian, editor,
*Key Papers in the Development of Information Theory,*New York: IEEE Press, 1974. **3**-
N. J. A. Sloane and
A. D. Wyner,
editors,
*Claude Elwood Shannon: Collected Papers,*New York: IEEE Press, 1993. **4**-
W. Weaver and C. E. Shannon,
*The Mathematical Theory of Communication,*Urbana, Illinois: University of Illinois Press, 1949, republished in paperback 1963.

And just in case something goes awry, here is a pdf version of A Mathematical Theory of Communication