In the best tradition of grad. students, Shannon was short of money, and happy to be recruited by his professor, Vannevar Bush, to tend the unwieldy entrails of Bush's mechanical computing device - the Differential Analyser.
The Differential Analyser, while a marvel of scientific engineering for its time, was a lot of hard work to maintain. Basically an assembly of shafts and gears, the gears themselves had to be manually configured to specific ratios before any problem could be ‘fed’ to the machine - a boring, laborious (and extremely messy) business.
("I had to kind of, you know, fix [it] from time to time to keep it going".)
Encouraged by Bush to base his master's thesis on the logical operation of the Differential Analyser, Shannon inevitably considered ways of improving it, perhaps by using electrical circuits instead of the present cumbersome collection of mechanical parts.
Not long afterwards, it dawned on Shannon that the Boolean algebra he'd learned as an undergraduate was in fact very similar to an electric circuit. The next obvious step would be to lay out circuitry according to Boolean principles, allowing the circuits to binary-test propositions as well as calculate problems.
Shannon incorporated his musings into his 1937 thesis. The paper, and its author, were hailed as brilliant, and his ideas were almost immediately put into force in the design of telephone systems. Later, of course, Shannon's thesis came to be seen as a focal point in the development of modern computers.
A half-century later, Shannon laid it all at the feet of Lady Luck:
"It just happened that no one else was familiar with both fields at the same time."
Shannon's later work, "A Mathematical Theory of Communication" (1948), outlining what we now know as Information Theory, described the measurement of information by binary digits representing yes-no alternatives - the fundamental basis of today's telecommunications.
"A Mathematical Theory of Communication" was luckily written while Shannon was employed by Bell Labs - Luckily, because Shannon wasn't planning on publishing his work, and only did so in the end at the urging of fellow employees.
The paper outlined a mathematical definition of information and, probably based on his work in cryptography during the war, Shannon described ways to measure data using the quantity of disorder in any given system, together with the concept of entropy.
(Information, in this sense, includes messages occurring in any communications medium - television, radio, telephone, data-processing devices such as computers and servomechanisms, even neural networks.)
The impact of this work was immediate and far-reaching. Lauded as "The Magna Carta of the information age", disciplines as diverse as computer science, genetic engineering and neuroanatomy used Shannon's discoveries to solve puzzles as different as computer error correction code problems and biological entropy.
Shannon retired at the age of 50, although he published papers sporadically over the next ten years. These days, apparently, his formidable intellect is bent upon more important things - inventing motorised pogo-sticks and generally enjoying life.
Dr. Shannon passed away 24 Feb 2001.
A Note on the Edition
Claude Shannon's ``A mathematical theory of communication'' was first published in two parts in the July and October 1948 editions of the Bell System Technical Journal [1]. The paper has appeared in a number of republications since:
The text we present here is based on the BSTJ version with a number of corrections. (The version on this site before May 18th 1998 was based on the University of Illinois Press version.)
Here you can find a PostScript (460 Kbytes), gzipped PostScript (146 Kbytes) and pdf (358 Kbytes) version of Shannon's paper. PDF files can be viewed by Adobe's acrobat reader. Tarred and gzipped contents of the directory (63 Kbytes) that contain the LaTeX code for the paper is also available.
And just in case something goes awry, here is a pdf version of A Mathematical Theory of Communication