The American mathematician and computer scientist who conceived and laid the foundations for information theory. His theories laid the groundwork for the electronic communications networks that now lace the earth.
Claude Elwood Shannon was born on April 30, 1916 in Petoskey, Michigan. After attending primary and secondary school in his neighboring hometown of Gaylord, he earned bachelors degrees in both electrical engineering and mathematics from the University of Michigan. After graduation, Shannon moved to the Massachusetts Institute of Technology (MIT) to pursue his graduate studies. 听While at M.I.T., he worked with Dr. Vannevar Bush on one of the early calculating machines, the "differential analyzer," which used a precisely honed system of shafts, gears, wheels and disks to solve equations in calculus. Though analog computers like this turned out to be little more than footnotes in the history of the computer, Dr. Shannon quickly made his mark with digital electronics, a considerably more influential idea.听In a prize-winning masters thesis completed in the Department of Mathematics, Shannon proposed a method for applying a mathematical form of logic called Boolean algebra to the design of relay switching circuits.听This innovation, credited as the advance that transformed circuit design 鈥渇rom an art to a science,鈥 remains the basis for circuit and chip design to this day. Shannon received both a听master's degree in electrical engineering and his Ph.D. in mathematics from M.I.T. in 1940.
In 1941, Shannon took a position at Bell Labs, where he had spent several prior summers. His war-time work on secret communication systems was used to build the system over which Roosevelt and Churchill communicated during the war. When his results were finally de-classified and published in 1949, they revolutionized the field of cryptography. Understanding, before almost anyone, the power that springs from encoding information in a simple language of 1's and 0's, Dr. Shannon as a young scientist at Bell Laboratories wrote two papers that remain monuments in the fields of computer science and information theory. "Shannon was the person who saw that the binary digit was the fundamental element in all of communication," said Dr. Robert G. Gallager, a professor of electrical engineering who worked with Dr. Shannon at the Massachusetts Institute of Technology. "That was really his discovery, and from it the whole communications revolution has sprung."
Shannon鈥檚 most important paper, 鈥 A mathematical theory of communication ,鈥 was published in 1948. This fundamental treatise both defined a mathematical notion by which information could be quantified and demonstrated that information could be delivered reliably over imperfect communication channels like phone lines or wireless connections. These groundbreaking innovations provided the tools that ushered in the information age. As noted by Ioan James, Shannon biographer for the Royal Society, 鈥淪o wide were its repercussions that the theory was described as one of humanity鈥檚 proudest and rarest creations, a general scientific theory that could profoundly and rapidly alter humanity鈥檚 view of the world.鈥 Shannon went on to develop many other important ideas whose impact expanded well beyond the field of 鈥渋nformation theory鈥 spawned by his 1948 paper.
Shannon approached research with a sense of curiosity, humor, and fun. An accomplished unicyclist, he was famous for cycling the halls of Bell Labs at night, juggling as he went. His later work on chess-playing machines and an electronic mouse that could run a maze helped create the field of artificial intelligence, the effort to make machines that think. And his ability to combine abstract thinking with a practical approach 鈥 he had a penchant for building machines 鈥 inspired a generation of computer scientists. Dr. Marvin Minsky of M.I.T., who as a young theorist worked closely with Dr. Shannon, was struck by his enthusiasm and enterprise. "Whatever came up, he engaged it with joy, and he attacked it with some surprising resource 鈥 which might be some new kind of technical concept or a hammer and saw with some scraps of wood," Dr. Minsky said. "For him, the harder a problem might seem, the better the chance to find something new."听
While Shannon worked in a field for which no Nobel prize is offered, his work was richly rewarded by honors including the National Medal of Science (1966) and honorary degrees from Yale (1954), Michigan (1961), Princeton (1962), Edin- burgh (1964), Pittsburgh (1964), Northwestern (1970), Oxford (1978), East Anglia (1982), Carnegie-Mellon (1984), Tufts (1987), and the University of Pennsylvania (1991). He was also the first recipient of the Harvey Prize (1972), the Kyoto Prize (1985), and the Shannon Award (1973). The last of these awards, named in his honor, is given by the Information Theory Society of the Institute of Electrical and Electronics Engineers (麻豆传媒AV) and remains the highest possible honor in the community of researchers dedicated to the field that he invented. His Collected Papers, published in 1993, contains 127 publications on topics ranging from communications to computing, and juggling to 鈥渕ind-reading鈥 machines.
Shannon died on Saturday, February 24, 2001 in Medford, Mass., after a long fight with Alzheimer's disease. He was 84.
For more information about Shannon and his impact, see the article by Michelle Effros and H. Vincent Poor,听 Claude Shannon: His Work and Its Legacy ,听Published with the permission of the EMS Newsletter: reprinted from N掳103 (March 2017) pp.29-34. Slides of the corresponding talk are also available.
听