domingo, 4 de octubre de 2009

Claude Shannon, The father of the Telecommunications!



Shannon, Claude Elwood (1916-2001)

American mathematician and father of information theory Eric Weisstein's World of Math. Claude Elwood Shannon was born in Gaylord, Michigan on April 30, 1916 to Claude Elwood and Mabel Wolf Shannon. Shannon's father Claude, was a judge in Gaylord, a small town with a population of about three thousand, and his mother Mabel was the principal of the local high school. Although he didn't work in the field of mathematics, Shannon proved to be mathematically precocious. Although there was not much scientific influence from Shannon's father, he received scientific encouragement from his grandfather, who was an inventor and a farmer whose inventions included the washing machine and farming machinery.

From an early age, Shannon showed an affinity for both engineering and mathematics, and graduated from Michigan University with degrees in both disciplines. For his advanced degrees, he chose to attend the Massachusetts Institute of Technology. At the time, MIT was one of a number of prestigious institutions conducting research that would eventually formulate the basis for what is now known as the information sciences. Its faculty included mathematician Norbert Wiener, who would later coin the term cybernetics to describe the work in information theories that he, Shannon, and other leading American mathematicians were conducting. It also included Vannevar Bush, MIT's dean of engineering, who in the early 1930s had built an analog computer called the Differential Analyzer. The Differential Analyzer was developed to calculate complex equations that tabulators and calculators of the day were unable to address. It was a mechanical computer, using a series of gears and shafts to engage cogs until the equation was solved. Once it completed its cycle, the answer to the equation was obtained by measuring the changes in position of its various machine parts. Its only electrical parts were the motors used to drive the gears. With its crude rods, gears and axles, the analyzer looked like a child's erector set. Setting it up to work one equation could take two to three days; solving the same equation could take equally as long, if not longer. In order to work a new problem, the entire machine, which took up several hundred feet of floor space, had to be torn apart and reset to a new mechanical configuration.

While at MIT, Shannon studied with both Wiener and Bush. Noted as a 'tinkerer,' he was ideally suited to working on the Differential Analyzer, and would set it up to run equations for other scientists. At Bush's suggestion, Shannon also studied the operation of the analyzer's relay circuits for his master's thesis. This analysis formed the basis for Shannon's influential 1938 paper "A Symbolic Analysis of Relay and Switching Circuits," in which he put forth his developing theories on the relationship of symbolic logic to relay circuits. This paper, and the theories it contained, would have a seminal impact on the development of information processing machines and systems in the years to come.

Shannon is known as the father of information theory. He combined mathematical theories with engineering principles to set the stage for the development of the digital computer and the modern digital communication revolution. His seminal 1948 paper opened up a whole new world and fundamentally changed the way in which the communication of information is viewed.

Shannon graduated from MIT in 1940 with both a master's degree and doctorate in mathematics. After graduation, he spent a year as a National Research Fellow at the Institute for Advanced Study at Princeton University, where he worked with mathematician and physicist Hermann Weyl. In 1941, Shannon joined the Bell Telephone Laboratories, where he became a member of a group of scientists charged with the tasks of developing more efficient information transmitting methods and improving the reliability of long-distance telephone and telegraph lines.

One of the most important feature of Shannon's theory was the concept of information entropy, Eric Weisstein's World of Math which he demonstrated to be equivalent to a shortage in the information content in a message. According to physics' second law of thermodynamics, Eric Weisstein's World of Physics entropy Eric Weisstein's World of Physics is the degree of randomness in any system always increased. Thus, many sentences can be significantly shortened without losing their meaning. Shannon proved that in a noisy conversation, signal could always be send without distortion. If the message is encoded in such a way that it is self-checking, signals will be received with the same accuracy as if there were no interference on the line. A language, for example, has a built in error-correcting code. Therefore, a noisy party conversation is only partly clear because half the language is redundant. Shannon's methods were soon seen to have applications not only to computer design but to virtually every subject in which language was important such as linguistics, psychology, cryptography, and phonetics.

Shannon also showed that if enough extra bits were added to a message to help correct for errors, it could tunnel through the noisiest channel, arriving unscathed at the end. This insight has been developed over the decades into sophisticated error-correcting codes that ensure the integrity of the data on which society interacts. While working at the Bell Labs, Shannon and Hamming they developed the concept of the error-correcting code. Eric Weisstein's World of Math

Shannon's paper provided a glimpse into the future of information processing. While studying the relay switches on the Differential Equalizer as they went about formulating an equation, Shannon noted that the switches were always either open or closed, or on and off. This led him to think about a mathematical way to describe the open and closed states, and he recalled the logical theories of mathematician George Boole, who in the middle 1800s advanced what he called the logic of thought, in which all equations were reduced to a binary system consisting of zeros and ones.

Boole's theory, which formulated the basis for Boolean algebra, Eric Weisstein's World of Math stated that a statement of logic carried a one if true and a zero if false. Shannon theorized that a switch in the on position would equate to a Boolean one. In the off position, it was a zero. By reducing information to a series of ones and zeros, Shannon wrote, information could be processed by using on-off switches. He also suggested that these switches could be connected in such a way to allow them to perform more complex equations that would go beyond simple 'yes' and 'no' statements to 'and', 'or' or 'not' operations. Understanding, before almost anyone, the power that springs from encoding information in a simple language of 1s and 0s, Shannon as a young man wrote two papers that remain monuments in the fields of computer science and information theory.

Shannon believed that information was no different than any other quantity and therefore could be manipulated by a machine. He applied his earlier research to the problem at hand, again using Boolean logic to develop a model that reduced information to its most simple form--a binary system of yes/no choices, which could be presented by a 1/0 binary code. By applying set codes to information as it was transmitted, the noise it picked up during transmission could be minimized, thereby improving the quality of information transmission.

In the late 1940s, Shannon's research was presented in The Mathematical Theory of Communications, which he co-authored with mathematician Warren Weaver. It was in this work that Shannon first introduced the word 'bit,' comprised of the first two and the last letter of 'binary digit' and coined by his colleague John W. Tukey, to describe the yes-no decision that lay at the core of his theories.

Shannon's most important scientific contribution was his work on communication. In 1941 he began a serious study of communication problems, partly motivated by the demands of the war effort. This research resulted in the classic paper entitled "A mathematical theory of communication" in 1948. This pioneering paper begins by observing that 'the fundamental problem of communication is that of reproducing at one point exactly or approximately a message selected at another point. The results were so breathtakingly original, that it took some time for the mathematical and engineering community to realize their significance. A brand new science had been created in the form of Information theory, with the publication of that single paper, and the frame work and terminology he established remains standard even today.

In the 1950s, Shannon turned his efforts to developing what was then called "intelligent machines"--mechanisms that emulated the operations of the human mind to solve problems. Of his inventions during that time, the best known was a maze-solving mouse called Theseus, which used magnetic relays to learn how to maneuver through a metal maze.

During a visit to the United States,during the World war II, Alan Turing, a leading British mathematician spent a few months with Shannon working with Shannon. Both scientists were interested in the possibility of building a machine that could imitate the human brain. They worked together to build an encrypted voice phone that would allow Roosevelt to have a secure transatlantic conversation with Churchill.

Shannon's information theories eventually saw application in a number of disciplines in which language is a factor, including linguistics, phonetics, psychology and cryptography, which was an early love of Shannon's. His theories also became a cornerstone of the developing field of artificial intelligence, and in 1956 he was instrumental in convening a conference at Dartmouth College that was the first major effort in organizing artificial intelligence research. He wrote a paper entitled "Programming a computer for playing chess" in 1950, and developed a chess playing computer. Many years later, in 1965, he met the world chess champion Pichail Botvinnik (also and electrical engineer), and played a match with him, but lost after 42 moves.

Shannon's interest did not stop with these. He was an expert juggler who was often seen juggling three balls while riding an unicycle. He was an accomplished clarinet player, and was even known to pen light verse.

"Shannon was the person who saw that the binary digit was the fundamental element in all of communication," said Robert G. Gallager, a professor of electrical engineering who worked with Shannon at the Massachusetts Institute of Technology. "That was really his discovery, and from it the whole communications revolution has sprung." Marvin Minsky of M.I.T., who as a young theorist worked closely with Shannon, was struck by his enthusiasm and enterprise. "Whatever came up, he engaged it with joy, and he attacked it with some surprising resource--which might be some new kind of technical concept or a hammer and saw with some scraps of wood," Minsky said. "For him, the harder a problem might seem, the better the chance to find something new."

Shannon was the recipient of numerous honorary degrees and awards. Shannon's published and un published documents(a hundred and twenty seven of them), cover an unbelievably wide spectrum of areas. Many of them have been a priceless source of research ideas for others. One could say that there would be no internet without Shannon's theory of information; every modem, every compressed file, every error correcting code owes something to Shannon.

Shannon died at age 84 on February 27, 2001 in Medford, Mass., after a long fight with Alzheimer's disease. He was survived by his three children: Robert James, Andrew Moore, and Margarita Catherine, and by his wife Mary Elizabeth Moore, who he had married on March 27, 1949.

This biography was taken from: http://scienceworld.wolfram.com/biography/Shannon.html

No hay comentarios: