Presentation is loading. Please wait.

Presentation is loading. Please wait.

Claude Elwood Shannon Information Theory Differential Analyzer

Similar presentations


Presentation on theme: "Claude Elwood Shannon Information Theory Differential Analyzer"— Presentation transcript:

1 Claude Elwood Shannon Information Theory Differential Analyzer
Claude E. Shannon was born in Petoskey, Michigan, April 30th, 1916 to Claude Shannon Sr. and Mabel Wolf Shannon. His early years greatly influenced where his life would take him—his deep interest for science came from his grandfather, an established tinkerer and inventor who held a patent on a washing machine. In 1932 Shannon left Gaylord to study at the University of Michigan, and earned two bachelor’s degrees in Mathematics and Electrical Engineering. He later attended MIT as a graduate student and worked for MIT’s Department of Electrical Engineering with Vannevar Bush and his Differential Analyzer. He also went on to teach at MIT. Shannon is best known for his ingenuity when it came to information and how it can be represented, transferred, encrypted and decrypted. Though for all he is credited with, many other fascinating inventions of his go unnoticed—some that have been noticed are his off-center unicycle, a machine called the “Ultimate Machine,” which had a large switch on the side and would reveal an arm from the box that would flip the switch off if it was switched on, and Shannon’s mouse, Theseus. Growing up, Shannon openly admired Thomas Edison, and later discovered that he and Edison are distantly related by the blood of John Ogden ( ), a colonial leader—no wonder he is a talented inventor. After living a undoubtedly fulfilling existence, Shannon died from Alzheimer’s in February 2001, he was 84. Information Theory In simplest terms, information theory is communication by selection, and is only possible through language. Any language allows us to take an object, or thought/mental image, and break it into conceptual chunks. It was Shannon who discovered that information, no matter its form, could be represented using a fundamental unit—the bit. This solved many problems such as sending and receiving data with “noise” as well as decryption and decryption challenges. Another very important aspect of information theory is “channel capacity.” An Obituary from MIT News states, “All communication lines today are measured in bits per second, the notion that Professor Shannon made precise with “channel capacity.”” Using his concept of entropy, he was also able to figure out how much data a message could loose without being distorted by the data loss and transmission. Shannon pioneered the ideas and real world applications of information theory that lead to CD’s, deep-space communications, and how bits are used for storage in computers for pictures, voice streams, and other data. Differential Analyzer The Differential Analyzer was conceived by Vannevar Bush and his students sometime in the 1920’s (finished in 1931), and was made of a complicated system of gears, pulleys and rods. Naturally, Shannon’s interested was piqued at the complexity and motion of this machine. Unlike modern computers, this machine didn’t represent mathematical variables with 1’s and 0’s, rather with the continuous physical motion of the rods. As Shannon spent time with this machine—he maintained it and programmed it for other scientists to use—he discovered that the relay’s that the machine used closely resembled symbolic logic. Each physical switch was either open or closed—a concept exactly matching a binary standard in logic. This association lead to the spark that made everything we take for granted about information—Shannon confirms, “…I realized that Boolean algebra was just the thing to take care of relay circuits and switching circuits.” By developing these concepts further, Shannon theorized and later proved the concepts of “channel capacity,” the ‘bit,’ and Entropy. Channel capacity is straight forward: it is the maximum amount of information a certain method of transmission of data can transfer at one time, and uses the bit as its unit of measurement (my house supposedly gets 7.5 megabits per second). The bit, represented as 1’s and 0’s in a computer, is congruent with “yes or no,” as in Boolean algebra. Entropy is the name given to the concept developed by Shannon in order to precisely calculate how much information could be lost from a message without distorting that message and can be thought of as a sort of ‘information scale.’ Tid-Bits Shannon co-authored an article called “Proposal for the Dartmouth Summer Research Project on Artificial Intelligence,” which bolstered the interest in AI and was the first appearance of the term. A man named Henry Quastler calculated that the information quantity (H) contained in a human was approximately 2*1028 bits.

2 Sources http://en.wikipedia.org/wiki/Claude_Shannon
“What is Information Theory”


Download ppt "Claude Elwood Shannon Information Theory Differential Analyzer"

Similar presentations


Ads by Google