Claude Shannon

related topics
{work, book, publish}
{system, computer, user}
{theory, work, human}
{math, number, function}
{son, year, death}
{game, team, player}
{ship, engine, design}
{@card@, make, design}
{math, energy, light}
{land, century, early}
{church, century, christian}

Information Theory
Shannon–Fano coding
Shannon–Hartley law
Nyquist–Shannon sampling theorem
Noisy channel coding theorem
Shannon switching game
Shannon number
Shannon index
Shannon's source coding theorem
Shannon's expansion
Shannon-Weaver model of communication

Claude Elwood Shannon (April 30, 1916 – February 24, 2001) was an American mathematician, electronic engineer, and cryptographer known as "the father of information theory".[1]

Shannon is famous for having founded information theory with one landmark paper published in 1948. But he is also credited with founding both digital computer and digital circuit design theory in 1937, when, as a 21-year-old master's student at MIT, he wrote a thesis demonstrating that electrical application of Boolean algebra could construct and resolve any logical, numerical relationship. It has been claimed that this was the most important master's thesis of all time.[2] Shannon contributed to the field of cryptanalysis during World War II and afterwards, including basic work on code breaking.

Contents

Full article ▸

related documents
Wikipedia:Signatures
Turing Award
Dave Winer
ArXiv
John Vincent Atanasoff
RSS
Request for Comments
Electronic mailing list
Slashdot
Wikipedia:Wikipedia NEWS/June 13 19 2001
Wikipedia:FAQ/Miscellaneous
Bjørn Lomborg
Tim Berners-Lee
Quotation
Alexander Grothendieck
Wikipedia:Wikipedians/Photographers
Wikipedia:Wikipedia NEWS/June 20 25 2001
Writer
Kristen Nygaard
FAQ
The Inquirer
New Scientist
Wikipedia:FAQ/Overview
Douglas Engelbart
Encyclopædia Britannica Eleventh Edition
Kathy Acker
Wikipedia:How the Current events page works
John Cawte Beaglehole
Donald Knuth
Zine