Information theory

related topics
{math, number, function}
{system, computer, user}
{theory, work, human}
{rate, high, increase}
{math, energy, light}
{work, book, publish}
{mi², represent, 1st}
{company, market, business}
{style, bgcolor, rowspan}
{album, band, music}

Information theory is a branch of applied mathematics and electrical engineering involving the quantification of information. Information theory was developed by Claude E. Shannon to find fundamental limits on signal processing operations such as compressing data and on reliably storing and communicating data. Since its inception it has broadened to find applications in many other areas, including statistical inference, natural language processing, cryptography generally, networks other than communication networks — as in neurobiology,[1] the evolution[2] and function[3] of molecular codes, model selection[4] in ecology, thermal physics,[5] quantum computing, plagiarism detection[6] and other forms of data analysis.[7]

A key measure of information is known as entropy, which is usually expressed by the average number of bits needed for storage or communication. Entropy quantifies the uncertainty involved in predicting the value of a random variable. For example, specifying the outcome of a fair coin flip (two equally likely outcomes) provides less information (lower entropy) than specifying the outcome from a roll of a die (six equally likely outcomes).

Applications of fundamental topics of information theory include lossless data compression (e.g. ZIP files), lossy data compression (e.g. MP3s), and channel coding (e.g. for DSL lines). The field is at the intersection of mathematics, statistics, computer science, physics, neurobiology, and electrical engineering. Its impact has been crucial to the success of the Voyager missions to deep space, the invention of the compact disc, the feasibility of mobile phones, the development of the Internet, the study of linguistics and of human perception, the understanding of black holes, and numerous other fields[citation needed]. Important sub-fields of information theory are source coding, channel coding, algorithmic complexity theory, algorithmic information theory, information-theoretic security, and measures of information.

Full article ▸

related documents
Self (programming language)
Pascal (programming language)
Tar (file format)
Structured programming
Reference counting
Object-oriented programming
SQL
Exception handling
Fuzzy control system
Lightweight Directory Access Protocol
Aspect-oriented programming
Dylan (programming language)
MUMPS
Abstraction (computer science)
Control flow
JavaScript
Busy beaver
Preprocessor
Hamming code
Communication complexity
White noise
MATLAB
Wikipedia:Searching
Programming language
PL/SQL
Icon (programming language)
SHA hash functions
Operator
Dirac delta function
Metric space