Shannon–Hartley theorem

related topics
{system, computer, user}
{math, number, function}
{math, energy, light}
{rate, high, increase}
{black, white, people}

In information theory, the Shannon–Hartley theorem (also known as Shannon's law) is an application of the noisy channel coding theorem to the archetypal case of a continuous-time analog communications channel subject to Gaussian noise. The theorem establishes Shannon's channel capacity for such a communication link, a bound on the maximum amount of error-free digital data (that is, information) that can be transmitted with a specified bandwidth in the presence of the noise interference, assuming (a) the signal power is bounded; (b)the Gaussian noise process is characterized by a known power or power spectral density. The law is named after Claude Shannon and Ralph Hartley.

Contents

Statement of the theorem

Considering all possible multi-level and multi-phase encoding techniques, the Shannon–Hartley theorem states the channel capacity C, meaning the theoretical tightest upper bound on the information rate (excluding error correcting codes) of clean (or arbitrarily low bit error rate) data that can be sent with a given average signal power S through an analog communication channel subject to additive white Gaussian noise of power N, is:

Full article ▸

related documents
MMIX
Portable Executable
Disk partitioning
Integrated development environment
Qt (toolkit)
Intel i860
L4 microkernel family
GNU Hurd
System call
Wikipedia:Free On-line Dictionary of Computing/T - W
Multiplexer
MAC address
Shareaza
Cygwin
Low-pass filter
Microsoft Access
Locality of reference
JPEG 2000
Carbon (API)
REBOL
Slackware
Subversion (software)
Non-return-to-zero
BBC BASIC
Resource Interchange File Format
Wikipedia:Free On-line Dictionary of Computing/L - N
MX record
Text Editor and Corrector
Fractal compression
Forward error correction