
related topics 
{math, number, function} 
{war, force, army} 
{system, computer, user} 
{math, energy, light} 
{rate, high, increase} 
{law, state, case} 
{work, book, publish} 

In cryptography, key size or key length is the size measured in bits^{[1]} of the key used in a cryptographic algorithm (such as a cipher). An algorithm's key length is distinct from its cryptographic security, which is a logarithmic measure of the fastest known computational attack on the algorithm, also measured in bits. The security of an algorithm cannot exceed its key length (since any algorithm can be cracked by brute force), but it can be smaller. For example, Triple DES has a key size of 168 bits but provides at most 112 bits of security, since an attack of complexity 2^{112} is known. This property of Triple DES is not a weakness provided 112 bits of security is sufficient for an application. Most symmetrickey algorithms in common use are designed to have security equal to their key length. No asymmetrickey algorithms with this property are known; elliptic curve cryptography comes the closest with an effective security of roughly half its key length.
Contents
Significance
Keys are used to control the operation of a cipher so that only the correct key can convert encrypted text (ciphertext) to plaintext. Many ciphers are based on publicly known algorithms or are open source, and so it is only the difficulty of obtaining the key that determines security of the system, provided that there is no analytic attack (i.e., a 'structural weakness' in the algorithms or protocols used), and assuming that the key is not otherwise available (such as via theft, extortion, or compromise of computer systems). The widely accepted notion that the security of the system should depend on the key alone has been explicitly formulated by Auguste Kerckhoffs (in the 1880s) and Claude Shannon (in the 1940s); the statements are known as Kerckhoffs' principle and Shannon's Maxim respectively.
A key should therefore be large enough that a brute force attack (possible against any encryption algorithm) is infeasible – i.e, would take too long to execute. Shannon's work on information theory showed that to achieve so called perfect secrecy, it is necessary for the key length to be at least as large as the message to be transmitted and only used once (this algorithm is called the Onetime pad). In light of this, and the practical difficulty of managing such long keys, modern cryptographic practice has discarded the notion of perfect secrecy as a requirement for encryption, and instead focuses on computational security, under which the computational requirements of breaking an encrypted text must be infeasible for an attacker.
Full article ▸


related documents 
Scope (programming) 
Tychonoff's theorem 
Homological algebra 
Semigroup 
Ordinary differential equation 
Modular arithmetic 
Filter (mathematics) 
XPath 1.0 
Normed vector space 
Countable set 
Exclusive or 
Holomorphic function 
Complex analysis 
J (programming language) 
Objective Caml 
Ultrafilter 
Partition (number theory) 
Power series 
Absolute convergence 
Natural logarithm 
Galois theory 
Gödel's completeness theorem 
Abel–Ruffini theorem 
Natural transformation 
Gaussian quadrature 
Database normalization 
Ideal (ring theory) 
Elementary algebra 
Elliptic curve 
Sequence 
