Security through obscurity

related topics
{law, state, case}
{system, computer, user}
{theory, work, human}
{math, number, function}
{company, market, business}
{war, force, army}
{build, building, house}
{ship, engine, design}
{rate, high, increase}
{@card@, make, design}
{work, book, publish}
{god, call, give}

Security through (or by) obscurity is a pejorative referring to a principle in security engineering, which attempts to use secrecy (of design, implementation, etc.) to provide security. A system relying on security through obscurity may have theoretical or actual security vulnerabilities, but its owners or designers believe that the flaws are not known, and that attackers are unlikely to find them. A system may use security through obscurity as a defense in depth measure; while all known security vulnerabilities would be mitigated through other measures, public disclosure of products and versions in use makes them early targets for newly discovered vulnerabilities in those products and versions. An attacker's first step is usually information gathering; this step is delayed by security through obscurity. The technique stands in contrast with security by design, although many real-world projects include elements of both strategies.



There is scant formal literature on the issue of security through obscurity. Books on security engineering will cite Kerckhoffs' doctrine from 1883, if they cite anything at all. For example, in a discussion about secrecy and openness in Nuclear Command and Control:[1]

In the field of legal academia, Peter Swire has written about the trade-off between the notion that "security through obscurity is an illusion" and the military notion that "loose lips sink ships"[3] as well as how competition affects the incentives to disclose.[4]

The principle of security through obscurity was more generally accepted in cryptographic work in the days when essentially all well-informed cryptographers were employed by national intelligence agencies, such as the National Security Agency. Now that cryptographers often work at universities, where researchers publish many or even all of their results, and publicly test others' designs, or in private industry, where results are more often controlled by patents and copyrights than by secrecy, the argument has lost some of its former popularity. An example is PGP released as source code, and generally regarded (when properly used) as a military-grade cryptosystem. The wide availability of high quality cryptography was disturbing to the US government, which seems to have been using a security through obscurity analysis to support its opposition to such work. Indeed, such reasoning is very often used by lawyers and administrators to justify policies which were designed to control or limit high quality cryptography only to those authorized.[citation needed]

Full article ▸

related documents
Communications Decency Act
Section 508 Amendment to the Rehabilitation Act of 1973
Federal Radio Commission
Adrian Lamo
Penet remailer
Procedural justice
Full disclosure
County Court
Antiterrorism and Effective Death Penalty Act of 1996
Court of Appeal of England and Wales
Gibbons v. Ogden
Martin v. Hunter's Lessee
Napoleonic code
Proximate cause
Civil and political rights
Warren Commission
Letter of marque
Court of First Instance
William Calley
Railway Labor Act
Ex parte Merryman
Dispute resolution
Regulation of Investigatory Powers Act 2000
John Allen Muhammad
Trial de novo
Trusts and estates
Ninth Amendment to the United States Constitution
Inquest (England and Wales)