Logic programming

related topics
{math, number, function}
{theory, work, human}
{system, computer, user}
{law, state, case}
{water, park, boat}
{language, word, form}
{woman, child, man}

Logic programming is, in its broadest sense, the use of mathematical logic for computer programming. In this view of logic programming, which can be traced at least as far back as John McCarthy's [1958] advice-taker proposal, logic is used as a purely declarative representation language, and a theorem-prover or model-generator is used as the problem-solver. The problem-solving task is split between the programmer, who is responsible only for ensuring the truth of programs expressed in logical form, and the theorem-prover or model-generator, which is responsible for solving problems efficiently.

However, logic programming, in the narrower sense in which it is more commonly understood, is the use of logic as both a declarative and procedural representation language. It is based upon the fact that a backwards reasoning theorem-prover applied to declarative sentences in the form of implications:

treats the implications as goal-reduction procedures:

For example, it treats the implication:

as the procedure:

Note that this is consistent with the BHK interpretation of constructive logic, where implication would be interpreted as a solution of problem H given solutions of B1Bn. However, the defining feature of logic programming is that sets of formulas can be regarded as programs and proof search can be given a computational meaning. This is achieved by restricting the underlying logic to a "well-behaved" fragment such as Horn clauses or Hereditary Harrop formulas. See D. Miller et al., 1991.

As in the purely declarative case, the programmer is responsible for ensuring the truth of programs. But since automated proof search is generally infeasible, logic programming as commonly understood also relies on the programmer to ensure that inferences are generated efficiently (see #Problem solving). In many cases, to achieve efficiency, one needs to be aware of and to exploit the problem-solving behavior of the theorem-prover. In this respect, logic programming is comparable to conventional imperative programming; using programs to control the behaviour of a program executor. However, unlike conventional imperative programs, which have only a procedural interpretation, logic programs also have a declarative, logical interpretation, which helps to ensure their correctness. Moreover, such programs, being declarative, are at a higher conceptual level than purely imperative programs; and their program executors, being theorem-provers, operate at a higher conceptual level than conventional compilers and interpreters.

Contents

Full article ▸

related documents
Denotational semantics
Kernel (algebra)
Cardinal number
Gaussian elimination
Numerical analysis
Infinity
Huffman coding
Complete lattice
Sequence alignment
Lua (programming language)
Russell's paradox
Stone–Čech compactification
Hash function
Entropy (information theory)
Interval (mathematics)
Series (mathematics)
Absolute value
Ruby (programming language)
Newton's method
Integration by parts
Functor
Direct sum of modules
RSA
Simplex
Boolean satisfiability problem
Riemann zeta function
Non-standard analysis
Proofs of Fermat's little theorem
List of trigonometric identities
Pascal's triangle