Optimality theory

related topics
{math, number, function}
{theory, work, human}
{language, word, form}
{rate, high, increase}
{government, party, election}
{law, state, case}
{specie, animal, plant}
{game, team, player}
{style, bgcolor, rowspan}
{system, computer, user}

Optimality theory (frequently abbreviated OT) is a linguistic model proposing that the observed forms of language arise from the interaction between conflicting constraints. There are three basic components of the theory:

Optimality theory assumes that these components are universal. Differences in grammars reflect different rankings of the universal constraint set, CON. Part of language acquisition can then be described as the process of adjusting the ranking of these constraints.

Optimality theory was originally proposed by the linguists Alan Prince and Paul Smolensky in 1993, and later expanded by Prince and John J. McCarthy. Although much of the interest in optimality theory has been associated with its use in phonology, the area to which optimality theory was first applied, the theory is also applicable to other subfields of linguistics (e.g. syntax and semantics).

Optimality theory is usually considered a development of generative grammar, which shares its focus on the investigation of universal principles, linguistic typology and language acquisition.

Optimality theory is often called a connectionist theory of language, because it has its roots in neural network research, though the relationship is now largely of historical interest. It arose in part as a successor to the theory of Harmonic Grammar, developed in 1990 by Géraldine Legendre, Yoshiro Miyata and Paul Smolensky.

Contents

Input and GEN: the candidate set

Optimality theory supposes that there are no language-specific restrictions on the input. This is called richness of the base. Every grammar can handle every possible input. For example, a language without complex clusters must be able to deal with an input such as /flask/. Languages without complex clusters differ on how they will resolve this problem; some will epenthesize (e.g. /falasak/, or /falasaka/ if all codas are banned) and some will delete (e.g. /fas/, /fak/, /las/, /lak/). Given any input, GEN generates an infinite number of candidates, or possible realizations of that input. A language's grammar (its ranking of constraints) determines which of the infinite candidates will be assessed as optimal by EVAL.

Full article ▸

related documents
Cox's theorem
Planner (programming language)
Jules Richard
Existential quantification
Reinforcement learning
Interpolation search
Augmented Backus–Naur Form
Gram–Schmidt process
Depth-first search
Recursive descent parser
Hyperbolic function
Legendre polynomials
Compactness theorem
Poisson process
2 (number)
Fixed point combinator
Average
Chain complex
Radius of convergence
Outer product
Free variables and bound variables
Jacobi symbol
Stirling's approximation
Sylow theorems
Delegation pattern
Probability
Generating trigonometric tables
De Morgan's laws
Monotone convergence theorem
Assignment problem