Conditional probability

related topics
{math, number, function}
{rate, high, increase}
{disease, patient, cell}
{day, year, event}
{son, year, death}
{theory, work, human}

Conditional probability is the probability of some event A, given the occurrence of some other event B. Conditional probability is written P(A|B), and is read "the (conditional) probability of A, given B" or "the probability of A under the condition B". When in a random experiment the event B is known to have occurred, the possible outcomes of the experiment are reduced to B, and hence the probability of the occurrence of A is changed from the unconditional probability into the conditional probability given B.

Joint probability is the probability of two events in conjunction. That is, it is the probability of both events together. The joint probability of A and B is written \scriptstyle P(A \cap B), P(AB) or \scriptstyle P(A, B)

Marginal probability is then the unconditional probability P(A) of the event A; that is, the probability of A, regardless of whether event B did or did not occur. If B can be thought of as the event of a random variable X having a given outcome, the marginal probability of A can be obtained by summing (or integrating, more generally) the joint probabilities over all outcomes for X. For example, if there are two possible outcomes for X with corresponding events B and B', this means that \scriptstyle P(A) = P(A \cap B) + P(A \cap B^'). This is called marginalization.

In these definitions, note that there need not be a causal or temporal relation between A and B. A may precede B or vice versa or they may happen at the same time. A may cause B or vice versa or they may have no causal relation at all. Notice, however, that causal and temporal relations are informal notions, not belonging to the probabilistic framework. They may apply in some examples, depending on the interpretation given to events.

Conditioning of probabilities, i.e. updating them to take account of (possibly new) information, may be achieved through Bayes' theorem. In such conditioning, the probability of A given only initial information I, P(A|I), is known as the prior probability. The updated conditional probability of A, given I and the outcome of the event B, is known as the posterior probability, P(A|B,I).


Full article ▸

related documents
Binomial distribution
Interpolation search
Cumulative distribution function
Rank (linear algebra)
Measure (mathematics)
Bolzano–Weierstrass theorem
Algorithms for calculating variance
Compactness theorem
Integral domain
Union (set theory)
Pauli matrices
Constructible number
Compact space
Linear search
Elliptic integral
Chain complex
Perfect number
Procedural programming
Closure (topology)
Free variables and bound variables
Riesz representation theorem
Jacobi symbol
Stirling's approximation
Diophantine equation
Sylow theorems
Hyperbolic function