In linear algebra, a family of vectors is linearly independent if none of them can be written as a linear combination of finitely many other vectors in the collection. A family of vectors which is not linearly independent is called linearly dependent. For instance, in the threedimensional real vector space we have the following example.
Here the first three vectors are linearly independent; but the fourth vector equals 9 times the first plus 5 times the second plus 4 times the third, so the four vectors together are linearly dependent. Linear dependence is a property of the family, not of any particular vector; for example in this case we could just as well write the first vector as a linear combination of the last three.
In probability theory and statistics there is an unrelated measure of linear dependence between random variables.
Contents
Definition
A finite subset of n vectors, v_{1}, v_{2}, ..., v_{n}, from the vector space V, is linearly dependent if and only if there exists a set of n scalars, a_{1}, a_{2}, ..., a_{n}, not all zero, such that
Note that the zero on the right is the zero vector, not the number zero.
If such scalars do not exist, then the vectors are said to be linearly independent.
Alternatively, linear independence can be directly defined as follows: a set of vectors is linearly independent if and only if the only representations of the zero vector as linear combinations of its elements are trivial solution solutions i.e. whenever a_{1}, a_{2}, ..., a_{n} are scalars such that
Full article ▸
