# Linear Independence

Definition. Let V be a vector space over a field F, and let . The set S is linearly independent if , , and

An equation like the one above is called a linear relationship among the ; if at least one of the coefficients is nonzero, it is a nontrivial linear relationship. Thus, a set of vectors is independent if there is no nontrivial linear relationship among finitely many of the vectors.

A set of vectors which is not linearly independent is linearly dependent. (I'll usually say "independent" and "dependent" for short.) Thus, a set of vectors S is dependent if there are vectors and numbers , not all of which are 0, such that

Example. If F is a field, the standard basis vectors are

Show that they form an independent set in .

Write

I have to show all the a's are 0. Now

So

Since by assumption , I get

Hence, , and the set is independent.

Example. Show that any set containing the zero vector is dependent.

If , then is a nontrivial linear relationship in S.

Example. Show that the vectors and are dependent in .

I have to find numbers a and b, not both 0, such that

In this case, you can probably juggle numbers in your head to see that

This shows that the vectors are dependent. There are infinitely many pairs of numbers a and b that work. In examples to follow, I'll show how to find numbers systematically in cases where the arithmetic isn't so easy.

Example. Suppose u, v, w, and x are vectors in a vector space. Prove that the set is dependent.

Notice that in the four vectors in , each of u, v, w, and x occurs once with a plus sign and once with a minus sign. So

This is a dependence relation, so the set is dependent.

If you can't see an "easy" linear combination of a set of vectors that equals , you may have to determine independence or dependence by solving a system of equations.

Example. Consider the following sets of vectors in . If the set is independent, prove it. If the set is dependent, find a nontrivial linear combination of the vectors which is equal to .

(a) .

(b) .

(a) Write a linear combination of the vectors and set it equal to :

I have to determine whether this implies that .

Note: When you convert vectors given in "parenthesis form" to "matrix form", you turn the vectors into column vectors as above. This is consistent with the way I've set up systems of linear equations. Thus,

The vector equation above is equivalent to the matrix equation

Row reduce to solve:

Note: Row operations won't change the last column of zeros, so you don't actually need to write it when you do the row reduction. I'll put it in to avoid confusion.

The last matrix gives the equations

Therefore, the vectors are independent.

(b) Write

This gives the matrix equation

Row reduce to solve:

This gives the equations

Thus, and . I can get a nontrivial solution by setting c to any nonzero number. I'll use . This gives and . So

This is a linear dependence relation, and the vectors are dependent.

The same approach works for vectors in where F is a field other than the real numbers.

Example. Consider the set of vectors

If the set is independent, prove it. If the set is dependent, find a nontrivial linear combination of the vectors which is equal to .

Write

This gives the matrix equation

Row reduce to solve the system:

This gives the equations

Thus, and . Set . This gives and . Hence, the set is dependent, and

Example. Consider the following set of vectors in :

If the set is independent, prove it. If the set is dependent, find a nontrivial inear combination of the vectors equal to .

Write

This gives the matrix equation

Row reduce to solve the system:

This gives the equations

Hence, and . Set . Then and . Therefore, the set is dependent, and

To summarize, to determine whether vectors , , ..., in a vector space V are independent, I try to solve

If the only solution is , then the vectors are independent; otherwise, they are dependent.

It's important to understand this general setup, and not just memorize the special case of vectors in , as shown in the last few examples. Remember that vectors don't have to look like things like " " ("numbers in slots"). Consider the next example, for instance.

Example. is a vector space over the reals. Show that the set is independent.

Suppose

That is,

Two polynomials are equal if and only if their corresponding coefficients are equal. Hence, . Therefore, is independent.

In some cases, you can tell by inspection that a set is dependent. I noted earlier that a set containing the zero vector must be dependent. Here's another easy case.

Proposition. If , a set of n vectors in is dependent.

Proof. Suppose are n vectors in , and . Write

This gives the matrix equation

To solve, I'd row reduce the matrix

Note that this matrix has m rows and columns, and .

The row-reduced echelon form can have at most one leading coefficient in each row, so there are at most m leading coefficients. These correspond to the main variables in the solution. Since there are n variables and , there must be some parameter variables. By setting any parameters variables equal to nonzero numbers, I get a nontrivial solution for , , ... . This implies that is dependent.

Example. Is the following set of vectors in independent or dependent?

Any set of three (or more) vectors in is dependent.

Proposition. Let be vectors in , where F is a field. is independent if and only if the matrix constructed using the vectors as columns is invertible:

Proof. Suppose the set is independent. Consider the system

Multiplying out the left side, this gives

By independence, . Thus, the system above has only as a solution. An earlier theorem on invertibility shows that this means the matrix of v's is invertible.

Conversely, suppose the following matrix is invertible:

Let

Write this as a matrix equation and solve it:

This gives . Hence, the v's are independent.

Note that this proposition requires that you have n vectors in --- the number of vectors must match the dimension of the space.

The result can also be stated in contrapositive form: The set of vectors is dependent if and only if the matrix having the vectors as columns is not invertible. I'll use this form in the next example.

Example. Consider the following set of vectors in :

For what values of x is the set dependent?

I have 3 vectors in , so the previous result applies.

Construct the matrix having the vectors as columns:

The set is dependent when A is not invertible, and A is not invertible when its determinant is equal to 0. Now

Thus, for and . For those values of x, the original set is dependent.

The next proposition says that a independent set can be thought of as a set without "redundancy", in the sense that you can't build any one of the vectors out of the others.

Proposition. Let V be an F-vector space, and let . S is dependent if and only if some can be expressed as a linear combination of other vectors in S.

("Other" means vectors other than v itself.)

Proof. Suppose can be written as a linear combination of other vectors in S:

Here (where for all i) and .

Then

This is a nontrivial linear relation among elements of S. Hence, S is dependent.

Conversely, suppose S is dependent. Then there are elements (not all 0) and such that

Since not all the a's are 0, at least one is nonzero. There's no harm in assuming that . (If another a was nonzero instead, just relabel the a's and v's so and start again.)

Since , its inverse is defined. So

Thus, I've expressed as a linear combination of other vectors in S.

Definition. Let be a set of functions which are differentiable times. The Wronskian is

Thus, the rows of the determinant consist of successive derivatives of the original functions.

Theorem. Let be a set of functions which are differentiable times. If at some point , then S is independent.

Thus, if you can find some value of x at which the Wronskian in nonzero, the functions are independent.

The converse is false: You can find functions which are independent on an interval in , but whose Wronskian is identically 0 on the interval. The converse does hold with additional conditions: For example, if the functions in equation are solutions to a linear differential equation.

Proof. Let

I have to show all the a's are 0.

This equation is an identity in x, so I may differentiate it repeatedly to get n equations:

I can write this in matrix form:

Plug in :

Let

The determinant of this matrix is the Wronskian , which by assumption is nonzero. Since the determinant is nonzero, the matrix is invertible. So

Since , the functions are independent.

Example. Demonstrate that the set of functions is independent.

Compute the Wronskian:

I can find values of x for which the Wronskian is nonzero: for example, if , then . Hence, is independent.

Contact information