Definition. Let V be a vector space over a field F, and let . The set S is linearly independent if , , and
A set of vectors which is not linearly independent is linearly dependent. (I'll usually say "independent" and "dependent" for short.) Thus, a set of vectors S is dependent if there are vectors and numbers , not all of which are 0, such that
Note that S could be an infinite set of vectors.
In words, the definition says that if a linear combination of any finite set of vectors in S equals the zero vector, then all the coefficients in the linear combination must be 0. I'll refer to such a linear combination as a trivial linear combination.
On the other hand, a linear combination of vectors is nontrivial if at least one of the coefficients is nonzero. ("At least one" doesn't mean "all" --- a nontrivial linear combination can have some zero coefficients, as long as at least one is nonzero.)
Thus, we can also say that a set of vectors is independent if there is no nontrivial linear combination among finitely many of the vectors which is equal to 0. And a set of vectors is dependent if there is some nontrivial linear combination among finitely many of the vectors which is equal to 0.
Let's see a pictorial example of a dependent set. Consider the following vectors u, v, and w in .
I'll show how to get a nontrivial linear combination of the vectors that is equal to the zero vector. Project w onto the lines of u and v.
The projections are multiples of u and of v. Since w is the diagonal of the parallelogram whose sides are and , we have
This is a nontrivial linear combination of u, v and w which is equal to the zero vector, so is dependent.
In fact, it's true that any 3 vectors in are dependent, and this pictorial example should make this reasonable. More generally, if F is a field then any n vectors in are dependent if . We'll prove this below.
Example. If F is a field, the standard basis vectors are
Show that they form an independent set in .
Write
I have to show all the a's are 0. Now
So
Since by assumption , I get
Hence, , and the set is independent.
Example. Show that any set containing the zero vector is dependent.
If , then . The left side is a nontrivial (since ) linear combination of vectors in S --- actually, a vector in S. The linear combination is equal to 0. Hence, S is dependent.
Notice that it doesn't matter what else is in S (if anything).
Example. Show that the vectors and are dependent in .
I have to find numbers a and b, not both 0, such that
In this case, you can probably juggle numbers in your head to see that
This shows that the vectors are dependent. There are infinitely many pairs of numbers a and b that work. In examples to follow, I'll show how to find numbers systematically in cases where the arithmetic isn't so easy.
Example. Suppose u, v, w, and x are vectors in a vector space. Prove that the set is dependent.
Notice that in the four vectors in , each of u, v, w, and x occurs once with a plus sign and once with a minus sign. So
This is a dependence relation, so the set is dependent.
If you can't see an "easy" linear combination of a set of vectors that equals 0, you may have to determine independence or dependence by solving a system of equations.
Example. Consider the following sets of vectors in . If the set is independent, prove it. If the set is dependent, find a nontrivial linear combination of the vectors which is equal to 0.
(a) .
(b) .
(a) Write a linear combination of the vectors and set it equal to 0:
I have to determine whether this implies that .
Note: When I convert vectors given in "parenthesis form" to "matrix form", I'll turn the vectors into column vectors as above. This is consistent with the way I've set up systems of linear equations. Thus,
The vector equation above is equivalent to the matrix equation
Row reduce to solve:
Note: Row operations won't change the last column of zeros, so you don't actually need to write it when you do the row reduction. I'll put it in to avoid confusion.
The last matrix gives the equations
Therefore, the vectors are independent.
(b) Write
This gives the matrix equation
Row reduce to solve:
This gives the equations
Thus, and . I can get a nontrivial solution by setting c to any nonzero number. I'll use . This gives and . So
This is a linear dependence relation, and the vectors are dependent.
The same approach works for vectors in where F is a field other than the real numbers.
Example. Consider the set of vectors
If the set is independent, prove it. If the set is dependent, find a nontrivial linear combination of the vectors which is equal to 0.
Write
This gives the matrix equation
Row reduce to solve the system:
This gives the equations
Thus, and . Set . This gives and . Hence, the set is dependent, and
Example. Consider the following set of vectors in :
If the set is independent, prove it. If the set is dependent, find a nontrivial inear combination of the vectors equal to 0.
Write
This gives the matrix equation
Row reduce to solve the system:
This gives the equations
Hence, and . Set . Then and . Therefore, the set is dependent, and
To summarize, to determine whether vectors , , ..., in a vector space V are independent, I try to solve
If the only solution is , then the vectors are independent; otherwise, they are dependent.
It's important to understand this general setup, and not just memorize the special case of vectors in , as shown in the last few examples. Remember that vectors don't have to look like things like " " ("numbers in slots"). Consider the next example, for instance.
Example. is a vector space over the reals. Show that the set is independent.
Suppose
That is,
Two polynomials are equal if and only if their corresponding coefficients are equal. Hence, . Therefore, is independent.
In some cases, you can tell by inspection that a set is dependent. I noted earlier that a set containing the zero vector must be dependent. Here's another easy case.
Proposition. If , a set of n vectors in is dependent.
Proof. Suppose are n vectors in , and . Write
This gives the matrix equation
To solve, I'd row reduce the matrix
Note that this matrix has m rows and columns, and .
The row-reduced echelon form can have at most one leading coefficient in each row, so there are at most m leading coefficients. These correspond to the main variables in the solution. Since there are n variables and , there must be some parameter variables. By setting any parameter variables equal to nonzero numbers, I get a nontrivial solution for , , ... . This implies that is dependent.
Example. Is the following set of vectors in independent or dependent?
Any set of three (or more) vectors in is dependent.
Notice that we know this by just counting the number of vectors. To answer the given question, we don't actually have to give a nontrivial linear combination of the vectors that's equal to 0.
Proposition. Let be vectors in , where F is a field. is independent if and only if the matrix constructed using the vectors as columns is invertible:
Proof. Suppose the set is independent. Consider the system
Multiplying out the left side, this gives
By independence, . Thus, the system above has only the zero vector 0 as a solution. An earlier theorem on invertibility shows that this means the matrix of v's is invertible.
Conversely, suppose the following matrix is invertible:
Let
Write this as a matrix equation and solve it:
This gives . Hence, the v's are independent.
Note that this proposition requires that you have n vectors in --- the number of vectors must match the dimension of the space.
The result can also be stated in contrapositive form: The set of vectors is dependent if and only if the matrix having the vectors as columns is not invertible. I'll use this form in the next example.
Example. Consider the following set of vectors in :
For what values of x is the set dependent?
I have 3 vectors in , so the previous result applies.
Construct the matrix having the vectors as columns:
The set is dependent when A is not invertible, and A is not invertible when its determinant is equal to 0. Now
Thus, for and . For those values of x, the original set is dependent.
The next proposition says that a independent set can be thought of as a set without "redundancy", in the sense that you can't build any one of the vectors out of the others.
Proposition. Let V be a vector space over a field F, and let . S is dependent if and only if some can be expressed as a linear combination of vectors in S.
Proof. Suppose can be written as a linear combination of vectors in S other than v:
Here (where for all i) and .
Then
This is a nontrivial linear relation among elements of S. Hence, S is dependent.
Conversely, suppose S is dependent. Then there are elements (not all 0) and such that
Since not all the a's are 0, at least one is nonzero. There's no harm in assuming that . (If another a was nonzero instead, just relabel the a's and v's so and start again.)
Since , its inverse is defined. So
Thus, I've expressed as a linear combination of other vectors in S.
Independence of sets of functions
In some cases, we can use a determinant to tell that a finite set of functions is independent. The determinant is called the Wronskian, and its rows are the successive derivatives of the original functions.
Definition. Let be a set of functions which are differentiable times. The Wronskian is
Naturally, this requires that the functions be sufficiently differentiable.
Theorem. Let be the real vector space of functions which are differentiable times. Let be a subset of .If at some point , then S is independent.
Thus, if you can find some value of x for which the Wronskian is nonzero, the functions are independent.
The converse is false, and we'll give a counterexample to the converse below. The converse does hold with additional conditions: For example, if the functions are solutions to a linear differential equation.
Proof. Let
I have to show all the a's are 0.
This equation is an identity in x, so I may differentiate it repeatedly to get n equations:
I can write this in matrix form:
Plug in :
Let
The determinant of this matrix is the Wronskian , which by assumption is nonzero. Since the determinant is nonzero, the matrix is invertible. So
Since , the functions are independent.
Example. denotes the vector space over consisting of twice-differentiable functions . Demonstrate that the set of functions is independent in .
Compute the Wronskian:
I can find values of x for which the Wronskian is nonzero: for example, if , then . Hence, is independent.
The next example shows that the converse of the last theorem is false: You can have a set of independent functions whose Wronskian is always 0 (so there's no point where the Wronskian is nonzero).
Example. denotes the vector space over consisting of differentiable functions . Let
Show that is independent in , but for all .
Note: You can check that g is differentiable at 0, and .
For independence, suppose that and for all . Plugging in , I get
Plugging in (and noting that ), I get
Adding and gives , so . Plugging into gives . This proves that is independent.
The Wronskian is
I'll take cases.
Since and , I have .
If , I have and , so
If , I have and , so
This shows that for all .
Copyright 2022 by Bruce Ikenaga