Linear Independence

Definition. Let V be a vector space over a field F, and let $S \subset
   V$ . The set S is linearly independent if $v_1, \ldots, v_n \in S$ , $a_1, \ldots, a_n \in F$ , and

$$a_1 v_1 + \cdots + a_n v_n = \vec{0} \quad \hbox{implies} \quad a_1 = \cdots = a_n = 0.$$

An equation like the one above is called a linear relationship among the $v_i$ ; if at least one of the coefficients $a_i$ is nonzero, it is a nontrivial linear relationship. Thus, a set of vectors is independent if there is no nontrivial linear relationship among finitely many of the vectors.

A set of vectors which is not linearly independent is linearly dependent. (I'll usually say "independent" and "dependent" for short.) Thus, a set of vectors S is dependent if there are vectors $v_1, \ldots, v_n \in S$ and numbers $a_1, \ldots, a_n \in F$ , not all of which are 0, such that

$$a_1 v_1 + \cdots + a_n v_n = \vec{0}.$$


Example. If F is a field, the standard basis vectors are

$$\eqalign{ e_1 & = (1, 0, 0 , \ldots 0) \cr e_2 & = \langle 0, 1, 0, \ldots 0) \cr & \vdots \cr e_n & = (0, 0, 0, \ldots 1) \cr}$$

Show that they form an independent set in $F^n$ .

Write

$$a_1 e_1 + a_2 e_2 + \cdots + a_n e_n = \vec{0}.$$

I have to show all the a's are 0. Now

$$\eqalign{ a_1 e_1 & = (a_1, 0, 0 , \ldots 0) \cr a_2 e_2 & = \langle 0, a_2, 0, \ldots 0) \cr & \vdots \cr a_n e_n & = (0, 0, 0, \ldots a_n) \cr}$$

So

$$a_1 e_1 + a_2 e_2 + \cdots + a_n e_n = (a_1, a_2, \ldots a_n).$$

Since by assumption $a_1
   e_1 + a_2 e_2 + \cdots + a_n e_n = \vec{0}$ , I get

$$(a_1, a_2, \ldots a_n) = (0, 0, \ldots 0).$$

Hence, $a_1 = a_2 = \cdots
   = a_n = 0$ , and the set is independent.


Example. Show that any set containing the zero vector is dependent.

If $\vec{0} \in S$ , then $1 \cdot \vec{0} = \vec{0}$ is a nontrivial linear relationship in S.


Example. Show that the vectors $(2, -1, 4)$ and $(-6,
   3, -12)$ are dependent in $\real^3$ .

I have to find numbers a and b, not both 0, such that

$$a \cdot (2, -1, 4) + b \cdot (-6, 3, -12) = (0, 0, 0).$$

In this case, you can probably juggle numbers in your head to see that

$$3 \cdot (2, -1, 4) + 1 \cdot (-6, 3, -12) = (0, 0, 0).$$

This shows that the vectors are dependent. There are infinitely many pairs of numbers a and b that work. In examples to follow, I'll show how to find numbers systematically in cases where the arithmetic isn't so easy.


Example. Suppose u, v, w, and x are vectors in a vector space. Prove that the set $\{u - v, v - w, w - x, x -
   u\}$ is dependent.

Notice that in the four vectors in $\{u - v, v - w, w - x, x -
   u\}$ , each of u, v, w, and x occurs once with a plus sign and once with a minus sign. So

$$(u - v) + (v - w) + (w - x) + (x - u) = \vec{0}.$$

This is a dependence relation, so the set is dependent.


If you can't see an "easy" linear combination of a set of vectors that equals $\vec{0}$ , you may have to determine independence or dependence by solving a system of equations.

Example. Consider the following sets of vectors in $\real^3$ . If the set is independent, prove it. If the set is dependent, find a nontrivial linear combination of the vectors which is equal to $\vec{0}$ .

(a) $\{(2, 0, -3), (1, 1,
   1), (1, 7, 2)\}$ .

(b) $\{(1, 2, -1), (4, 1,
   3), (-10, 1, -11)\}$ .

(a) Write a linear combination of the vectors and set it equal to $\vec{0}$ :

$$a \cdot \left[\matrix{2 \cr 0 \cr -3 \cr}\right] + b \cdot \left[\matrix{1 \cr 1 \cr 1 \cr}\right] + c \cdot \left[\matrix{1 \cr 7 \cr 2 \cr}\right] = \left[\matrix{0 \cr 0 \cr 0 \cr}\right].$$

I have to determine whether this implies that $a = b = c = 0$ .

Note: When you convert vectors given in "parenthesis form" to "matrix form", you turn the vectors into column vectors as above. This is consistent with the way I've set up systems of linear equations. Thus,

$$(2, 0, -3) \quad\hbox{became}\quad \left[\matrix{2 \cr 0 \cr -3 \cr}\right].$$

The vector equation above is equivalent to the matrix equation

$$\left[\matrix{ 2 & 1 & 1 \cr 0 & 1 & 7 \cr -3 & 1 & 2 \cr}\right] \left[\matrix{a \cr b \cr c \cr}\right] = \left[\matrix{0 \cr 0 \cr 0 \cr}\right].$$

Row reduce to solve:

$$\left[\matrix{ 2 & 1 & 1 & 0 \cr 0 & 1 & 7 & 0 \cr -3 & 1 & 2 & 0 \cr}\right] \quad \to \quad \left[\matrix{ 1 & 0 & 0 & 0 \cr 0 & 1 & 0 & 0 \cr 0 & 0 & 1 & 0 \cr}\right]$$

Note: Row operations won't change the last column of zeros, so you don't actually need to write it when you do the row reduction. I'll put it in to avoid confusion.

The last matrix gives the equations

$$a = 0, \quad b = 0, \quad c = 0.$$

Therefore, the vectors are independent.

(b) Write

$$a \cdot \left[\matrix{1 \cr 2 \cr -1 \cr}\right] + b \cdot \left[\matrix{4 \cr 1 \cr 3 \cr}\right] + c \cdot \left[\matrix{-10 \cr 1 \cr -11 \cr}\right] = \left[\matrix{0 \cr 0 \cr 0 \cr}\right].$$

This gives the matrix equation

$$\left[\matrix{ 1 & 4 & -10 \cr 2 & 1 & 1 \cr -1 & 3 & -11 \cr}\right] \left[\matrix{a \cr b \cr c \cr}\right] = \left[\matrix{0 \cr 0 \cr 0 \cr}\right].$$

Row reduce to solve:

$$\left[\matrix{ 1 & 4 & -10 & 0 \cr 2 & 1 & 1 & 0 \cr -1 & 3 & -11 & 0 \cr}\right] \quad \to \quad \left[\matrix{ 1 & 0 & 2 & 0 \cr 0 & 1 & -3 & 0 \cr 0 & 0 & 0 & 0 \cr}\right]$$

This gives the equations

$$a + 2 c = 0, \quad b - 3 c = 0.$$

Thus, $a = - 2 c$ and $b = 3 c$ . I can get a nontrivial solution by setting c to any nonzero number. I'll use $c = 1$ . This gives $a = -2$ and $b = 3$ . So

$$(-2) \cdot \left[\matrix{1 \cr 2 \cr -1 \cr}\right] + 3 \cdot \left[\matrix{4 \cr 1 \cr 3 \cr}\right] + 1 \cdot \left[\matrix{-10 \cr 1 \cr -11 \cr}\right] = \left[\matrix{0 \cr 0 \cr 0 \cr}\right].$$

This is a linear dependence relation, and the vectors are dependent.


The same approach works for vectors in $F^n$ where F is a field other than the real numbers.

Example. Consider the set of vectors

$$\left\{(4, 1, 2), (3, 3, 0), (0, 1, 1)\right\} \quad\hbox{in}\quad \integer_5^3.$$

If the set is independent, prove it. If the set is dependent, find a nontrivial linear combination of the vectors which is equal to $\vec{0}$ .

Write

$$a \cdot \left[\matrix{4 \cr 1 \cr 2 \cr}\right] + b \cdot \left[\matrix{3 \cr 3 \cr 0 \cr}\right] + c \cdot \left[\matrix{0 \cr 1 \cr 1 \cr}\right] = \left[\matrix{0 \cr 0 \cr 0 \cr}\right].$$

This gives the matrix equation

$$\left[\matrix{ 4 & 3 & 0 \cr 1 & 3 & 1 \cr 2 & 0 & 1 \cr}\right] \left[\matrix{a \cr b \cr c \cr}\right] = \left[\matrix{0 \cr 0 \cr 0 \cr}\right].$$

Row reduce to solve the system:

$$\left[\matrix{ 4 & 3 & 0 & 0 \cr 1 & 3 & 1 & 0 \cr 2 & 0 & 1 & 0 \cr}\right] \matrix{\to \cr r_{1} \to 4 r_{1} \cr} \left[\matrix{ 1 & 2 & 0 & 0 \cr 1 & 3 & 1 & 0 \cr 2 & 0 & 1 & 0 \cr}\right] \matrix{\to \cr r_{3} \to r_{3} + 3 r_{1} \cr}$$

$$\left[\matrix{ 1 & 2 & 0 & 0 \cr 1 & 3 & 1 & 0 \cr 0 & 1 & 1 & 0 \cr}\right] \matrix{\to \cr r_{2} \to r_{2} + 4 r_{1} \cr} \left[\matrix{ 1 & 2 & 0 & 0 \cr 0 & 1 & 1 & 0 \cr 0 & 1 & 1 & 0 \cr}\right] \matrix{\to \cr r_{1} \to r_{1} + 3 r_{2} \cr}$$

$$\left[\matrix{ 1 & 0 & 3 & 0 \cr 0 & 1 & 1 & 0 \cr 0 & 1 & 1 & 0 \cr}\right] \matrix{\to \cr r_{3} \to r_{3} + 4 r_{2} \cr} \left[\matrix{ 1 & 0 & 3 & 0 \cr 0 & 1 & 1 & 0 \cr 0 & 0 & 0 & 0 \cr}\right]$$

This gives the equations

$$a + 3 c = 0, \quad b + c = 0.$$

Thus, $a = 2 c$ and $b = 4 c$ . Set $c = 1$ . This gives $a = 2$ and $b = 4$ . Hence, the set is dependent, and

$$2 \cdot \left[\matrix{4 \cr 1 \cr 2 \cr}\right] + 4 \cdot \left[\matrix{3 \cr 3 \cr 0 \cr}\right] + 1 \cdot \left[\matrix{0 \cr 1 \cr 1 \cr}\right] = \left[\matrix{0 \cr 0 \cr 0 \cr}\right].\quad\halmos$$


Example. Consider the following set of vectors in $\integer_3^4$ :

$$\left\{(1, 0, 1, 2), (1, 2, 2, 1), (0, 1, 2, 1)\right\}$$

If the set is independent, prove it. If the set is dependent, find a nontrivial inear combination of the vectors equal to $\vec{0}$ .

Write

$$a \cdot \left[\matrix{1 \cr 0 \cr 1 \cr 2 \cr}\right] + b \cdot \left[\matrix{1 \cr 2 \cr 2 \cr 1 \cr}\right] + c \cdot \left[\matrix{0 \cr 1 \cr 2 \cr 1 \cr}\right] = \left[\matrix{0 \cr 0 \cr 0 \cr 0 \cr}\right].$$

This gives the matrix equation

$$\left[\matrix{ 1 & 1 & 0 \cr 0 & 2 & 1 \cr 1 & 2 & 2 \cr 2 & 1 & 1 \cr}\right] \left[\matrix{a \cr b \cr c \cr}\right] = \left[\matrix{0 \cr 0 \cr 0 \cr 0 \cr}\right].$$

Row reduce to solve the system:

$$\left[\matrix{ 1 & 1 & 0 & 0 \cr 0 & 2 & 1 & 0 \cr 1 & 2 & 2 & 0 \cr 2 & 1 & 1 & 0 \cr}\right] \quad \to \quad \left[\matrix{ 1 & 0 & 1 & 0 \cr 0 & 1 & 2 & 0 \cr 0 & 0 & 0 & 0 \cr 0 & 0 & 0 & 0 \cr}\right]$$

This gives the equations

$$a + c = 0, \quad b + 2 c = 0.$$

Hence, $a = 2 c$ and $b = c$ . Set $c = 1$ . Then $a =
   2$ and $b = 1$ . Therefore, the set is dependent, and

$$2 \cdot \left[\matrix{1 \cr 0 \cr 1 \cr 2 \cr}\right] + 1 \cdot \left[\matrix{1 \cr 2 \cr 2 \cr 1 \cr}\right] + 1 \cdot \left[\matrix{0 \cr 1 \cr 2 \cr 1 \cr}\right] = \left[\matrix{0 \cr 0 \cr 0 \cr 0 \cr}\right].\quad\halmos$$


To summarize, to determine whether vectors $v_1$ , $v_2$ , ..., $v_m$ in a vector space V are independent, I try to solve

$$a_1 v_1 + a_2 v_2 + \cdots + a_m v_m = 0.$$

If the only solution is $a_1 = a_2 = \cdots = a_m = 0$ , then the vectors are independent; otherwise, they are dependent.

It's important to understand this general setup, and not just memorize the special case of vectors in $F^n$ , as shown in the last few examples. Remember that vectors don't have to look like things like "$(-3, 5, 7, 0)$ " ("numbers in slots"). Consider the next example, for instance.

Example. $\real[x]$ is a vector space over the reals. Show that the set $\{1, x, x^2, \ldots \}$ is independent.

Suppose

$$a_0 + a_1 x + a_2 x^2 + \cdots + a_n x^n = 0.$$

That is,

$$a_0 + a_1 x + a_2 x^2 + \cdots + a_n x^n = 0 + 0 \cdot x + 0 \cdot x^2 + \cdots + 0 \cdot x^n.$$

Two polynomials are equal if and only if their corresponding coefficients are equal. Hence, $a_0 = a_2 = \cdots = a_n = 0$ . Therefore, $\{1, x, x^2, \ldots \}$ is independent.


In some cases, you can tell by inspection that a set is dependent. I noted earlier that a set containing the zero vector must be dependent. Here's another easy case.

Proposition. If $n > m$ , a set of n vectors in $F^m$ is dependent.

Proof. Suppose $v_1, v_2, \ldots v_n$ are n vectors in $F^m$ , and $n >
   m$ . Write

$$a_1 v_1 + a_2 v_2 + \cdots + a_n v_n = \vec{0}.$$

This gives the matrix equation

$$\left[\matrix{ \uparrow & \uparrow & & \uparrow \cr v_1 & v_2 & & v_n \cr \downarrow & \downarrow & & \downarrow \cr}\right] \left[\matrix{a_1 \cr a_2 \cr \vdots \cr a_n \cr}\right] = \left[\matrix{0 \cr 0 \cr \vdots \cr 0 \cr}\right].$$

To solve, I'd row reduce the matrix

$$\left[\matrix{ \uparrow & \uparrow & & \uparrow & 0 \cr v_1 & v_2 & & v_n & \vdots \cr \downarrow & \downarrow & & \downarrow & 0 \cr}\right]$$

Note that this matrix has m rows and $n + 1$ columns, and $n > m$ .

The row-reduced echelon form can have at most one leading coefficient in each row, so there are at most m leading coefficients. These correspond to the main variables in the solution. Since there are n variables and $n > m$ , there must be some parameter variables. By setting any parameters variables equal to nonzero numbers, I get a nontrivial solution for $a_1$ , $a_2$ , ... $a_n$ . This implies that $v_1, v_2, \ldots v_n$ is dependent.

Example. Is the following set of vectors in $\real^2$ independent or dependent?

$$\left\{(1, -3), (5, -\pi), (\sqrt{42}, 7)\right\}$$

Any set of three (or more) vectors in $\real^2$ is dependent.


Proposition. Let $\{v_1, v_2, \ldots v_n\}$ be vectors in $F^n$ , where F is a field. $\{v_1, v_2, \ldots v_n\}$ is independent if and only if the matrix constructed using the vectors as columns is invertible:

$$\left[\matrix{ \uparrow & \uparrow & & \uparrow \cr v_1 & v_2 & & v_n \cr \downarrow & \downarrow & & \downarrow \cr}\right].$$

Proof. Suppose the set is independent. Consider the system

$$\left[\matrix{ \uparrow & \uparrow & & \uparrow \cr v_1 & v_2 & & v_n \cr \downarrow & \downarrow & & \downarrow \cr}\right] \left[\matrix{a_1 \cr a_2 \cr \vdots \cr a_n \cr}\right] = \left[\matrix{0 \cr 0 \cr \vdots \cr 0 \cr}\right].$$

Multiplying out the left side, this gives

$$a_1 v_1 + a_2 v_2 + a_n v_n = \vec{0}.$$

By independence, $a_1 =
   a_2 = \cdots = a_n = 0$ . Thus, the system above has only $\vec{0}$ as a solution. An earlier theorem on invertibility shows that this means the matrix of v's is invertible.

Conversely, suppose the following matrix is invertible:

$$A = \left[\matrix{ \uparrow & \uparrow & & \uparrow \cr v_1 & v_2 & & v_n \cr \downarrow & \downarrow & & \downarrow \cr}\right].$$

Let

$$a_1 v_1 + a_2 v_2 + a_n v_n = \vec{0}.$$

Write this as a matrix equation and solve it:

$$\eqalign{ \left[\matrix{ \uparrow & \uparrow & & \uparrow \cr v_1 & v_2 & & v_n \cr \downarrow & \downarrow & & \downarrow \cr}\right] \left[\matrix{a_1 \cr a_2 \cr \vdots \cr a_n \cr}\right] & = \left[\matrix{0 \cr 0 \cr \vdots \cr 0 \cr}\right] \cr A \cdot \left[\matrix{a_1 \cr a_2 \cr \vdots \cr a_n \cr}\right] & = \left[\matrix{0 \cr 0 \cr \vdots \cr 0 \cr}\right] \cr A^{-1} A \cdot \left[\matrix{a_1 \cr a_2 \cr \vdots \cr a_n \cr}\right] & = A^{-1} \cdot \left[\matrix{0 \cr 0 \cr \vdots \cr 0 \cr}\right] \cr \left[\matrix{a_1 \cr a_2 \cr \vdots \cr a_n \cr}\right] & = \left[\matrix{0 \cr 0 \cr \vdots \cr 0 \cr}\right] \cr}$$

This gives $a_1 = a_2 =
   \cdots = a_n = 0$ . Hence, the v's are independent.

Note that this proposition requires that you have n vectors in $F^n$ --- the number of vectors must match the dimension of the space.

The result can also be stated in contrapositive form: The set of vectors is dependent if and only if the matrix having the vectors as columns is not invertible. I'll use this form in the next example.

Example. Consider the following set of vectors in $\real^3$ :

$$\left\{ \left[\matrix{x - 8 \cr 0 \cr 0 \cr}\right], \left[\matrix{-7 \cr x - 1 \cr 0 \cr}\right], \left[\matrix{13 \cr 5 \cr 30 \cr}\right]\right\}.$$

For what values of x is the set dependent?

I have 3 vectors in $\real^3$ , so the previous result applies.

Construct the matrix having the vectors as columns:

$$A = \left[\matrix{ x - 8 & -7 & 13 \cr 0 & x - 1 & 5 \cr 0 & 0 & 30 \cr}\right].$$

The set is dependent when A is not invertible, and A is not invertible when its determinant is equal to 0. Now

$$\det A = 30(x - 1)(x - 8).$$

Thus, $\det A = 0$ for $x = 1$ and $x =
   8$ . For those values of x, the original set is dependent.


The next proposition says that a independent set can be thought of as a set without "redundancy", in the sense that you can't build any one of the vectors out of the others.

Proposition. Let V be an F-vector space, and let $S \subset V$ . S is dependent if and only if some $v \in S$ can be expressed as a linear combination of other vectors in S.

("Other" means vectors other than v itself.)

Proof. Suppose $v \in S$ can be written as a linear combination of other vectors in S:

$$v = a_1 v_1 + \cdots + a_n v_n.$$

Here $v_1, \ldots, v_n \in
   S$ (where $v_i \ne v$ for all i) and $a_1, \ldots, a_n \in F$ .

Then

$$\vec{0} = a_1 v_1 + \cdots + a_n v_n - 1 \cdot v.$$

This is a nontrivial linear relation among elements of S. Hence, S is dependent.

Conversely, suppose S is dependent. Then there are elements $a_1, a_2, \ldots a_n \in F$ (not all 0) and $v_1, v_2, \ldots v_n \in S$ such that

$$a_1 v_1 + a_2 v_2 + \cdots + a_n v_n = 0.$$

Since not all the a's are 0, at least one is nonzero. There's no harm in assuming that $a_1 \ne
   0$ . (If another a was nonzero instead, just relabel the a's and v's so $a_1 \ne 0$ and start again.)

Since $a_1 \ne 0$ , its inverse $a_1^{-1}$ is defined. So

$$\eqalign{ a_1 v_1 + a_2 v_2 + \cdots + a_n v_n & = \vec{0} \cr a_1 v_1 & = -a_2 v_2 - \cdots - a_n v_n \cr a_1^{-1} a_1 v_1 & = a_1^{-1} (-a_2 v_2 - \cdots - a_n v_n) \cr v_1 & = -a_1^{-1} a_2 v_2 - \cdots - a_1^{-1} a_n v_n \cr}$$

Thus, I've expressed $v_1$ as a linear combination of other vectors in S.

Definition. Let $\{f_1, f_2, \ldots f_n\}$ be a set of functions $\real \to \real$ which are differentiable $n - 1$ times. The Wronskian is

$$W(f_1, f_2, \ldots f_n) = \left|\matrix{ f_1 & f_2 & \cdots & f_n \cr f_1' & f_2' & \cdots & f_n' \cr f_1'' & f_2'' & \cdots & f_n'' \cr \vdots & \vdots & & \vdots \cr f_n^{(n - 1)} & f_2^{(n - 1)} & \cdots & f_n^{(n - 1)} \cr}\right|.$$

Thus, the rows of the determinant consist of successive derivatives of the original functions.

Theorem. Let $S = \{f_1, f_2, \ldots f_n\}$ be a set of functions $\real \to \real$ which are differentiable $n - 1$ times. If $W(f_1, f_2, \ldots f_n) \ne 0$ at some point $x = c$ , then S is independent.

Thus, if you can find some value of x at which the Wronskian in nonzero, the functions are independent.

The converse is false: You can find functions which are independent on an interval in $\real$ , but whose Wronskian is identically 0 on the interval. The converse does hold with additional conditions: For example, if the functions in equation are solutions to a linear differential equation.

Proof. Let

$$a_1 f_1(x) + a_2 f_2(x) + \cdots + a_n f_n(x) = 0.$$

I have to show all the a's are 0.

This equation is an identity in x, so I may differentiate it repeatedly to get n equations:

$$\eqalign{ a_1 f_1(x) + a_2 f_2(x) + \cdots + a_n f_n(x) & = 0 \cr a_1 f_1'(x) + a_2 f_2'(x) + \cdots + a_n f_n'(x) & = 0 \cr a_1 f_1''(x) + a_2 f_2''(x) + \cdots + a_n f_n''(x) & = 0 \cr & \vdots \cr a_1 f_1^{(n - 1)}(x) + a_2 f_2^{(n - 1)}(x) + \cdots + a_n f_n^{(n - 1)}(x) & = 0 \cr}$$

I can write this in matrix form:

$$\left[\matrix{ f_1(x) & f_2(x) & \cdots & f_n(x) \cr f_1'(x) & f_2'(x) & \cdots & f_n'(x) \cr f_1''(x) & f_2''(x) & \cdots & f_n''(x) \cr \vdots & \vdots & & \vdots \cr f_n^{(n - 1)}(x) & f_2^{(n - 1)}(x) & \cdots & f_n^{(n - 1)}(x) \cr}\right] \left[\matrix{a_1 \cr a_2 \cr \vdots \cr a_n \cr}\right] = \left[\matrix{0 \cr 0 \cr \vdots \cr 0 \cr}\right].$$

Plug in $x = c$ :

$$\left[\matrix{ f_1(c) & f_2(c) & \cdots & f_n(c) \cr f_1'(c) & f_2'(c) & \cdots & f_n'(c) \cr f_1''(c) & f_2''(c) & \cdots & f_n''(c) \cr \vdots & \vdots & & \vdots \cr f_n^{(n - 1)}(c) & f_2^{(n - 1)}(c) & \cdots & f_n^{(n - 1)}(c) \cr}\right] \left[\matrix{a_1 \cr a_2 \cr \vdots \cr a_n \cr}\right] = \left[\matrix{0 \cr 0 \cr \vdots \cr 0 \cr}\right].$$

Let

$$A = \left[\matrix{ f_1(c) & f_2(c) & \cdots & f_n(c) \cr f_1'(c) & f_2'(c) & \cdots & f_n'(c) \cr f_1''(c) & f_2''(c) & \cdots & f_n''(c) \cr \vdots & \vdots & & \vdots \cr f_n^{(n - 1)}(c) & f_2^{(n - 1)}(c) & \cdots & f_n^{(n - 1)}(c) \cr}\right].$$

The determinant of this matrix is the Wronskian $W(f_1, f_2, \ldots f_n)(c)$ , which by assumption is nonzero. Since the determinant is nonzero, the matrix is invertible. So

$$\eqalign{ A \cdot \left[\matrix{a_1 \cr a_2 \cr \vdots \cr a_n \cr}\right] & = \left[\matrix{0 \cr 0 \cr \vdots \cr 0 \cr}\right] \cr A^{-1} A \cdot \left[\matrix{a_1 \cr a_2 \cr \vdots \cr a_n \cr}\right] & = A^{-1} \cdot \left[\matrix{0 \cr 0 \cr \vdots \cr 0 \cr}\right] \cr \left[\matrix{a_1 \cr a_2 \cr \vdots \cr a_n \cr}\right] & = \left[\matrix{0 \cr 0 \cr \vdots \cr 0 \cr}\right] \cr}$$

Since $a_1 = a_2 = \cdots
   = a_n = 0$ , the functions are independent.

Example. Demonstrate that the set of functions $\{x, x^3, x^5\}$ is independent.

Compute the Wronskian:

$$W(x, x^3, x^5) = \left|\matrix{ x & x^3 & x^5 \cr 1 & 3 x^2 & 5 x^4 \cr 0 & 6 x & 20 x^3 \cr}\right| = 16 x^6.$$

I can find values of x for which the Wronskian is nonzero: for example, if $x = 1$ , then $W(x, x^3, x^5) = 16 \ne 0$ . Hence, $\{x, x^3, x^5\}$ is independent.



Contact information

Bruce Ikenaga's Home Page

Copyright 2017 by Bruce Ikenaga