A set is * independent* if, roughly speaking,
there is no redundancy in the set: You can't "build" any
vector in the set as a linear combination of the others. A set * spans* if you can "build everything" in
the vector space as linear combinations of vectors in the set.
Putting these two ideas together, a * basis* is
an independent spanning set: A set with no redundancy out of which
you can "build everything".

* Definition.* Let V be an F-vector space. A
subset of V is a * basis* if it is
linearly independent and spans V.

The number of elements in a basis for V is called the * dimension* of V, and is denoted .

Someone could object at this point that I don't know that two bases for V might not have different numbers of elements --- in which case the dimension of V wouldn't make sense. I'll show later that this can't happen.

* Example.* Consider the standard basis for
:

As the name implies, this set *is* a basis.

I observed earlier that this set is independent. Moreover,

Therefore, every vector in can be written as a linear combination of the vectors, so the set also spans. Hence

* Example.* The following set is a basis for
over :

Make a matrix with the vectors as *columns* and row reduce:

Since A row reduces to the identity, it is invertible,and there are a number of conditions which are equivalent to A being invertible.

First, since A is invertible the following system has a unique solution for every :

But this matrix equation can be written as

In other words, any vector can be written as a linear combination of the given vectors: The given vectors span .

Second, since A is invertible, the following system has only , , as a solution:

This matrix equation can be written as

Since the only solution is , , , the vectors are independent.

Hence, the given set of vectors is a basis for .

The following result is clear from the last example.

* Proposition.* Let F be a field. is a basis for if and only if

Note that by earlier results on invertibility, this is equivalent to the following conditions (among others):

1. A row reduces to the identity.

2. .

Thus, if you have n vectors in , this gives you several ways of checking whether or not the set is a basis.

A basis for V is a spanning set for V, so every vector in V can be written as a linear combination of basis elements. The next result says that such a linear combination is unique.

* Lemma.* Let
be a basis for a vector space V. Every can be written in exactly
one way as

* Proof.* Let . Since spans V, there are scalars ,
, ..., such that

Suppose that there is another way to do this:

Then

Hence,

Since is independent,

Therefore,

That is, the two linear combinations were actually the same. This proves that there's only one way to write v as a linear combination of the 's.

I want to show that two bases for a vector space must have the same number of elements. I need some preliminary results, which are important in their own right.

* Lemma.* If A is an matrix with
, the system has nontrivial
solutions.

* Proof.* Write

The condition means that the following system has more variables than equations:

If A row reduces to a row reduced echelon matrix R, then R can have at most m leading coefficients. Therefore, some of the variables , , ..., will be free variables (parameters); if I assign nonzero values to the free variables (e.g. by setting all of them equal to 1), the resulting solution will be nontrivial.

* Theorem.* Let
be a basis for a vector space V.

(a) Any subset of V with *more* than n elements is dependent.

(b) Any subset of V with *fewer* than n elements cannot span.

* Proof.* (a) Suppose
is a subset of V, and that . I want to show that is dependent.

Write each w as a linear combination of the v's:

This can be represented as the following matrix equation:

Since , the matrix of a's has more columns than rows. Therefore, the following system has a nontrivial solution , , ... :

That is, not all the b's are 0, but

But then

Therefore,

In equation form,

This is a nontrivial linear combination of the w's which adds up to , so the w's are dependent.

(b) Suppose that is a set of vectors in V and . I want to show that does not span V.

Suppose on the contrary that the w's span V. Then each v can be written as a linear combination of the w's:

In matrix form, this is

Since , the coefficient matrix has more columns than rows. Hence, the following system has a nontrivial solution , , ... :

Thus,

Multiplying the v and w equation on the right by the b-vector gives

In equation form, this is

Since not all the b's are 0, this is a nontrivial linear combination of the v's which adds up to --- contradicting the independence of the v's.

This contradiction means that the w's *can't* span after
all.

* Example.* The standard basis for
contains 3 vectors.

Hence, the following set of *four* vectors can't be
independent:

Likewise, the following set of *two* vectors can't span
:

* Corollary.* If is a
basis for a vector space V, then every basis for V has n elements.

* Proof.* If is
another basis for V, then m can't be less than n or couldn't span. Likewise, m can't be greater than n or
couldn't be independent. Therefore,
.

The Corollary shows that the dimension of a finite-dimensional vector
space is well-defined --- that is, *in a finite-dimensional vector
space, any two bases have the same number of elements*. This is
true in general; I'll state the relevant results without proof.

(a) Every vector space has a basis. The proof requires a
set-theoretic result called * Zorn's Lemma*.

(b) Two bases for *any* vector space have the same number of
elements. Specifically, if and are bases for a
vector space V, there is a bijective function .

I've already given one example of an infinite basis:

This set is a basis for the vector space of polynomials with real coefficients over the field of real numbers.

The next result shows that, in principle, you can construct a basis by:

(a) Starting with an independent set and adding vectors.

(b) Starting with a spanning set and removing vectors.

* Theorem.* Let V be a vector space.

(a) Any set of independent vectors is a subset of a basis for V.

(b) Any spanning set for V contains a subset which is a basis.

Part (a) means that if S is an independent set, then there is a basis T such that . (If S was a basis to begin with, then .) Part (b) means that if S is a spanning set, then there is a basis T such that .

* Proof.* I'll only give the proof in the case
where V has finite dimension n, though it is true for any vector
space.

(a) Let be independent. If this set spans V, it's a basis, and I'm done. Otherwise, there is a vector which is not in the span of .

I claim that is independent. Suppose

Suppose . Then I can write

Since v has been expressed as a linear combination of the 's, it's in the span of the 's, contrary to assumption. Therefore, this case is ruled out.

The only other possibility is . Then , so independence of the 's implies . Therefore, is independent.

I can continue adding vectors in this way until I get a set which is independent and spans --- a basis. The process must terminate, since no independent set in V can have more than n elements.

(b) Suppose spans V. I want to show that some subset of is a basis.

If is independent, it's a basis, and I'm done. Otherwise, there is a nontrivial linear combination

Assume without loss of generality that . Then

Since is a linear combination of the other v's, I can remove it and still have a set which spans V: .

I continue throwing out vectors in this way until I reach a set which spans and is independent --- a basis. The process must terminate, because no set containing fewer than n vectors can span V.

It's possible to carry out the "adding vectors" and
"removing vectors" procedures in some specific cases. The
algorithms are related to those for finding bases for the * row space* and * column space*
of a matrix, which I'll discuss later.

Suppose you know a basis should have n elements, and you have a set S
with n elements ("the right number"). To show S is a basis,
you only need to check *either* that it is independent
*or* that it spans --- not both. I'll justify this statement,
then show by example how you can use it. I need a preliminary result.

* Proposition.* Let V be a finite dimensional
vector space over a field F, and let W be a subspace of V. If , then .

* Proof.* Suppose , but
. I'll show that this leads to a contradiction.

Let be a basis for W. Suppose this is not a basis for V. Since it's an independent set, the previous result shows that I can add vectors , , ... to make a basis for V:

But this is a basis for V with more than n elements, which is impossible.

Therefore, is also a basis for V. Let . Since spans V, I can write x as a linear combination of the elements of :

But since , , ... are in W and W is a subspace, any linear combination of , , ... must be in W. Thus, .

Since x was an arbitrary element of V, I get , so .

You might be thinking that this result is obvious: W and V have the
same dimension, and you can't have one thing inside another, with
both things having the "same size", unless the things are
equal. This is intuitively what is going on here, but this kind of
intuitive reasoning doesn't always work. For example, the even
integers are a subset of the integers and, as infinite sets, both are
of the same "order of infinity" (*
cardinality*). But the even integers aren't all of the integers:
There are odd integers as well.

* Corollary.* Let S be a set of n vectors in an
n-dimensional vector space V.

(a) If S is independent, then S is a basis for V.

(b) If S spans V, then S is a basis for V.

* Proof.* (a) Suppose S is independent.
Consider W, the span of S. Then S is independent and spans W, so S is
a basis for W. Since S has n elements, . But and . By the preceding result, .

Hence, S spans V, and S is a basis for V.

(b) Suppose S spans V. Suppose S is not independent. By an earlier result, I can remove some elements of S to get a set T which is a basis for V. But now I have a basis T for V with fewer than n elements (since I removed elements from S, which had n elements).

This is a contradiction, and hence S must be independent.

* Example.* Show that the following set is a
basis for :

Since I have 3 vectors in (which has dimension 3), I only need to show that the set is independent. So suppose

This gives the matrix equation

Solve the system by row-reduction:

The solution is , , and . Hence, the set is independent, and the preceding result shows that it's a basis.

Note: You could alternatively show that the set spans. To do this, you'd show that for an arbitrary , you can solve the following system for x, y, and z in terms of a, b, and c:

You can see it's almost the same thing, but this will be a little messier than checking independence.

Send comments about this page to: Bruce.Ikenaga@millersville.edu.

Copyright 2014 by Bruce Ikenaga