Bases for Vector Spaces

A set is independent if, roughly speaking, there is no redundancy in the set: You can't "build" any vector in the set as a linear combination of the others. A set spans if you can "build everything" in the vector space as linear combinations of vectors in the set. Putting these two ideas together, a basis is an independent spanning set: A set with no redundancy out of which you can "build everything".

Definition. Let V be an F-vector space. A subset ${\cal B}$ of V is a basis if it is linearly independent and spans V.

The number of elements in a basis for V is called the dimension of V, and is denoted $\dim V$ .

Someone could object at this point that I don't know that two bases for V might not have different numbers of elements --- in which case the dimension of V wouldn't make sense. I'll show later that this can't happen.


Example. Consider the standard basis for $\real^3$ :

$$\left\{\left[\matrix{1 \cr 0 \cr 0 \cr}\right], \left[\matrix{0 \cr 1 \cr 0 \cr}\right], \left[\matrix{0 \cr 0 \cr 1 \cr}\right]\right\}$$

As the name implies, this set is a basis.

I observed earlier that this set is independent. Moreover,

$$\left[\matrix{a \cr b \cr c \cr}\right] = a \cdot \left[\matrix{1 \cr 0 \cr 0 \cr}\right] + b \cdot \left[\matrix{0 \cr 1 \cr 0 \cr}\right] + c \cdot \left[\matrix{0 \cr 0 \cr 1 \cr}\right].$$

Therefore, every vector in $\real^3$ can be written as a linear combination of the vectors, so the set also spans. Hence


Example. The following set is a basis for $\real^3$ over $\real$ :

$$\left\{\left[\matrix{1 \cr 1 \cr 0 \cr}\right], \left[\matrix{0 \cr 1 \cr 1 \cr}\right], \left[\matrix{1 \cr 0 \cr 1 \cr}\right]\right\}$$

Make a matrix with the vectors as columns and row reduce:

$$A = \left[\matrix{1 & 0 & 1 \cr 1 & 1 & 0 \cr 0 & 1 & 1 \cr}\right] \to \left[\matrix{1 & 0 & 0 \cr 0 & 1 & 0 \cr 0 & 0 & 1 \cr}\right]$$

Since A row reduces to the identity, it is invertible,and there are a number of conditions which are equivalent to A being invertible.

First, since A is invertible the following system has a unique solution for every $(a, b, c)$ :

$$\left[\matrix{1 & 0 & 1 \cr 1 & 1 & 0 \cr 0 & 1 & 1 \cr}\right] \left[\matrix{x \cr y \cr z \cr}\right] = \left[\matrix{a \cr b \cr c \cr}\right].$$

But this matrix equation can be written as

$$x \cdot \left[\matrix{1 \cr 1 \cr 0 \cr}\right] + y \cdot \left[\matrix{0 \cr 1 \cr 1 \cr}\right] + z \cdot \left[\matrix{1 \cr 0 \cr 1 \cr}\right] = \left[\matrix{a \cr b \cr c \cr}\right].$$

In other words, any vector $(a, b, c) \in
   \real^3$ can be written as a linear combination of the given vectors: The given vectors span $\real^3$ .

Second, since A is invertible, the following system has only $x = 0$ , $y = 0$ , $z = 0$ as a solution:

$$\left[\matrix{1 & 0 & 1 \cr 1 & 1 & 0 \cr 0 & 1 & 1 \cr}\right] \left[\matrix{x \cr y \cr z \cr}\right] = \left[\matrix{0 \cr 0 \cr 0 \cr}\right].$$

This matrix equation can be written as

$$x \cdot \left[\matrix{1 \cr 1 \cr 0 \cr}\right] + y \cdot \left[\matrix{0 \cr 1 \cr 1 \cr}\right] + z \cdot \left[\matrix{1 \cr 0 \cr 1 \cr}\right] = \left[\matrix{0 \cr 0 \cr 0 \cr}\right].$$

Since the only solution is $x = 0$ , $y =
   0$ , $z = 0$ , the vectors are independent.

Hence, the given set of vectors is a basis for $\real^3$ .


The following result is clear from the last example.

Proposition. Let F be a field. $\{v_1, v_2, \ldots, v_n\}$ is a basis for $F^n$ if and only if

$$A = \left[\matrix{\uparrow & \uparrow & & \uparrow \cr v_1 & v_2 & \cdots & v_n \cr \downarrow & \downarrow & & \downarrow \cr}\right] \quad\hbox{is invertible}.\quad\halmos$$

Note that by earlier results on invertibility, this is equivalent to the following conditions (among others):

1. A row reduces to the identity.

2. $\det A \ne 0$ .

Thus, if you have n vectors in $F^n$ , this gives you several ways of checking whether or not the set is a basis.

A basis for V is a spanning set for V, so every vector in V can be written as a linear combination of basis elements. The next result says that such a linear combination is unique.

Lemma. Let $\{v_1, v_2, \ldots, v_n\}$ be a basis for a vector space V. Every $v
   \in V$ can be written in exactly one way as

$$v = a_1 v_1 + a_2 v_2 + \cdots + a_n v_n, \quad a_i \in F.$$

Proof. Let $v
   \in V$ . Since $\{v_1, v_2, \ldots, v_n\}$ spans V, there are scalars $a_1$ , $a_2$ , ..., $a_n$ such that

$$v = a_1 v_1 + a_2 v_2 + \cdots + a_n v_n.$$

Suppose that there is another way to do this:

$$v = b_1 v_1 + b_2 v_2 + \cdots + b_n v_n.$$

Then

$$a_1 v_1 + a_2 v_2 + \cdots + a_n v_n = b_1 v_1 + b_2 v_2 + \cdots + b_n v_n.$$

Hence,

$$(a_1 - b_1)v_1 + (a_2 - b_2)v_2 + \cdots + (a_n - b_n)v_n = 0.$$

Since $\{v_1, v_2, \ldots, v_n\}$ is independent,

$$a_1 - b_1 = 0, \quad a_2 - b_2 = 0, \ldots, a_n - b_n = 0.$$

Therefore,

$$a_1 = b_1, \quad a_2 = b_2, \ldots, a_n = b_n.$$

That is, the two linear combinations were actually the same. This proves that there's only one way to write v as a linear combination of the $v_i$ 's.

I want to show that two bases for a vector space must have the same number of elements. I need some preliminary results, which are important in their own right.

Lemma. If A is an $m \times n$ matrix with $m < n$ , the system $A x =
   \vec{0}$ has nontrivial solutions.

Proof. Write

$$A = \left[\matrix{ a_{1 1} & a_{1 2} & \cdots & a_{1 n} \cr a_{2 1} & a_{2 2} & \cdots & a_{2 n} \cr \vdots & \vdots & & \vdots \cr a_{m 1} & a_{m 2} & \cdots & a_{m n} \cr}\right].$$

The condition $m < n$ means that the following system has more variables than equations:

$$\left[\matrix{ a_{1 1} & a_{1 2} & \cdots & a_{1 n} \cr a_{2 1} & a_{2 2} & \cdots & a_{2 n} \cr \vdots & \vdots & & \vdots \cr a_{m 1} & a_{m 2} & \cdots & a_{m n} \cr}\right] \left[\matrix{x_1 \cr x_2 \cr \vdots \cr x_n \cr}\right] = \left[\matrix{0 \cr 0 \cr \vdots \cr 0 \cr}\right].$$

If A row reduces to a row reduced echelon matrix R, then R can have at most m leading coefficients. Therefore, some of the variables $x_1$ , $x_2$ , ..., $x_n$ will be free variables (parameters); if I assign nonzero values to the free variables (e.g. by setting all of them equal to 1), the resulting solution will be nontrivial.

Theorem. Let $\{v_1, v_2, \ldots, v_n\}$ be a basis for a vector space V.

(a) Any subset of V with more than n elements is dependent.

(b) Any subset of V with fewer than n elements cannot span.

Proof. (a) Suppose $\{w_1, w_2, \ldots, w_m\}$ is a subset of V, and that $m > n$ . I want to show that $\{w_1, w_2, \ldots, w_m\}$ is dependent.

Write each w as a linear combination of the v's:

$$\eqalign{ w_1 &= a_{1 1} v_1 + a_{1 2} v_2 + \cdots + a_{1 n} v_n \cr w_2 &= a_{2 1} v_1 + a_{2 2} v_2 + \cdots + a_{2 n} v_n \cr & \vdots \cr w_m &= a_{m 1} v_1 + a_{m 2} v_2 + \cdots + a_{m n} v_n \cr}$$

This can be represented as the following matrix equation:

$$\left[\matrix{ \uparrow & \uparrow & & \uparrow \cr w_1 & w_2 & \cdots & w_m \cr \downarrow & \downarrow & & \downarrow \cr}\right] = \left[\matrix{ \uparrow & \uparrow & & \uparrow \cr v_1 & v_2 & \cdots & v_n \cr \downarrow & \downarrow & & \downarrow \cr}\right] \left[\matrix{ a_{1 1} & a_{2 1} & \cdots & a_{m 1} \cr a_{1 2} & a_{2 2} & \cdots & a_{m 2} \cr \vdots & \vdots & & \vdots \cr a_{1 n} & a_{2 n} & \cdots & a_{m n} \cr}\right].$$

Since $m > n$ , the matrix of a's has more columns than rows. Therefore, the following system has a nontrivial solution $x_1 = b_1$ , $x_2 = b_2$ , ... $x_m =
   b_m$ :

$$\left[\matrix{ a_{1 1} & a_{2 1} & \cdots & a_{m 1} \cr a_{1 2} & a_{2 2} & \cdots & a_{m 2} \cr \vdots & \vdots & & \vdots \cr a_{1 n} & a_{2 n} & \cdots & a_{m n} \cr}\right] \left[\matrix{x_1 \cr x_2 \cr \vdots \cr x_m \cr}\right] = \left[\matrix{0 \cr 0 \cr \vdots \cr 0 \cr}\right].$$

That is, not all the b's are 0, but

$$\left[\matrix{ a_{1 1} & a_{2 1} & \cdots & a_{m 1} \cr a_{1 2} & a_{2 2} & \cdots & a_{m 2} \cr \vdots & \vdots & & \vdots \cr a_{1 n} & a_{2 n} & \cdots & a_{m n} \cr}\right] \left[\matrix{b_1 \cr b_2 \cr \vdots \cr b_m \cr}\right] = \left[\matrix{0 \cr 0 \cr \vdots \cr 0 \cr}\right].$$

But then

$$\left[\matrix{ \uparrow & \uparrow & & \uparrow \cr w_1 & w_2 & \cdots & w_m \cr \downarrow & \downarrow & & \downarrow \cr}\right] \left[\matrix{b_1 \cr b_2 \cr \vdots \cr b_m \cr}\right] = \left[\matrix{ \uparrow & \uparrow & & \uparrow \cr v_1 & v_2 & \cdots & v_n \cr \downarrow & \downarrow & & \downarrow \cr}\right] \left[\matrix{ a_{1 1} & a_{2 1} & \cdots & a_{m 1} \cr a_{1 2} & a_{2 2} & \cdots & a_{m 2} \cr \vdots & \vdots & & \vdots \cr a_{1 n} & a_{2 n} & \cdots & a_{m n} \cr}\right] \left[\matrix{b_1 \cr b_2 \cr \vdots \cr b_m \cr}\right].$$

Therefore,

$$\left[\matrix{ \uparrow & \uparrow & & \uparrow \cr w_1 & w_2 & \cdots & w_m \cr \downarrow & \downarrow & & \downarrow \cr}\right] \left[\matrix{b_1 \cr b_2 \cr \vdots \cr b_m \cr}\right] = \left[\matrix{0 \cr 0 \cr \vdots \cr 0 \cr}\right].$$

In equation form,

$$b_1 w_1 + b_2 w_2 + \cdots + b_m w_m = \vec{0}.$$

This is a nontrivial linear combination of the w's which adds up to $\vec{0}$ , so the w's are dependent.

(b) Suppose that $\{w_1, w_2, \ldots,
   w_m\}$ is a set of vectors in V and $m < n$ . I want to show that $\{w_1, w_2, \ldots, w_m\}$ does not span V.

Suppose on the contrary that the w's span V. Then each v can be written as a linear combination of the w's:

$$\eqalign{ v_1 &= a_{1 1} w_1 + a_{1 2} w_2 + \cdots + a_{1 m} w_m \cr v_2 &= a_{2 1} w_1 + a_{2 2} w_2 + \cdots + a_{2 m} w_m \cr & \vdots \cr v_n &= a_{n 1} w_1 + a_{n 2} w_2 + \cdots + a_{n m} w_m \cr}$$

In matrix form, this is

$$\left[\matrix{ \uparrow & \uparrow & & \uparrow \cr v_1 & v_2 & \cdots & v_n \cr \downarrow & \downarrow & & \downarrow \cr}\right] = \left[\matrix{ \uparrow & \uparrow & & \uparrow \cr w_1 & w_2 & \cdots & w_m \cr \downarrow & \downarrow & & \downarrow \cr}\right] \left[\matrix{ a_{1 1} & a_{2 1} & \cdots & a_{n 1} \cr a_{1 2} & a_{2 2} & \cdots & a_{n 2} \cr \vdots & \vdots & & \vdots \cr a_{1 m} & a_{2 m} & \cdots & a_{n m} \cr}\right].$$

Since $n > m$ , the coefficient matrix has more columns than rows. Hence, the following system has a nontrivial solution $x_1 = b_1$ , $x_2 = b_2$ , ... $x_n =
   b_n$ :

$$\left[\matrix{ a_{1 1} & a_{2 1} & \cdots & a_{n 1} \cr a_{1 2} & a_{2 2} & \cdots & a_{n 2} \cr \vdots & \vdots & & \vdots \cr a_{1 m} & a_{2 m} & \cdots & a_{n m} \cr}\right] \left[\matrix{x_1 \cr x_2 \cr \vdots \cr x_n \cr}\right] = \left[\matrix{0 \cr 0 \cr \vdots \cr 0 \cr}\right].$$

Thus,

$$\left[\matrix{ a_{1 1} & a_{2 1} & \cdots & a_{n 1} \cr a_{1 2} & a_{2 2} & \cdots & a_{n 2} \cr \vdots & \vdots & & \vdots \cr a_{1 m} & a_{2 m} & \cdots & a_{n m} \cr}\right] \left[\matrix{b_1 \cr b_2 \cr \vdots \cr b_n \cr}\right] = \left[\matrix{0 \cr 0 \cr \vdots \cr 0 \cr}\right].$$

Multiplying the v and w equation on the right by the b-vector gives

$$\left[\matrix{ \uparrow & \uparrow & & \uparrow \cr v_1 & v_2 & \cdots & v_n \cr \downarrow & \downarrow & & \downarrow \cr}\right] \left[\matrix{b_1 \cr b_2 \cr \vdots \cr b_n \cr}\right] = \left[\matrix{ \uparrow & \uparrow & & \uparrow \cr w_1 & w_2 & \cdots & w_m \cr \downarrow & \downarrow & & \downarrow \cr}\right] \left[\matrix{ a_{1 1} & a_{2 1} & \cdots & a_{n 1} \cr a_{1 2} & a_{2 2} & \cdots & a_{n 2} \cr \vdots & \vdots & & \vdots \cr a_{1 m} & a_{2 m} & \cdots & a_{n m} \cr}\right] \left[\matrix{b_1 \cr b_2 \cr \vdots \cr b_n \cr}\right].$$ Hence,

$$\left[\matrix{ \uparrow & \uparrow & & \uparrow \cr v_1 & v_2 & \cdots & v_n \cr \downarrow & \downarrow & & \downarrow \cr}\right] \left[\matrix{b_1 \cr b_2 \cr \vdots \cr b_n \cr}\right] = \left[\matrix{0 \cr 0 \cr \vdots \cr 0 \cr}\right].$$

In equation form, this is

$$b_1 v_1 + b_2 v_2 + \cdots + b_n v_n = \vec{0}.$$

Since not all the b's are 0, this is a nontrivial linear combination of the v's which adds up to $\vec{0}$ --- contradicting the independence of the v's.

This contradiction means that the w's can't span after all.


Example. The standard basis for $\real^3$ contains 3 vectors.

Hence, the following set of four vectors can't be independent:

$$\left\{\left[\matrix{1 \cr 0 \cr -1 \cr}\right], \left[\matrix{2 \cr -3 \cr 10 \cr}\right], \left[\matrix{1 \cr 1 \cr 1 \cr}\right], \left[\matrix{0 \cr 11 \cr -7 \cr}\right]\right\}$$

Likewise, the following set of two vectors can't span $\real^3$ :

$$\left\{\left[\matrix{-2 \cr 1 \cr 2 \cr}\right], \left[\matrix{3 \cr 1 \cr 5 \cr}\right]\right\}\quad\halmos$$


Corollary. If $\{v_1, \ldots, v_n\}$ is a basis for a vector space V, then every basis for V has n elements.

Proof. If $\{w_1, \ldots, w_m\}$ is another basis for V, then m can't be less than n or $\{w_1, \ldots, w_m\}$ couldn't span. Likewise, m can't be greater than n or $\{w_1, \ldots, w_m\}$ couldn't be independent. Therefore, $m = n$ .

The Corollary shows that the dimension of a finite-dimensional vector space is well-defined --- that is, in a finite-dimensional vector space, any two bases have the same number of elements. This is true in general; I'll state the relevant results without proof.

(a) Every vector space has a basis. The proof requires a set-theoretic result called Zorn's Lemma.

(b) Two bases for any vector space have the same number of elements. Specifically, if ${\cal B}$ and ${\cal C}$ are bases for a vector space V, there is a bijective function $f: {\cal B} \to {\cal C}$ .

I've already given one example of an infinite basis:

$$\{1, x, x^2, x^3, \ldots\}.$$

This set is a basis for the vector space $\real[x]$ of polynomials with real coefficients over the field of real numbers.

The next result shows that, in principle, you can construct a basis by:

(a) Starting with an independent set and adding vectors.

(b) Starting with a spanning set and removing vectors.

Theorem. Let V be a vector space.

(a) Any set of independent vectors is a subset of a basis for V.

(b) Any spanning set for V contains a subset which is a basis.

Part (a) means that if S is an independent set, then there is a basis T such that $S \subset T$ . (If S was a basis to begin with, then $S = T$ .) Part (b) means that if S is a spanning set, then there is a basis T such that $T \subset
   S$ .

Proof. I'll only give the proof in the case where V has finite dimension n, though it is true for any vector space.

(a) Let $\{v_1, \ldots, v_m\}$ be independent. If this set spans V, it's a basis, and I'm done. Otherwise, there is a vector $v \in V$ which is not in the span of $\{v_1,
   \ldots, v_m\}$ .

I claim that $\{v, v_1, \ldots, v_m\}$ is independent. Suppose

$$a v + a_1 v_1 + \cdots + a_m v_m = 0.$$

Suppose $a \ne 0$ . Then I can write

$$v = -\dfrac{1}{a}\left(a_1 v_1 + \cdots + a_m v_m\right).$$

Since v has been expressed as a linear combination of the $v_k$ 's, it's in the span of the $v_k$ 's, contrary to assumption. Therefore, this case is ruled out.

The only other possibility is $a = 0$ . Then $a_1 v_1 + \cdots + a_m v_m = 0$ , so independence of the $v_k$ 's implies $a_1 = \cdots = a_m = 0$ . Therefore, $\{v, v_1,
   \ldots, v_m\}$ is independent.

I can continue adding vectors in this way until I get a set which is independent and spans --- a basis. The process must terminate, since no independent set in V can have more than n elements.

(b) Suppose $\{v_1, \ldots, v_m\}$ spans V. I want to show that some subset of $\{v_1, \ldots, v_m\}$ is a basis.

If $\{v_1, \ldots, v_m\}$ is independent, it's a basis, and I'm done. Otherwise, there is a nontrivial linear combination

$$a_1 v_1 + \cdots + a_m v_m = \vec{0}.$$

Assume without loss of generality that $a_1 \ne 0$ . Then

$$v_1 = -\dfrac{1}{a_1}\left(a_2 v_2 + \cdots + a_m v_m\right).$$

Since $v_1$ is a linear combination of the other v's, I can remove it and still have a set which spans V: $V = (v_2, \ldots, v_m)$ .

I continue throwing out vectors in this way until I reach a set which spans and is independent --- a basis. The process must terminate, because no set containing fewer than n vectors can span V.

It's possible to carry out the "adding vectors" and "removing vectors" procedures in some specific cases. The algorithms are related to those for finding bases for the row space and column space of a matrix, which I'll discuss later.

Suppose you know a basis should have n elements, and you have a set S with n elements ("the right number"). To show S is a basis, you only need to check either that it is independent or that it spans --- not both. I'll justify this statement, then show by example how you can use it. I need a preliminary result.

Proposition. Let V be a finite dimensional vector space over a field F, and let W be a subspace of V. If $\dim W = \dim V$ , then $V = W$ .

Proof. Suppose $\dim W = \dim V = n$ , but $V \ne W$ . I'll show that this leads to a contradiction.

Let $\{x_1, x_2, \ldots, x_n\}$ be a basis for W. Suppose this is not a basis for V. Since it's an independent set, the previous result shows that I can add vectors $y_1$ , $y_2$ , ... $y_m$ to make a basis for V:

$$\{x_1, x_2, \ldots, x_n, y_1, y_2, \ldots y_m\}.$$

But this is a basis for V with more than n elements, which is impossible.

Therefore, $\{x_1, x_2, \ldots, x_n\}$ is also a basis for V. Let $x \in V$ . Since $\{x_1, x_2, \ldots, x_n\}$ spans V, I can write x as a linear combination of the elements of $\{x_1, x_2, \ldots, x_n\}$ :

$$x = a_1 x_1 + a_2 x_2 + \cdots + a_n x_n, \quad a_i \in F.$$

But since $x_1$ , $x_2$ , ... $x_n$ are in W and W is a subspace, any linear combination of $x_1$ , $x_2$ , ... $x_n$ must be in W. Thus, $x \in W$ .

Since x was an arbitrary element of V, I get $V \subset W$ , so $W = V$ .

You might be thinking that this result is obvious: W and V have the same dimension, and you can't have one thing inside another, with both things having the "same size", unless the things are equal. This is intuitively what is going on here, but this kind of intuitive reasoning doesn't always work. For example, the even integers are a subset of the integers and, as infinite sets, both are of the same "order of infinity" ( cardinality). But the even integers aren't all of the integers: There are odd integers as well.

Corollary. Let S be a set of n vectors in an n-dimensional vector space V.

(a) If S is independent, then S is a basis for V.

(b) If S spans V, then S is a basis for V.

Proof. (a) Suppose S is independent. Consider W, the span of S. Then S is independent and spans W, so S is a basis for W. Since S has n elements, $\dim W = n$ . But $W \subset V$ and $\dim V = n$ . By the preceding result, $V = W$ .

Hence, S spans V, and S is a basis for V.

(b) Suppose S spans V. Suppose S is not independent. By an earlier result, I can remove some elements of S to get a set T which is a basis for V. But now I have a basis T for V with fewer than n elements (since I removed elements from S, which had n elements).

This is a contradiction, and hence S must be independent.


Example. Show that the following set is a basis for $\real^3$ :

$$\left\{\left[\matrix{1 \cr 1 \cr 0 \cr}\right], \left[\matrix{-1 \cr 0 \cr 1 \cr}\right], \left[\matrix{2 \cr 1 \cr 1 \cr}\right]\right\}.$$

Since I have 3 vectors in $\real^3$ (which has dimension 3), I only need to show that the set is independent. So suppose

$$x \cdot \left[\matrix{1 \cr 1 \cr 0 \cr}\right] + y \cdot \left[\matrix{-1 \cr 0 \cr 1 \cr}\right] + z \cdot \left[\matrix{2 \cr 1 \cr 1 \cr}\right] = \left[\matrix{0 \cr 0 \cr 0 \cr}\right].$$

This gives the matrix equation

$$\left[\matrix{ 1 & -1 & 2 \cr 1 & 0 & 1 \cr 0 & 1 & 1 \cr}\right] \left[\matrix{x \cr y \cr z \cr}\right] = \left[\matrix{0 \cr 0 \cr 0 \cr}\right].$$

Solve the system by row-reduction:

$$\left[\matrix{ 1 & -1 & 2 & 0 \cr 1 & 0 & 1 & 0 \cr 0 & 1 & 1 & 0 \cr}\right] \quad\to\quad \left[\matrix{ 1 & 0 & 0 & 0 \cr 0 & 1 & 0 & 0 \cr 0 & 0 & 1 & 0 \cr}\right]$$

The solution is $x = 0$ , $y = 0$ , and $z = 0$ . Hence, the set is independent, and the preceding result shows that it's a basis.

Note: You could alternatively show that the set spans. To do this, you'd show that for an arbitrary $(a, b,
   c) \in \real^3$ , you can solve the following system for x, y, and z in terms of a, b, and c:

$$x \cdot \left[\matrix{1 \cr 1 \cr 0 \cr}\right] + y \cdot \left[\matrix{-1 \cr 0 \cr 1 \cr}\right] + z \cdot \left[\matrix{2 \cr 1 \cr 1 \cr}\right] = \left[\matrix{a \cr b \cr c \cr}\right].$$

You can see it's almost the same thing, but this will be a little messier than checking independence.



Send comments about this page to: Bruce.Ikenaga@millersville.edu.

Bruce Ikenaga's Home Page

Copyright 2014 by Bruce Ikenaga