# Coordinates and Change of Basis

Let V be a vector space and let be a basis for V. Every vector can be uniquely expressed as a linear combination of elements of :

(Let me remind you of why this is true. Since a basis spans, every can be written in this way. On the other hand, if are two ways of writing a given vector, then , and by independence , ..., --- that is, , ..., . So the representation of a vector in this way is unique.)

Consider the situation where is a finite ordered basis --- that is, fix a numbering of the elements of . If , the ordered list of coefficients is uniquely associated with v. The are the components of v with respect to the (ordered) basis ; I will use the notation

It is easy to confuse a vector with the representation of the vector in terms of its components relative to a basis. This confusion arises because representation of vectors which is most familiar is that of a vector as an ordinary n-tuple in :

This amounts to identifying the elements of with their representation relative to the standard basis

Example. (a) Show that

is a basis for .

These are three vectors in , which has dimension 3. Hence, it suffices to check that they're independent. Form the matrix with the elements of as its rows and row reduce:

The vectors are independent. Three independent vectors in must form a basis.

(b) Find the components of relative to .

I must find numbers a, b, and c such that

This is equivalent to the matrix equation

Set up the matrix for the system and row reduce to solve:

This says , , and . Therefore, .

(c) Write in terms of the standard basis.

I'll write for v relative to and for v relative to the standard basis. The matrix equation in (b)

says

In (b), I knew and I wanted ; this time it's the other way around. So I simply put into the spot and multiply:

Let me generalize the observation I made in (c).

• If is a basis for --- whose elements are written in terms of the standard basis, of course --- and M is the matrix whose columns are the vectors in , then left multiplication by M translates vectors written in terms of to vectors written in terms of the standard basis.

I'll write for M, and call it a translation matrix. Again, translates vectors written in terms of to vectors written in terms of the standard basis.

The inverse of a square matrix M is a matrix such that , where I is the identity matrix. If I multiply the last equation on the left by , I get

In words, this means:

• Left multiplication by translates vectors from the standard basis to .

This means that . Dispensing with M, I can say that

In the example above, left multiplication by the following matrix translates vectors from to the standard basis:

The inverse of is

Left multiplication by this matrix translates vectors from the standard basis to .

Example. (Translating vectors from one basis to another) The translation analogy is a useful one, since it makes it easy to see how to set up arbitrary changes of basis.

For example, suppose

is another basis for .

Here's how to translate vectors from to :

Remember that the product is read from right to left! Thus, the composite operation translates a vector to a standard vector, and then translates the resulting standard vector to a vector. Moreover, I have matrices which perform each of the right-hand operations.

This matrix translates vectors from to the standard basis:

This matrix translates vectors from the standard basis to :

Therefore, multiplication by the following matrix will translate vectors from to :

Contact information