Math 322

These problems are intended to help you study. The presence or absence of a problem or topic from this handout does not imply anything about what will or will not be on the final.

1. Write down the parametrized solution to the following system over :

2. Let R be a commutative ring with identity. Prove using the definitions of transpose, matrix addition, multiplication of a matrix by a scalar, and matrix equality that if and A and B are matrices with entries in R, then

3. Describe the possible row reduced echelon forms for real matrices.

4. Row reduce the following matrix to row-reduced echelon form over :

5. The real matrix A has the row reduced echelon form R:

Find:

(a) A basis for the row space of A.

(b) A basis for the column space of A.

(c) The rank of A.

(d) A basis for the null space of A.

(e) The nullity of A.

6. Suppose F is a field and . Suppose A and B
are * similar* (so for some
invertible matrix ).

(a) Prove that and are similar.

(b) Prove that if , then and are similar.

7. Let A be an matrix and let B be an matrix. Show that the row space of is contained in the row space of B.

8. Write the following matrix as a product where and B is a real matrix.

9. Let

Suppose is the linear transformation whose matrix is

(a) Find , where .

(b) Find , where .

(c) Find .

10. If A is an matrix of rank n, the * pseudoinverse* of A is defined to be

(a) Prove that if A is invertible, then .

(b) Prove that .

(c) Show that is symmetric.

11. Find the inverse of the following matrix over :

12. (a) Find the (classical) adjoint of

(b) Use the adjoint formula to compute .

13. Let . An inner product is defined on by

(a) Compute .

(b) Find the length of relative to this inner product.

(c) Find the cosine of the angle between and relative to this inner product.

14. The first two vectors in the following set are orthogonal:

Find an othonormal set which spans the same subspace of .

15. Let

(a) Find a basis for the subspace of spanned by .

(b) Find a subset of which is a basis for the subspace of spanned by .

16. Compute the determinant of the real matrix and simplify:

17. Use Cramer's Rule to solve the following system of linear equations over :

18. Write as a product of elementary matrices.

19. Let V be the subset of which consists of all matrices A satisfying . Prove or disprove: V is a subspace of .

20. Find the eigenvalues and a complete independent set of eigenvectors for the following matrix. Find a diagonalizing matrix P and find the diagonal matrix D.

21. Prove that if and , then 0 is an eigenvalue of A.

22. Let F be a field and let . Prove, or disprove by specific counterexample: If p is an eigenvalue of A with eigenvector v and q is an eigenvalue of A with eigenvector w, then is an eigenvalue of A with eigenvector .

23. Consider the function given by

Check each axiom for a linear transformation. If the axiom holds, prove it. If the axiom does not hold, give a specific counterexample.

24. Consider the function given by

Check each axiom for a linear transformation. If the axiom holds, prove it. If the axiom does not hold, give a specific counterexample.

25. Suppose u, v, and w are vectors in a real inner product space, and

(a) Compute .

(b) Compute .

26. (a) Compute .

(b) Compute .

27. Explain why the complex dot product on is not defined by

28. A parallelogram has vertices , , , and , listed counterclockwise around the parallelogram. Find an affine transformation which takes the unit square , onto the parallelogram, so that the point is mapped to A.

29. Let V be a real inner product space, and let .

(a) Define

Prove that is a subspace of V.

(b) Define

Prove that W is * not* a subspace of V.

30. Let A be an real matrix. Show that every vector in the null space of A is orthogonal to every vector in the row space of A.

31. Solve the linear system

32. (a) Solve the linear system

(b) As , the solution curves approach a line. What is the line?

33. Solve the linear system

34. Compute , where

35. If is orthogonal, then . Prove by counterexample that the converse is false.

36. M is a real symmetric matrix with eigenvalues 2, -2, and 1.

is an eigenvector for 2.

is an eigenvector for -2.

(a) Find an eigenvector for 1.

(b) Find a diagonalizing matrix P for M. Find , and the corresponding diagonal matrix D. Find M.

37. Let

Find an orthogonal matrix O which diagonalizes A, and write down the corresponding diagonal matrix.

38. Let

Find an orthogonal matrix P which diagonalizes A, and write down the corresponding diagonal matrix.

39. (a) Suppose that . Prove that is Hermitian.

(b) Suppose that is Hermitian. Prove that is real.

40. Every Hermitian matrix H can be written as , where A is symmetric and B is skew-symmetric. Show how this works with the Hermitian matrix

41. is a vector space over . Is the set S of unitary matrices in a subspace of ?

42. Consider the following set of vectors in :

(a) Show that the first two vectors are orthogonal.

(b) Apply Gram-Schmidt to find a vector so that the vectors in the following set are mutually perpendicular and span the same subspace as the original set.

- Simplifying the answer is pretty tedious, so if you think you understand how to do this, you might want to stop at the point where you've computed all the dot products, but before you've combined the results into a single vector.

43. Find a unitary matrix U that diagonalizes

Write the corresponding diagonal matrix.

44. Find a unitary matrix U that diagonalizes

Write the corresponding diagonal matrix.

45. Find the Fourier expansion on the interval of

1. Write down the parametrized solution to the following system over :

Write down the augmented matrix and row reduce:

The equations corresponding to the row reduced echelon matrix are

Thus, and . Set and . The solution is

2. Let R be a commutative ring with identity. Prove using the definitions of transpose, matrix addition, multiplication of a matrix by a scalar, and matrix equality that if and A and B are matrices with entries in R, then

Since and are equal element-by-element, they are equal by definition of matrix equality.

3. Describe the possible row reduced echelon forms for real matrices.

A row reduced echelon matrix has either 0, 1, or 2 leading coefficients.

If there are no leading coefficients, the only possibility is the zero matrix:

If there is one leading coefficient, it is in the first row, and there are three possibilities:

( stands for any number.)

If there are two leading coefficients, there are three possibilities:

4. Row reduce the following matrix to row-reduced echelon form over :

5. The real matrix A has the row reduced echelon form R:

Find:

(a) A basis for the row space of A.

(b) A basis for the column space of A.

(c) The rank of A.

(d) A basis for the null space of A.

(e) The nullity of A.

(a) The nonzero rows of the row reduced echelon matrix form a basis for the row space:

(b) The leading coefficients occur in columns 1, 3, and 5. Therefore, the first, third, and fifth columns of A form a basis for the column space of A:

(c) The rank is the dimension of the row space or the column space, so the rank is 3.

(d) A vector is in the null space of A if and only if it's in the null space of R. In this case, it must satisfy

This produces the following equations:

Therefore, , , and , so

Therefore, a basis for the null space is

(e) The nullity of A is the dimension of the null space of A, so the nullity is 3.

6. Suppose F is a field and . Suppose A and B
are * similar* (so for some
invertible matrix ).

(a) Prove that and are similar.

(b) Prove that if , then and are similar.

(a)

(To get the last equation, I used the fact that .) Hence, and are similar.

(b)

The last equality follows from the fact that all the intermediate products cancel. (If you want to prove this rigorously, you can give an induction proof.) Hence, and are similar.

7. Let A be an matrix and let B be an matrix. Show that the row space of is contained in the row space of B.

Write

Then

But each is a linear combination of the rows of B. For example, suppose and

Then

Since the rows of are linear combinations of the rows of B,
the rows of are contained in the row space of B.
Therefore, the same is true for *linear combinations* of the
rows of --- and hence, the row space of is contained in the row space of B.

8. Write the following matrix as a product where and B is a real matrix.

9. Let

Suppose is the linear transformation whose matrix is

(a) Find , where .

(b) Find , where .

(c) Find .

(a)

(b) From (a),

So

You could also do this by multiplying by --- it's essentially the same computation.

(c)

10. If A is an matrix of rank n, the * pseudoinverse* of A is defined to be

(a) Prove that if A is invertible, then .

(b) Prove that .

(c) Show that is symmetric.

(a) Suppose A is invertible. Then

(b)

(c)

Note: The intent is not that you should memorize the definition of the pseudoinverse. This problem is about whether you can take a new definition and work with it together with things you already know.

11. Find the inverse of the following matrix over :

12. (a) Find the (classical) adjoint of

(b) Use the adjoint formula to compute .

First, I'll compute the cofactors. The cofactor is listed in the position.

The adjoint is the transpose of the matrix of cofactors:

(b) First, expanding by cofactors of the first row gives

Now in , so

13. Let . An inner product is defined on by

(a) Compute .

(b) Find the length of relative to this inner product.

(c) Find the cosine of the angle between and relative to this inner product.

(a) Compute .

(b)

(c) First,

Then

14. The first two vectors in the following set are orthogonal:

Find an othonormal set which spans the same subspace of .

Apply Gram-Schmidt:

The following set is an orthogonal set which spans the same subspace:

The following set is an orthonormal set which spans the same subspace:

15. Let

(a) Find a basis for the subspace of spanned by .

(b) Find a subset of which is a basis for the subspace of spanned by .

(a) Construct a matrix with the elements of as the rows and row reduce:

The nonzero rows of the row reduced echelon matrix form a basis for the row space, which in turn is the same as the span of . Therefore, a basis for the subspace of spanned by is

(b) Construct a matrix with the elements of as the columns and row reduce:

The leading columns occur in the first and second columns. Therefore, the first and second columns of the original matrix are independent. So a subset of which is a basis for the subspace spanned by is given by

16. Compute the determinant of the real matrix and simplify:

The idea is to simplify the determinant by performing row and column operations before expanding.

17. Use Cramer's Rule to solve the following system of linear equations over :

Hence,

18. Write as a product of elementary matrices.

Row reduce the matrix to the identity:

The elementary matrices which correspond to the row operations are

Then

Hence,

19. Let V be the subset of which consists of all matrices A satisfying . Prove or disprove: V is a subspace of .

It's always good to check first whether the supposed subspace contains the zero vector. , so no conclusion can be drawn.

V is not a subspace, since it's not closed under sums. For example,

However,

It does not square to itself:

You can also show that V is not closed under scalar multiplication. As noted earlier, . Consider

Then

Therefore, .

20. Find the eigenvalues and a complete independent set of eigenvectors for the following matrix. Find a diagonalizing matrix P and find the diagonal matrix D.

The eigenvalues are 1 and 2.

For ,

If denotes an eigenvector, the last matrix gives the equations

Hence, and . Thus,

is an eigenvector for .

For ,

If denotes an eigenvector, the last matrix gives the equation

Thus,

and are independent eigenvectors for .

A diagonalizing matrix is given by

21. Prove that if and , then 0 is an eigenvalue of A.

If A is the zero matrix, then for any nonzero vector v, I have . Hence, 0 is an eigenvalue of A.

If A is not the zero matrix, then some column of A is nonzero --- say it is the column . Now means that if

In particular, I must have . Then , which says that 0 is an eigenvalue of A.

22. Let F be a field and let . Prove, or disprove by specific counterexample: If p is an eigenvalue of A with eigenvector v and q is an eigenvalue of A with eigenvector w, then is an eigenvalue of A with eigenvector .

The statement is false.

Consider the following matrix in :

is an eigenvalue of A with eigenvector , because

is an eigenvalue of A with eigenvector , because

However, is not an eigenvalue with eigenvector , because

23. Consider the function given by

Check each axiom for a linear transformation. If the axiom holds, prove it. If the axiom does not hold, give a specific counterexample.

Since , the sum axiom does not hold.

Since , the scalar multiplication axiom does not hold.

24. Consider the function given by

Let . Then

Hence, the sum axiom holds.

Let . Then

Hence, the scalar multiplication axiom holds.

25. Suppose u, v, and w are vectors in a real inner product space, and

(a) Compute .

(b) Compute .

(a)

(b)

Hence, .

26. (a) Compute .

(b) Compute .

(a)

(b)

27. Explain why the complex dot product on is not defined by

The correct definition is

This *does* give a nonnegative real number when a vector is
multiplied by itself.

28. A parallelogram has vertices , , , and , listed counterclockwise around the parallelogram. Find an affine transformation which takes the unit square , onto the parallelogram, so that the point is mapped to A.

The vectors for the sides which start at A are and . Hence, I can use the transformation

(It's okay to switch the columns of the matrix, since this will also give a transformation meeting the requirements of the problem.)

29. Let V be a real inner product space, and let .

(a) Define

Prove that is a subspace of V.

(b) Define

Prove that W is * not* a subspace of V.

(a) Let . I want to show . Now implies , and implies . Therefore

Hence, .

Let , and let . I want to show that . Now implies , so

Hence, .

Therefore, is a subspace of V.

(b) Suppose , so . Then

Hence, . Since W is not closed under scalar multiplication, W is not a subspace.

30. Let A be an real matrix. Show that every vector in the null space of A is orthogonal to every vector in the row space of A.

Suppose x is in the null space of A, so . Denoting the rows of A by , , ..., , this means that

But this means that

An element of the row space of A is a linear combination of the rows of A, say

Then

Hence, x is orthogonal to every element of the row space of A.

31. Solve the linear system

The eigenvalues are and .

For ,

By inspection, is an eigenvector for .

For ,

By inspection, is an eigenvector for .

The solution is

32. (a) Solve the linear system

(b) As , the solution curves approach a line. What is the line?

(a) Let

Then

The eigenvalues are and .

For , I have

By inspection, is an eigenvector.

For , I have

By inspection, is an eigenvector.

The solution is

(b) As , I have . The second term in the solution goes to 0, so as ,

This means that and , so

So the curves approach the line , or .

33. Solve the linear system

Then

The roots are .

For , I have

(I can eliminate the first row, because it must be a multiple of the second. If it were not, the rows would form an independent set, the matrix would row-reduce to the identity, and the only solution to the corresponding homogeneous system would be . Since I know there are eigenvectors, and since eigenvector must be nonzero, the homogeneous system must have a nonzero solution.)

Using a and b as the variables, the corresponding homogeneous system for the last matrix is

Since all I want is *some* nonzero solution, I can take and . Thus, an eigenvector is .

If the last shortcut is confusing, you can also do this by solving for a (say):

Then

Taking gives , as before.

Using the eigenvalue , I have the solution

Taking the real and imaginary parts of this solution give two independent real solutions, which I use to get the general solution:

34. Compute , where

The eigenvalues are and .

First,

Next, , and

So

35. If is orthogonal, then . Prove by counterexample that the converse is false.

Consider the matrix

It has determinant 1, but it is not orthogonal: The columns aren't mutually perpendicular, and they don't have length 1.

36. M is a real symmetric matrix with eigenvalues 2, -2, and 1.

is an eigenvector for 2.

is an eigenvector for -2.

(a) Find an eigenvector for 1.

(b) Find a diagonalizing matrix P for M. Find , and the corresponding diagonal matrix D. Find M.

(a) Since M is symmetric, an eigenvector for 1 must be perpendicular to the eigenvectors for 2 and -2. So

This gives the system

Row reduce:

The parametrized solution is

Taking gives .

You could also do this by taking the cross product of the two eigenvectors, since they're in .

(b)

Since ,

37. Let

Find an orthogonal matrix O which diagonalizes A, and write down the corresponding diagonal matrix.

Let

The characteristic polynomial of A is

The eigenvalues are and .

For , partial row reduction gives

is an eigenvector for . It has length .

For , partial row reduction gives

is an eigenvector for . It has length .

The normalized vectors give an orthonormal set of eigenvectors for A:

Construct O by using the orthonormal vectors as the columns:

Moreover,

38. Let

Find an orthogonal matrix P which diagonalizes A, and write down the corresponding diagonal matrix.

The characteristic polynomial is

The eigenvalues are and .

For ,

With as a solution vector, the corresponding homogeneous system is , or . Thus,

Taking and , then and , I get the eigenvectors , . Observe that these vectors are already orthogonal. Dividing each by its length gives the orthonormal set

For ,

With as a solution vector, the corresponding homogeneous system is , , or . Thus,

Taking , I get the eigenvector .

Eigenvectors for different eigenvalues of a symmetric matrix must be orthogonal, and you can verify that this eigenvector is perpendicular to the two I already found. Dividing the vector by its length, I obtain

Thus,

Therefore, the diagonal matrix is

39. (a) Suppose that . Prove that is Hermitian.

(b) Suppose that is Hermitian. Prove that is real.

(a)

Therefore, is Hermitian.

(b) Since H is Hermitian, all the entries on the main diagonal are real, so the same is true for all the entries on the main diagonal of . So I only have to show that the off-diagonal entries of are real.

Suppose , and consider the element of . This is

However, since H is Hermitian,

That is, if , then . Hence,

That is, the element of is a real number. Hence, is a real matrix.

To see how this works for yourself, consider the Hermitian matrix

Then

Notice what happens when the off-diagonal entries are added.

40. Every Hermitian matrix H can be written as , where A is symmetric and B is skew-symmetric. Show how this works with the Hermitian matrix

For A, use the real parts of H; for B, use the imaginary parts of H. Thus,

41. is a vector space over . Is the set S of unitary matrices in a subspace of ?

The set of unitary matrices can't be a subspace of , because it does not contain the zero matrix. The zero matrix is not unitary, because its rows don't form an orthonormal set --- the zero vector does not have length 1.

42. Consider the following set of vectors in :

(a) Show that the first two vectors are orthogonal.

(b) Apply Gram-Schmidt to find a vector so that the vectors in the following set are mutually perpendicular and span the same subspace as the original set.

(a)

(b)

43. Find a unitary matrix U that diagonalizes

Write the corresponding diagonal matrix.

The eigenvalues are and .

For ,

By inspection, is an eigenvector. Note that .

For ,

By inspection, is an eigenvector. Note that .

Thus, a unitary matrix which diagonalizes A is given by

The diagonal matrix is

44. Find a unitary matrix U that diagonalizes

Write the corresponding diagonal matrix.

The eigenvalues are -3 and 6.

For ,

Using as the solution vector, the last matrix gives the equation

Thus, .

Then

Hence, taking and , I find that is an eigenvector. To get a "nice" second eigenvector, take and (to clear the denominator); this gives as a second eigenvector.

For ,

Using as the solution vector, the last matrix gives the equations

Thus, is an eigenvector.

To get a unitary diagonalizing matrix, divide each eigenvector by its length. You get

The diagonal matrix is

45. Find the Fourier expansion on the interval of

For , I have

Hence, the Fourier expansion is

*The best thing for being sad is to learn something.* -
Merlyn, in *T. H. White's* *The Once and Future King*

Copyright 2020 by Bruce Ikenaga