Review Problems for the Final

Math 322

These problems are intended to help you study. The presence or absence of a problem or topic from this handout does not imply anything about what will or will not be on the final.

1. Write down the parametrized solution to the following system over $\integer_3$ :

$$w + 2 x + y + 2 z = 1, \quad y + z = 2.$$

2. Let R be a commutative ring with identity. Prove using the definitions of transpose, matrix addition, multiplication of a matrix by a scalar, and matrix equality that if $k \in R$ and A and B are matrices with entries in R, then

$$(k A + B)^T = k A^T + B^T.$$

3. Describe the possible row reduced echelon forms for $2 \times 3$ real matrices.

4. Row reduce the following matrix to row-reduced echelon form over $\integer_5$ :

$$\left[\matrix{ 1 & 4 & 0 & 2 & 4 \cr 3 & 1 & 1 & 1 & 0 \cr 1 & 2 & 0 & 4 & 3 \cr}\right]$$

5. The real matrix A has the row reduced echelon form R:

$$A = \left[\matrix{ 4 & -4 & 0 & 0 & -3 & -13 \cr 0 & 0 & 2 & 4 & 0 & 2 \cr -1 & 1 & 2 & 4 & 1 & 6 \cr 0 & 0 & 1 & 2 & 0 & 1 \cr}\right] \to \left[\matrix{ 1 & -1 & 0 & 0 & 0 & -1 \cr 0 & 0 & 1 & 2 & 0 & 1 \cr 0 & 0 & 0 & 0 & 1 & 3 \cr 0 & 0 & 0 & 0 & 0 & 0 \cr}\right] = R.$$

Find:

(a) A basis for the row space of A.

(b) A basis for the column space of A.

(c) The rank of A.

(d) A basis for the null space of A.

(e) The nullity of A.

6. Suppose F is a field and $A, B
   \in M(n, F)$ . Suppose A and B are similar (so $P A P^{-1} = B$ for some invertible matrix $P \in M(n, F)$ ).

(a) Prove that $A^T$ and $B^T$ are similar.

(b) Prove that if $n \ge 1$ , then $A^n$ and $B^n$ are similar.

7. Let A be an $m \times n$ matrix and let B be an $n \times p$ matrix. Show that the row space of $AB$ is contained in the row space of B.

8. Write the following matrix as a product $A B$ where $A = (x, y,
   z)^T$ and B is a $3 \times 3$ real matrix.

$$\left[\matrix{x + 4 y + 7 z & 2 x + 5 y + 8 z & 3 x + 6 y + 9 z \cr}\right].$$

9. Let

$${\cal B} = \left\{\left[\matrix{2 \cr 1 \cr}\right], \left[\matrix{3 \cr 1 \cr}\right]\right\}, \quad {\cal C} = \left\{\left[\matrix{1 \cr 0 \cr 1 \cr}\right], \left[\matrix{0 \cr 1 \cr 1 \cr}\right], \left[\matrix{1 \cr 1 \cr 0 \cr}\right]\right\}.$$

Suppose $T: \real^2 \to \real^3$ is the linear transformation whose matrix is

$$[T]_{{\cal B},{\cal C}} = \left[\matrix{1 & -1 \cr 0 & 2 \cr 3 & -1 \cr}\right].$$

(a) Find $T(\vec{v})_{\cal C}$ , where $\vec{v} = (1, 4)_{\cal B}$ .

(b) Find $T(\vec{v})_{\rm std}$ , where $\vec{v} = (1, 4)_{\cal B}$ .

(c) Find $[T]_{{\rm std},{\rm
   std}}$ .

10. If A is an $m \times n$ matrix of rank n, the pseudoinverse of A is defined to be

$$A^+ = (A^T A)^{-1} A^T.$$

(a) Prove that if A is invertible, then $A^+ = A^{-1}$ .

(b) Prove that $(A A^+)^2 = A
   A^+$ .

(c) Show that $A A^+$ is symmetric.

11. Find the inverse of the following matrix over $\integer_5$ :

$$\left[\matrix{ 1 & 4 & 2 \cr 0 & 3 & 0 \cr 1 & 1 & 1 \cr}\right].$$

12. (a) Find the (classical) adjoint of

$$A = \left[\matrix{ 1 & 0 & 3 \cr 2 & 2 & 1 \cr 4 & 1 & 1 \cr}\right] \in M(3, \integer_5).$$

(b) Use the adjoint formula to compute $A^{-1}$ .

13. Let $x, y \in \real^2$ . An inner product is defined on $\real^2$ by

$$\innp{x}{y} = \left[\matrix{x_1 & x_2 \cr}\right] \left[\matrix{ 5 & -8 \cr -8 & 13 \cr}\right] \left[\matrix{y_1 \cr y_2 \cr}\right].$$

(a) Compute $\innp{(3, -1)}{(1,
   2)}$ .

(b) Find the length of $(2, 1)$ relative to this inner product.

(c) Find the cosine of the angle between $(4, 0)$ and $(1, 1)$ relative to this inner product.

14. The first two vectors in the following set are orthogonal:

$$\{(2, -3, 1, 2), (1, 1, -1, 1), (2, 13, -1, 0)\}.$$

Find an othonormal set which spans the same subspace of $\real^4$ .

15. Let

$${\cal B} = \left\{\left[\matrix{1 \cr 1 \cr -5 \cr 2 \cr}\right], \left[\matrix{-2 \cr -1 \cr 2 \cr -1 \cr}\right], \left[\matrix{8 \cr 5 \cr -16 \cr 7 \cr}\right]\right\}.$$

(a) Find a basis for the subspace of $\real^4$ spanned by ${\cal B}$ .

(b) Find a subset of ${\cal B}$ which is a basis for the subspace of $\real^4$ spanned by ${\cal
   B}$ .

16. Compute the determinant of the real matrix and simplify:

$$\left|\matrix{ -4 - 2 k & k & 4 - 2 k \cr -4 - k & k & 4 - k \cr -1 - k & 0 & 1 - k \cr}\right|.$$

17. Use Cramer's Rule to solve the following system of linear equations over $\real$ :

$$\eqalign{ x + 3 y & = 8 \cr 2 x + y & = -9 \cr}$$

18. Write $\left[\matrix{1 & -3
   \cr 2 & 2 \cr}\right] \in M(2,\real)$ as a product of elementary matrices.

19. Let V be the subset of $M(2,\real)$ which consists of all matrices A satisfying $A^2
   = A$ . Prove or disprove: V is a subspace of $M(2,\real)$ .

20. Find the eigenvalues and a complete independent set of eigenvectors for the following matrix. Find a diagonalizing matrix P and find the diagonal matrix D.

$$A = \left[\matrix{ 0 & 0 & 2 \cr -3 & 2 & 3 \cr -1 & 0 & 3 \cr}\right].$$

21. Prove that if $A \in M(n,
   \real)$ and $A^2 = 0$ , then 0 is an eigenvalue of A.

22. Let F be a field and let $A
   \in M(n, F)$ . Prove, or disprove by specific counterexample: If p is an eigenvalue of A with eigenvector v and q is an eigenvalue of A with eigenvector w, then $p + q$ is an eigenvalue of A with eigenvector $v + w$ .

23. Consider the function $f:
   \real^2 \to \real^2$ given by

$$f(x,y) = (x, x y).$$

Check each axiom for a linear transformation. If the axiom holds, prove it. If the axiom does not hold, give a specific counterexample.

24. Consider the function $f: M(2,
   \real) \to \real^2$ given by

$$f\left(\left[\matrix{a & b \cr c & d \cr}\right]\right) = \left[\matrix{a - d \cr b - c \cr}\right].$$

Check each axiom for a linear transformation. If the axiom holds, prove it. If the axiom does not hold, give a specific counterexample.

25. Suppose u, v, and w are vectors in a real inner product space, and

$$\|u\| = 5, \quad \innp{u}{v} = 8, \quad \|v\| = 3.$$

(a) Compute $\innp{u + 2 v}{u - 3
   v}$ .

(b) Compute $\|3 u + v\|$ .

26. (a) Compute $(1 + 2 i, 2
   i)\cdot (2 - i, 1 + i)$ .

(b) Compute $\|(2 + 3 i, 1 -
   i)\|$ .

27. Explain why the complex dot product on $\complex^n$ is not defined by

$$\hbox{``}(a_1, a_2, \ldots, a_n) \cdot (b_1, b_2, \ldots, b_n) = a_1 b_1 + a_2 b_2 + \cdots + a_n b_n \hbox{''}.$$

28. A parallelogram has vertices $A(2, 4)$ , $B(4, 1)$ , $C(5, 3)$ , and $D(3, 6)$ , listed counterclockwise around the parallelogram. Find an affine transformation which takes the unit square $0 \le u \le 1$ , $0 \le v \le 1$ onto the parallelogram, so that the point $(0, 0)$ is mapped to A.

29. Let V be a real inner product space, and let $v \in V$ .

(a) Define

$$v^\perp = \{x \in V \mid \innp{v}{x} = 0\}.$$

Prove that $v^\perp$ is a subspace of V.

(b) Define

$$W = \{x \in V \mid \innp{v}{x} = 1\}.$$

Prove that W is not a subspace of V.

30. Let A be an $m \times n$ real matrix. Show that every vector in the null space of A is orthogonal to every vector in the row space of A.

31. Solve the linear system

$$\left[\matrix{x' \cr y' \cr}\right] = \left[\matrix{2 & 3 \cr 3 & 2 \cr}\right] \left[\matrix{x \cr y \cr}\right].$$

32. (a) Solve the linear system

$$\left[\matrix{x' \cr y' \cr}\right] = \left[\matrix{ 1 & 6 \cr 1 & 2 \cr}\right] \left[\matrix{x \cr y \cr}\right].$$

(b) As $t \to \infty$ , the solution curves approach a line. What is the line?

33. Solve the linear system

$$\left[\matrix{x' \cr y' \cr}\right] = \left[\matrix{ 1 & 5 \cr -1 & 3 \cr}\right] \left[\matrix{x \cr y \cr}\right].$$

34. Compute $e^{A t}$ , where

$$A = \left[\matrix{ 2 & 4 \cr 4 & -4 \cr}\right].$$

35. If $A \in M(n, \real)$ is orthogonal, then $\det A = \pm 1$ . Prove by counterexample that the converse is false.

36. M is a real symmetric matrix with eigenvalues 2, -2, and 1.

$(1, 1, 0)$ is an eigenvector for 2.

$(1, -1, 1)$ is an eigenvector for -2.

(a) Find an eigenvector for 1.

(b) Find a diagonalizing matrix P for M. Find $P^{-1}$ , and the corresponding diagonal matrix D. Find M.

37. Let

$$A = \left[\matrix{3 & \sqrt{5} \cr \sqrt{5} & -1 \cr}\right].$$

Find an orthogonal matrix O which diagonalizes A, and write down the corresponding diagonal matrix.

38. Let

$$A = \left[\matrix{ -2 & 0 & 0 \cr 0 & 2 & 2 \cr 0 & 2 & -1 \cr}\right] \in M(3, \real).$$

Find an orthogonal matrix P which diagonalizes A, and write down the corresponding diagonal matrix.

39. (a) Suppose that $A \in M(n,
   \complex)$ . Prove that $A + A^*$ is Hermitian.

(b) Suppose that $H \in M(n,
   \complex)$ is Hermitian. Prove that $H + H^T$ is real.

40. Every Hermitian matrix H can be written as $H = A + i \cdot B$ , where A is symmetric and B is skew-symmetric. Show how this works with the Hermitian matrix

$$\left[\matrix{ 2 & 4 + 2 i & 3 - i \cr 4 - 2 i & 0 & 17 i \cr 3 + i & -17 i & -5 \cr}\right].$$

41. $M(n, \complex)$ is a vector space over $\complex$ . Is the set S of unitary matrices in $M(n, \complex)$ a subspace of $M(n, \complex)$ ?

42. Consider the following set of vectors in $\complex^4$ :

$$\left\{(1 + i, 2, 3, -1), (1, 3 - i, i, 7), (1, 1, 1, 1)\right\}.$$

(a) Show that the first two vectors are orthogonal.

(b) Apply Gram-Schmidt to find a vector $v \in \complex^3$ so that the vectors in the following set are mutually perpendicular and span the same subspace as the original set.

$$\left\{(1 + i, 2, 3, -1), (1, 3 - i, i, 7), v\right\}.$$

43. Find a unitary matrix U that diagonalizes

$$A = \left[\matrix{5 & 2 - 2 i \cr 2 + 2 i & -2 \cr}\right].$$

Write the corresponding diagonal matrix.

44. Find a unitary matrix U that diagonalizes

$$A = \left[\matrix{ -3 & 0 & 0 \cr 0 & 2 & 2 + 4 i \cr 0 & 2 - 4 i & 1 \cr}\right].$$

Write the corresponding diagonal matrix.

45. Find the Fourier expansion on the interval $-1 \le x \le 1$ of

$$f(x) = \cases{0 & if $-1 \le x \le 0$ \cr 1 & if $0 < x \le 1$ \cr}.$$


Solutions to the Review Problems for the Final

1. Write down the parametrized solution to the following system over $\integer_3$ :

$$w + 2 x + y + 2 z = 1, \quad y + z = 2.$$

Write down the augmented matrix and row reduce:

$$\left[\matrix{ 1 & 2 & 1 & 2 & 1 \cr 0 & 0 & 1 & 1 & 2 \cr}\right] \matrix{\to \cr r_1 \to r_1 + 2r_2 \cr} \left[\matrix{ 1 & 2 & 0 & 1 & 2 \cr 0 & 0 & 1 & 1 & 2 \cr}\right]$$

The equations corresponding to the row reduced echelon matrix are

$$w + 2 x + z = 2, \quad y + z = 2.$$

Thus, $w = x + 2 z + 2$ and $y = 2 z + 2$ . Set $x = s$ and $z = t$ . The solution is

$$w = s + 2 t + 2, \quad x = s, \quad y = 2 t + 2, \quad z = t.\quad\halmos$$


2. Let R be a commutative ring with identity. Prove using the definitions of transpose, matrix addition, multiplication of a matrix by a scalar, and matrix equality that if $k \in R$ and A and B are matrices with entries in R, then

$$(k A + B)^T = k A^T + B^T.$$

$$\matrix{ [(k A + B)^T]_{ij} & = & (k A + B)_{ji} & \hbox{(Definition of transpose)} \cr & = & (k A)_{ji} + B_{ji} & \hbox{(Definition of matrix addition)} \cr & = & k(A)_{ji} + B_{ji} & \hbox{(Definition of scalar multiplication)} \cr & = & k(A^T)_{ij} + (B^T)_{ij} & \hbox{(Definition of transpose)} \cr & = & (k A^T)_{ij} + (B^T)_{ij} & \hbox{(Definition of scalar multiplication)} \cr & = & (k A^T + B^T)_{ij} & \hbox{(Definition of matrix addition)} \cr}$$

Since $(k A + B)^T$ and $k
   A^T + B^T$ are equal element-by-element, they are equal by definition of matrix equality.


3. Describe the possible row reduced echelon forms for $2 \times 3$ real matrices.

A $2 \times 3$ row reduced echelon matrix has either 0, 1, or 2 leading coefficients.

If there are no leading coefficients, the only possibility is the zero matrix:

$$\left[\matrix{0 & 0 & 0 \cr 0 & 0 & 0 \cr}\right].$$

If there is one leading coefficient, it is in the first row, and there are three possibilities:

$$\left[\matrix{1 & \star & \star \cr 0 & 0 & 0 \cr}\right], \quad \left[\matrix{0 & 1 & \star \cr 0 & 0 & 0 \cr}\right], \quad \left[\matrix{0 & 0 & 1 \cr 0 & 0 & 0 \cr}\right].$$

($\star$ stands for any number.)

If there are two leading coefficients, there are three possibilities:

$$\left[\matrix{1 & 0 & \star \cr 0 & 1 & \star \cr}\right], \quad \left[\matrix{1 & \star & 0 \cr 0 & 0 & 1 \cr}\right], \quad \left[\matrix{0 & 1 & 0 \cr 0 & 0 & 1 \cr}\right].\quad\halmos$$


4. Row reduce the following matrix to row-reduced echelon form over $\integer_5$ :

$$\left[\matrix{ 1 & 4 & 0 & 2 & 4 \cr 3 & 1 & 1 & 1 & 0 \cr 1 & 2 & 0 & 4 & 3 \cr}\right]$$

$$\left[\matrix{ 1 & 4 & 0 & 2 & 4 \cr 3 & 1 & 1 & 1 & 0 \cr 1 & 2 & 0 & 4 & 3 \cr}\right] \matrix{\to \cr r_{3} \to r_{3} + 4 r_{1} \cr} \left[\matrix{ 1 & 4 & 0 & 2 & 4 \cr 3 & 1 & 1 & 1 & 0 \cr 0 & 3 & 0 & 2 & 4 \cr}\right] \matrix{\to \cr r_{2} \to r_{2} + 2 r_{1} \cr}$$

$$\left[\matrix{ 1 & 4 & 0 & 2 & 4 \cr 0 & 4 & 1 & 0 & 3 \cr 0 & 3 & 0 & 2 & 4 \cr}\right] \matrix{\to \cr r_{2} \to 4 r_{2} \cr} \left[\matrix{ 1 & 4 & 0 & 2 & 4 \cr 0 & 1 & 4 & 0 & 2 \cr 0 & 3 & 0 & 2 & 4 \cr}\right] \matrix{\to \cr r_{1} \to r_{1} + r_{2} \cr}$$

$$\left[\matrix{ 1 & 0 & 4 & 2 & 1 \cr 0 & 1 & 4 & 0 & 2 \cr 0 & 3 & 0 & 2 & 4 \cr}\right] \matrix{\to \cr r_{3} \to r_{3} + 2 r_{2} \cr} \left[\matrix{ 1 & 0 & 4 & 2 & 1 \cr 0 & 1 & 4 & 0 & 2 \cr 0 & 0 & 3 & 2 & 3 \cr}\right] \matrix{\to \cr r_{3} \to 2 r_{3} \cr}$$

$$\left[\matrix{ 1 & 0 & 4 & 2 & 1 \cr 0 & 1 & 4 & 0 & 2 \cr 0 & 0 & 1 & 4 & 1 \cr}\right] \matrix{\to \cr r_{1} \to r_{1} + r_{3} \cr} \left[\matrix{ 1 & 0 & 0 & 1 & 2 \cr 0 & 1 & 4 & 0 & 2 \cr 0 & 0 & 1 & 4 & 1 \cr}\right] \matrix{\to \cr r_{2} \to r_{2} + r_{3} \cr}$$

$$\left[\matrix{ 1 & 0 & 0 & 1 & 2 \cr 0 & 1 & 0 & 4 & 3 \cr 0 & 0 & 1 & 4 & 1 \cr}\right]\quad\halmos$$


5. The real matrix A has the row reduced echelon form R:

$$A = \left[\matrix{ 4 & -4 & 0 & 0 & -3 & -13 \cr 0 & 0 & 2 & 4 & 0 & 2 \cr -1 & 1 & 2 & 4 & 1 & 6 \cr 0 & 0 & 1 & 2 & 0 & 1 \cr}\right] \to \left[\matrix{ 1 & -1 & 0 & 0 & 0 & -1 \cr 0 & 0 & 1 & 2 & 0 & 1 \cr 0 & 0 & 0 & 0 & 1 & 3 \cr 0 & 0 & 0 & 0 & 0 & 0 \cr}\right] = R.$$

Find:

(a) A basis for the row space of A.

(b) A basis for the column space of A.

(c) The rank of A.

(d) A basis for the null space of A.

(e) The nullity of A.

(a) The nonzero rows of the row reduced echelon matrix form a basis for the row space:

$$\{(1, -1, 0, 0, 0, -1), (0, 0, 1, 2, 0, 1), (0, 0, 0, 0, 1, 3)\}.\quad\halmos$$

(b) The leading coefficients occur in columns 1, 3, and 5. Therefore, the first, third, and fifth columns of A form a basis for the column space of A:

$$\{(4, 0, -1, 0), (0, 2, 2, 1), (-3, 0, 1, 0)\}.\quad\halmos$$

(c) The rank is the dimension of the row space or the column space, so the rank is 3.

(d) A vector $(a, b, c, d, e,
   f)$ is in the null space of A if and only if it's in the null space of R. In this case, it must satisfy

$$\left[\matrix{1 & -1 & 0 & 0 & 0 & -1 \cr 0 & 0 & 1 & 2 & 0 & 1 \cr 0 & 0 & 0 & 0 & 1 & 3 \cr 0 & 0 & 0 & 0 & 0 & 0 \cr}\right] \left[\matrix{a \cr b \cr c \cr d \cr e \cr f \cr}\right] = \left[\matrix{0 \cr 0 \cr 0 \cr 0 \cr}\right].$$

This produces the following equations:

$$a - b - f = 0, \quad c + 2 d + f = 0, \quad e + 3 f = 0.$$

Therefore, $a = b + f$ , $c =
   -2d - f$ , and $e = -3f$ , so

$$\left[\matrix{a \cr b \cr c \cr d \cr e \cr f \cr}\right] = \left[\matrix{b + f \cr b \cr -2d - f \cr d \cr -3f \cr f \cr}\right] = b \cdot \left[\matrix{1 \cr 1 \cr 0 \cr 0 \cr 0 \cr 0 \cr}\right] + d \cdot \left[\matrix{0 \cr 0 \cr -2 \cr 1 \cr 0 \cr 0 \cr}\right] + f \cdot \left[\matrix{1 \cr 0 \cr -1 \cr 0 \cr -3 \cr 1 \cr}\right].$$

Therefore, a basis for the null space is

$$\{(1, 1, 0, 0, 0, 0), (0, 0, -2, 1, 0, 0), (1, 0, -1, 0, -3, 1)\} \quad\halmos$$

(e) The nullity of A is the dimension of the null space of A, so the nullity is 3.


6. Suppose F is a field and $A, B
   \in M(n, F)$ . Suppose A and B are similar (so $P A P^{-1} = B$ for some invertible matrix $P \in M(n,
   F)$ ).

(a) Prove that $A^T$ and $B^T$ are similar.

(b) Prove that if $n \ge 1$ , then $A^n$ and $B^n$ are similar.

(a)

$$\eqalign{ P A P^{-1} & = B \cr (P A P^{-1})^T & = B^T \cr (P^{-1})^T A^T P^T & = B^T \cr (P^T)^{-1} A^T P^T & = B^T \cr}$$

(To get the last equation, I used the fact that $(M^{-1})^T = (M^T)^{-1}$ .) Hence, $A^T$ and $B^T$ are similar.

(b)

$$B^n = (P A P^{-1})^n = \overbrace{(P A P^{-1})(P A P^{-1})(P A P^{-1}) \cdots (P A P^{-1})}^{n\rm\;times} = P A^n P^{-1}.$$

The last equality follows from the fact that all the intermediate $P^{-1} P$ products cancel. (If you want to prove this rigorously, you can give an induction proof.) Hence, $A^n$ and $B^n$ are similar.


7. Let A be an $m \times n$ matrix and let B be an $n \times p$ matrix. Show that the row space of $A B$ is contained in the row space of B.

Write

$$A = \left[\matrix{\leftarrow & \bvec{r_1} & \rightarrow \cr \leftarrow & \bvec{r_2} & \rightarrow \cr & \vdots & \cr \leftarrow & \bvec{r_m} & \rightarrow \cr}\right].$$

Then

$$A B = \left[\matrix{\leftarrow & \bvec{r_1} & \rightarrow \cr \leftarrow & \bvec{r_2} & \rightarrow \cr & \vdots & \cr \leftarrow & \bvec{r_m} & \rightarrow \cr}\right]B = \left[\matrix{\leftarrow & \bvec{r_1} B & \rightarrow \cr \leftarrow & \bvec{r_2} B & \rightarrow \cr & \vdots & \cr \leftarrow & \bvec{r_m} B & \rightarrow \cr}\right].$$

But each $r_k B$ is a linear combination of the rows of B. For example, suppose $\bvec{r_1}
   = (a_{11}, a_{12}, \ldots a_{1n})$ and

$$B = \left[\matrix{\leftarrow & \bvec{s_1} & \rightarrow \cr \leftarrow & \bvec{s_2} & \rightarrow \cr & \vdots & \cr \leftarrow & \bvec{s_n} & \rightarrow \cr}\right].$$

Then

$$\bvec{r_1} B = \left[\matrix{a_{11} & a_{12} & \cdots & a_{1n} \cr}\right] \left[\matrix{\leftarrow & \bvec{s_1} & \rightarrow \cr \leftarrow & \bvec{s_2} & \rightarrow \cr & \vdots & \cr \leftarrow & \bvec{s_n} & \rightarrow \cr}\right] = a_{11}\bvec{s_1} + a_{12}\bvec{s_2} + \cdots + a_{1n}\bvec{s_n}.$$

Since the rows of $A B$ are linear combinations of the rows of B, the rows of $A B$ are contained in the row space of B. Therefore, the same is true for linear combinations of the rows of $A B$ --- and hence, the row space of $A B$ is contained in the row space of B.


8. Write the following matrix as a product $A B$ where $A = (x, y,
   z)^T$ and B is a $3 \times 3$ real matrix.

$$\left[\matrix{x + 4 y + 7 z & 2 x + 5 y + 8 z & 3 x + 6 y + 9 z \cr}\right].$$

$$\left[\matrix{ x + 4 y + 7 z & 2 x + 5 y + 8 z & 3 x + 6 y + 9 z \cr}\right] = \left[\matrix{x & y & z \cr}\right] \left[\matrix{ 1 & 2 & 3 \cr 4 & 5 & 6 \cr 7 & 8 & 9 \cr}\right]. \quad\halmos$$


9. Let

$${\cal B} = \left\{\left[\matrix{2 \cr 1 \cr}\right], \left[\matrix{3 \cr 1 \cr}\right]\right\}, \quad {\cal C} = \left\{\left[\matrix{1 \cr 0 \cr 1 \cr}\right], \left[\matrix{0 \cr 1 \cr 1 \cr}\right], \left[\matrix{1 \cr 1 \cr 0 \cr}\right]\right\}.$$

Suppose $T: \real^2 \to \real^3$ is the linear transformation whose matrix is

$$[T]_{{\cal B},{\cal C}} = \left[\matrix{1 & -1 \cr 0 & 2 \cr 3 & -1 \cr}\right].$$

(a) Find $T(\vec{v})_{\cal C}$ , where $\vec{v} = (1, 4)_{\cal B}$ .

(b) Find $T(\vec{v})_{\rm std}$ , where $\vec{v} = (1, 4)_{\cal B}$ .

(c) Find $[T]_{{\rm std},{\rm
   std}}$ .

(a)

$$T(\vec{v})_{\cal C} = \left[\matrix{1 & -1 \cr 0 & 2 \cr 3 & -1 \cr}\right]\left[\matrix{1 \cr 4 \cr}\right] = \left[\matrix{-3 \cr 8 \cr -1 \cr}\right]_{\cal C}.\quad\halmos$$

(b) From (a),

$$T(\vec{v})_{\cal C} = \left[\matrix{-3 \cr 8 \cr -1 \cr}\right]_{\cal C}.$$

So

$$T(\vec{v})_{\rm std} = (-3)\cdot \left[\matrix{1 \cr 0 \cr 1 \cr}\right] + 8\cdot \left[\matrix{0 \cr 1 \cr 1 \cr}\right] + (-1)\cdot \left[\matrix{1 \cr 1 \cr 0 \cr}\right] = \left[\matrix{-4 \cr 7 \cr 5 \cr}\right].$$

You could also do this by multiplying $(3, 8, -1)_{\cal C}$ by $[{\cal C} \to {\rm std}]$ --- it's essentially the same computation.

(c)

$$[T]_{{\rm std},{\rm std}} = [{\cal C} \to {\rm std}] [T]_{{\cal B},{\cal C}}[{\rm std} \to {\cal B}] = \left[\matrix{1 & 0 & 1 \cr 0 & 1 & 1 \cr 1 & 1 & 0 \cr}\right] \left[\matrix{1 & -1 \cr 0 & 2 \cr 3 & -1 \cr}\right] \left[\matrix{2 & 3 \cr 1 & 1 \cr}\right]^{-1} = \left[\matrix{-6 & 16 \cr -2 & 7 \cr 0 & 1 \cr}\right].\quad\halmos$$


10. If A is an $m \times n$ matrix of rank n, the pseudoinverse of A is defined to be

$$A^+ = (A^T A)^{-1} A^T.$$

(a) Prove that if A is invertible, then $A^+ = A^{-1}$ .

(b) Prove that $(A A^+)^2 = A
   A^+$ .

(c) Show that $A A^+$ is symmetric.

(a) Suppose A is invertible. Then

$$A^+ = (A^T A)^{-1} A^T = A^{-1} (A^T)^{-1} A^T = A^{-1} I = A^{-1}.\quad\halmos$$

(b)

$$(A A^+)^2 = \left[A (A^T A)^{-1} A^T\right]^2 = A (A^T A)^{-1} A^T A (A^T A)^{-1} A^T = A (A^T A)^{-1} \left[(A^T A) (A^T A)^{-1}\right] A^T =$$

$$A(A^T A)^{-1}\left[I\right] A^T = A (A^T A)^{-1} A^T = A A^+.\quad\halmos$$

(c)

$$(A A^+)^T = \left[A (A^T A)^{-1} A^T\right]^T = A \left[(A^T A)^{-1}\right]^T A^T = A \left[(A^T A)^T\right]^{-1} A^T = A (A^T A)^{-1} A^T = A A^+.\quad\halmos$$

Note: The intent is not that you should memorize the definition of the pseudoinverse. This problem is about whether you can take a new definition and work with it together with things you already know.


11. Find the inverse of the following matrix over $\integer_5$ :

$$\left[\matrix{ 1 & 4 & 2 \cr 0 & 3 & 0 \cr 1 & 1 & 1 \cr}\right].$$

$$\left[\matrix{ 1 & 4 & 2 & 1 & 0 & 0 \cr 0 & 3 & 0 & 0 & 1 & 0 \cr 1 & 1 & 1 & 0 & 0 & 1 \cr}\right] \matrix{\to \cr r_3 \to r_3 + 4r_1 \cr} \left[\matrix{ 1 & 4 & 2 & 1 & 0 & 0 \cr 0 & 3 & 0 & 0 & 1 & 0 \cr 0 & 2 & 4 & 4 & 0 & 1 \cr}\right] \matrix{\to \cr r_3 \to r_3 + r_2 \cr}$$

$$\left[\matrix{ 1 & 4 & 2 & 1 & 0 & 0 \cr 0 & 3 & 0 & 0 & 1 & 0 \cr 0 & 0 & 4 & 4 & 1 & 1 \cr}\right] \matrix{\to \cr r_2 \to 2r_2 \cr} \left[\matrix{ 1 & 4 & 2 & 1 & 0 & 0 \cr 0 & 1 & 0 & 0 & 2 & 0 \cr 0 & 0 & 4 & 4 & 1 & 1 \cr}\right] \matrix{\to \cr r_1 \to r_1 + r_2 \cr}$$

$$\left[\matrix{ 1 & 0 & 2 & 1 & 2 & 0 \cr 0 & 1 & 0 & 0 & 2 & 0 \cr 0 & 0 & 4 & 4 & 1 & 1 \cr}\right] \matrix{\to \cr r_3 \to 4r_3 \cr} \left[\matrix{ 1 & 0 & 2 & 1 & 2 & 0 \cr 0 & 1 & 0 & 0 & 2 & 0 \cr 0 & 0 & 1 & 1 & 4 & 4 \cr}\right] \matrix{\to \cr r_1 \to r_1 + 3r_3 \cr}$$

$$\left[\matrix{ 1 & 0 & 0 & 4 & 4 & 2 \cr 0 & 1 & 0 & 0 & 2 & 0 \cr 0 & 0 & 1 & 1 & 4 & 4 \cr}\right]$$

$$\left[\matrix{ 1 & 4 & 2 \cr 0 & 3 & 0 \cr 1 & 1 & 1 \cr}\right]^{-1} = \left[\matrix{ 4 & 4 & 2 \cr 0 & 2 & 0 \cr 1 & 4 & 4 \cr}\right].\quad\halmos$$


12. (a) Find the (classical) adjoint of

$$A = \left[\matrix{ 1 & 0 & 3 \cr 2 & 2 & 1 \cr 4 & 1 & 1 \cr}\right] \in M(3, \integer_5).$$

(b) Use the adjoint formula to compute $A^{-1}$ .

First, I'll compute the cofactors. The $(i, j)^{\rm th}$ cofactor is listed in the $(i, j)^{\rm
   th}$ position.

$$\matrix{ \left|\matrix{2 & 1 \cr 1 & 1 \cr}\right| = 1 & -\left|\matrix{2 & 1 \cr 4 & 1 \cr}\right| = 2 & \left|\matrix{2 & 2 \cr 4 & 1 \cr}\right| = -6 = 4 \cr -\left|\matrix{0 & 3 \cr 1 & 1 \cr}\right| = 3 & \left|\matrix{1 & 3 \cr 4 & 1 \cr}\right| = -11 = 4 & -\left|\matrix{1 & 0 \cr 4 & 1 \cr}\right| = -1 = 4 \cr \left|\matrix{0 & 3 \cr 2 & 1 \cr}\right| = -6 = 4 & -\left|\matrix{1 & 3 \cr 2 & 1 \cr}\right| = 5 = 0 & \left|\matrix{1 & 0 \cr 2 & 2 \cr}\right| = 2 \cr}$$

The adjoint is the transpose of the matrix of cofactors:

$$\adj(A) = \left[\matrix{ 1 & 3 & 4 \cr 2 & 4 & 0 \cr 4 & 4 & 2 \cr}\right].\quad\halmos$$

(b) First, expanding by cofactors of the first row gives

$$|A| = \left|\matrix{ 1 & 0 & 3 \cr 2 & 2 & 1 \cr 4 & 1 & 1 \cr}\right] = 1 \cdot \left|\matrix{2 & 1 \cr 1 & 1 \cr}\right| - 0 \cdot \left|\matrix{2 & 1 \cr 4 & 1 \cr}\right| + 3 \cdot \left|\matrix{2 & 2 \cr 4 & 1 \cr}\right| = 3.$$

Now $3^{-1} = 2$ in $\integer_5$ , so

$$A^{-1} = |A|^{-1} \adj(A) = 2 \cdot \left[\matrix{ 1 & 3 & 4 \cr 2 & 4 & 0 \cr 4 & 4 & 2 \cr}\right] = \left[\matrix{ 2 & 1 & 3 \cr 4 & 3 & 0 \cr 3 & 3 & 4 \cr}\right].\quad\halmos$$


13. Let $x, y \in \real^2$ . An inner product is defined on $\real^2$ by

$$\innp{x}{y} = \left[\matrix{x_1 & x_2 \cr}\right] \left[\matrix{ 5 & -8 \cr -8 & 13 \cr}\right] \left[\matrix{y_1 \cr y_2 \cr}\right].$$

(a) Compute $\innp{(3, -1)}{(1,
   2)}$ .

(b) Find the length of $(2, 1)$ relative to this inner product.

(c) Find the cosine of the angle between $(4, 0)$ and $(1, 1)$ relative to this inner product.

(a) Compute $\innp{(3, -1)}{(1,
   2)}$ .

$$\innp{(3, -1)}{(1, 2)} = \left[\matrix{3 & -1 \cr}\right] \left[\matrix{ 5 & -8 \cr -8 & 13 \cr}\right] \left[\matrix{1 \cr 2 \cr}\right] = -51.\quad\halmos$$

(b)

$$\|(2, 1)\| = \sqrt{\innp{(2, 1)}{(2, 1)}} = \sqrt{\left[\matrix{2 & 1 \cr}\right] \left[\matrix{ 5 & -8 \cr -8 & 13 \cr}\right] \left[\matrix{2 \cr 1 \cr}\right]} = \sqrt{1} = 1.\quad\halmos$$

(c) First,

$$\|(4, 0)\| = \sqrt{\innp{(4, 0)}{(4, 0)}} = \sqrt{\left[\matrix{4 & 0 \cr}\right] \left[\matrix{ 5 & -8 \cr -8 & 13 \cr}\right] \left[\matrix{4 \cr 0 \cr}\right]} = \sqrt{80},$$

$$\|(1, 1)\| = \sqrt{\innp{(1, 1)}{(1, 1)}} = \sqrt{\left[\matrix{1 & 1 \cr}\right] \left[\matrix{ 5 & -8 \cr -8 & 13 \cr}\right] \left[\matrix{1 \cr 1 \cr}\right]} = \sqrt{2},$$

$$\innp{(4, 0)}{(1, 1)} = \left[\matrix{4 & 0 \cr}\right] \left[\matrix{ 5 & -8 \cr -8 & 13 \cr}\right] \left[\matrix{1 \cr 1 \cr}\right] = -12.$$

Then

$$\cos \theta = \dfrac{\innp{(4, 0)}{(1, 1)}}{\|(4, 0)\| \|(1, 1)\|} = \dfrac{-12}{\sqrt{80}\sqrt{2}}.\quad\halmos$$


14. The first two vectors in the following set are orthogonal:

$$\{(2, -3, 1, 2), (1, 1, -1, 1), (2, 13, -1, 0)\}.$$

Find an othonormal set which spans the same subspace of $\real^4$ .

Apply Gram-Schmidt:

$$(2, 13, -1, 0) - \dfrac{(2, 13, -1, 0) \cdot (2, -3, 1, 2)} {(2, -3, 1, 2) \cdot (2, -3, 1, 2)} (2, -3, 1, 2) - \dfrac{(2, 13, -1, 0) \cdot (1, 1, -1, 1)} {(1, 1, -1, 1) \cdot (1, 1, -1, 1)} (1, 1, -1, 1) =$$

$$(2, 13, -1, 0) + 2 \cdot (2, -3, 1, 2) - 4 \cdot (1, 1, -1, 1) = (2, 3, 5, 0).$$

The following set is an orthogonal set which spans the same subspace:

$$\{(2, -3, 1, 2), (1, 1, -1, 1), (2, 3, 5, 0)\}.$$

The following set is an orthonormal set which spans the same subspace:

$$\left\{\dfrac{1}{\sqrt{18}}(2, -3, 1, 2), \dfrac{1}{2}(1, 1, -1, 1), \dfrac{1}{\sqrt{38}}(2, 3, 5, 0)\right\}.\quad\halmos$$


15. Let

$${\cal B} = \left\{\left[\matrix{1 \cr 1 \cr -5 \cr 2 \cr}\right], \left[\matrix{-2 \cr -1 \cr 2 \cr -1 \cr}\right], \left[\matrix{8 \cr 5 \cr -16 \cr 7 \cr}\right]\right\}.$$

(a) Find a basis for the subspace of $\real^4$ spanned by ${\cal B}$ .

(b) Find a subset of ${\cal B}$ which is a basis for the subspace of $\real^4$ spanned by ${\cal
   B}$ .

(a) Construct a matrix with the elements of ${\cal B}$ as the rows and row reduce:

$$\left[\matrix{1 & 1 & -5 & 2 \cr -2 & -1 & 2 & -1 \cr 8 & 5 & -16 & 7 \cr}\right] \to \left[\matrix{1 & 0 & 3 & -1 \cr 0 & 1 & -8 & 3 \cr 0 & 0 & 0 & 0 \cr}\right].$$

The nonzero rows of the row reduced echelon matrix form a basis for the row space, which in turn is the same as the span of ${\cal B}$ . Therefore, a basis for the subspace of $\real^4$ spanned by ${\cal
   B}$ is

$$\{(1, 0, 3, -1), (0, 1, -8, 3)\}.\quad\halmos$$

(b) Construct a matrix with the elements of ${\cal B}$ as the columns and row reduce:

$$\left[\matrix{1 & -2 & 8 \cr 1 & -1 & 5 \cr -5 & 2 & -16 \cr 2 & -1 & 7 \cr}\right] \to \left[\matrix{1 & 0 & 2 \cr 0 & 1 & -3 \cr 0 & 0 & 0 \cr 0 & 0 & 0 \cr}\right].$$

The leading columns occur in the first and second columns. Therefore, the first and second columns of the original matrix are independent. So a subset of ${\cal B}$ which is a basis for the subspace spanned by ${\cal B}$ is given by

$$\left\{\left[\matrix{1 \cr 1 \cr -5 \cr 2 \cr}\right], \left[\matrix{-2 \cr -1 \cr 2 \cr -1 \cr}\right]\right\}.\quad\halmos$$


16. Compute the determinant of the real matrix and simplify:

$$\left|\matrix{ -4 - 2 k & k & 4 - 2 k \cr -4 - k & k & 4 - k \cr -1 - k & 0 & 1 - k \cr}\right|.$$

The idea is to simplify the determinant by performing row and column operations before expanding.

$$\left|\matrix{ -4 - 2 k & k & 4 - 2 k \cr -4 - k & k & 4 - k \cr -1 - k & 0 & 1 - k \cr}\right| \matrix{\to \cr r_1 \to r_1 - r_2 \cr} \left|\matrix{ -k & 0 & -k \cr -4 - k & k & 4 - k \cr -1 - k & 0 & 1 - k \cr}\right| \matrix{\to \cr c_1 \to c_1 - c_3 \cr}$$

$$\left|\matrix{ 0 & 0 & -k \cr -8 & k & 4 - k \cr -2 & 0 & 1 - k \cr}\right| = (-k) \left|\matrix{-8 & k \cr -2 & 0 \cr}\right| = (-k)(2 k) = -2 k^2.\quad\halmos$$


17. Use Cramer's Rule to solve the following system of linear equations over $\real$ :

$$\eqalign{ x + 3 y & = 8 \cr 2 x + y & = -9 \cr}$$

$$\left|\matrix{1 & 3 \cr 2 & 1 \cr}\right| = -5, \quad \left|\matrix{8 & 3 \cr -9 & 1 \cr}\right| = 35, \quad \left|\matrix{1 & 8 \cr 2 & -9 \cr}\right| = -25.$$

Hence,

$$x = \dfrac{35}{-5} = -7 \quad\hbox{and}\quad y = \dfrac{-25}{-5} = 5.\quad\halmos$$


18. Write $\left[\matrix{1 & -3
   \cr 2 & 2 \cr}\right] \in M(2,\real)$ as a product of elementary matrices.

Row reduce the matrix to the identity:

$$\left[\matrix{1 & -3 \cr 2 & 2 \cr}\right] \matrix{\to \cr r_2 \to r_2-2r_1 \cr} \left[\matrix{1 & -3 \cr 0 & 8 \cr}\right] \matrix{\to \cr r_2 \to r_2/8 \cr} \left[\matrix{1 & -3 \cr 0 & 1 \cr}\right] \matrix{\to \cr r_1 \to r_1+3r_2 \cr} \left[\matrix{1 & 0 \cr 0 & 1 \cr}\right].$$

The elementary matrices which correspond to the row operations are

$$(r_2 \to r_2-2r_1): \quad E_1 = \left[\matrix{1 & 0 \cr -2 & 1 \cr}\right]$$

$$(r_2 \to r_2/8): \quad E_2 = \left[\matrix{1 & 0 \cr \noalign{\vskip2pt} 0 & \dfrac{1}{8} \cr}\right]$$

$$(r_1 \to r_1+3r_2): \quad E_3 = \left[\matrix{1 & 3 \cr 0 & 1 \cr}\right]$$

Then

$$E_3E_2E_1\left[\matrix{1 & -3 \cr 2 & 2 \cr}\right] = \left[\matrix{1 & 0 \cr 0 & 1 \cr}\right].$$

Hence,

$$\left[\matrix{1 & -3 \cr 2 & 2 \cr}\right] = E_1^{-1}E_2^{-1}E_3^{-1}\left[\matrix{1 & 0 \cr 0 & 1 \cr}\right] = E_1^{-1}E_2^{-1}E_3^{-1} = \left[\matrix{1 & 0 \cr 2 & 1 \cr}\right] \left[\matrix{1 & 0 \cr 0 & 8 \cr}\right] \left[\matrix{1 & -3 \cr 0 & 1 \cr}\right].\quad\halmos$$


19. Let V be the subset of $M(2,\real)$ which consists of all matrices A satisfying $A^2
   = A$ . Prove or disprove: V is a subspace of $M(2,\real)$ .

It's always good to check first whether the supposed subspace contains the zero vector. $\left[\matrix{0 & 0 \cr 0 & 0 \cr}\right] \in V$ , so no conclusion can be drawn.

V is not a subspace, since it's not closed under sums. For example,

$$\left[\matrix{1 & 0 \cr 0 & 1 \cr}\right]^2 = \left[\matrix{1 & 0 \cr 0 & 1 \cr}\right], \quad\hbox{so}\quad \left[\matrix{1 & 0 \cr 0 & 1 \cr}\right] \in V.$$

However,

$$\left[\matrix{1 & 0 \cr 0 & 1 \cr}\right] + \left[\matrix{1 & 0 \cr 0 & 1 \cr}\right] = \left[\matrix{2 & 0 \cr 0 & 2 \cr}\right] \notin V.$$

It does not square to itself:

$$\left[\matrix{2 & 0 \cr 0 & 2 \cr}\right]^2 = \left[\matrix{4 & 0 \cr 0 & 4 \cr}\right] \ne \left[\matrix{2 & 0 \cr 0 & 2 \cr}\right].$$

You can also show that V is not closed under scalar multiplication. As noted earlier, $\left[\matrix{1 & 0 \cr 0 & 1 \cr}\right] \in V$ . Consider

$$2 \cdot \left[\matrix{1 & 0 \cr 0 & 1 \cr}\right] = \left[\matrix{2 & 0 \cr 0 & 2 \cr}\right].$$

Then

$$\left[\matrix{2 & 0 \cr 0 & 2 \cr}\right]^2 = \left[\matrix{4 & 0 \cr 0 & 4 \cr}\right] \ne \left[\matrix{2 & 0 \cr 0 & 2 \cr}\right].$$

Therefore, $2 \cdot
   \left[\matrix{1 & 0 \cr 0 & 1 \cr}\right] \notin V$ .


20. Find the eigenvalues and a complete independent set of eigenvectors for the following matrix. Find a diagonalizing matrix P and find the diagonal matrix D.

$$A = \left[\matrix{ 0 & 0 & 2 \cr -3 & 2 & 3 \cr -1 & 0 & 3 \cr}\right].$$ Find the characteristic polynomial:

$$|A - xI| = \left|\matrix{ -x & 0 & 2 \cr -3 & 2 - x & 3 \cr -1 & 0 & 3 - x \cr}\right| = (2 - x) \left|\matrix{-2 & 2 \cr -1 & 3 - x \cr}\right| = (2 - x)(x^2 - 3 x + 2) = -(x - 2)^2(x - 1).$$

The eigenvalues are 1 and 2.

For $\lambda = 1$ ,

$$A - I = \left[\matrix{ -1 & 0 & 2 \cr -3 & 1 & 3 \cr -1 & 0 & 2 \cr}\right] \to \left[\matrix{ 1 & 0 & -2 \cr 0 & 1 & -3 \cr 0 & 0 & 0 \cr}\right]$$

If $(a, b, c)$ denotes an eigenvector, the last matrix gives the equations

$$a - 2c = 0 \quad\hbox{and}\quad b - 3c = 0.$$

Hence, $a = 2c$ and $b
   = 3c$ . Thus,

$$\left[\matrix{a \cr b \cr c \cr}\right] = c \cdot \left[\matrix{2 \cr 3 \cr 1 \cr}\right].$$

$(2, 3, 1)$ is an eigenvector for $\lambda = 1$ .

For $\lambda = 2$ ,

$$A - 2 I = \left[\matrix{ -2 & 0 & 2 \cr -3 & 0 & 3 \cr -1 & 0 & 1 \cr}\right] \to \left[\matrix{ 1 & 0 & -1 \cr 0 & 0 & 0 \cr 0 & 0 & 0 \cr}\right]$$

If $(a, b, c)$ denotes an eigenvector, the last matrix gives the equation

$$a - c = 0, \quad\hbox{so}\quad a = c.$$

Thus,

$$\left[\matrix{a \cr b \cr c \cr}\right] = b \cdot \left[\matrix{0 \cr 1 \cr 0 \cr}\right] + c \cdot \left[\matrix{1 \cr 0 \cr 1 \cr}\right].$$

$(0, 1, 0)$ and $(1, 0, 1)$ are independent eigenvectors for $\lambda = 2$ .

A diagonalizing matrix is given by

$$P = \left[\matrix{ 2 & 0 & 1 \cr 3 & 1 & 0 \cr 1 & 0 & 1 \cr}\right], \quad\hbox{and}\quad D = \left[\matrix{ 1 & 0 & 0 \cr 0 & 2 & 0 \cr 0 & 0 & 2 \cr}\right].\quad\halmos$$


21. Prove that if $A \in M(n,
   \real)$ and $A^2 = 0$ , then 0 is an eigenvalue of A.

If A is the zero matrix, then for any nonzero vector v, I have $A v = 0 \cdot v$ . Hence, 0 is an eigenvalue of A.

If A is not the zero matrix, then some column of A is nonzero --- say it is the $k^{\rm th}$ column $c_k$ . Now $A^2 = 0$ means that if

$$A = \left[\matrix{ \uparrow & \uparrow & & \uparrow & & \uparrow \cr c_1 & c_2 & \cdots & c_k & \cdots & c_n \cr \downarrow & \downarrow & & \downarrow & & \downarrow \cr}\right], \quad\hbox{then}\quad A \cdot \left[\matrix{ \uparrow & \uparrow & & \uparrow & & \uparrow \cr c_1 & c_2 & \cdots & c_k & \cdots & c_n \cr \downarrow & \downarrow & & \downarrow & & \downarrow \cr}\right] = 0.$$

In particular, I must have $A c_k
   = \vec{0}$ . Then $A c_k = 0 \cdot c_k$ , which says that 0 is an eigenvalue of A.


22. Let F be a field and let $A
   \in M(n, F)$ . Prove, or disprove by specific counterexample: If p is an eigenvalue of A with eigenvector v and q is an eigenvalue of A with eigenvector w, then $p + q$ is an eigenvalue of A with eigenvector $v + w$ .

The statement is false.

Consider the following matrix in $M(2, \real)$ :

$$A = \left[\matrix{1 & 0 \cr 0 & 2 \cr}\right].$$

$p = 1$ is an eigenvalue of A with eigenvector $(1, 0)$ , because

$$\left[\matrix{1 & 0 \cr 0 & 2 \cr}\right] \left[\matrix{1 \cr 0 \cr}\right] = 1 \cdot \left[\matrix{1 \cr 0 \cr}\right].$$

$q = 2$ is an eigenvalue of A with eigenvector $(0, 1)$ , because

$$\left[\matrix{1 & 0 \cr 0 & 2 \cr}\right] \left[\matrix{0 \cr 1 \cr}\right] = 2 \cdot \left[\matrix{0 \cr 1 \cr}\right].$$

However, $p + q = 3$ is not an eigenvalue with eigenvector $(1, 0) + (0, 1) = (1, 1)$ , because

$$\left[\matrix{1 & 0 \cr 0 & 2 \cr}\right] \left[\matrix{1 \cr 1 \cr}\right] = \left[\matrix{1 \cr 2 \cr}\right] \ne 3 \cdot \left[\matrix{1 \cr 1 \cr}\right].\quad\halmos$$


23. Consider the function $f:
   \real^2 \to \real^2$ given by

$$f(x,y) = (x, x y).$$

Check each axiom for a linear transformation. If the axiom holds, prove it. If the axiom does not hold, give a specific counterexample.

$$f(1, 2) + f(3, 4) = (1, 2) + (3, 12) = (4, 14), \quad\hbox{but}\quad f[(1, 2) + (3, 4)] = f(4, 6) = (4, 24).$$

Since $f(1, 2) + f(3, 4) \ne
   f[(1, 2) + (3, 4)]$ , the sum axiom does not hold.

$$f[3 \cdot (1, 2)] = f(3, 6) = (3, 18), \quad\hbox{but}\quad 3 \cdot f(1, 2) = 3 \cdot (1, 2) = (3, 6).$$

Since $f[3 \cdot (1, 2)] \ne 3
   \cdot f(1, 2)$ , the scalar multiplication axiom does not hold.


24. Consider the function $f:
   M(2, \real) \to \real^2$ given by

$$f\left(\left[\matrix{a & b \cr c & d \cr}\right]\right) = \left[\matrix{a - d \cr b - c \cr}\right].$$

Check each axiom for a linear transformation. If the axiom holds, prove it. If the axiom does not hold, give a specific counterexample.

Let $a, b, c, d, a', b', c', d'
   \in \real$ . Then

$$f\left(\left[\matrix{a & b \cr c & d \cr}\right] + \left[\matrix{a' & b' \cr c' & d' \cr}\right]\right) = f\left(\left[\matrix{ a + a' & b + b' \cr c + c' & d + d' \cr}\right]\right) =$$

$$\left[\matrix{ (a + a') - (d + d') \cr (b + b') - (c + c') \cr}\right] = \left[\matrix{ (a - d) + (a' - d') \cr (b - c) + (b' - c') \cr}\right] = \left[\matrix{ a - d \cr b - c \cr}\right] + \left[\matrix{ a' - d' \cr b' - c' \cr}\right] =$$

$$f\left(\left[\matrix{a & b \cr c & d \cr}\right]\right) + f\left(\left[\matrix{a' & b' \cr c' & d' \cr}\right]\right).$$

Hence, the sum axiom holds.

Let $k, a, b, c, d \in \real$ . Then

$$f\left(k \cdot \left[\matrix{a & b \cr c & d \cr}\right]\right) = f\left(\left[\matrix{k a & k b \cr k c & k d \cr}\right]\right) = \left[\matrix{ k a - k d \cr k b - k c \cr}\right] =$$

$$k \cdot \left[\matrix{ a - d \cr b - c \cr}\right] = k \cdot f\left(\left[\matrix{a & b \cr c & d \cr}\right]\right).$$

Hence, the scalar multiplication axiom holds.


25. Suppose u, v, and w are vectors in a real inner product space, and

$$\|u\| = 5, \quad \innp{u}{v} = 8, \quad \|v\| = 3.$$

(a) Compute $\innp{u + 2 v}{u - 3
   v}$ .

(b) Compute $\|3 u + v\|$ .

(a)

$$\innp{u + 2 v}{u - 3 v} = \innp{u}{u} - 3 \innp{u}{v} + 2 \innp{v}{u} - 6 \innp{v}{v} = \innp{u}{u} - \innp{u}{v} - 6 \innp{v}{v} = 5^2 - 8 - 6(3^2) = -37.\quad\halmos$$

(b)

$$\|3 u + v\|^2 = \innp{3 u + v}{3 u + v} = 9 \innp{u}{u} + 3 \innp{u}{v} + 3 \innp{v}{u} + \innp{v}{v} = 9 \innp{u}{u} + 6 \innp{u}{v} + \innp{v}{v} =$$

$$9(5^2) + 6(8) + 3^2 = 282.$$

Hence, $\|3 u + v\| =
   \sqrt{282}$ .


26. (a) Compute $(1 + 2 i, 2
   i)\cdot (2 - i, 1 + i)$ .

(b) Compute $\|(2 + 3 i, 1 -
   i)\|$ .

(a)

$$(1 + 2 i, 2 i)\cdot (2 - i, 1 + i) = (1 + 2 i)(2 + i) + (2 i)(1 - i) = 5 i + (2 + 2 i) = 2 + 7 i.\quad\halmos$$

(b)

$$\|(2 + 3 i, 1 - i)\| = \sqrt{(2 + 3 i)(2 - 3 i) + (1 - i)(1 + i)} = \sqrt{15}.\quad\halmos$$


27. Explain why the complex dot product on $\complex^n$ is not defined by

$$\hbox{``}(a_1, a_2, \ldots, a_n) \cdot (b_1, b_2, \ldots, b_n) = a_1 b_1 + a_2 b_2 + \cdots + a_n b_n \hbox{''}.$$ Consider, for instance, $(1 + i, 2 + i) \in \complex^2$. The inner product of a vector with itself should be a {\it nonnegative real number}. But if I use the definition above, I get

$$\hbox{``} (1 + i, 2 + i) \cdot (1 + i, 2 + i) = (1 + i)(1 + i) + (2 + i)(2 + i) = 3 + 6 i.$$

The correct definition is

$$(a_1, a_2, \ldots, a_n) \cdot (b_1, b_2, \ldots, b_n) = a_1 \conj{b_1} + a_2 \conj{b_2} + \cdots + a_n \conj{b_n}.$$

This does give a nonnegative real number when a vector is multiplied by itself.


28. A parallelogram has vertices $A(2, 4)$ , $B(4, 1)$ , $C(5, 3)$ , and $D(3, 6)$ , listed counterclockwise around the parallelogram. Find an affine transformation which takes the unit square $0 \le u \le 1$ , $0 \le v \le
   1$ onto the parallelogram, so that the point $(0, 0)$ is mapped to A.

The vectors for the sides which start at A are $\bvec{A B} = (2, -3)$ and $\bvec{A D} = (1, 2)$ . Hence, I can use the transformation

$$f\left(\left[\matrix{u \cr v \cr}\right]\right) = \left[\matrix{ 2 & 1 \cr -3 & 2 \cr}\right] \left[\matrix{u \cr v \cr}\right] + \left[\matrix{2 \cr 4\cr}\right].$$

(It's okay to switch the columns of the matrix, since this will also give a transformation meeting the requirements of the problem.)


29. Let V be a real inner product space, and let $v \in V$ .

(a) Define

$$v^\perp = \{x \in V \mid \innp{v}{x} = 0\}.$$

Prove that $v^\perp$ is a subspace of V.

(b) Define

$$W = \{x \in V \mid \innp{v}{x} = 1\}.$$

Prove that W is not a subspace of V.

(a) Let $x, y \in v^\perp$ . I want to show $x + y \in v^\perp$ . Now $x \in v^\perp$ implies $\innp{v}{x} = 0$ , and $y \in
   v^\perp$ implies $\innp{v}{y} = 0$ . Therefore

$$0 = \innp{v}{x} + \innp{v}{y} = \innp{v}{x + y}.$$

Hence, $x + y \in v^\perp$ .

Let $x \in v^\perp$ , and let $k \in \real$ . I want to show that $k x \in v^\perp$ . Now $x \in v^\perp$ implies $\innp{v}{x} = 0$ , so

$$0 = k \innp{v}{x} = \innp{v}{k x}.$$

Hence, $k x \in v^\perp$ .

Therefore, $v^\perp$ is a subspace of V.

(b) Suppose $x \in W$ , so $\innp{v}{x} = 1$ . Then

$$\innp{v}{2 x} = 2 \innp{v}{x} = 2 \cdot 1 = 2.$$

Hence, $2 x \notin W$ . Since W is not closed under scalar multiplication, W is not a subspace.


30. Let A be an $m \times n$ real matrix. Show that every vector in the null space of A is orthogonal to every vector in the row space of A.

Suppose x is in the null space of A, so $A x = \vec{0}$ . Denoting the rows of A by $r_1$ , $r_2$ , ..., $r_m$ , this means that

$$\left[\matrix{ \leftarrow & r_1 & \rightarrow \cr \leftarrow & r_2 & \rightarrow \cr & \vdots & \cr \leftarrow & r_m & \rightarrow \cr}\right] \left[\matrix{\uparrow \cr x \cr \downarrow \cr}\right] = \left[\matrix{0 \cr 0 \cr \vdots \cr 0 \cr}\right].$$

But this means that

$$r_1 \cdot x = 0, \quad r_2 \cdot x = 0, \ldots, r_m \cdot x = 0.$$

An element of the row space of A is a linear combination of the rows of A, say

$$r = a_1 r_1 + a_2 r_2 + \cdots + a_m r_m.$$

Then

$$r \cdot x = a_1 r_1 \cdot x + a_2 r_2 \cdot x + \cdots + a_m r_m \cdot x = 0 + 0 + \cdots + 0 = 0.$$

Hence, x is orthogonal to every element of the row space of A.


31. Solve the linear system

$$\left[\matrix{x' \cr y' \cr}\right] = \left[\matrix{2 & 3 \cr 3 & 2 \cr}\right] \left[\matrix{x \cr y \cr}\right].$$

$$\det \left[\matrix{2 - x & 3 \cr 3 & 2 - x \cr}\right] = x^2 - 4 x - 5 = (x - 5)(x + 1).$$

The eigenvalues are $x = 5$ and $x = -1$ .

For $x = 5$ ,

$$A - 5 I = \left[\matrix{-3 & 3 \cr 3 & -3 \cr}\right] \to \left[\matrix{1 & -1 \cr 0 & 0 \cr}\right].$$

By inspection, $(1, 1)$ is an eigenvector for $x = 5$ .

For $x = -1$ ,

$$A + I = \left[\matrix{3 & 3 \cr 3 & 3 \cr}\right] \to \left[\matrix{1 & 1 \cr 0 & 0 \cr}\right].$$

By inspection, $(1, -1)$ is an eigenvector for $x = -1$ .

The solution is

$$\left[\matrix{x \cr y \cr}\right] = c_1e^{5 t}\left[\matrix{1 \cr 1 \cr}\right] + c_2e^{-t}\left[\matrix{1 \cr -1 \cr}\right].\quad\halmos$$


32. (a) Solve the linear system

$$\left[\matrix{x' \cr y' \cr}\right] = \left[\matrix{ 1 & 6 \cr 1 & 2 \cr}\right] \left[\matrix{x \cr y \cr}\right].$$

(b) As $t \to \infty$ , the solution curves approach a line. What is the line?

(a) Let

$$A = \left[\matrix{ 1 & 6 \cr 1 & 2 \cr}\right].$$

Then

$$\det (A - xI) = \left|\matrix{ 1 - x & 6 \cr 1 & 2 - x \cr}\right| = (x - 1)(x - 2) - 6 = x^2 - 3 x - 4 = (x - 4)(x + 1).$$

The eigenvalues are $x = 4$ and $x = 1$ .

For $x = 4$ , I have

$$A - 4 I = \left[\matrix{ -3 & 6 \cr 1 & -2 \cr}\right].$$

By inspection, $(2, 1)$ is an eigenvector.

For $x = -1$ , I have

$$A + I = \left[\matrix{ 2 & 6 \cr 1 & 3 \cr}\right].$$

By inspection, $(3, -1)$ is an eigenvector.

The solution is

$$\left[\matrix{x \cr y \cr}\right] = c_1 e^{4 t} \left[\matrix{2 \cr 1 \cr}\right] + c_2 e^{-t} \left[\matrix{3 \cr -1 \cr}\right].\quad\halmos$$

(b) As $t \to \infty$ , I have $e^{-t} \to 0$ . The second term in the solution goes to 0, so as $t \to \infty$ ,

$$\left[\matrix{x \cr y \cr}\right] \approx c_1 e^{4 t} \left[\matrix{2 \cr 1 \cr}\right].$$

This means that $x \approx 2 c_1
   e^{4 t}$ and $y \approx c_2 e^{4 t}$ , so

$$\dfrac{y}{x} \approx \dfrac{c_1 e^{4 t}}{2c_1 e^{4 t}} = \dfrac{1}{2}.$$

So the curves approach the line $\dfrac{y}{x} = \dfrac{1}{2}$ , or $y =
   \dfrac{1}{2} x$ .


33. Solve the linear system

$$\left[\matrix{x' \cr y' \cr}\right] = \left[\matrix{ 1 & 5 \cr -1 & 3 \cr}\right] \left[\matrix{x \cr y \cr}\right].$$ Let

$$A = \left[\matrix{ 1 & 5 \cr -1 & 3 \cr}\right].$$

Then

$$\det (A - xI) = \left|\matrix{ 1 - x & 5 \cr -1 & 3 - x \cr}\right| = (1 - x)(3 - x) - (5)(-1) = x^2 - 4 x + 8.$$

The roots are $2 \pm 2 i$ .

For $2 + 2 i$ , I have

$$A - (2 + 2 i)I = \left[\matrix{ -1 - 2 i & 5 \cr -1 & 1 - 2 i \cr}\right] \quad \to \quad \left[\matrix{ -1 & 1 - 2 i \cr 0 & 0 \cr}\right]$$

(I can eliminate the first row, because it must be a multiple of the second. If it were not, the rows would form an independent set, the matrix would row-reduce to the identity, and the only solution to the corresponding homogeneous system would be $(0, 0)$ . Since I know there are eigenvectors, and since eigenvector must be nonzero, the homogeneous system must have a nonzero solution.)

Using a and b as the variables, the corresponding homogeneous system for the last matrix is

$$(-1)a + (1 - 2 i)b = 0.$$

Since all I want is some nonzero solution, I can take $a = 1 - 2 i$ and $b = 1$ . Thus, an eigenvector is $(1 - 2 i, 1)$ .

If the last shortcut is confusing, you can also do this by solving for a (say):

$$a = (1 - 2 i) b.$$

Then

$$\left[\matrix{a \cr b \cr}\right] = \left[\matrix{(1 - 2 i)b \cr b \cr}\right] = b \cdot \left[\matrix{1 - 2 i \cr 1 \cr}\right].$$

Taking $b = 1$ gives $(1 - 2 i, 1)$ , as before.

Using the eigenvalue $x = 2 + 2
   i$ , I have the solution

$$e^{(2 + 2 i)t} \left[\matrix{1 - 2 i \cr 1 \cr}\right] = e^{2 t} e^{2 t i} \left[\matrix{1 - 2 i \cr 1 \cr}\right] = e^{2 t} (\cos 2 t + i \sin 2 t) \left[\matrix{1 - 2 i \cr 1 \cr}\right] =$$

$$e^{2 t} \left[\matrix{ (\cos 2 t + 2 \sin 2 t) + i(-2 \cos 2 t + \sin 2 t) \cr \cos 2 t + i \sin 2 t \cr}\right].$$

Taking the real and imaginary parts of this solution give two independent real solutions, which I use to get the general solution:

$$\left[\matrix{x \cr y \cr}\right] = c_1 e^{2 t} \left[\matrix{ \cos 2 t + 2 \sin 2 t \cr \cos 2 t \cr}\right] + c_2 e^{2 t} \left[\matrix{ -2 \cos 2 t + \sin 2 t \cr \sin 2 t \cr}\right].\quad\halmos$$


34. Compute $e^{A t}$ , where

$$A = \left[\matrix{ 2 & 4 \cr 4 & -4 \cr}\right].$$

$$|A - x I| = \left|\matrix{ 2 - x & 4 \cr 4 & -4 - x \cr}\right| = (x + 4)(x - 2) - (4)(4) = x^2 + 2 x - 24 = (x + 6)(x - 4).$$

The eigenvalues are $x = -6$ and $x = 4$ .

First,

$$B_1 = \left[\matrix{ 1 & 0 \cr 0 & 1 \cr}\right] \quad\hbox{and}\quad B_2 = A + 6 I = \left[\matrix{ 8 & 4 \cr 4 & 2 \cr}\right].$$

Next, $a_1(t) = e^{-6 t}$ , and

$$a_2(t) = e^{4 t} \int_0^t e^{-4 u} a_1(u)\,du = e^{4 t} \int_0^t e^{-4 u} e^{-6 u}\,du = e^{4 t} \int_0^t e^{-10 u}\,du =$$

$$e^{4 t} \left[-\dfrac{1}{10} e^{-10 u}\right]_0^t = -\dfrac{1}{10} e^{4 t} \left(e^{-10 t} - 1\right) = -\dfrac{1}{10} \left(e^{-6 t} - e^{4 t}\right).$$

So

$$e^{A t} = e^{-6 t} \left[\matrix{ 1 & 0 \cr 0 & 1 \cr}\right] - \dfrac{1}{10} \left(e^{-6 t} - e^{4 t}\right) \left[\matrix{ 8 & 4 \cr 4 & 2 \cr}\right] = \left[\matrix{ \dfrac{1}{5} e^{-6 t} + \dfrac{4}{5} e^{4 t} & -\dfrac{2}{5} e^{-6 t} + \dfrac{2}{5} e^{4 t} \cr \noalign{\vskip2pt} -\dfrac{2}{5} e^{-6 t} + \dfrac{2}{5} e^{4 t} & \dfrac{4}{5} e^{-6 t} + \dfrac{1}{5} e^{4 t} \cr}\right]. \quad\halmos$$


35. If $A \in M(n, \real)$ is orthogonal, then $\det A = \pm 1$ . Prove by counterexample that the converse is false.

Consider the matrix

$$\left[\matrix{ 5 & 2 \cr 2 & 1 \cr}\right] \in M(n, \real).$$

It has determinant 1, but it is not orthogonal: The columns aren't mutually perpendicular, and they don't have length 1.


36. M is a real symmetric matrix with eigenvalues 2, -2, and 1.

$(1, 1, 0)$ is an eigenvector for 2.

$(1, -1, 1)$ is an eigenvector for -2.

(a) Find an eigenvector for 1.

(b) Find a diagonalizing matrix P for M. Find $P^{-1}$ , and the corresponding diagonal matrix D. Find M.

(a) Since M is symmetric, an eigenvector $(a, b, c)$ for 1 must be perpendicular to the eigenvectors for 2 and -2. So

$$(1, 1, 0) \cdot (a, b, c) = 0 \quad\hbox{and}\quad (1, -1, 1) \cdot (a, b, c) = 0.$$

This gives the system

$$\left[\matrix{ 1 & 1 & 0 \cr 1 & -1 & 1 \cr}\right] \left[\matrix{a \cr b \cr c \cr}\right] = \left[\matrix{0 \cr 0 \cr}\right]$$

Row reduce:

$$\left[\matrix{ 1 & 1 & 0 & 0 \cr 1 & -1 & 1 & 0 \cr}\right] \quad\to\quad \left[\matrix{ 1 & 0 & \dfrac{1}{2} & 0 \cr \noalign{\vskip2pt} 0 & 1 & -\dfrac{1}{2} & 0 \cr}\right]$$

The parametrized solution is

$$a = -\dfrac{1}{2} t, \quad b = \dfrac{1}{2} t, \quad c = t.$$

Taking $t = 2$ gives $(a, b, c) = (-1, 1, 2)$ .

You could also do this by taking the cross product of the two eigenvectors, since they're in $\real^3$ .

(b)

$$P = \left[\matrix{ 1 & 1 & -1 \cr 1 & -1 & 1 \cr 0 & 1 & 2 \cr}\right], \quad P^{-1} = P^T = \left[\matrix{ 1 & 1 & 0 \cr 1 & -1 & 1 \cr -1 & 1 & 2 \cr}\right], \quad D = \left[\matrix{ 2 & 0 & 0 \cr 0 & -2 & 0 \cr 0 & 0 & 1 \cr}\right].$$

Since $P^{-1} M P = D$ ,

$$M = P D P^{-1} = \dfrac{1}{2} \left[\matrix{ 1 & 3 & -2 \cr 3 & 1 & 2 \cr -2 & 2 & 0 \cr}\right].\quad\halmos$$


37. Let

$$A = \left[\matrix{3 & \sqrt{5} \cr \sqrt{5} & -1 \cr}\right].$$

Find an orthogonal matrix O which diagonalizes A, and write down the corresponding diagonal matrix.

Let

$$A = \left[\matrix{3 & \sqrt{5} \cr \sqrt{5} & -1 \cr}\right].$$

The characteristic polynomial of A is

$$\det \left[\matrix{3 - \lambda & \sqrt{5} \cr \sqrt{5} & -1 - \lambda \cr}\right] = \lambda^2 - 2\lambda - 8 = (\lambda - 4)(\lambda + 2).$$

The eigenvalues are $\lambda =
   4$ and $\lambda = -2$ .

For $\lambda = 4$ , partial row reduction gives

$$A - 4 I = \left[\matrix{-1 & \sqrt{5} \cr \sqrt{5} & -5 \cr}\right] \to \left[\matrix{-1 & \sqrt{5} \cr 0 & 0 \cr}\right].$$

$(\sqrt{5}, 1)$ is an eigenvector for $\lambda = 4$ . It has length $\|(\sqrt{5}, 1)\| = \sqrt{6}$ .

For $\lambda = -2$ , partial row reduction gives

$$A + 2 I = \left[\matrix{5 & \sqrt{5} \cr \sqrt{5} & 1 \cr}\right] \to \left[\matrix{0 & 0 \cr \sqrt{5} & 1 \cr}\right].$$

$(-1,\sqrt{5})$ is an eigenvector for $\lambda = -2$ . It has length $\|(-1,\sqrt{5})\| = \sqrt{6}$ .

The normalized vectors give an orthonormal set of eigenvectors for A:

$$\dfrac{1}{\sqrt{6}}(\sqrt{5}, 1), \quad \dfrac{1}{\sqrt{6}}(-1,\sqrt{5})$$

Construct O by using the orthonormal vectors as the columns:

$$O = \dfrac{1}{\sqrt{6}} \left[\matrix{ \sqrt{5} & -1 \cr 1 & \sqrt{5} \cr}\right].$$

Moreover,

$$D = \left[\matrix{ 4 & 0 \cr 0 & -2 \cr}\right].\quad\halmos$$


38. Let

$$A = \left[\matrix{ -2 & 0 & 0 \cr 0 & 2 & 2 \cr 0 & 2 & -1 \cr}\right] \in M(3, \real).$$

Find an orthogonal matrix P which diagonalizes A, and write down the corresponding diagonal matrix.

The characteristic polynomial is

$$\det \left[\matrix{-2 - x & 0 & 0 \cr 0 & 2 - x & 2 \cr 0 & 2 & -1 - x \cr}\right] = -(x + 2)^2(x - 3).$$

The eigenvalues are $\lambda =
   -2$ and $\lambda = 3$ .

For $\lambda = -2$ ,

$$A + 2 I = \left[\matrix{0 & 0 & 0 \cr 0 & 4 & 2 \cr 0 & 2 & 1 \cr}\right] \to \left[\matrix{0 & 2 & 1 \cr 0 & 0 & 0 \cr 0 & 0 & 0 \cr}\right]$$

With $(a, b, c)$ as a solution vector, the corresponding homogeneous system is $2b + c =
   0$ , or $c = -2b$ . Thus,

$$(a, b, c) = (a, b, -2b) = a(1, 0, 0) + b(0, 1, -2).$$

Taking $a = 0$ and $b
   = 1$ , then $a = 1$ and $b = 0$ , I get the eigenvectors $(1, 0, 0)$ , $(0, 1, -2)$ . Observe that these vectors are already orthogonal. Dividing each by its length gives the orthonormal set

$$(1, 0, 0), \quad \dfrac{1}{\sqrt{5}}(0, 1, -2).$$

For $\lambda = 3$ ,

$$A - 3 I = \left[\matrix{-5 & 0 & 0 \cr 0 & -1 & 2 \cr 0 & 2 & -4 \cr}\right] \to \left[\matrix{1 & 0 & 0 \cr 0 & 1 & -2 \cr 0 & 0 & 0 \cr}\right]$$

With $(a, b, c)$ as a solution vector, the corresponding homogeneous system is $a = 0$ , $b - 2c = 0$ , or $b = 2c$ . Thus,

$$(a, b, c) = (0, 2c, c) = c(0, 2, 1).$$

Taking $c = 1$ , I get the eigenvector $(0, 2, 1)$ .

Eigenvectors for different eigenvalues of a symmetric matrix must be orthogonal, and you can verify that this eigenvector is perpendicular to the two I already found. Dividing the vector by its length, I obtain

$$\dfrac{1}{\sqrt{5}} (0, 2, 1).$$

Thus,

$$P = \left[\matrix{1 & 0 & 0 \cr \noalign{\vskip2pt} 0 & \dfrac{1}{\sqrt{5}} & \dfrac{2}{\sqrt{5}} \cr \noalign{\vskip2pt} 0 & -\dfrac{2}{\sqrt{5}} & \dfrac{1}{\sqrt{5}} \cr}\right].$$

Therefore, the diagonal matrix is

$$D = \left[\matrix{-2 & 0 & 0 \cr 0 & -2 & 0 \cr 0 & 0 & 3 \cr}\right]. \quad\halmos$$


39. (a) Suppose that $A \in M(n,
   \complex)$ . Prove that $A + A^*$ is Hermitian.

(b) Suppose that $H \in M(n,
   \complex)$ is Hermitian. Prove that $H + H^T$ is real.

(a)

$$(A + A^*)^* = A^* + (A^*)^* = A^* + A = A + A^*.$$

Therefore, $A + A^*$ is Hermitian.

(b) Since H is Hermitian, all the entries on the main diagonal are real, so the same is true for all the entries on the main diagonal of $H + H^T$ . So I only have to show that the off-diagonal entries of $H + H^T$ are real.

Suppose $j \ne k$ , and consider the $(j, k)^{\rm th}$ element of $H +
   H^T$ . This is

$$H_{j k} + (H^T)_{j k} = H_{j k} + H_{k j}.$$

However, since H is Hermitian,

$$H_{k j} = \conj{H_{j k}}.$$

That is, if $H_{j k} = a + b i$ , then $H_{k j} = a - b i$ . Hence,

$$H_{j k} + (H^T)_{j k} = H_{j k} + H_{k j} = (a + b i) + (a - b i) = 2 a.$$

That is, the $(j, k)^{\rm th}$ element of $H + H^T$ is a real number. Hence, $H + H^T$ is a real matrix.

To see how this works for yourself, consider the Hermitian matrix

$$H = \left[\matrix{2 & 2 + 3 i \cr 2 - 3 i & -7 \cr}\right].$$

Then

$$H + H^T = \left[\matrix{2 & 2 + 3 i \cr 2 - 3 i & -7 \cr}\right] + \left[\matrix{2 & 2 - 3 i \cr 2 + 3 i & -7 \cr}\right] = \left[\matrix{ 2 + 2 & (2 + 3 i) + (2 - 3 i) \cr (2 - 3 i) + (2 + 3 i) & -7 + (-7) \cr}\right] = \left[\matrix{4 & 4 \cr 4 & -14 \cr}\right].$$

Notice what happens when the off-diagonal entries are added.


40. Every Hermitian matrix H can be written as $H = A + i \cdot B$ , where A is symmetric and B is skew-symmetric. Show how this works with the Hermitian matrix

$$\left[\matrix{ 2 & 4 + 2 i & 3 - i \cr 4 - 2 i & 0 & 17 i \cr 3 + i & -17 i & -5 \cr}\right].$$

For A, use the real parts of H; for B, use the imaginary parts of H. Thus,

$$\left[\matrix{ 2 & 4 + 2 i & 3 - i \cr 4 - 2 i & 0 & 17 i \cr 3 + i & -17 i & -5 \cr}\right] = \left[\matrix{ 2 & 4 & 3 \cr 4 & 0 & 0 \cr 3 & 0 & 5 \cr}\right] + i \cdot \left[\matrix{ 0 & 2 & -1 \cr -2 & 0 & 17 \cr 1 & -17 & 0 \cr}\right].\quad\halmos$$


41. $M(n, \complex)$ is a vector space over $\complex$ . Is the set S of unitary matrices in $M(n, \complex)$ a subspace of $M(n, \complex)$ ?

The set of unitary matrices can't be a subspace of $M(n, \complex)$ , because it does not contain the zero matrix. The zero matrix is not unitary, because its rows don't form an orthonormal set --- the zero vector does not have length 1.


42. Consider the following set of vectors in $\complex^4$ :

$$\left\{(1 + i, 2, 3, -1), (1, 3 - i, i, 7), (1, 1, 1, 1)\right\}.$$

(a) Show that the first two vectors are orthogonal.

(b) Apply Gram-Schmidt to find a vector $v \in \complex^3$ so that the vectors in the following set are mutually perpendicular and span the same subspace as the original set.

$$\left\{(1 + i, 2, 3, -1), (1, 3 - i, i, 7), v\right\}.$$

(a)

$$(1 + i, 2, 3, -1) \cdot (1, 3 - i, i, 7) = (1 + i)(1) + (2)(3 + i) + (3)(-i) + (-1)(7) = 1 + i + 6 + 2 i - 3 i - 7 = 0.\quad\halmos$$

(b)

$$(1, 1, 1, 1) - \dfrac{(1, 1, 1, 1) \cdot (1 + i, 2, 3, -1)} {(1 + i, 2, 3, -1) \cdot (1 + i, 2, 3, -1)}(1 + i, 2, 3, -1) - \dfrac{(1, 1, 1, 1) \cdot (1, 3 - i, i, 7)} {(1, 3 - i, i, 7) \cdot (1, 3 - i, i, 7)}(1, 3 - i, i, 7) =$$

$$(1, 1, 1, 1) - \dfrac{5 - i}{16}(1 + i, 2, 3, -1) - \dfrac{11}{61} (1, 3 - i, i, 7) =$$

$$\dfrac{1}{976} (1105 - 305 i, 444 - 434 i, 976 - 1091 i, -256 + 305 i). \quad\halmos$$


43. Find a unitary matrix U that diagonalizes

$$A = \left[\matrix{5 & 2 - 2 i \cr 2 + 2 i & -2 \cr}\right].$$

Write the corresponding diagonal matrix.

$$\det \left[\matrix{5 - x & 2 - 2 i \cr 2 + 2 i & -2 - x \cr}\right] = (x + 2)(x - 5) - (2 - 2 i)(2 + 2 i) = x^2 - 3 x - 10 - 8 = x^2 - 3 x - 18 = (x - 6)(x + 3).$$

The eigenvalues are $\lambda =
   6$ and $\lambda = -3$ .

For $\lambda = 6$ ,

$$A - 6 I = \left[\matrix{-1 & 2 - 2 i \cr 2 + 2 i & -8 \cr}\right] \to \left[\matrix{-1 & 2 - 2 i \cr 0 & 0 \cr}\right].$$

By inspection, $(2 - 2 i, 1)$ is an eigenvector. Note that $\|(2 - 2 i, 1)\| = 3$ .

For $\lambda = -3$ ,

$$A + 3 I = \left[\matrix{8 & 2 - 2 i \cr 2 + 2 i & 1 \cr}\right] \to \left[\matrix{2 + 2 i & 1 \cr 0 & 0 \cr}\right].$$

By inspection, $(-1, 2 + 2 i)$ is an eigenvector. Note that $\|(-1, 2 + 2 i)\| = 3$ .

Thus, a unitary matrix which diagonalizes A is given by

$$U = \left[\matrix{ \dfrac{1}{3}(2 - 2 i) & -\dfrac{1}{3} \cr \noalign{\vskip2pt} \dfrac{1}{3} & \dfrac{1}{3}(2 + 2 i) \cr}\right].$$

The diagonal matrix is

$$D = \left[\matrix{ 6 & 0 \cr 0 & -3 \cr}\right].\quad\halmos$$


44. Find a unitary matrix U that diagonalizes

$$A = \left[\matrix{ -3 & 0 & 0 \cr 0 & 2 & 2 + 4 i \cr 0 & 2 - 4 i & 1 \cr}\right].$$

Write the corresponding diagonal matrix.

$$|A - xI| = \left|\matrix{ -3 - x & 0 & 0 \cr 0 & 2 - x & 2 + 4 i \cr 0 & 2 - 4 i & 1 - x \cr}\right| = (-x - 3)\left((2 - x)(1 - x) - (2 + 4 i)(2 - 4 i)\right) =$$

$$-(x + 3)(x^2 - 3 x + 2 - 20) = -(x + 3)(x^2 - 3 x - 18) = -(x + 3)^2(x - 6).$$

The eigenvalues are -3 and 6.

For $\lambda = -3$ ,

$$A + 3 I = \left[\matrix{ 0 & 0 & 0 \cr 0 & 5 & 2 + 4 i \cr 0 & 2 - 4 i & 4 \cr}\right] \to \left[\matrix{ 0 & 5 & 2 + 4 i \cr 0 & 0 & 0 \cr 0 & 0 & 0 \cr}\right].$$

Using $(a, b, c)$ as the solution vector, the last matrix gives the equation

$$5 b + (2 + 4 i) c = 0.$$

Thus, $b = -\dfrac{2 + 4 i}{5}
   c$ .

Then

$$\left[\matrix{a \cr b \cr c \cr}\right] = a \cdot \left[\matrix{1 \cr 0 \cr 0 \cr}\right] + c \cdot \left[\matrix{ 0 \cr \noalign{\vskip2pt} -\dfrac{2 + 4 i}{5} \cr \noalign{\vskip2pt} 1 \cr}\right].$$

Hence, taking $a = 1$ and $c
   = 0$ , I find that $(1, 0, 0)$ is an eigenvector. To get a "nice" second eigenvector, take $a = 0$ and $c = -5$ (to clear the denominator); this gives $(0, 2 +
   4 i, -5)$ as a second eigenvector.

For $\lambda = 6$ ,

$$A - 6 I = \left[\matrix{ -9 & 0 & 0 \cr 0 & -4 & 2 + 4 i \cr 0 & 2 - 4 i & -5 \cr}\right] \to \left[\matrix{ 1 & 0 & 0 \cr 0 & 2 - 4 i & -5 \cr 0 & 0 & 0 \cr}\right].$$

Using $(a, b, c)$ as the solution vector, the last matrix gives the equations

$$a = 0 \quad\hbox{and}\quad (2 - 4 i) b + (-5) c = 0 .$$

Thus, $(0, 5, 2 - 4 i)$ is an eigenvector.

To get a unitary diagonalizing matrix, divide each eigenvector by its length. You get

$$U = \dfrac{1}{\sqrt{45}} \left[\matrix{ \sqrt{45} & 0 & 0 \cr \noalign{\vskip2pt} 0 & 2 + 4 i & 5 \cr \noalign{\vskip2pt} 0 & -5 & 2 - 4 i \cr}\right].$$

The diagonal matrix is

$$D = \left[\matrix{ -3 & 0 & 0 \cr 0 & -3 & 0 \cr 0 & 0 & 6 \cr}\right].\quad\halmos$$


45. Find the Fourier expansion on the interval $-1 \le x \le 1$ of

$$f(x) = \cases{ 0 & if $-1 \le x \le 0$ \cr 1 & if $0 < x \le 1$ \cr}.$$ Since the integral of $f(x)$ on $-1 \le x \le 1$ is just the area under the curve, I have

$$a_0 = \int_{-1}^1 f(x)\,dx = 1.$$

For $n \ge 1$ , I have

$$a_n = \int_{-1}^1 f(x) \cos \pi n x\,dx = \int_0^1 \cos \pi n x\,dx = \left[\dfrac{1}{\pi n}\sin \pi n x\right]_0^1 = 0,$$

$$b_n = \int_{-1}^1 f(x) \sin \pi n x\,dx = \int_0^1 \sin \pi n x\,dx = \left[-\dfrac{1}{\pi n}\cos \pi n x\right]_0^1 = \dfrac{1}{\pi n}[1 - (-1)^n].$$

Hence, the Fourier expansion is

$$f(x) \sim \dfrac{1}{2} + \sum_{n=1}^\infty \dfrac{1}{\pi n}[1 - (-1)^n]\sin \pi n x.\quad\halmos$$


The best thing for being sad is to learn something. - Merlyn, in T. H. White's The Once and Future King


Contact information

Bruce Ikenaga's Home Page

Copyright 2020 by Bruce Ikenaga