Matrix Groups

Many groups have matrices as their elements. The operation is usually either matrix addition or matrix multiplication.


Example. Let G denote the set of all $2\times 3$ matrices with real entries. (Remember that "$2\times 3$ " means the matrices have 2 rows and 3 columns.) Here are some elements of G:

$$\left[\matrix{1 & 2 & 3 \cr 4 & 5 & 6 \cr}\right], \quad \left[\matrix{0 & 0 & 0 \cr 0 & 0 & 0 \cr}\right], \quad \left[\matrix{1.17 & -2.46 & \pi\sqrt{3} \cr \noalign{\vskip2pt} 147.2 & \dfrac{22}{7} & 0 \cr}\right].$$

Show that G is a group under matrix addition.

If you add two $2\times 3$ matrices with real entries, you obtain another $2\times 3$ matrix with real entries:

$$\left[\matrix{a & b & c \cr d & e & f \cr}\right] + \left[\matrix{u & v & w \cr x & y & z \cr}\right] = \left[\matrix{ a + u & b + v & c + w \cr d + x & e + y & f + z \cr}\right].$$

That is, addition yields a binary operation on the set.

You should know from linear algebra that matrix addition is associative.

The identity element is the $2\times 3$ zero matrix:

$$\left[\matrix{0 & 0 & 0 \cr 0 & 0 & 0 \cr}\right] + \left[\matrix{a & b & c \cr d & e & f \cr}\right] = \left[\matrix{a & b & c \cr d & e & f \cr}\right], \quad \left[\matrix{a & b & c \cr d & e & f \cr}\right] + \left[\matrix{0 & 0 & 0 \cr 0 & 0 & 0 \cr}\right] = \left[\matrix{a & b & c \cr d & e & f \cr}\right].$$

The inverse of a $2\times 3$ matrix under this operation is the matrix obtained by negating the entries of the original matrix:

$$\left[\matrix{a & b & c \cr d & e & f \cr}\right] + \left[\matrix{-a & -b & -c \cr -d & -e & -f \cr}\right] = \left[\matrix{0 & 0 & 0 \cr 0 & 0 & 0 \cr}\right], \quad \left[\matrix{-a & -b & -c \cr -d & -e & -f \cr}\right] + \left[\matrix{a & b & c \cr d & e & f \cr}\right] = \left[\matrix{0 & 0 & 0 \cr 0 & 0 & 0 \cr}\right].$$

Notice that I don't get a group if I try to apply matrix addition to the set of all matrices with real entries. This does not define a binary operation on the set, because matrices of different dimensions can't be added.

In general, the set of $m \times
   n$ matrices with real entries --- or entries in $\integer$ , $\rational$ , $\complex$ , or $\integer_n$ for $n \ge 2$ form a group under matrix addition.

As a special case, the $n \times
   n$ matrices with real entries forms a group under matrix addition. This group is denoted $M(n, \real)$ . As you might guess, $M(n, \rational)$ denotes the group of $n \times n$ matrices with rational entries (and so on).


Example. Let G be the group of $3 \times 4$ matrices with entries in $\integer_3$ under matrix addition.

(a) What is the order of G?

(b) Find the inverse of $\displaystyle \left[\matrix{1 & 1 & 2 \cr 0 & 2 & 1
   \cr}\right]$ in G.

(a) A $3 \times 4$ matrix has $3\cdot 4 = 12$ entries. Each entry can be any one of the 3 elements of $\integer_3$ . Therefore, there are $3^{12} = 531441$ elements.

(b)

$$\left[\matrix{1 & 1 & 2 \cr 0 & 2 & 1 \cr}\right] + \left[\matrix{2 & 2 & 1 \cr 0 & 1 & 2 \cr}\right] = \left[\matrix{0 & 0 & 0 \cr 0 & 0 & 0 \cr}\right].$$

Hence, the inverse is $\displaystyle \left[\matrix{2 & 2 & 1 \cr 0 & 1 & 2
   \cr}\right]$ .


Example. Let

$$G = \left\{\left[\matrix{0 & x \cr 0 & y \cr}\right] \Bigm| x, y \in \real\right\}.$$

In words, G is the set of $2\times 2$ matrices with real entries having zeros in the first column.

Show that G is a group under matrix addition.

First,

$$\left[\matrix{0 & x_1 \cr 0 & y_1 \cr}\right] + \left[\matrix{0 & x_2 \cr 0 & y_2 \cr}\right] = \left[\matrix{0 & x_1 + x_2 \cr 0 & y_1 + y_2 \cr}\right] \in G.$$

That is, if you add two elements of G, you get another element of G. Hence, matrix addition gives a binary operation on the set G.

From linear algebra, you know that matrix addition is associative.

The zero matrix $\displaystyle
   \left[\matrix{0 & 0 \cr 0 & 0 \cr}\right]$ is the identity under matrix addition; it's an element of G, since its first column is all-zero.

Finally, the additive inverse of an element $\displaystyle \left[\matrix{0 & x \cr
   0 & y \cr}\right] \in G$ is $\displaystyle \left[\matrix{0 & -x
   \cr 0 & -y \cr}\right]$ , which is also an element of G. Thus, every element of G has an inverse.

All the axioms for a group have been verified, so G is a group under matrix addition.


Example. Consider the set of matrices

$$G = \left\{\left[\matrix{1 & x \cr 0 & 1 \cr}\right] \Bigm| x \in \real, \quad x \ge 0\right\}.$$

(Notice that x must be nonnegative). Is G a group under matrix multiplication?

First, suppose that $x, y \in
   \real$ , $x, y \ge 0$ . Then

$$\left[\matrix{1 & x \cr 0 & 1 \cr}\right] \left[\matrix{1 & y \cr 0 & 1 \cr}\right] = \left[\matrix{1 & x + y \cr 0 & 1 \cr}\right].$$

Now $x + y \ge 0$ , so $\displaystyle \left[\matrix{1 & x + y \cr 0 & 1 \cr}\right] \in
   G$ . Therefore, matrix multiplication gives a binary operation on G.

I'll take for granted the fact that matrix multiplication is associative.

The identity for multiplication is $\displaystyle \left[\matrix{1 & 0 \cr 0 & 1 \cr}\right]$ , and this is an element of G.

However, not all elements of G have inverses. To give a specific counterexample, suppose that for $x \ge 0$

$$\left[\matrix{1 & x \cr 0 & 1 \cr}\right] \left[\matrix{1 & 2 \cr 0 & 1 \cr}\right] = \left[\matrix{1 & 0 \cr 0 & 1 \cr}\right]$$

Then

$$\left[\matrix{1 & x + 2 \cr 0 & 1 \cr}\right] = \left[\matrix{1 & 0 \cr 0 & 1 \cr}\right].$$

Hence, $x + 2 = 0$ and $x
   = -2$ . This contradicts $x \ge 0$ . Hence, the element $\displaystyle \left[\matrix{1 & 2 \cr 0 & 1 \cr}\right]$ of G does not have an inverse.

Therefore, G is not a group under matrix multiplication.


Example. $GL(n,\real)$ denotes the set of invertible $n \times n$ matrices with real entries, the general linear group. Show that $GL(n,\real)$ is a group under matrix multiplication.

First, if $A, B \in GL(n,
   \real)$ , I know from linear algebra that $\det A \ne 0$ and $\det B \ne 0$ . Then

$$\det (AB) = (\det A)\cdot (\det B) \ne 0.$$

Hence, so $A B \in GL(n, \real)$ . This proves that $GL(n, \real)$ is closed under matrix multiplication.

I will take it as known from linear algebra that matrix multiplication is associative.

The identity matrix is the $n
   \times n$ matrix

$$I = \left[\matrix{1 & 0 & \cdots & 0 \cr 0 & 1 & \cdots & 0 \cr \vdots & \vdots & \ddots & \vdots \cr 0 & 0 & \cdots & 1 \cr}\right].$$

It is the identity for matrix multiplication: $A I = A = I A$ for all $A \in
   GL(n, \real)$ .

Finally, since $GL(n,\real)$ is the set of invertible $n \times n$ matrices, every element of $GL(n, \real)$ has an inverse under matrix multiplication.


Example. $GL(2, \integer_3)$ denotes the set of $2 \times 2$ invertible matrices with entries in $\integer_3$ . The operation is matrix multiplication --- but note that all the arithmetic is performed in $\integer_3$ .

For example,

$$\left[\matrix{2 & 1 \cr 1 & 2 \cr}\right] \left[\matrix{1 & 1 \cr 2 & 1 \cr}\right] = \left[\matrix{1 & 0 \cr 2 & 0 \cr}\right].$$

The proof that $GL(2,
   \integer_3)$ is a group under matrix multiplication follows the proof in the last example. (In fact, the same thing works with any commutative ring in place of $\real$ or $\integer_3$ ; commutative rings will be discussed later.)

(a) What is the order of $\displaystyle \left[\matrix{1 & 1 \cr 0 & 1 \cr}\right]$ ?

(b) Find the inverse of $\displaystyle \left[\matrix{2 & 1 \cr 2 & 2 \cr}\right]$ .

(a) Notice that

$$\left[\matrix{1 & 1 \cr 0 & 1 \cr}\right]^2 = \left[\matrix{1 & 2 \cr 0 & 1 \cr}\right] \quad\hbox{and}\quad \left[\matrix{1 & 1 \cr 0 & 1 \cr}\right]^3 = \left[\matrix{1 & 0 \cr 0 & 1 \cr}\right].$$

Therefore, $\displaystyle
   \left[\matrix{1 & 1 \cr 0 & 1 \cr}\right]$ has order 3 in $GL(2, \integer_3)$ .

(b) Recall the formula for the inverse of a $2 \times 2$ matrix:

$$\left[\matrix{a & b \cr c & d \cr}\right]^{-1} = \dfrac{1}{a d - b c}\left[\matrix{d & -b \cr -c & a \cr}\right].$$

The formula works in this situation, but you have to interpret the fraction as a multiplicative inverse:

$$\left[\matrix{a & b \cr c & d \cr}\right]^{-1} = (a d - b c)^{-1}\left[\matrix{d & -b \cr -c & a \cr}\right].$$

Thus,

$$\left[\matrix{2 & 1 \cr 2 & 2 \cr}\right]^{-1} = (2^{-1}) \left[\matrix{2 & 2 \cr 1 & 2 \cr}\right] = 2\ cdot \left[\matrix{2 & 2 \cr 1 & 2 \cr}\right] = \left[\matrix{1 & 1 \cr 2 & 1 \cr}\right].$$

On the other hand, the matrix $\displaystyle \left[\matrix{2 & 1 \cr 1 & 2 \cr}\right]$ is not an element of $GL(2, \integer_3)$ . It has determinant $2 \cdot 2 - 1\cdot 1 = 0$ , so it's not invertible.


Example. Show that the following set is a subgroup of $GL(2, \real)$ :

$$SL(2, \real) = \left\{A \in GL(2, \real) \Bigm| \det A = 1\right\}$$

Suppose $A, B \in SL(2, \real)$ . Then

$$\det (A B) = (\det A)(\det B) = 1 \cdot 1 = 1.$$

Hence, $A B \in SL(2, \real)$ .

Since $\det I = 1$ , the identity matrix is in $SL(2, \real)$ .

Finally, if $A \in SL(2, \real)$ , then $A A^{-1} = I$ implies that

$$(\det A)(\det A^{-1}) = \det I = 1.$$

But $\det A = 1$ , so $\det A^{-1} = 1$ , and hence $A^{-1} \in SL(2, \real)$ .

Therefore, $SL(2, \real)$ is a subgroup of $GL(2, \real)$ .


Contact information

Bruce Ikenaga's Home Page

Copyright 2018 by Bruce Ikenaga