06-The Commutator

We introduce the concept of a commutator of matrices and explore its implications in the study of matrix groups.

The Commutator

In general, matrices do not commute. We can quantify this failure to commute with the commutator:

$$[A,B] = AB - BA,$$

this is an antisymmetric, bilinear map on a chosen set of matrices, be it $\mathsf{Hom}(\mathbb{C}^{n})$, $\mathsf{Hom}(\mathbb{R}^{n})$ or perhaps a subspace thereof.

A potentially silly example of such a subspace might be the diagonal matrices, whose commutator vanishes. Evidently the product of two diagonal matrices is also diagonal, so they form their own subalgebra under matrix multiplication. Because the commutator vanishes for all members of this subalgebra, we call it an abelian algebra.

The Commutator and the Exponential Map

Let $V$ be a finite dimensional vector space, like $\mathbb{C}^{n}$, so that $\mathsf{End}(V)$ can be modeled by matrices, say $\mathsf{Hom}(\mathbb{C}^{n})$. In our study of the exponential map, we saw that any element $g_{a}$ of the group $\mathsf{GL}(V)$ can be written as an exponential,

$$g_{a} = e^{M_{a}},\quad M_{a}\in \mathsf{End}(V).$$

Let us consider the matrix product of two such elements $g_{a}g_{b}$.

$$g_{a}g_{b} =e^{M_{a}}e^{M_{b}} = \sum_{m,n = 0}^{\infty} \frac{1}{m!n!}M_{a}^{m}M_{b}^{n}.$$

Because in general $[M_{a},M_{b}]\neq 0$, such a product of infinite sums$^{1}$ will always have terms with $M_{a}$ to the left and $M_{b}$ to the right. For example,

$$g_{a}g_{b} = \mathbb{1} + M_{a} + M_{b} +\frac{1}{2}M_{a}^{2}+ M_{a}M_{b} +\frac{1}{2}M_{b}^{2}\cdots.$$

By replacing $a$ and $b$, we can represent the other product,

$$g_{b}g_{a} = \mathbb{1} + M_{a} + M_{b} +\frac{1}{2}M_{a}^{2}+ M_{b}M_{a} +\frac{1}{2}M_{b}^{2}\cdots.$$

Thus, the commutator of $g_{a}$ and $g_{b}$, is given by

$$[g_{a},g_{b}] = [M_{a},M_{b}] + \frac{1}{2!} ([M_{a}^{2},M_{b}] - [M_{b}^{2},M_{a}]) \cdots.$$

so that in particular, we see that the commutator of $[g_{a},g_{b}]$ depends on the commutator $[M_{a},M_{b}]$.

There is a more general statement of this fact, often referred to as the Baker-Campbell-Hausdorff formula, gives an explicit form of the product

\begin{equation}\label{bch}e^{A}e^{B} = e^{A + B + \frac{1}{2}[A,B] + \frac{1}{12}[A,[A,B] - \frac{1}{12}[B,[A,B] + \cdots }.\end{equation}

We shall not prove this fact here, but rather motivate it through the proof of an easier Lemma.

Lemma: Let $A$ and $B$ be finite dimensional matrices and let $t$ be a formal variable such that $\frac{d}{dt}e^{At} = Ae^{At}$. Then we have the power series expansion in $t$

$$e^{At}Be^{-At} = B + [A,B]t + \frac{1}{2!}[A,[A,B]]t^{2} + \frac{1}{3!}[A,[A,[A,B]]]t^{3} + \cdots$$

Proof. We prove by induction. Let

$$f(t) = e^{At}Be^{-At},$$

and let

$$f^{(n)}(t) = \frac{d^{n}f(t)}{dt^{n}}.$$

Suppose that

$$f^{(n)}(t) = [A,f^{(n-1)}(t)].$$

Taking the derivative of the right hand side,

$$\frac{d}{dt}\left(Af^{(n-1)}(t) - f^{(n-1)}(t)A\right) = Af^{(n)}(t) - f^{(n)}(t)A,$$

since $A$ is independent of the formal parameter $t$. Thus

$$f^{(n)}(t) = [A,f^{(n-1)}(t)] \Rightarrow f^{(n+1)}(t) = [A,f^{(n)}(t)].$$

In particular this holds for $t=0$.

Next observe that

$$\frac{d}{dt}(e^{At}Be^{-At}) = Ae^{At}Be^{-At} + e^{At}B(-A)e^{-At},$$

so that at $t=0$,

$$\frac{d}{dt}(e^{At}Be^{-At})\Big|_{t=0} = [A,B].$$

The hypothesis - which states that the $n$-th term in the formal series for $t$ is the $n$-th commutator of $B$ with $A$ - follows by induction. $\square$


$^{1}$: Convergence of such sums is not assumed. Often they are just written as formal sums. Sufficiently close to the identity these sums do converge.

Sean Downes

Theoretical physicist, coffee and outdoor recreation enthusiast.

https://www.pasayten.org
Previous
Previous

07-Lie Algebras

Next
Next

05 - Algebras