Ahoy! This page uses MathJax to typeset math symbols. If you’re seeing code rather than nice, typeset symbols, you may need to hit “refresh” to render things cleanly.
Applied Representation Theory : 001
The Exponential Map
We study the relationship between vector spaces and groups through the exponential map.
What's inside:
Revisiting elementary groups
The exponential function for matrices
An introduction to algebras
The Trouble with Zero
Last time, we explored the real and complex numbers as simple examples of groups. Under addition, all axioms are obeyed just fine. Under multiplication, however, we swept one issue under the rug: zero.
Neither $\mathbb{R}$ or $\mathbb{C}$ form a group with multiplication. There's a glaring inconsistency with the group axioms. Zero doesn't not have a multiplicative inverse. There is no number $x$ such that
$$ x\cdot 0 = 1.$$
Instead, we consider the multiplicative groups $\mathbb{R}^{\times}$ and $\mathbb{C}^{\times}$, which consist of all nonzero numbers. Specifically, for example,
$$\mathbb{C}^{\times} = \left\{ z \in \mathbb{C} \;\Big|\; z \neq 0\right\}.$$
It's not hard to check that both of these ``punctured'' fields form groups.
This structure generalizes in a way we've already seen. Consider the vector space $\mathbb{C}^{n}$. The endomorphisms of $\mathbb{C}^{n}$ are just the $n\times n$ complex matrices,
$$\mathsf{End}(\mathbb{C}^{n}) = \mathsf{Hom}_{n,n}\mathbb{C}.$$
These endomorphisms are not a group, but rather contain the group
$$\mathsf{GL}(\mathbb{C}^{n}) = \left\{ M \in \mathsf{Hom}_{n,n}\mathbb{C} \;\Big|\; \det M \neq 0\right\}.$$
The Exponential Map
The Real Numbers
Another way to deal with zero is to simply restrict ourselves to positive numbers. For $\mathbb{R}$, consider the map
$$\exp : t \mapsto e^{t} = \sum_{n=0}^{\infty} \frac{1}{n!}t^{n}.$$
The exponential function maps all of $\mathbb{R}$ to the positive, real line. In this context
$$0\mapsto 1,$$
and the limit
$$\lim_{t\rightarrow -\infty} e^{t} = 0.$$
In this sense, all of the negative, real numbers get mapped to the unit interval. Restricted to the strictly positive, real numbers, multiplication is once again a group operation.
Put another way, the image of the exponential map:
$$\exp[\mathbb{R}] = \left\{ \exp(x) \;\Big|\; x\in \mathbb{R}\right\},$$
is the set of strictly positive, real numbers,
$$\mathbb{R}^{+} = \left\{ x \in \mathbb{R} \;\Big|\; x > 0\right\}.$$
This forms a subgroup of $\mathbb{R}^{\times}$, that is, a subset which is also a group under the same operation$^{1}$.
We can always obtain that subgroup by taking the quotient,
$$\exp[\mathbb{R}] = \mathbb{R}^{+} \cong \mathbb{R}^{\times}\big/\, \mathbb{Z}_{2} = \left\{ |x| \;\Big|\; x \in \mathbb{R}^{\times} \right\}.$$
That we can take such a quotient of groups demonstrates the relative structure of $\mathbb{R}^{\times}$ and $\exp[\mathbb{R}]$. The former is equipped with second subgroup,
$$\mathbb{Z}_{2} = \left\{-1,1\right\},$$
such that any member of $\mathbb{R}^{\times}$ can be thought of as a pair of elements, one from $\exp[\mathbb{R}]$ and one from $\mathbb{Z}_{2}$:
\begin{equation}\label{rz2}\mathbb{R}^{\times}\cong \exp[\mathbb{R}]\times \mathbb{Z}_{2}.\end{equation}
While it might seem trivial for the simple case of real numbers, this fact generalizes to representations that act on $\mathbb{R}^{n}$ in a profound way. More on that soon.
The Complex Numbers
This same construction carries over to the complex plane, with curious implications. The real numbers behave as they do with $\mathbb{R}$, but the entire imaginary axis is mapped to the unit circle,
$$ \exp : iy \mapsto e^{iy}.$$
Thus we find, for real parameters $x$ and $y$, a map to polar coordinates
\begin{equation}\label{polar}\exp : x + iy \mapsto e^{x + iy} = e^{x}e^{iy} \rightsquigarrow r e^{i\theta}.\end{equation}
The image of the exponential map, $\exp[\mathbb{C}]$ is $\mathbb{C}^{\times}$.
Of course, this is not to say that $\exp[\mathbb{C}]$ is not without interesting substructure. As \eqref{polar} makes explicit, we can always quotient $\mathbb{C}^{\times}$ by the set of scalars,
$$\mathbb{C}^{\times} \big/ \exp[\mathbb{R}] = \left\{ \frac{z}{|z|} \;\Big|\; z \in \mathbb{C}^{\times}\right\}$$
to get the unit circle $S^{1}$ - otherwise known as the abelian group $\mathsf{U}(1)$. Hence, we can think of
\begin{equation}\label{cu1}\mathbb{C}^{\times} \cong \exp[\mathbb{R}]\times \mathsf{U}(1).\end{equation}
Note that for both \eqref{rz2} and \eqref{cu1}, the absence of zero is crucial to the construction. As with the reals, this construction generalizes to discussions of $\mathsf{End}(\mathbb{C}^{n})$.
The Exponential of a Matrix
The definition of the exponential map ports directly to both $\mathsf{End}(\mathbb{R}^{n})$ and $\mathsf{End}(\mathbb{C}^{n})$. For either case, let $M$ be such an $n\times n$ matrix. Then
$$\exp: M \mapsto e^{M} = \sum_{n=0}^{\infty}\frac{1}{n!}M^{n}.$$
Using methods of multivariable calculus, one can show
\begin{equation}\label{jacobi}\det e^{M} = e^{\mathsf{Tr}\,M},\end{equation}
where $\mathsf{Tr}\,M$ represents the trace of the matrix $M$, that is to say, the sum of its diagonal elements. Therefore, given any matrix $M$, with finite elements, we find that the corresponding matrix $e^{M}$ is invertible.
To summarize, the exponential map took the fields $\mathbb{R}$ and $\mathbb{C}$ to the corresponding (multiplicative) groups, $\mathbb{R}^{+}$ and $\mathbb{C}^{\times}$. It also took the endomorpisms to the groups,
$$ \exp : \mathsf{Hom}_{n,n}\mathbb{A} \rightarrow \mathsf{GL}(\mathbb{A}^{n}),$$
where $\mathbb{A}$ is either of $\mathbb{R}$ or $\mathbb{C}$.
The common thread here is that the exponential map takes vector spaces to groups by removing zero. More precisely, every member of the image of the exponential map is invertible.
Introduction to Algebras
The point of this section is not to demonize zero. Rather, we are looking to emphasize the different between vector spaces and groups. Let $V$ be a vector space, perhaps $\mathbb{C}^{n}$. You have already shown in the exercises that $\mathsf{End}(V)$ is also a vector space. While not a group, it is certainly true that for two maps $f$ and $g$ in $\mathsf{End}(V)$, the composition of those maps $f\circ g$ is also. Explicitly for $v$ in $V$,
$$(f\circ g)(v) = f(g(v)).$$
It's a fundamental fact from set theory that compositions of functions like these is an associative operation,
$$f\circ(g\circ h) = (f\circ g)\circ h = f \circ g \circ h.$$
For a concrete example, notice that the composition of matrices in $\mathsf{End}(\mathbb{A}^{n})$ is just given by matrix multiplication.
Thus, the vector space of endomorphisms is closed under the associative operation of composition. It's not a group, because such maps need not be invertible, thanks to the zero map, which takes all vectors in $V$ to $0$.
$$(0\circ f) = (f \circ 0) = 0.$$
But that zero map is crucial. It makes $\mathsf{End}(V)$ into a vector space, as
$$f + (-f) = 0,$$
for all $f$ in $\mathsf{End}(V)$.
We therefore define a slightly less rigid algebraic structure for $\mathsf{End}(V)$.
An algebra is a vector space $V$ together with a bilinear map:
$$\star : V\times V \rightarrow V.$$
A bilinear map is linear in both factors, so that
$$(af + bg)\times h = a(f\times h) + b(g\times h),$$
and
$$f\times (ag + bh) = a(f\times g) + b(f\times h),$$
for all $f,g,h$ in $V$ and scalars $a$ and $b$.
An associative algebra is an algebra whose bilinear map amounts to an associative operation. The algebra of matrices $\mathsf{End}(\mathbb{C}^{n}) = \mathsf{Hom}_{n,n}\mathbb{C}$ is an associative algebra. $\mathsf{Hom}_{n,n}\mathbb{R}$ is a subset of $\mathsf{Hom}_{n,n}\mathbb{C}$, and is also an associative algebra. Such a subset is called a subalgebra.
We shall study algebras in detail next time.
Next Time
Associative algebras of finite dimensional vector spaces
Lie algebras
Further Reading
While popular with physicists, Howard Georgi's book Lie Algebras in Particle Physics is a disorderly mess, and so may be hard to read. Nevertheless, it has many examples familiar from physics. From a mathematical point of view, a reasonable text with a lot of problems is Hungerford's Algebra. Brian C. Hall also wrote a great book, Lie Groups, Lie algebras and Representations.
Exercises
The Endormophism algebra: Let $V$ be a vector space. Prove that $\mathsf{End}(V)$ is an associative algebra under composition of maps. Hence argue that the image $\exp[\mathsf{End}(V)]$ is the group $\mathsf{GL}(V)$
The cross-product: Show that $\mathbb{R}^{3}$ together with the cross-product familiar from classical vector analysis is an algebra. Is this algebra an associative algebra?
Algebra isomorphisms: Extend the definition of a group homomorphism to an algebra homomorphism. What makes two algebras isomorphic?
Group algebras: Let $G$ be a group and let $\mathbb{A}$ be either of $\mathbb{R}$ or $\mathbb{C}$. The Group algebra over $\mathbb{A}$ - denoted $\mathbb{A}[G]$ - is defined to be the set of formal linear combinations of group members, with scalars drawn from $\mathbb{A}$.
This is a rather abstract construction. For $a$ and $b$ in $\mathbb{A}$, and $g$ and $h$ in $G$, $ag + bh$ is a vector in $\mathbb{A}[G]$. We have
$$\dim \mathbb{A}[G] = |G|,$$
where $|G|$ is the order of the finite group - the total number of group members. Argue that if $G$ is finite, then $\mathbb{A}[G]$ is isomorphic as a vector space to $\mathbb{A}^{|G|}$.
If $G$ is abelian, show that
$$\exp(g)\exp(h) = \exp(g+h).$$
Verify that this relationship fails when $G$ is nonabelian. What can you deduce about the relationship of $\mathbb{A}[G]$ and its image under the exponential, $\exp[\mathbb{A}[G]]$?
Finally, what are $\mathsf{End}(\mathbb{A}[G])$ and $\mathsf{GL}(\mathbb{A}[G])$?
The Exterior algebra: Generalize the cross-product to a bilinear operation on $\mathbb{R}^{n}$, for $n>1$, as follows. Let $v$ and $w$ be vectors in $\mathbb{R}^{n}$. Define the wedge product as the anticommuting juxtposition
$$w \wedge v = - v\wedge w.$$
Explicitly, let $e_{i}$, with $1\leq i \leq n$ form an orthonormal basis for $\mathbb{R}^{n}$. Argue that $e_{i}\wedge e_{j} = 0$ if and only if $i=j$. Write down all possible nonzero wedge products of these basis elements.
Argue that the linear span of all these possible wedge products of $e_{i}$ form a vector space. Show that the dimension of that vector space is $\frac{1}{2}n(n-1)$. Call this vector space $\bigwedge(\mathbb{R}^{n})$. Argue that $\bigwedge(\mathbb{R}^{n})$ together with the wedge product is an algebra. Is this algebra associative? What is the image of this algebra under the exponential map?
The Cross Product, revisited: Refine the exterior algebra over $\mathbb{R}^{3}$ as follows. Define
$$e_{i}\wedge e_{j} = \epsilon_{ijk}e_{k},$$
where $\epsilon_{ijk}$ is the totally antisymmetric object with three indices, and $\epsilon_{123} = 1$. Argue that this is nothing more than the cross-product on $\mathbb{R}^{3}$.
The Complex numbers, revisited: Define a bilinear product on $\mathbb{R}^{2}$ such that the resulting algebra is $\mathbb{C}$. To that end, it may help to consider the orthonormal basis for $\mathbb{R}^{2}$:
$$\left\{e_{x},e_{y}\right\}.$$
The Quaternions: The quaternions - often denoted $\mathbb{H}$ - are a four-dimensional algebra based on $\mathbb{R}^{4}$. A generic quaternion $q$ in $\mathbb{H}$ can be expressed in terms of the four real numbers, $a,b,c,d$ as:
$$q = a + bi + cj + dk,$$
where $i$, $j$ and $k$ are orthogonal, unit, imaginary quaternions, not unlike the unit $i$ in $\mathbb{C}$. In particular
$$i^{2} = j^{2} = k^{2} = -1.$$
We also have the triple product
$$ijk = -1.$$
Express these rules in terms of a bilinear map on $\mathbb{R}^{4}$ as you did with $\mathbb{C}$. Hence show that the three-dimensional subspace of imaginary quaternions is a subalgebra isomorphic (as an algebra) to $\mathbb{R}^{3}$ with the cross-product. What is the image of the imaginary quaternions under the exponential map? What about all of $\mathbb{H}$?
The Symmetric Algebra: Let $V$ be a vector space and let $v$ and $w$ be vectors. As you did with the wedge product, define the symmetric vector product
$$v \star w = w \star v.$$
What is the algebra induced by this product, analogous to $\bigwedge(V)$? Is it associative?
$^{1}$: It also has the same identity element, and therefore the inverse elements are also the same.
©2021 The Pasayten Institute cc by-sa-4.0