Marco Mackaay recently pointed me at a paper by Mikhail Khovanov, which describes a categorification of the Heisenberg algebra H (or anyway its integral form H_{\mathbb{Z}}) in terms of a diagrammatic calculus.  This is very much in the spirit of the Khovanov-Lauda program of categorifying Lie algebras, quantum groups, and the like.  (There’s also another one by Sabin Cautis and Anthony Licata, following up on it, which I fully intend to read but haven’t done so yet. I may post about it later.)

Now, as alluded to in some of the slides I’ve from recent talks, Jamie Vicary and I have been looking at a slightly different way to answer this question, so before I talk about the Khovanov paper, I’ll say a tiny bit about why I was interested.

Groupoidification

The Weyl algebra (or the Heisenberg algebra – the difference being whether the commutation relations that define it give real or imaginary values) is interesting for physics-related reasons, being the algebra of operators associated to the quantum harmonic oscillator.  The particular approach to categorifying it that I’ve worked with goes back to something that I wrote up here, and as far as I know, originally was suggested by Baez and Dolan here.  This categorification is based on “stuff types” (Jim Dolan’s term, based on “structure types”, a.k.a. Joyal’s “species”).  It’s an example of the groupoidification program, the point of which is to categorify parts of linear algebra using the category Span(Gpd).  This has objects which are groupoids, and morphisms which are spans of groupoids: pairs of maps G_1 \leftarrow X \rightarrow G_2.  Since I’ve already discussed the backgroup here before (e.g. here and to a lesser extent here), and the papers I just mentioned give plenty more detail (as does “Groupoidification Made Easy“, by Baez, Hoffnung and Walker), I’ll just mention that this is actually more naturally a 2-category (maps between spans are maps X \rightarrow X' making everything commute).  It’s got a monoidal structure, is additive in a fairly natural way, has duals for morphisms (by reversing the orientation of spans), and more.  Jamie Vicary and I are both interested in the quantum harmonic oscillator – he did this paper a while ago describing how to construct one in a general symmetric dagger-monoidal category.  We’ve been interested in how the stuff type picture fits into that framework, and also in trying to examine it in more detail using 2-linearization (which I explain here).

Anyway, stuff types provide a possible categorification of the Weyl/Heisenberg algebra in terms of spans and groupoids.  They aren’t the only way to approach the question, though – Khovanov’s paper gives a different (though, unsurprisingly, related) point of view.  There are some nice aspects to the groupoidification approach: for one thing, it gives a nice set of pictures for the morphisms in its categorified algebra (they look like groupoids whose objects are Feynman diagrams).  Two great features of this Khovanov-Lauda program: the diagrammatic calculus gives a great visual representation of the 2-morphisms; and by dealing with generators and relations directly, it describes, in some sense1, the universal answer to the question “What is a categorification of the algebra with these generators and relations”.  Here’s how it works…

Heisenberg Algebra

One way to represent the Weyl/Heisenberg algebra (the two terms refer to different presentations of isomorphic algebras) uses a polynomial algebra P_n = \mathbb{C}[x_1,\dots,x_n].  In fact, there’s a version of this algebra for each natural number n (the stuff-type references above only treat n=1, though extending it to “n-sorted stuff types” isn’t particularly hard).  In particular, it’s the algebra of operators on P_n generated by the “raising” operators a_k(p) = x_k \cdot p and the “lowering” operators b_k(p) = \frac{\partial p}{\partial x_k}.  The point is that this is characterized by some commutation relations.  For j \neq k, we have:

[a_j,a_k] = [b_j,b_k] = [a_j,b_k] = 0

but on the other hand

[a_k,b_k] = 1

So the algebra could be seen as just a free thing generated by symbols \{a_j,b_k\} with these relations.  These can be understood to be the “raising and lowering” operators for an n-dimensional harmonic oscillator.  This isn’t the only presentation of this algebra.  There’s another one where [p_k,q_k] = i (as in i = \sqrt{-1}) has a slightly different interpretation, where the p and q operators are the position and momentum operators for the same system.  Finally, a third one – which is the one that Khovanov actually categorifies – is skewed a bit, in that it replaces the a_j with a different set of \hat{a}_j so that the commutation relation actually looks like

[\hat{a}_j,b_k] = b_{k-1}\hat{a}_{j-1}

It’s not instantly obvious that this produces the same result – but the \hat{a}_j can be rewritten in terms of the a_j, and they generate the same algebra.  (Note that for the one-dimensional version, these are in any case the same, taking a_0 = b_0 = 1.)

Diagrammatic Calculus

To categorify this, in Khovanov’s sense (though see note below1), means to find a category \mathcal{H} whose isomorphism classes of objects correspond to (integer-) linear combinations of products of the generators of H.  Now, in the Span(Gpd) setup, we can say that the groupoid FinSet_0, or equvialently \mathcal{S} = \coprod_n  \mathcal{S}_n, represents Fock space.  Groupoidification turns this into the free vector space on the set of isomorphism classes of objects.  This has some extra structure which we don’t need right now, so it makes the most sense to describe it as \mathbb{C}[[t]], the space of power series (where t^n corresponds to the object [n]).  The algebra itself is an algebra of endomorphisms of this space.  It’s this algebra Khovanov is looking at, so the monoidal category in question could really be considered a bicategory with one object, where the monoidal product comes from composition, and the object stands in formally for the space it acts on.  But this space doesn’t enter into the description, so we’ll just think of \mathcal{H} as a monoidal category.  We’ll build it in two steps: the first is to define a category \mathcal{H}'.

The objects of \mathcal{H}' are defined by two generators, called Q_+ and Q_-, and the fact that it’s monoidal (these objects will be the categorifications of a and b).  Thus, there are objects Q_+ \otimes Q_- \otimes Q_+ and so forth.  In general, if \epsilon is some word on the alphabet \{+,-\}, there’s an object Q_{\epsilon} = Q_{\epsilon_1} \otimes \dots \otimes Q_{\epsilon_m}.

As in other categorifications in the Khovanov-Lauda vein, we define the morphisms of \mathcal{H}' to be linear combinations of certain planar diagrams, modulo some local relations.  (This type of formalism comes out of knot theory – see e.g. this intro by Louis Kauffman).  In particular, we draw the objects as sequences of dots labelled + or -, and connect two such sequences by a bunch of oriented strands (embeddings of the interval, or circle, in the plane).  Each + dot is the endpoint of a strand oriented up, and each - dot is the endpoint of a strand oriented down.  The local relations mean that we can take these diagrams up to isotopy (moving the strands around), as well as various other relations that define changes you can make to a diagram and still represent the same morphism.  These relations include things like:

which seems visually obvious (imagine tugging hard on the ends on the left hand side to straighten the strands), and the less-obvious:

and a bunch of others.  The main ingredients are cups, caps, and crossings, with various orientations.  Other diagrams can be made by pasting these together.  The point, then, is that any morphism is some \mathbf{k}-linear combination of these.  (I prefer to assume \mathbf{k} = \mathbb{C} most of the time, since I’m interested in quantum mechanics, but this isn’t strictly necessary.)

The second diagram, by the way, are an important part of categorifying the commutation relations.  This would say that Q_- \otimes Q_+ \cong Q_+ \otimes Q_- \oplus 1 (the commutation relation has become a decomposition of a certain tensor product).  The point is that the left hand sides show the composition of two crossings Q_- \otimes Q_+ \rightarrow Q_+ \otimes Q_- and Q_+ \otimes Q_- \rightarrow Q_- \otimes Q_+ in two different orders.  One can use this, plus isotopy, to show the decomposition.

That diagrams are invariant under isotopy means, among other things, that the yanking rule holds:

(and similar rules for up-oriented strands, and zig zags on the other side).  These conditions amount to saying that the functors - \otimes Q_+ and - \otimes Q_- are two-sided adjoints.  The two cups and caps (with each possible orientation) give the units and counits for the two adjunctions.  So, for instance, in the zig-zag diagram above, there’s a cup which gives a unit map \mathbf{k} \rightarrow Q_- \otimes Q_+ (reading upward), all tensored on the right by Q_-.  This is followed by a cap giving a counit map Q_+ \otimes Q_- \rightarrow \mathbf{k} (all tensored on the left by Q_-).  So the yanking rule essentially just gives one of the identities required for an adjunction.  There are four of them, so in fact there are two adjunctions: one where Q_+ is the left adjoint, and one where it’s the right adjoint.

Karoubi Envelope

Now, so far this has explained where a category \mathcal{H}' comes from – the one with the objects Q_{\epsilon} described above.  This isn’t quite enough to get a categorification of H_{\mathbb{Z}}: it would be enough to get the version with just one a and one b element, and their powers, but not all the a_j and b_k.  To get all the elements of the (integral form of) the Heisenberg algebras, and in particular to get generators that satisfy the right commutation relations, we need to introduce some new objects.  There’s a convenient way to do this, though, which is to take the Karoubi envelope of \mathcal{H}'.

The Karoubi envelope of any category \mathcal{C} is a universal way to find a category Kar(\mathcal{C}) that contains \mathcal{C} and for which all idempotents split (i.e. have corresponding subobjects).  Think of vector spaces, for example: a map p \in End(V) such that p^2 = p is a projection.  That projection corresponds to a subspace W \subset V, and W is actually another object in Vect, so that p splits (factors) as V \rightarrow W subset V.  This might not happen in any general \mathcal{C}, but it will in Kar(\mathcal{C}).  This has, for objects, all the pairs (C,p) where p : C \rightarrow C is idempotent (so \mathcal{C} is contained in Kar(\mathcal{C}) as the cases where p=1).  The morphisms f : (C,p) \rightarrow (C',p') are just maps f : C \rightarrow C' with the compatibility condition that p' f = p f = f (essentially, maps between the new subobjects).

So which new subobjects are the relevant ones?  They’ll be subobjects of tensor powers of our Q_{\pm}.  First, consider Q_{+^n} = Q_+^{\otimes n}.  Obviously, there’s an action of the symmetric group \mathcal{S}_n on this, so in fact (since we want a \mathbf{k}-linear category), its endomorphisms contain a copy of \mathbf{k}[\mathcal{S}_n], the corresponding group algebra.  This has a number of different projections, but the relevant ones here are the symmetrizer,:

e_n = \frac{1}{n!} \sum_{\sigma \in \mathcal{S}_n} \sigma

which wants to be a “projection onto the symmetric subspace” and the antisymmetrizer:

e'_n = \frac{1}{n!} \sum_{\sigma \in \mathcal{S}_n} sign(\sigma) \sigma

which wants to be a “projection onto the antisymmetric subspace” (if it were in a category with the right sub-objects). The diagrammatic way to depict this is with horizontal bars: so the new object S^n_+ = (Q_{+^n}, e) (the symmetrized subobject of Q_+^{\oplus n}) is a hollow rectangle, labelled by n.  The projection from Q_+^{\otimes n} is drawn with n arrows heading into that box:

The antisymmetrized subobject \Lambda^n_+ = (Q_{+^n},e') is drawn with a black box instead.  There are also S^n_- and \Lambda^n_- defined in the same way (and drawn with downward-pointing arrows).

The basic fact – which can be shown by various diagram manipulations, is that S^n_- \otimes \Lambda^m_+ \cong (\Lambda^m_+ \otimes S^n_-) \oplus (\Lambda_+^{m-1} \otimes S^{n-1}_-).  The key thing is that there are maps from the left hand side into each of the terms on the right, and the sum can be shown to be an isomorphism using all the previous relations.  The map into the second term involves a cap that uses up one of the strands from each term on the left.

There are other idempotents as well – for every partition \lambda of n, there’s a notion of \lambda-symmetric things – but ultimately these boil down to symmetrizing the various parts of the partition.  The main point is that we now have objects in \mathcal{H} = Kar(\mathcal{H}') corresponding to all the elements of H_{\mathbb{Z}}.  The right choice is that the \hat{a}_j  (the new generators in this presentation that came from the lowering operators) correspond to the S^j_- (symmetrized products of “lowering” strands), and the b_k correspond to the \Lambda^k_+ (antisymmetrized products of “raising” strands).  We also have isomorphisms (i.e. diagrams that are invertible, using the local moves we’re allowed) for all the relations.  This is a categorification of H_{\mathbb{Z}}.

Some Generalities

This diagrammatic calculus is universal enough to be applied to all sorts of settings where there are functors which are two-sided adjoints of one another (by labelling strands with functors, and the regions of the plane with categories they go between).  I like this a lot, since biadjointness of certain functors is essential to the 2-linearization functor \Lambda (see my link above).  In particular, \Lambda uses biadjointness of restriction and induction functors between representation categories of groupoids associated to a groupoid homomorphism (and uses these unit and counit maps to deal with 2-morphisms).  That example comes from the fact that a (finite-dimensional) representation of a finite group(oid) is a functor into Vect, and a group(oid) homomorphism is also just a functor F : H \rightarrow G.  Given such an F, there’s an easy “restriction” F^* : Fun(G,Vect) \rightarrow Fun(H,Vect), that just works by composing with F.  Then in principle there might be two different adjoints Fun(H,Vect) \rightarrow Fun(G,Vect), given by the left and right Kan extension along F.  But these are defined by colimits and limits, which are the same for (finite-dimensional) vector spaces.  So in fact the adjoint is two-sided.

Khovanov’s paper describes and uses exactly this example of biadjointness in a very nice way, albeit in the classical case where we’re just talking about inclusions of finite groups.  That is, given a subgroup H < G, we get a functors Res_G^H : Rep(G) \rightarrow Rep(H), which just considers the obvious action of H act on any representation space of G.  It has a biadjoint Ind^G_H : Rep(H) \rightarrow Rep(G), which takes a representation V of H to \mathbf{k}[G] \otimes_{\mathbf{k}[H]} V, which is a special case of the formula for a Kan extension.  (This formula suggests why it’s also natural to see these as functors between module categories \mathbf{k}[G]-mod and \mathbf{k}[H]-mod).  To talk about the Heisenberg algebra in particular, Khovanov considers these functors for all the symmetric group inclusions \mathcal{S}_n < \mathcal{S}_{n+1}.

Except for having to break apart the symmetric groupoid as S = \coprod_n \mathcal{S}_n, this is all you need to categorify the Heisenberg algebra.  In the Span(Gpd) categorification, we pick out the interesting operators as those generated by the - \sqcup \{\star\} map from FinSet_0 to itself, but “really” (i.e. up to equivalence) this is just all the inclusions \mathcal{S}_n < \mathcal{S}_{n+1} taken at once.  However, Khovanov’s approach is nice, because it separates out a lot of what’s going on abstractly and uses a general diagrammatic way to depict all these 2-morphisms (this is explained in the first few pages of Aaron Lauda’s paper on ambidextrous adjoints, too).  The case of restriction and induction is just one example where this calculus applies.

There’s a fair bit more in the paper, but this is probably sufficient to say here.


1 There are two distinct but related senses of “categorification” of an algebra A here, by the way.  To simplify the point, say we’re talking about a ring R.  The first sense of a categorification of R is a (monoidal, additive) category C with a “valuation” in R that takes \otimes to \times and \oplus to +.  This is described, with plenty of examples, in this paper by Rafael Diaz and Eddy Pariguan.  The other, typical of the Khovanov program, says it is a (monoidal, additive) category C whose Grothendieck ring is K_0(C) = R.  Of course, the second definition implies the first, but not conversely.  The objects of the Grothendieck ring are isomorphism classes in C.  A valuation may identify objects which aren’t isomorphic (or, as in groupoidification, morphisms which aren’t 2-isomorphic).

So a categorification of the first sort could be factored into two steps: first take the Grothendieck ring, then take a quotient to further identify things with the same valuation.  If we’re lucky, there’s a commutative square here: we could first take the category C, find some surjection C \rightarrow C', and then find that K_0(C') = R.  This seems to be the relation between Khovanov’s categorification of H_{\mathbb{Z}} and the one in Span(Gpd). This is the sense in which it seems to be the “universal” answer to the problem.

About these ads