This is the second post of my thoughts inspired mainly by reading Fernando Zalamea’s “Synthetic Philosophy of Contemporary Mathematics” (and also a few other sources). The first part is here.

I do have a few issues with the Zalamea book: mainly, as a reader, pinning down what a lot of the sentences really mean can be hard. This might be a combination perfectly reasonable things: the fact that it’s doing philosophy  – and it’s not analytic philosophy, which aspires to math-like rigour. (Indeed, one of the many ideas the book throws around is that of “synthetic philosophy”, modelled not after formal logic, but sheaf theory, with many local points of view and ways of relating them in areas where they overlap. Intuitively appealing, though it’s hard to see how to make it rigorous in the same way.)

So, many of the ideas are still formative, and the terms used to describe them are sometimes new coinages. Then, too, the combination of philosophical jargon and the fact that it’s translated from Spanish probably contribute. So I give the author the benefit of the doubt on this point and interpret the best I can. Even so, it’s still difficult for me to say exactly what some of it is saying. In any case, here I wanted to break down my understanding of some themes it is pointing out. There is more there than I have space to deal with here, but these are some major ones.

I had a somewhat similar response to David Corfield’s book, “Toward a Philosophy of Real Mathematics” (which Zalamea mentions in a chapter where he acknowledges some authors who have engaged the kind of issues he’s interested in). That is, both of them do well at pointing out topics which haven’t received much attention, but the main strength is by pointing out areas of actual mathematical activity, and describing what they’re like (for example, Corfield’s chapter on higher category theory, and Zalamea’s account of Grothendieck’s work). They both feel sort of preliminary, though, in that they’re pointing out areas where a lot more people need to study, argue, and generally thrash out various positions on the issues before (at least as far as I can see) one could hope to say the issues raised have actually been dealt with.

Themes

In any case, part of the point is that for a philosophical take on what mathematicians are actually studying, we need to look at some details. In the previous post I outlined the summary (from philosopher Albert Lautman) of the themes of “Elementary” and “Classical” mathematics. Lautman introduced five themes apropos to the “Modern” period – characterizing what was new compared to the “Classical” (say, pre-1900 or so). Zalamea’s claim, which seems correct to me, is that all of these themes are still present today, but some new ones have been added.

That is, mathematics is cumulative: all the qualities from previous periods stay important, but as it develops, new aspects of mathematics become visible. Thus, Lautman had five points, which are somewhat detailed, but the stand-out points to my mind include:

The existence of a great many different axiomatic systems and theories, which are not reducible to each other, but are related in various ways . Think of the profusion of different algebraic gadgets, such as groups, rings, quandles, magmas, and so on, each of which has its own particular flavour. Whereas Classical mathematics did a lot with, say, the real number system, the Modern period not only invented a bunch of other number systems and algebras, but also a range of different axiom systems for describing different generalizations. The same could be said in analysis: the work on the Calculus in the Classical period leads into the definition of a metric space and then a topological space in the Modern period, and an increasing profusion of specific classes of them with different properties (think of all the various separation axioms, for example, and the web of implications relating them).

The study of a rich class of examples of different axiomatic systems. Here the standout example to me is the classification of the finite groups, where the “semantics” of the classification is much more complex than the “syntax” of the theory. This reminds me of the game of Go (a.k.a. Wei Chi in China, or Baduk in Korea), which has gained some recent fame because of the famous AlphaGo victories. The analogy: that the rules of the game are very simple, but the actual practice of play is very difficult to master, and the variety of examples of games is huge. This is, essentially, because of a combinatorial explosion, and somewhat the same principle is at work in mathematics: the theory of groups has, essentially, just three axioms on one set with three structures (the unit, the inverse, and the multiplication – a 0-ary, unary, and binary operation respectively), so the theory is quite simple. Yet the classification of all the examples is complicated and full of lots of exceptions (like the sporadic simple groups), to the point that it was only finished in Contemporary times. Similar things could be said about topological spaces.

A unity of methods beyond apparent variety. An example cited being the connection between the Galois group of field extensions and the group of deck transformations of a certain kind of branched cover of spaces. In either case, the idea is to study a mathematical object by way of its group of automorphisms over some fixed base object – and in particular to classify intermediate objects by way of the subgroups of this big group. Here, the “base object” could refer to either a sub-field (which is a sub-object in the category of fields) or a base space for the cover (which is not – it’s a quotient, or more generically the target of a projection morphism). These are conceptually different kinds of things on the face of it, but the mechanism of studying “homomorphisms over” them is similar. In fact, following through the comparison reveals a unification, by considering the fields of functions on the spaces: a covering space then has a function field which is an extension of the base case, and the two apparently different situations turn out to correspond exactly.

A “dialectical movement that is a back-and-forth between the One and the Many”. This is one of those jargon-sounding terms (especially the Hegelian-sounding term “dialectic”) and is a bit abstract. The examples given include:

  • The way multiple variants on some idea are thought up, which in turn get unified into a more general theory, which in turn spawns its own variants, and so on. So, as people study different axiom systems for set theory, and their models, this diversity gets unified into the study of the general principles of how such systems all fit together. That is, as “meta-mathematics”, which considers which models satisfy a given theorem, which axioms are required to prove it, etc.
  • The way branches of mathematics (algebra, geometry, analysis, etc.) diverge and develop their own distinct characters, only to be re-unified by mixing them together in new subjects: algebraic geometry, analytic number theory, geometric analysis, etc. intil they again seem like parts of a big interrelated whole. Beyond these obvious cases, the supposedly different sub-disciplines develop distinctive ideas, tools, and methods, but then borrow them from each other as fast as they specialize. This back-and-forth between specialization and cross-fertilization is thus an example of “dialectic”.

Zalamea suggests that in the Contemporary period, all these themes are still present, but that some new ones have become important as well:

Structural Impurity of Arithmetic” – this is related to subjects outside my range of experience, like the Weil Conjectures and the Langlands Program, so I don’t have much to say about it, except to note that, by way of arithmetic functions like zeta functions, they relate number theory to algebraic curves and geometry, and constitute a huge research program that came into being in the Contemporary period (specifically the late 1960’s and early 1970’s). (Edward Frenkel described the Langlands program as “a kind of grand unified theory of mathematics” for this among other reasons.)

Geometrization of Mathematics – essentially, the migration of tools and methods originally developed for like the way topos theory turns logic into a kind of geometry in which the topology of a space provides the algebra of possible truth values. This feeds into the pervasive use of sheaves in modern mathematics, etc. Or, again, the whole field of noncommutative geometry, geometric ideas about space are interpreted as  (necessarily commutative) algebra of functions on that space with pointwise multiplication: differential operators like the Lagrangian, for instance, capture metric geometry, while bundles over a space have an interpretation in terms of modules over the algebra. These geometric concepts can be applied to noncommutative algebras A, thus treating them as if they were spaces.

“Schematization”, and becoming detached from foundations: in particular, the idea that what it means to study, for instance, “groups” is best understood in terms of the properties of the category of groups, and that an equivalent category, where the objects have some different construction, is just as good. You see this especially in the world of n-categories: there are many different definitions for the entities being studied, and there’s increasingly an attitude that we don’t really need to make a specific choice. The “homotopy hypotesis” for \infty-groupoids is an example of this: as long as these form a model of homotopy types, and their collectivity is a “homotopy theory” (with weak equivalences, fibrations, etc.) that’s homotopy-equivalent to the corresponding structure you get from another definition, they are in some sense “the same” idea. The subject of Univalent Foundations makes this very explicit.

Fluxion and Deformation” of the boundaries of some previously fixed subject. “Fluxion” is one of those bits of jargon-sounding words which is suggestive, but I’m not entirely clear if it has a precise measing. This gets at the whole area of deformation theory, quantization (in all its various guises), and so on. That is, what’s happening here is that previously-understood structures which seemed to be discrete come to be understood as points on a continuum. Thus, for instance, we have q-deformation: this starts a bit earlier than the Contemporary period, with the q-integers, which are really power series in a variable q, which just amount to the integers they’re deformations of when q has the value 1. It really takes off later with the whole area of q-deformations of algebra – in which such power series take on the role of the base ring.  Both of these have been studied by people interested in quantum physics, where the parameter q, or the commutators in A are pegged to the Planck constant \hbar.

There’s also reflexivity of modern mathematics, theories applied to themselves. This is another one of those cases where it’s less clear to me what the term is meant to suggest (though examples given include fixed point theorems and classification theorems.)

There’s a list near the beginning of notable mathematicians who illustrate

Zalamea synthesizes these into three big trends described with newly coined terms: “eidal“, “quiddital“, and “archaeal” mathematics. He recognizes these are just convenient rules of thumb for characterizing broad aspects of contemporary research, rather than rigorously definable ideas or subfields. This is a part of the book which I find more opaque than the rest – but essentially the distinction seems to be as follows.

Roughly, eidal mathematics (from the Greek eidos or “idea”) seems to describe the kind that involves moving toward the abstract, and linking apparently unrelated fields of study. One big example referenced here is the Langlands program, which is a bunch of conjectures connecting number theory to geometry. Also under this umbrella he references category theory, especially via Lawvere, which subsumes many different fields into a common framework – each being the study of some particular category such as Top, perhaps by relating it to some other category (such as, in algebraic topology, Grp).

The new term quiddital mathematics (from Latin quidditas, “what exists” or literally “whatness”) appears to refer to the sort which is intimately connected to physics. The way ideas that originate in physics have driven mathematics isn’t totally new: Calculus is a classical example. But more recently, the study of operator algebras was driven by quantum mechanics, index theory which links differential operators and topology was driven by quantum field theory, and there’s a whole plethora of mathematics that has grown out of String Theory, CFT, TQFT, and so forth – which may or may not turn out to be real physics, but were certainly inspired by theorizing about it. And, while it hasn’t had as deep an effect on pure mathematics, as far as I know, I imagine this category would include those areas of study that arose out of other applied studies, such as the theory of networks or the dynamics of large complex systems.

The third new coinage, archaeal mathematics (from arche, or “origin”, also giving the word “archetype”) is another one whose meaning is harder for me to pin down, because the explanation is quite abstract. In the structure of the explanation, this seems to be playing a role that mediates between the first two: something that mediates moving between very abstract notions and concrete realizations of them. One essential idea here is the finding of “invariants”, though what this really seems to mean is more like finding a universal structure of a given type. A simple case might be that between the axioms of groups, and particular examples that show up in practice, we might have various free groups – they’re concrete but pure examples of the theory, and other examples come from imposing more relations on them.

I’m not entirely sure about these three categories, but I do think there’s a good point here. This is that there’s a move away from the specifics and toward general principles. The huge repertoire of “contemparary” mathematics can be sort of overwhelming, and highly detailed. The five themes listed by Lautman, or Zalamea’s additional five, are an attempt to find trends, or deal descriptively with that repertoire. But it’s still, in some ways, a taxonomy: a list of features. Reducing the scheme to these three, whether this particular system makes sense to you or not, is more philosophical: rather than giving a taxonomy, it’s an effort to find underlying reasons why these themes and not others are generating the mathematics we’re doing.  So, while I’m not completely convinced about this system as an account of what contemporary mathematics is about, I do find that thinking about this question sheds light on the mass of current math.

Some Thoughts

In particular, a question that I wonder about, which a project like this might help answer, is the question of whether the mathematics we’re studying today is inevitable. If, as the historical account suggests, mathematics is a time-bound process, we might well ask whether it could have gone differently. Would we expect, say, extraterrestrials, or artificial intelligences, or even human beings in isolated cultures, to discover essentially the same things as ourselves? That is, to what extent is the mathematics we’ve got culturally dependent, and

In Part I, I made an analogy between mathematics and biology, which was mainly meant to suggest why a philosophy of mathematics that goes beyond foundational questions – like the ontology of what mathematical objects are, and the logic of how proof works – is important. That is to say, mathematical questions themselves are worth studying, to see their structure, what kinds of issues they are asking about (as distinct from issues they raise by their mere existence), and so on. The analogy with biology had another face: namely, that what you discover when you ask about the substance of what mathematics looks at is that it evolves over time – in particular, that it’s cumulative. The division of mathematical practice into periods that Zalamea describes in the book (culminating in “Contemporary” mathematics, the current areas of interest) may be arbitrary, but it conveys this progression.

This biological analogy is not in the book, though I doubt it’s original to me. However, it is one that occurs to me when considering the very historically-grounded argument that is there. It’s reminiscent, to my mind, of the question of whether mathematics is “invented or discovered”. We could equally well ask whether evolution “invents” or “discovers” its products. That is, one way of talking about evolution pictures the forms of living things as pre-existing possibilities in some “fitness landscape”, and the historical process of evolving amounts to a walk across the landscape, finding local optima. Increases in the “height” of the fitness function lead, more or less by definition, to higher rates of reproduction, and therefore get reinforced, and so we end up in particular regions of the landscape.

This is a contentious – or at least very simplified – picture, since some would deny that the space of all possibilities could be specified in advance (for example, Lee Smolin and Stuart Kauffman have argued for this view.) But suppose for the moment it’s the case and let’s complete the analogy: we could imagine mathematics, similarly, as a pre-existing space of possibilities, which is explored over time. What corresponds to the “fitness” function is, presumably, whatever it is about a piece of mathematics that makes it interesting or uninteresting, and ensures that people continue to develop it.

I don’t want to get too hung up on this analogy. One of the large-scale features Zalamea finds in contemporary mathematics is precisely one that makes it different from evolution in biology. Namely, while there is a tendency to diversification (the way evolution leads to speciation and an increase in the diversity of species over time), there is also a tendency for ideas in one area of mathematics to find application in another – as if living things had a tendency to observe each other and copy each others’ innovations. Evolution doesn’t work that way, and the reason why not has to do with specifics of exactly how living things evolve: sexual reproduction, and the fact that most organisms no longer transfer genes horizontally across species, but only vertically across generations. The pattern Zalamea points out suggests that, whatever method mathematicians are using to explore the landscape of possible mathematics, it has some very different features. One of which seems to be that it rewards results or concepts in one sub-discipline for which it’s easy to find analogies and apply them into many different areas. This tendency works against what might otherwise be a trend toward rampant diversification.

Still, given this historical outlook, one high-level question would be to try to describe what features make a piece of mathematics more rewarding and encourage its development. We would then expect that over time, the theorems and frameworks that get developed will have more of those properties. This would be for reasons that aren’t so much intrinsic to the nature of mathematics as for historical reasons. Then again, if we had a satisfactory high-level account of what current mathematics is about – something like what the three-way division into eidal, quiddital, and archaeal mathematics is aiming at – that would give us a way to ask whether only certain themes, and only certain subjects and theorems, could really succeed.

I’m not sure how much this gains us within mathematics, but it might tell us how we ought to regard the mathematics we have.