Hacker News new | past | comments | ask | show | jobs | submit login
Applied category theory in chemistry, computing, and social networks [pdf] (ams.org)
135 points by larve on Nov 1, 2022 | hide | past | favorite | 58 comments



If anyone is interested in Applied Category Theory, definitely check out the Topos Institute in Berkeley [1]. They do weekly seminars that they post on youtube and a really intriguing blog. I must say that David Spivak is a treasure to hear speak. 7 Sketches in Compositionality [2] was my introduction into Category Theory (written by Spivak and Brendan Fong, another member of Topos), and it really sold the idea of Category Theory as a field that's not just a mathematical meta-language but also a field that can stand on its own. I recommend it over Mac Lane's CWM if you're not a mathematician.

[1] https://topos.site/ [2] https://arxiv.org/abs/1803.05316


The nLab https://ncatlab.org/nlab/show/HomePage is a useful reference for category theory terminology and results.


Agreed on the institute and Spivak. But CWM is, er, for working mathematicians. Leinster's Basic Category Theory, or Awodey Category theory are more unassuming.


Yes, that's why I recommend starting out with 7 Sketches.


Right, my mistake.


Anyone interested in category theory might be interested in the brand new book The Joy of Abstraction: An Exploration of Math, Category Theory, and Life by Eugenia Cheng.

https://www.amazon.com/Joy-Abstraction-Exploration-Category-...

Then there's of course the classic introduction Conceptual Mathematics by Lawvere and Schanuel.

https://www.amazon.com/Conceptual-Mathematics-First-Introduc...


"A goal of the ACT community is to bridge the gap between theorists using category-theoretic modeling tools and those who want to use the models to say something useful and true about the world"

This is key. Sure category theoretic abstractions are useful as a common language. Lot's of theorems are applicable in many domains.

The hard thing is understanding all this stuff and mapping it into your domain.


Has anyone actually seen a real world problem being solved via Category Theory? As in, "in the wild", instead of being mentioned on a website dedicated to Category Theory?

To me it seems a lot like the crypto and web 3.0 promises of "soon there will be all of these amazing applications"... and then there aren't any.

Also, all of the examples of CT I've seen are applicable only to pure functional languages like Haskell, which seems to limit their practical utility.


All these concepts actually apply to any programming language. Haskell often allows to express them very concisely and compose them elegantly, but the fact that say, lists can be viewed as a functor (ie, you can apply a function taking an int and returning a string, apply it to a list of ints, and get back a list of strings), is independent of the language. Knowing that fact however does (for a certain type of programmers at least) make it easier to combine different pieces of code.

I usually program in C++ or php, but I do think in monads / functors / applicatives quite often. I then use whatever features of the language I have at my disposal to “encode” my thinking at an appropriate level of abstraction. What level of abstraction that is depends not only on the language, but also how efficient and “communicable” I want my code to be. I don’t mind expanding the `map` of a functor into an explicit for loop, if my junior colleague can more easily deal with it.

In fact, at this point, with functional programming patterns having become so pervasive, most people including beginners have no problem with the concept of functors, and “mapping” functions on things. Monads still have that aura around them, but really, any time you write a line of code that reuses the result of a previous line of code and checks for something, you probably have a monad hiding in there, and “exposing” it might make it easier to factor out your code or find an elegant API.


There was a talk a few years back by Conal Elliott about a concise implementation of reverse mode automatic differentiation through tweaking the definition of derivative to make it composable and then dualizing[0].

There was another talk where IIRC Edward Kmett dualizes divide and conquer algorithms to derive a generic bucket sort that works on general algebraic data types[1].

These are generic concepts. Haskell is just convenient to express them in. In some sense, obeying functor or monad laws just means you'll have a "clean", predictable design, for example. The language doesn't need special support; you just need to check whether your code is composable in a sensible way.

[0] https://youtube.com/watch?v=17gfCTnw6uE

[1] https://youtube.com/watch?v=eXDJ5Jcbgk8


Heh... I just realised I've already seen that presentation!


It is used in some formalizations of quantum field theory. Quantum field theory is used to make testable predictions about physics, and historically, giving scientific theories a rigorous mathematical backing has been useful. So that could be an example.


The classification problem of sorting humans into two groups:

- the group of people who don't see the relevance of mathematics "in the real world", and expect any a-ha moment to have come to them by magic already, and;

- the other group.


Idk, as a mathematician, I still don’t really see the usefulness of category theory “in the real world” (or even in a good amount of math)


Which math is most useful?

Differential Equations Statistics / Stochastic Processes Discrete Math / Algorithms / Logic Solver

You do not need CT for all this stuff.

Grothendieck needed CT - ist Algebraic Geometry useful?

I've read the paper. They list many useful applications. CT doesn't seem really necessary for each of them but it looks like CT is (could be) helpful.


To properly talk about compositionality in differential equations you absolutely need category theory. Likewise I firmly believe that PDE and numerical mathematics actually lack a good foundation in category theory, not that they don’t need one.

It is just very hard to create non-trivial general theories in sich „Applied“ fields, Functional Analysis alone is not very exciting. For an example see for example the work or Hairer (stochastic differential equations) or Costello (QFT).


Grothendieck stacks in QFT, did not know about it.

This is wild stuff:

https://people.math.umass.edu/~mirkovic/0.SEMINARS/1.QFT/C.C...

funny names: a Leonard Cohen writing about an Costello (but Kevin Costello not Elvis Costello)


From the linked article:

> Statebox is blending an ACT approach to Petri nets together with blockchain technology to develop a technology stack based on a visual programming language. In addition, they have built a software engine for compositional game-theoretic modeling, a finite state machine oracle.

Hmm.

More seriously:

> Structured cospans of Petri nets were implemented in the software package Julia to develop an SIR model that is compositional in the sense that various cities can each have their own model that can be connected together to form a composite SIR model

SIR is the most common way of modelling infectious diseases. Having said that though - there are plenty of SIR models that don't require category theory.



I'm a bit of a simpleton. What's the real world problem that ZX-calculus solves?

edit: This question is genuine. There are only abstract concepts included in the video and wiki, as deep as I can see.


this video may help

https://youtu.be/iC-KVdB8pf0


Sorry, but what real world problem does a quantum circuit solve?

My interpretation of "solving a real world problem" means that it can't be infinitely abstract, and must be a relatively good solution, compared to other options. I only see, from my dumb perspective, extreme abstracts.

I'm not being combative. I just don't understand.


There's no such thing as "infinitely abstract". All abstractions eventually become real-world given enough time. It's only a question of how far ahead of the curve they are and long it takes.


> There's no such thing as "infinitely abstract".

In time, that's correct. At the moment, I disagree.


Is this what Grothendieck specialized in? I remember reading an article about how it was applied by NSA to break encryption. I tried reading then about how that was done but at most it seems like its one abstraction layer above the model itself and so you can step around things, maybe.


Which of the results mentioned in the article are proven by applying some result of category theory to another field, and which are just stated using category theory (with the real mathematical insight lying elsewhere)?


If you translate everything to being stated in terms of category theory, it clarifies how it all works and makes it easier for people to go from one area to another. It is a bit like what Grothendieck did with abstract algebra. Imagine if every sub-field of mathematics had their own word for group, and that's something like how category theory fans see the present world.

There's no such thing as a theorem that can only be proven with category theory, because whatever specific thing you're applying category theory to already has all of the properties that would make the theorem true without category theory being involved - if it didn't, you wouldn't be able to treat it as if it was a category. Categories are not the kind of mathematical invention that, for example, ordinary differential equations or matrices are.


You could say the same thing for any abstract mathematical construct, e.g. metric spaces. There's no theorem (that's not itself about metric spaces) that can only be proven using metric spaces. It's always possible to make new abstractions (simply take the antecedent of any theorem), so whether you adopt a new abstraction is a matter of taste and fashion.

In Grothendieck's early work on abelian categories, categories are themselves the objects of mathematical study, and nontrivial results about them are established.


> If you translate everything to being stated in terms of category theory, it …

… makes it incomprehensible to experts in the field and laymen alike!


I agree: "There's no such thing as a theorem that can only be proven with category theory".

But sometimes it feels like Category Theory is rebuilding the world and there are genuine theorems.

Epilogue: Theorems in Category Theory

https://emilyriehl.github.io/files/context.pdf


Baez's blog is basically mathematical crack cocaine


Only if you're an algebraeist. There's a lot more to math than algebra.


He moved recently from Twitter to mathstodon.xyz


What are the applications of the category theory being used in computing besides typing?


Is it just me, or does "applied category theory" never use anything from category theory other than boxes being connected by arrows?


I can see why you're saying that, but the use of commutative diagrams communicate the structure between the functions and objects of interest - it is this perspective which is core idea to category theory. So I'd argue that the diagrams are a result of communicating a category-theoretic model rather than the end-result itself, and therefore have much deeper meaning than just "boxes being connected by arrows".


If you’re talking about string diagrams, then these are actually just 2D notation for very precise category theory. Manipulating the diagrams is equivalent to proving things in the category, provided you’ve shown soundness and completeness.


Arrows and boxes is pretty much what category theory is about. You have objects connected by morphisms, and all further structure and concepts are derived from that.


Yeah, but you can only be said to be applying category theory if you use some of those structures or concepts.


I am asking myself that question regularly, as I'm trying to get more comfortable with "real" category theory. My background is programming, and I've become familiar with a lot of the applied CT (functors, monads, monoids, applicatives, ...).

I think the point of category theory, especially applied category theory, is recognizing what can be further abstracted as an arrow and an object. This turns something that originally is complex and has no apparent structure into a simple dot and arrow, again. Then rinse and repeat.

For example, you can take a for loop that applies a function to a list and then creates a new list. You can't "easily" add another for loop that then applies a second transformation. But if you recognize the functor from the category of types to the category of types, then you can formulate your 2 composed for loops as `list.map(f1 | f2)`, and everything is "trivial" arrows and objects again.

Turn that into a function that returns a list, and you want to concatenate the result. "Annoying" to write, you have to flatten your list. Recognize the monad, back to a simple composition of two morphisms, even though the morphisms encapsulate a lot.

Learning and applying CT I think is recognizing which things can be conceptualized as a morphism, which then magically makes things appear "simple." Yet it is only simple if you have done that recognition work, and more advanced concepts in CT are there to assist you.


> For example, you can take a for loop that applies a function to a list and then creates a new list. You can't "easily" add another for loop that then applies a second transformation.

This is quite unconvincing.

Easy solution 1: add a second loop which takes the first loop's output and transforms it with f2.

Easy solution 2: realise you are doing one transform after another and replace out.add(f1(x)) with out.add(f2(f1(x)).


How to derive solutions like solution 2 is what category theory helps me with.

It sounds glaringly obvious in the cases where the abstraction is well understood Mapping on lists, on trees, on Maybe, on Result are all well known at this point, because most APIs offer it. Less obvious is the fact that this applies to say, a serial interface, sensor readings, futures.

If you can then express the concept of a functor in the programming language itself, you can start writing code that only relies on the fact that something is a functor. And then pass it a list, a serial interface, futures, futures of serial interfaces, etc…

For example, you could potentially find use in expressing your above code as:

out = makeMapped(out, [f1, f2])

Now I can just pile 18 more functions defined in a config file without having to write it out.

out = makeMapped(out, configFile.createFunctions())

I’m not saying you need category theory for this, the same way you don’t need algebra every time you add two numbers, but it makes it easier to see these patterns.

I have trouble finding good “practical” examples, because you can use either functors, that every experienced programmer already knows intuitively so well that the concept is ingrained. If you bring up monads, people often react viscerally.

In practice, I often just take the time to write out monads without even mentioning the word or concept, and refactor code so that it leverages that underlying structure, and that usually makes things become apparent. Monads are a really good structure to cut down on absurd complexity for anything dealing with state and concurrency, and turn it into an elegant problem.sequence(step1, step2, step3, step4) formulation.


How does that differ from plain graph theory (sorry if a naive question)?


Not a dumb question at all!

The short answer is that a category also satisfies two rules: every object has an arrow to itself (identity) and if there are two arrows (a->b) and (b->c) there is an arrow (a->c) (associativity).

Other abstractions such as functors (mappings between categories) are built upon this relatively simple foundation. You can also think about categories of categories (they can be considered an object, after all!), and so on and so forth.

Some complain about the generality of Category Theory ('It is a theory about everything and nothing!'), but its generality makes it compelling for some for study.


I think it’s not a naive question at all. To my naive understanding, categories need to have a composition property (if there is a morphism from a to b, and b to c, then there is a morphism a to c created by composing a to b and b to c), which allows to express “richer” structures than just directed graphs. Now, for the gory details, you better ask a mathematician :]


They are incorrect. While diagrams are indeed (a part of) the language of Category Theory, the theory is certainly not about the language.


>other than boxes being connected by arrows

Well, that's because that's basically it, lol.

But I get your sentiment and, sure, there's a lot of things where this could be applied and that hasn't happened already. It's definitely a field with a prolific future, though. I've personally applied many ideas I got from there into my smallish ventures with good results.


Well, not exactly. Things till functions and functor are part of almost every big enough theory like set theory or even just natural numbers and used everywhere, and if arrows are only used for describing that, I wouldn't say it is application of category theory.

Only point of category theory starts when you go higher than that and deal with more abstract and generalizable things e.g monads(which again is used in other theories), bigger infinities, ordering of infinities etc.


But that's not "basically it," there are lots of high-flying theorems that make it an entire area of mathematics.


Well, then I have no idea why you asked your previous question. ¯\_(ツ)_/¯


Because applied category theory stops at translating everything into boxes and arrows. There's no "applied Yoneda lemma."


I'm not sure that's true. This is a surface / introductory article, but if you go deeper into say, categorical databases, modeling of attacks (https://arxiv.org/pdf/2103.00044.pdf), there sure is some wilder CT at work. I'm leaving out programming as applied category theory because there is no shortage of refined concepts there (type theory, all the haskell libraries, many things I don't even comprehend).

But to me, the value in ACT is actually learning how "simple" some of the other fields could be, because it allows me to communicate with non-software engineers, and actually have them contribute to software.

I did a lot of work with state machines as composable abstractions for concurrency and flow control, and mechanical engineers were able to find some really subtle say, race conditions, just by pointing out "hey, isn't there an arrow from this to this, if the limit switch catches early?"


In computing you can apply Yoneda for optimizations.

"Each of the steps is fairly compelling, except perhaps the second one, which rests on the Yoneda Lemma"

see "Kan Extensions for Program Optimisation"

https://www.cs.ox.ac.uk/ralf.hinze/Kan.pdf


lol, you picked the one thing category theory actually has examples of not being abstract nonsense for

https://mathoverflow.net/questions/12511/what-is-yonedas-lem...


It's the actual process of translating everything into boxes and arrows that's the core of ACT.


So, if I already model systems via boxes and arrows, and use things like petri nets, then am I good, or do I still need to learn category theory?


If that works for you, that’s of course perfect. I find that the study of CT pretty has very much clarified that a lot of diagrams I see in other domains (business processes, supply chain modelling) directly relate to my understand of diagrams.

I think it is a property of human cognition that if we draw an arrow from a to b, and one from b to c, that there is some kind of relation between a to c as well.

What this means concretely is that if I can draw my problem, in my domain, as a diagram with boxes and arrows, there is a reasonable chance that you can actually understand it, and actually contribute to it. Similarly, seeing diagrams from your problem domain is often sometimes I can relate to my design patterns and software architecture foundations, and I might be able to ask quite pointed questions when trying to say, build a mathematical model or build some supply chain software. Questions like “ok, but, is there an arrow from here to here?” and the answer “oh yes, that happens when we actually replenish the inventory by doing X, I forgot about that part”.

Another thing I found people do with CT is that they use it as a toolbox of abstractions that they can try to fit to a problem. Say, “oh, is there maybe a functor from petri nets to databases? ok, there is, but it’s kind of useless. Does this formulation actually work as a monad?” This is useful for finding elegant APIs to problems, say things like optics (lenses, etc…).


I never ran into petri net in jobs, I always thought they were still stuck at educational use. Do you get good results using them ?




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: