Would a modern Prolog (those things + automatic backtracking) or Erlang (those things + cheap actors) have been seen as even better for such purposes, if they had had it at the time?
Having looked at the source for SHRDLU (which I found very hard to understand without more context), I think a lot of it was in an early version of Prolog, "Micro-Planner."
I think it was more about the way the lambda calculus encodes ideas from category theory and so on. GOFAI was all about using logic to achieve AI so languages like LISP (and, of course, logic programming languages like Prolog) were a natural choice for that sort of thing.
No, Eugenio Moggi realized that you can use Monads to represent IO etc. in languages like Haskell, and that was in the paper "Computational lambda-calculus and monads" (1989). The first person I can think of to make a connection between categories and lambda calculus is Joachim Lambek, in his paper "From lambda calculus to Cartesian closed categories" (1980).
Category theory, the lambda calculus and the principle of compositionality were employed in linguistics since the 1930's, for instance with categorical grammar. I don't think we're talking about the same thing.
What kind of Category Theory are you talking about? [Category Theory](https://plato.stanford.edu/entries/category-theory/) in the mathematical sense, and in the sense that it can be used to describe Lambda Calculi, appeared in 1945, in the field of Algebraic Topology and Homological Algebra. I'd be really surprised if it was employed in linguistics in the 1930s.
I guess you're right and I'm wrong- the lambda calculus is not my specialty really. I know for sure that it's used in semantic and discourse analysis in computational linguistics [1] (which again are not my specialty) but, admittedly, I am a bit fuzzy on the dates.
What I had in mind in particular was the application of concepts from formal language analysis to natural language processing, which wikipedia confirms started in the 1970's with the work of Richard Montague [2]. Somehow I had convinced myself this work hails from much earlier, around the time of (Chomsky's) Syntactic Structures - but that again is from the late '50s. Anyway, this kind of analysis does make use of the lambda calculus and category theory _today_ but perhaps Montague's early work didn't specifically do so.
Like I say the lambda calculus is not my specialty and I'm always a bit surprised when I realise how far back Church's original work goes. Considering how long it took for the relevant concepts to percolate down to the mainstream of programming languages, that is. That should help explain my confusion with the dates.
As a side note- a lot of what I know about semantic analysis is just "lore" - things I've picked up here and there while looking at structure, which is my main dish usually. Sometimes I struggle to point out how I know some things I'm sure I know :/
Edit: I think my reply to the original question, about why Lisp was used in computational linguistics, is mostly correct, even if I got my dates wrong: using formal logic tools in analysing meaning in natural languages was popular around the time Winograd wrote his thesis and he would have leaned heavily on it for SHRDLU. I'm not getting the dates wrong here, either- Alain Colmerauer and Definite Clause Grammar goes back to at least 1972 and he didn't just come up with the idea out of nowhere, there was a lot of activity in logics and grammars around that time.