Hacker News new | past | comments | ask | show | jobs | submit login
The deification of Alan Turing (bellmar.medium.com)
67 points by mbellotti on March 9, 2022 | hide | past | favorite | 36 comments



> In terms of actual influence the much more obvious choice for father of computer science is Charles Babbage, whose work was both influential during his lifetime and who pioneers like von Neumann and Aiken referenced in their development of early working computers.

This is a strange thing to say.

George Dyson in Turing's Cathedral wrote:

> “Von Neumann was well aware of the fundamental importance of Turing’s paper of 1936 ‘On computable numbers …’ which describes in principle the ‘Universal Computer’ of which every modern computer (perhaps not ENIAC as first completed but certainly all later ones) is a realization,” Stanley Frankel explains. “Von Neumann introduced me to that paper and at his urging I studied it with care.… He firmly emphasized to me, and to others I am sure, that the fundamental conception is owing to Turing.”

Re-quoting Randell, On Alan Turing and the Origins of Digital Computers [1].

Dyson also came and gave a talk at Google as part of the book tour [2], and the talk consisted of him telling stories about the book and showing pictures of interesting artifacts. At 18:51 he shows a picture of Turing's paper as stored in the IAS library, and this is what he has to say:

> And so it irritated me how all these historians still are arguing about "What did Von Neumann take from Turing? Why didn't he give Turing credit? Did he read Turing's paper?" People say "oh no he didn't read Turing's paper." So I decided I would go look in Von Neumann's library.

> Turing's paper was published in the Proceedings of the London Mathematical Society. That's the volume it was in. When you go to the Institute [for Advanced Study] library, all the volumes are absolutely untouched, mint, hardly been opened except volume 42. And it's clearly, you know, it's been read so many times it's completely fallen apart. They didn't have Xerox machines so they just... Yeah, so the engineers I spoke with, the few that were left said "Yeah... we all had to read that paper. That's what we were doing was building a Turing machine." So I think that answers the question.

[1] https://eprints.ncl.ac.uk/159999

[2] https://youtu.be/_FibuHyIHnU?t=1131


Thank you. Babbage > Turing is like arguing ghosts > UFOs.

It's ridiculously different.


This article just goes to show that, when people hear about "computer science", they think that it only refers to actual hardware computers. Case in point, the reference about "many [...] innovations around computer architecture".

Computer science is about the mechanized manipulation of symbols. Turing proposed a fundamental model to reason about this. He wasn't trying to glorify tabulating machines into mathematical ones; rather, he was solving the Entscheidungsproblem, like a lot of other people. In the circle of logicians, philosophers and mathematicians that cared for this issue (the proto-computer scientists, in a sense), he has never been "obscure".


To expand on your point here: there are generally two inroads into computer science from other disciplines. You can view it is an extension of electrical engineering and the actual development of computing hardware, or you can view it as a development of an "applied math" curriculum.

From that viewpoint, the article here is mostly complaining that Turing has had little impact on the development of computer science if you only look at it from the first perspective. Except the ACM in particular generally hews more towards the second perspective--of the ~50 Turing Awards, at best a dozen of them aren't heavily rooted in a view from the second perspective.

In other words, this amounts to a complaint that the organization that honors the people who make contributions to the math-y side of computer science names its award for doing so after the one of the most important math-y contributors of computer science as opposed to one of the people who actually built contributors.


But that first viewpoint is called Computer Engineering, and is distinct from Computer Science.


As distinct as two neighboring disciplines can be. (not very much IMO)


Good point. Of course, one may retort that Turing also contributed to the second perspective. Sure, his work at Bletchley Park was secret for a while, but his involvement in developing the Ferranti Mark 1 [1] was well known.

[1] https://www.turing.org.uk/scrapbook/manmach.html


Turing has definitely increased in visibility over the last few decades. Von Neumann was considered the "father of digital computing", because he set down in detail how a general purpose stored-program digital computer ought to work. One was built, and it worked. A few billion Von Neumann architecture machines later...

Turing's automata theory work was obscure and not very usable. Turing's code breaking work was very specialized. The real theoretician of cryptanalysis was Friedman, who gave the field a theoretical basis, along with breaking the Japanese Purple cypher and founding the National Security Agency.

"Colossus", the electronic codebreaking machine at Bletchley, was not a general purpose computer. It was a key-tester, like a Bitcoin miner. Its predecessors, the electromechanical "bombes", were also key-testers.

Almost forgotten today are Eckert and Mauchly. They were the architects of the ENIAC, which was a semi general purpose computer programmed with plugboards and knobs. This was a rush job during WWII, when naval gunnery and navigation tables were needed in a hurry. It did the job it was supposed to do. After the war, they formed Eckert-Mauchley Computer Corporation, and produced the BINAC.[1] This was the minimum viable product for a commercial electronic digital computer. All the essential subsystems were there - CPU, memory, magnetic tape drive, printer. Everything was duplicated for checking purposes. That got them acquired by Remington Rand, and their next machine was the famous UNIVAC I, with more memory, better tape drives, a full set of peripherals, and profitable sales. Eckert had a long career with Remington Rand/UNIVAC/Unisys. Mauchley stayed for a few years and then did another startup. More like a good Silicon Valley career.

[1] http://archive.computerhistory.org/resources/text/Eckert_Mau...


Doesn't Tommy Flowers (https://en.wikipedia.org/wiki/Tommy_Flowers) Colossus predate ENIAC?

Flowers wanted a loan from the Bank of England to start a computer company after the war but they didn't believe what he wanted to do was possible, even though he'd already done it (Colossus what classified for years after the war)


They also downplayed Konrad Zuse in history because his story wasn't written by the WWII victors.


Schmidhuber on Zuse v Turing:

(Turing oversold)[https://people.idsia.ch/~juergen/turing-oversold.html]

(Turing: Keep his work in perspective)[https://www.nature.com/articles/483541b]


And this is particularly sad because unlike other German scientists of the time who were lauded like Wernher von Braun, Zuse never joined the Nazi party despite the pressure to do so.


Von Neumann style machines have basically won out, but its interesting to note that the PIC microcontrollers follow a Harvard/Aiken architecture, with different address spaces for code and data. There are therefore millions of non-Von-Neumann computers in the wild right now doing real work!


AVRs too. But in both cases they are stored-program computers; you don't have to reconnect the hardware to run a new problem on them. And they are commonly used to run interpreters, the possibility of which was the key insight of Turing's paper.


Mentioning Eckert and Mauchly requires also mentioning Atanasoff, with whom they lost a priority dispute.


Very slanted article. Turing was hardly unknown to logicians. In fact, even though the undecidability in lambda calculus was proven by Church in 1936 [1], Godel remained unconvinced that it was a complete model of mechanical computation.

It was Turing's paper, in which, among many other things, he showed that Turing machines were equivalent to lambda calculus, that convinced Godel that both Turing machines, and hence lambda calculus, were the right models of mechanical computation.

This was circa 1936, before Turing came to Princeton. So he was hardly unknown to mathematicians even then.

The Turing test and his codebreaking work are of course, well known. Of course, there are other achievements of Turing including a powerful version of the central limit theorem [2] , and "the chemical basis of morphogenesis" [3], to show that he was hardly incapable or obscure. Turing was an original genius, with a wide variety of original views that were later found to be far-reaching.

[1] https://www.jstor.org/stable/2371045?origin=crossref&seq=1#m...

[2] https://www.jstor.org/stable/2974762?seq=1#metadata_info_tab...

[3] https://en.wikipedia.org/wiki/The_Chemical_Basis_of_Morphoge...


I typed in the preface to my copy of Morphogenesis (which describes what we now call cellular automata and reaction-diffusion systems), and I also scanned the drawing by Alan Turing's mother inside the front cover, showing her son watching the daisies grow in the Springtime of 1923.

Hockey or Watching the Daisies Grow:

https://donhopkins.com/home/AlanTuringHockeyOrWatchingTheDai...

Collected Works of A.M. Turing: Morphogenesis: P.T. Saunders, Editor:

https://donhopkins.com/home/archive/Turing/Morphogenesis.txt

>Preface

>It is not in dispute that A.M. Turing was one of the leading figures in twentieth-century science. The fact would have been known to the general public sooner but for the Official Secrets Act, which prevented discussion of his wartime work. At all events it is now widely known that he was, to the extent that any single person can claim to have been so, the inventor of the "computer". Indeed, with the aid of Andrew Hodges's excellent biography, A.M. Turing: the Enigma, even non-mathematicians like myself have some idea of how his idea of a "universal machine": arose - as a sort of byproduct of a paper answering Hilbert's "Entscheidungsproblem". However, his work in pure mathematics and mathematical logic extended considerably further; and the work of his last years, on morphogenesis in plants, is, so one understands, also of the greatest originality and of permanent importance.

[...]

>Preface to This Volume

>It may seem surprising that this collection of Alan Turing's work includes a whole volume devoted to biology, a subject in which he published only one paper. Biology was, however, far more important to Turing than is generally recognized. He has been interested in the subject right from his school days, and he has read, and been much impressed by, the book that has had such a strong influence on many theoretical biologists over the years, D'Arcy Thompson's (1917) classic "On Growth and Form". He was also, like so many who work in biology, attracted by the sheer beauty of organisms. He wrote his (1952) paper not as a mathematical exercise, but because he saw the origin of biological form as one of the fundamental problems in science. And at the time of his death he was still working in biology, applying the theory he had derived to particular examples.

>I found reading the archive material a fascinating experience. For while at first glance Turing's work on biology appears quite different from his other writings, it actually exhibits the features typical of all his work: his ability to identify a crucial problem in a field, his comparative lack of interest in what others were doing, his selection of an appropriate mathematical approach, and the great skill and evident ease with which he handled a wide range of mathematical techniques.

[...]

>Introduction

>Turing's work in biology illustrated just as clearly as his other work his ability to identify a fundamental problem and to approach it in a highly original way, drawing remarkably little from what others had done. He chose to work on the problem of form at a time when the majority of biologists were primarily interested in other questions. There are very few references in these papers, and most of them are for confirmation of details rather than for ideas which he was following up. In biology, as in almost everything else he did within science -- or out of it -- Turing was not content to accept a framework set up by others.

>Even the fact that the mathematics in these papers is different from what he used in his other work is significant. For while it is not uncommon for a newcomer to make an important contribution to a subject, this is usually because he brings to it techniques and ideas which he has been using in his previous field but which are not known in the new one. Now much of Turing's career up to this point had been concerned with computers, from the hypothetical Turing machine to the real life Colossus, and this might have been expected to have led him to see the development of an organism from egg to adult as being programmed in the genes and to set out to study the structure of the programs. This would also have been in the spirit of the times, because the combining of Darwinian natural selection and Mendelian genetics into the synthetic theory of evolution had only been completed about ten years earlier, and it was in the very next year that Crick and Watson discovered the structure of DNA. Alternatively, Turing's experience in computing might have suggested to him something like what are now called cellular automata, models in which the fate of a cell is determined by the states of its neighbours through some simple algorithm, in a way that is very reminiscent of the Turing machine.

>For Turing, however, the fundamental problem of biology had always been to account for pattern and form, and the dramatic progress that was being made at that time in genetics did not alter his view. And because he believed that the solution was to be found in physics and chemistry it was to these subjects and the sort of mathematics that could be applied to them that he turned. In my view, he was right, but even someone who disagrees must be impressed by the way in which he went directly to what he saw as the most important problem and set out to attack it with the tools that he judged appropriate to the task, rather than those which were easiest to hand or which others were already using. What is more, he understood the full significance of the problem in a way that many biologists did not and still do not. We can see this in the joint manuscript with Wardlaw which is included in this volume, but it is clear just from the comment he made to Robin Gandy (Hodges 1983, p. 431) that his new ideas were "intended to defeat the argument from design".

>This single remark sums up one of the most crucial issues in contemporary biology. The argument from design was originally put forward as a scientific proof of the existence of God. The best known statement of it is William Paley's (1802) famous metaphor of a watchmaker. If we see a stone on some waste ground we do not wonder about it. If, on the other hand, we were to find a watch, with all its many parts combining so beautifully to achieve its purpose of keeping accurate time, we would be bound to infer that it had been designed and constructed by an intelligent being. Similarly, so the argument runs, when we look at an organism, and above all at a human being, how can we not believe that there must be an intelligent Creator?

>Turing was not, of course, trying to refute Paley; that has been done almost a century earlier by Charles Darwin. But the argument from design had survived, and was, and indeed remains, still a potent force in biology. For the essence of Darwin's theory is that organisms are created by natural selection out of random variations. Almost any small variation can occur; whether it persists and so features in evolution depends on whether it is selected. Consequently we explain how a certain feature has evolved by saying what advantage it gives to the organism, i.e. what purpose it serves, just as if we were explaining why the Creator has designed the organism in that way. Natural selection thus takes over the role of the Creator, and becomes "The Blind Watchmaker" (Dawkins 1986).

>Not all biologists, however, have accepted this view. One of the strongest dissenters was D'Arcy Thompson (1917), who insisted that biological form is to be explained chiefly in the same way as inorganic form, i.e., as the result of physical and chemical processes. The primary task of the biologist is to discover the set of forms that are likely to appear. Only then is it worth asking which of them will be selected. Turing, who had been very much influenced by D'Arcy Thompson, set out to put the program into practice. Instead of asking why a certain arrangement of leaves is especially advantageous to a plant, he tried to show that it was a natural consequence of the process by which the leaves are produced. He did not in fact achieve his immediate aim, and indeed more than thirty-five years later the problem of phyllotaxis has still not been solved. On the other hand, the reaction-diffusion model has been applied to many other problems of pattern and form and Turing structures (as they are now called) have been observed experimentally (Castets at al. 1990), so Turing's idea had been vindicated.

[...]

Turing, Alan Mathison, 1912-1954. Morphogenesis / edited by P. T. Saunders. p. cm. -- (Collected works of A. M. Turing, Volume 3). Includes bibliographical references and index. ISBN 0 444 88486 6. 1. Plant morphogenesis. 2. Plant morphogenesis -- Mathematical models. 3. Phyllotaxis. 4. Phyllotaxis -- Mathematical models. (C) 1992 Elsevier Science Publishers B. V. All Rights Reserved.

https://books.google.nl/books?id=GX7NCgAAQBAJ&pg=PR8&lpg=PR8

Watching the daisies grow: Turing and Biology:

https://web.archive.org/web/20180901055917/http://tokillamac...

>There’s a sketch drawn by Turing’s mother while he was still a child, showing a hockey game at school. The boys in the background are playing hockey, and Turing in the foreground is not playing, instead he’s leaning over to inspect a daisy growing in the field. The title of the sketch is “Hockey or watching the daisies grow”.

>Turing continued this interest in biology into his adult life. In 1952 Turing wrote what became his most cited paper “The Chemical Basis of Morphogenesis”. This work looked at the question of how structure in nature comes about. How do we start from single cells and end up with complex patterns and shapes? For example, the black and white patterning on a cow, the patterns on a sea shell, the dappling on a fish.

>He came up with the idea of having two interacting chemicals, which he called “morphogens”. These would diffuse through a space, and inhibit or promote each other as they met. He modelled this system with two equations, showing how the amount of chemicals would vary over time across the space. He demonstrated that his model could provide convincingly life-like patterns, and suggested that this might be how nature does it. Not only that but he programmed his computer to help him to calculate the results of the equations for certain cases.

[...]

Turing pattern:

https://en.wikipedia.org/wiki/Turing_pattern

>The Turing pattern is a concept introduced by English mathematician Alan Turing in a 1952 paper titled "The Chemical Basis of Morphogenesis" which describes how patterns in nature, such as stripes and spots, can arise naturally and autonomously from a homogeneous, uniform state. In his classic paper, Turing examined the behaviour of a system in which two diffusible substances interact with each other, and found that such a system is able to generate a spatially periodic pattern even from a random or almost uniform initial condition. Turing hypothesized that the resulting wavelike patterns are the chemical basis of morphogenesis.

>Turing patterning is often found in combination with other patterns: vertebrate limb development is one of the many phenotypes exhibiting Turing patterning overlapped with a complementary pattern (in this case a French flag model).

Reaction–diffusion system:

https://en.wikipedia.org/wiki/Reaction%E2%80%93diffusion_sys...

>Reaction–diffusion systems are mathematical models which correspond to several physical phenomena. The most common is the change in space and time of the concentration of one or more chemical substances: local chemical reactions in which the substances are transformed into each other, and diffusion which causes the substances to spread out over a surface in space.

>Reaction–diffusion systems are naturally applied in chemistry. However, the system can also describe dynamical processes of non-chemical nature. Examples are found in biology, geology and physics (neutron diffusion theory) and ecology. Mathematically, reaction–diffusion systems take the form of semi-linear parabolic partial differential equations. They can be represented in the general form:

>[Impressive Gratuitous Partial Differential Equation [1] goes here]

>where q(x, t) represents the unknown vector function, D is a diagonal matrix of diffusion coefficients, and R accounts for all local reactions. The solutions of reaction–diffusion equations display a wide range of behaviours, including the formation of travelling waves and wave-like phenomena as well as other self-organized patterns like stripes, hexagons or more intricate structure like dissipative solitons. Such patterns have been dubbed "Turing patterns". Each function, for which a reaction diffusion differential equation holds, represents in fact a concentration variable.

Another important historic but underrated paper:

[1] Ray Tracing JELL-O Brand Gelatin; Paul S. Heckbert, Dessert Foods Division, Pixar; Communications of the ACM, Feb 1, 1988:

https://www.thefreelibrary.com/Ray+tracing+JELL-O+brand+gela...

>JELL-O DYNAMICS: "Previous researchers have observed that, under certain conditions, Jell-O wiggles. We have been able to simulate these unique and complex Jell-O dynamics using spatial deformations and other hairy mathematics. From previous research with rendering systems, we have learned that a good dose of Gratuitous Partial Differential Equations is needed to meet the paper quota for Impressive Formulas."

>[Lots of impressive sounding detailed hairy mathematical technobabble redacted. To eloquently summarize:]

>The "begetted" eightness as the system-limit number of the nuclear uniqueness of self-regenerative symmetrical growth may well account for the fundamental octave of unique interpermutative integer effects identified as plus one, plus two, plus three, plus four, as the interpermuted effects of the integers one, two, three, and four, respectively; and as minus four, minus three, minus two, minus one, characterizing the integers five, six, seven, and eight, respectively.

>In other words, to a first approximation:

>J = 0

[...]


This is... somewhat untethered from reality?

Of course there's a link between mathematics and computing, and it well predates Turing or Berkeley. Starting somewhere with Leibniz' "stepped reckoner", stumbling further along with Babbage, Lovelace, and the many actuarial computers.

And that link was very obvious by the time Hilbert & Ackerman formulated the Entscheidungsproblem. Turings biggest contribution to computing was two-fold:

One, he created a formal (theoretical) machine that had behavior equivalent to first-order logic. Second, he formally proved that equivalence - probably the most important part here. That formally proven equivalence means that all problems decidable by first-order logic are decidable by a machine. Three, he used that to formally prove the Entscheidungsproblem isn't generally solvable, and so proved the limits of first order logic.

That's the fundamental breakthrough. Proving that a machine can solve an entire class of problems, and that there are limits to what that machine can solve.

It's not that he somehow shaped what computers should look like, but that he had formally proven their capabilities and limitations.

I'm not surprised by the article given the background of the author - if you value practicality over theory (i.e. you favor software engineering over computer science), Aiken and Berkeley are more relevant. But the ACM has always cared about a theoretical foundation, and so their admiration of Turing makes sense.

The fact that the author doesn't bother to even acknowledge that distinction is a bit surprising, though.


Turing was working on his Phd[0] and went to Princeton where Church was. Turing recognized that Church's lambda calculus answered the question of what it means to compute (effectively calculable). I think it was Phil Wadler that noted that at this point Turing pivoted his thesis to show equivalence to the lambda calculus.

I find it interesting that Gentzen was not convinced by Church's assertion that the lambda calculus demonstrated effectively calculable functions; however when Gentzen saw that Turning explanation he was finally convinced.

Based on this I'd say that computing has 3 fathers, alternately, a holy trinity :-)

[0] https://www.dcc.fc.up.pt/~acm/turing-phd.pdf


TIL that Turing used LaTeX ;) OK, fine, somebody probably typeset it, but it was still an odd experience to read it that way.

Either way, thanks for sharing that link!


Agreed.

The real puzzle is that John von Neumann's machines haven't been able to do one damn more thing than a Turing machine, so why do we even know his name! ;)


Strangely negative. The ACM award isn't the only reason people celebrate Turing...


I suppose the link missing from the article is von Neumann: https://en.m.wikipedia.org/wiki/Von_Neumann_architecture

That von Neumann may have been heavily influenced by Turing's paper, so that we could say Turing's idea helped.


Side bar: as far as I can tell Turing was the first to use the term "assertion" in the context of software testing.

> How can one check a large routine in the sense of making sure that it's right? In order that the man who checks may not have too difficult a task, the programmer should make a number of definite assertions which can be checked individually, and from which the correctness of the whole program easily follows.

https://turingarchive.kings.cam.ac.uk/publications-lectures-...


I suggest people look for Bowden "Faster than Thought" and put Turing in the context of his peers, and publishing of work that was contemporary, and current at the time. he writes in this about his work, alongside Wilkes and others of his contemporaries in the field. Yes, its very UK specific.

https://archive.org/details/faster-than-thought-b.-v.-bowden


>It was only later, when the young Association for Computing Machinery (ACM) needed to establish computer science as a legitimate field of study that the history got edited to suggest a smooth evolution from theoretical mathematics to computing

This is a bit conspiratorial. I don't think anyone believes the evolution of computing started with us stumbling around in plato's cave until Turing opened the door. Then Boom! the next day we had macbooks. Of course practical and theoretical computing evolved together. This is hardly unique to computer science. What Turing represents is a person that asked(and answered) the big questions. Take a parallel in physics. Lorentz and Minkowski explored obscure mathematical tools, but Einstein shattered our thinking. Turing did something similar.

Why take it back to Babbage? The abacus was in use in the 11th century BCE. This practical device was a tool to solve a problem. Individuals were not asking "Why does the abacus work?", "What does it mean?", "What can I or can't compute with this thing?", "What other applications are there for calculating machines?". We spent 2,000 years using this tool to count things and not much more. Babbage added some mechanization but he wasn't exactly trying to bridge the abacus with consciousness.

Turing was thinking broadly. When we celebrate Turing we are celebrating the formalization. He gave us a deterministic framework for reasoning about computers. He allowed us to consider their limitations, their philosophical implications, and the opportunities they represent. The huge leaps from the 1960's on would not have been possible without his work.


> It was only later, when the young Association for Computing Machinery (ACM) needed to establish computer science as a legitimate field of study that the history got edited to suggest a smooth evolution from theoretical mathematics to computing. To sell that message ACM needed founding figures and they settled on a deceased British mathematician named Alan Turing.

> Scientists active in ACM — specifically John W. Carr III and Saul Gorn — began connecting Turing’s 1936 paper “On Computable Numbers” to a broader vision of computer science in 1955.

Hm. I wonder what this writer would think of the Church-Turing thesis.


I've always understood "Father of <some industry>" to be fluid, and usually there's more than one possible figure. Turing, or Babbage, or Shannon, etc could all be called that. One organization thinks Turing is the father. So? Who really cares?


Also Konrad Zuse, frequently ignored in the anglosphere. He is one of ~two people with a credible claim to building the first freely programmable (Turing-complete) computer. His first few computers were mechanical, later models electrical.


Going back even further, I'd argue Gottfried Leibnitz was one of the first to grasp the fundamental problems of building a computer.



It seems like the author doesn't understand, in a fundamental sense, what a computer is. She has been led astray by surface appearances.

Digital "computers" are called that because they developed as higher-precision, lower-speed versions of "analog computers", which integrated systems of ordinary differential equations in real time (but faster). Examples included Bush's mechanical differential analyzer, the MONIAC hydraulic computer, electronic differential analyzers built out of op-amps, and, earlier, Michelson's harmonic analyzer and various kinds of planimeters. Reconfiguring these devices to solve different "programs" of equations involved reconnecting their parts in new ways.†

The thing that makes digital computers special, fundamentally different from both the analog "computers" they were named after and Shannon's pioneering digital-logic circuits, is that they are universal; instead of reconnecting the pieces physically to carry out a different "program", you can leave them connected according to a "universal program", which runs a stored program made out of data in some kind of data storage medium, such as a loop of movie film with holes punched in it, a player piano roll, a mercury delay line, or a DRAM chip. It can even run a program that interprets programs for a different computer, a so-called "simulator" or "emulator". So all such computers are, in a certain sense, equivalent; one may be faster than another, or happen to be connected to different I/O devices at some time, but there's no feature you can add to one of them that enables it to do computations that another one can't.

That's why we can use the same digital computer not only to numerically integrate systems of differential equations but to play card games, edit text, control lathes, symbolically integrate algebraic expressions, decode radio transmissions, and encrypt and decrypt. And it's why we can run Linux on an 8-bit AVR microcontroller.⁜

Because the designers of ENIAC lacked this insight when the design was frozen in 01943, at first ENIAC was programmed by reconnecting its parts with a plugboard, like an analog computer. It wasn't modified to be programmable with data storage media until 01948, three years after von Neumann's First Draft in 01945, in which he (and his colleagues) proposed keeping programs in RAM.

The Harvard Mark I (built in 01944) and Konrad Zuse's Z3 (designed in 01935, built in 01941) could run stored programs from tape, like Babbage's later designs and unlike pre-01948 ENIAC. But they were not designed around this insight of universality, and neither was well-suited to emulating more complex machines, lacking for instance jumps. The Z3 was proven to be accidentally Turing-complete, but not until 01998, and not in a practical way.

That insight into the protean, infinitely adaptable nature of digital computers was not enunciated by Babbage, by Lovelace, or even by the brilliant Zuse. It was discovered by Turing; it is the central notion of his 01936 paper, from which Dyson tells us von Neumann was working, as Russ Cox points out in https://news.ycombinator.com/item?id=30623248.

And that is why Alan Turing is the creator of the discipline that later became known as computer science: it was he who discovered what we now call, simply, "computers".

______

† "Program" is used to mean "configure by connecting" up to the present day in analog electronics; an LM317 is a "programmable voltage regulator" not because its output voltage is controlled by software but because you can change its output voltage by hooking a resistor up to it.

⁜ Though Linux on an AVR isn't very practical: https://dmitry.gr/index.php?proj=07.+Linux+on+8bit&r=05.Proj...

Turing's concept of computational universality permits an amazing economy of hardware; it is the reason that machines like the LGP-30, the Intel 4004, the PDP-8/S, or the HP 9100A could be so much smaller and simpler than the ENIAC, despite being able to handle enormously more complex problems. The ENIAC contained 18000 vacuum tubes, 1500 relays, and 7200 (non-thermionic) diodes; the LGP-30 had 113 vacuum tubes and 1450 diodes; the 4004 had 2300 transistors (not including RAM); the PDP-8/S had 519 logic gates (not including RAM, which was magnetic cores; https://www.ricomputermuseum.org/collections-gallery/equipme... says the CPU contains 1001 transistors, and I'm guessing about 1500 diodes); the HP 9100A had 2208 bits of read-write core, 29 toroids of read-only core (holding 1856 bits), 32768 bits of ROM fabricated as a printed circuit board with no components, and what looks like a couple hundred transistors from https://www.hpmuseum.org/tech9100.htm, many of which are in the 40 J-K flip-flops mentioned in https://hpmemoryproject.org/news/9100/hp9100_hpj_02.htm or https://worldradiohistory.com/Archive-Company-Publications/H....


Thanks for your comment, I think it's insightful. Serious question: why do you write years with five digits?



To me the real deity of CS is Donald Knuth, not for theoretical contributions (ok, but TeX now), but for assembling a large swath of computer science into a biblical form which educated a great many people. His books were indispensable for me in the late 80s and early 90s, before you could just go Google anything.


[flagged]


Please don't post unsubstantive and/or flamebait comments to HN—we ban accounts that do that, because it's not what this site is for.

And certainly please omit name-calling and personal attacks. Those are particularly not what this site is for!

If you wouldn't mind reviewing https://news.ycombinator.com/newsguidelines.html and sticking to the rules when posting here, we'd be grateful.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: