This goes to show how hard winning Turing award is. One would have expected someone who invented the most useful invention of the 20th century to have won this award long time ago. Maybe I am just overvaluing www because of the impact it had on people's lives.
I always felt that Tim Berners-Lee was not respected enough in both the computer science and programming communities. I felt it especially after working for over a decade at Google, which literally built its entire business on TBL's architectural concepts.
For example, Google and other search engines would not work without the principle of least power [1], which a lot of people, including Alan Kay [2], somehow don't understand. That is, if the web language was a VM rather than HTML, there would be no Google.
It would also not have been possible for the web to make the jump from desktops to cell phones as the #1 client now. You know the handler in iOS and Android that makes <select> boxes usable? That's an example of the principle of least power.
I recommend reading his book "Weaving the Web" [2] if you want to learn more about the story behind the web.
I'm very glad that TBL is getting this recognition. He is a genius and also has a very generous personality.
People in the programming community seem to talk about Torvalds or Stallman a lot, perhaps because of their loud styles, but I don't see that much about TBL.
Ditto in the CS community. "HyperText" used to be a big research area but I guess TBL solved it and people don't talk about it anymore.
>For example, Google and other search engines would not work without the principle of least power [1], which a lot of people, including Alan Kay [2], somehow don't understand. That is, if the web language was a VM rather than HTML, there would be no Google.
Kay's criticism of the Web is very well justified and (like most of his high-level criticisms) typically misunderstood. He doesn't criticize it as a repository of hyperlinked documents. He criticizes it as platform for application delivery, which it became. Modern web with all its scripts is a VM -- and badly designed one at that.
I think people understand the argument, they just don't fully agree with it. Personally, while I agree that we have ended up with a fairly poor universal VM, I find it to be a fascinating example of path dependency. Maybe we would be better off with one repository of hypertext and one well designed universal VM, but what path would we have taken to get there?
The hypertext repository was so compelling that everyone installed software to access it. Then that universally available software was so compelling that people found ways to run increasingly complex applications on it. And that's how we naturally found ourselves where we are today.
I don't see any reason to think that engineering a more perfect solution all at once would have worked better than this natural progression.
Well said. Kay's misunderstanding is that it could have been any different.
He thinks that you can just design something nice from whole cloth and people will use it. That's why his designs aren't deployed.
I've been looking at projects like OOVM and going back in history to Self and SmallTalk, and there's a reason that those things aren't deployed. Don't get me wrong -- they're certainly influential and valuable.
But he's basically confusing research and engineering, as if engineering wasn't even a thing, and you can just come up with stuff and have people use it because it's good. You need a balance of both to "change the world", and TBL has certainly done that.
Another analogy I use is complaining about the human body. Like "who designed this thing there there trachea and esophagus are so close together?!? What an idiot!!!" Or "why are all these people mentally ill and otherwise non-functional members of society? Who designed this crap?"
The point is that it couldn't have been any different. It wasn't designed; it was evolved.
>Another analogy I use is complaining about the human body. Like "who designed this thing there there trachea and esophagus are so close together?!? What an idiot!!!" Or "why are all these people mentally ill and otherwise non-functional members of society? Who designed this crap?"
Okay, so what's wrong with discussing the limitations of the human body and the ways to improve it then?
Yes, the web evolved instead of being designed (however much that distinction makes sense), but arrived at a shi^H^H^H suboptimal result. And it arrived there through deliberate design decisions of people - who unfortunately were designing a different system in the first place.
It's like English. I love English, but it's a bloody mess that we're all stuck with now - except that changing a computer system is comparatively easy to changing the direction of a language.
It's great to discuss ways to improve things, but that is different than suggesting that the whole thing is rotten to the core and needs a re-work from the ground up. The productive way to do this is to identify specific deficiencies and propose targeted incremental improvements to address them. This is what all the people involved in various standards and implementations on the web have been doing for years. This is working, progress is just slow and difficult, as it is with most things that are worth doing.
>He thinks that you can just design something nice from whole cloth and people will use it.
Because that's exactly what they did in Xerox Park, many times over.
>That's why his designs aren't deployed.
No comment.
>But he's basically confusing research and engineering, as if engineering wasn't even a thing, and you can just come up with stuff and have people use it because it's good.
Kay has many talks about the difference between invention and innovation (which are much better terms than ones you're using). In fact, his analysis of this difference is probably the most insightful and though-provoking technology talk I have ever seen:
Of course, this subject makes a lot of developers highly uncomfortable, hence a lot of shallow, ignorant, knee-jerk dismissals. "Everything is incremental." "Everything is the only way it could be." "This is fine." And so on. Thing is, Kay worked at Xerox and Apple. He read a myriad of books and research papers on computing, which he constantly references in his talks and writings. He worked and continues to work with some of the most forward-thinking people in the field of computing. In late eighties he foresaw most of the current computing trends - which is verifiable via YouTube. Even without any context his talks display a considerable depth of thought. In short: unlike some people, he actually knows what he is talking about.
>The point is that it couldn't have been any different. It wasn't designed; it was evolved.
And that is why someone who designed it just received a Turing award. Makes perfect sense.
Edit:
Regarding your other comment here.
>If the web is a genius for hypertext, but not for app delivery, then he should have just said so. That is not a very hard sentiment to express. "The Web was done by Amateurs" doesn't capture it.
He has several decades worth of talks and writing. If you haven't bothered to familiarize yourself with at least some of them to understand what he means it's your own fault.
If that's true, then it's not the fault of the web's designers. Suppose you are an architect and you a build a house for a family of five. Then someone buys the house and it into an auditorium. And then they say, "Wow this is really shitty place to hold concerts -- the acoustics are terrible. What a bunch of amateurs." Is that your fault?
If the web is a genius for hypertext, but not for app delivery, then he should have just said so. That is not a very hard sentiment to express. "The Web was done by Amateurs" doesn't capture it.
But I don't even think that's true. If the web were really bad as an application delivery platform, someone should have supplanted it by now. Alan Kay or someone else should go design their own awesome VM for application delivery. I guarantee you it will fail, for reasons of fundamental to its design, while TBL's platform succeeded for reasons fundamental to its design.
I don't want to detract from TBL's accomplishments, especially on this occasion but, he didn't solve the hypertext problem insomuch as he decided that the really hard things like provenance and bidirectional linkages weren't important. Google and Facebook did solve those problems, but only for commercial benefit. Moreover, it's clear that TBL regrets that early decision to the degree that he rails against the walled gardens.
But that's exactly what I disagree with. Design is as much about what to leave out as what to include.
Leaving out those things wasn't an accident or ignorance as Alan Kay claims. There were a lot of very conscious design decisions involved in the web -- again see "Weaving the Web".
He may regret that the web has evolved into walled gardens, but what could he have possibly have done about it? There's to prevent that decades in advance, at least not without strangling it from birth.
I don't think that you and I are in disagreement about the TBL’s design competence: I stated that his decision to set aside the problem of provenance that preoccupied other hypertext research was intentional.
But, your argument is perilously close to begging the question that the success of web is a good thing. Now, I happen to think that the web is a net good (no pun intended), because (among other things) it helped break Microsoft’s hegemony and continues to force OS vendors to provide and support a standard universal computing platform (albeit a crippled one). But, it’s also arguable that the success of the web may have set personal computing back by a few decades, while also exacerbating a bunch of other problems like wealth inequality and reduced privacy/sovereignty, because it turned the Internet into a big modem.
I’ll take a look at Weaving the Web; thank you for the recommendation. To you, I commend Jaron Lanier’s Who Owns the Future.
I think the EME thing is a symptom of a worse problem, which is that the internet is very much controlled by a handful of commercial interests. Whoever controls browsers can become a bad actor and you either go along with it or end up with a broken system (this site only works in IE).
The goal of advocacy is to rouse a sympathetic collective that you can leverage at the bargaining table. But, when the time comes, you play whatever hand you have.
The W3C has to negotiate with industry. But, unfortunately for us that care about the open web, its position is weak.
"The Internet was done so well that most people think of it as a natural resource like the Pacific Ocean, rather than something that was man-made. When was the last time a technology with a scale like that was so error-free? The Web, in comparison, is a joke. The Web was done by amateurs."
-- Alan Kay
That quote is pretty unfortunate. I guess nobody's perfect.
He's absolutely right, and also completely wrong. It's like listening to the blind men argue about the elephant. "It's a particle!" "No, it's a wave!" https://news.ycombinator.com/item?id=2119057
With all respect to Tim Berners-Lee, I think the principle of least power is overrated. For example, even binary Horn clauses are Turing complete[0]. Whatever configuration language you can come up with (HTML, CSS, or whatever), it's either Turing-complete or not, and if it's Turing-complete there is no danger of executing infinite loops as long as you have some cutoff for resource requirements. The only thing that matters is whether the code is properly sandboxed before it tries to do any I/O.
Not sure I agree with you overall, but I do agree with the distinction between computational power and I/O (or capabilities). That's a very important design criteria for languages for distributed systems.
I think the problem would be that the computation would get cut off at different points on every machine, leading to an unstable ecosystem. Remember the browser has to run on devices with at least an order of magnitude difference in resources, probably 2 orders now.
Generally, you want to guarantee that your style computations terminate. Now it appears that CSS doesn't actually provide that guarantee, since it's Turing complete :) But I guess it's close enough in practice.
Torvalds didn't add anything to CS. He just wrote a Unix-like kernel. The kernel took off and the project grew to include thousands of devs and became the biggest OSS project ever, but still, it's a kernel, not much different in essence from any old Unix kernel. Git? Again, popular project, nothing new. It was even started contemporaneously with Mercurial, which is essentially equal to it, and both are no new inventions at all, DVCS was here since the 90's.
Stallman did Emacs in whose invention AFAIK he did take part, and then he took part in implementing some other innovative projects, though I don't really know his CS career (I mostly know him as the face of GNU and FSF).
IDK but having created a popular project should not be equal to a big innovation in the field.
"IDK but having created a popular project should not be equal to a big innovation in the field."
So you seriously think that "new" is better, then "well done"?
I don't think it makes sense to compare those 2 things and value one higher than the other.
Of what use are innovations, if you can't use them in a "popular project"?
> So you seriously think that "new" is better, then "well done"?
No. But Linux is not more well-made than, say, the kernel of any modern BSD, or that of illumos, etc. Git is not technically superior to Mercurial et al. Torvalds' success is certainly a big, admirable one, but it's a different kind of success than that of Berners-Lee.
Also, while it's the most popular kernel, it's not like we'd not have anything we do have today if it didn't exist, it's in essence an ordinary kernel that came out in the right time. It's those who made the distros and reverse-engineered the drivers and ported/packaged thousands of programs who made Linux a big thing.
WWW, on the other hand, is an invention. It's something that did not exist, and it transformed the world like nothing else.
They didn't contribute to the science. They made very important contributions to the implementation of the science. Stallman also made very important contributions to the political/legal landscape.
I certainly do not overlook that, but is that the type of contribution that brings a Turing? I mean scientific equipment companies are of utmost importance to any science lab, but they don't get nobels, do they? Linux is at the heart of infrastructure today, but it basically is a Unix like kernel. By that logic the author of cURL should get a Turing award too.
> This goes to show how hard winning Turing award is. One would have expected someone who invented the most useful invention of the 20th century to have won this award long time ago.
That's an interesting point. I interpret the prize's criteria "for contributions of a technical nature made to the computing community" to be less in Claude's space and more in the applied space. Scrolling down through the list of past winners, I've implemented several of the work of winners and am using the WWW right now to communicate this.
On the other hand, I think I use Claude's work in the same sense that I use Kirchhoff's laws everyday.
Stephen Cook, Micali, Goldwasser, Rabin, Scott, Karp, Hartmanis, Stearns, Manuel Blum, Yao and Valiant. These are all pure theoreticians; I didn't include data structures people and applied-ish cryptographers like RSA and DH.
More than a significant fraction of Turing awards have been won by theoreticians.
I see a number of theory guys on the list. That's the thing that's interesting about Shannon's exclusion: for any given explanation, there's a counterexample on the list. I've never really known anyone involved very seriously in the ACM, those guys probably know why he was excluded.
> Forbes magazine updates a complete global list of known U.S. dollar billionaires every year. John D. Rockefeller became the world's first confirmed U.S. dollar billionaire in 1916
If you want to play, note that I did not say USD anywhere in my comment. Not to mention that Forbes was founded in 1917, and so it clearly has no data collected to "confirm" an earlier billionaire. We can go back to the 14th century if you like (https://www.wikiwand.com/en/Musa_I_of_Mali):
> During his reign Mali may have been the largest producer of gold in the world at a point of exceptional demand. One of the richest people in history, he is known to have been enormously wealthy; reported as being inconceivably rich by contemporaries, "There’s really no way to put an accurate number on his wealth" (Davidson 2015).
But thanks, I actually never knew that about USD billionaires.
No I think you're spot on. It's easy to see the web as an obvious invention with hindsight, but just imagine a world where tcp/ip is the only protocol available to you. I could easily imagine a world in which someone thought to create an across-the-wire binary specification rather than a really simple text protocol. The genius of http and html is that they are very simple. I remember right after the dot-com bubble crashed a bunch of people basically said the web was dead and that we should be passing binaries back-and-forth. Inventing something simple and useful that the average person can easily pick up is the work of a genius (and yes, the average person can write html, my mom doesn't understand how the internet even works, but she writes html for her job).
"the Haber–Bosch process, a method used in industry to synthesize ammonia from nitrogen gas and hydrogen gas. This invention is of importance for the large-scale synthesis of fertilizers and explosives. The food production for half the world's current population depends on this method for producing nitrogen fertilizers."
Yes they are very useful. I was only counting computer related contributions since Turin award only recognise people who contributed to computing community.
EDITED: 20th century, not 19th.