Hacker News new | past | comments | ask | show | jobs | submit login

First of all, I would strongly encourage anyone who is interested to check out Kurzweil's 2009 predictions in his 1999 book Age of Spiritual Machines, rather than this Wikipedia synopsis. It puts his predictions in a much more accurate context. You can view much of it here: http://books.google.com/books?id=ldAGcyh0bkUC&pg=PA789&#...

Kurzweil also does a reasonably unbiased job of grading his own predictions here: http://www.kurzweilai.net/images/How-My-Predictions-Are-Fari...

Quite a few of your statements relate to technological adoption vs. technological capability, such as everyday use of speech recognition and ebooks. I clearly stated that Kurzweil is not perfect at predicting what technologies will catch on with consumers and organizations, nor is anyone for that matter. To me, and to most of the people reading this, the most interesting aspect of Kurzweil's predictions is always what technological capabilities will be possible, rather than the rate of technological adoption.

Some of your other statements conflate science fiction with what Kurzweil does: "There have been predictions of self-driving cars for more than half a century. It's in Disney's 'Magic Highway' from 1958, for example." Similarly, most of your other points attempt to make the case that because nascent research projects existed, all of his predictions should have been readily apparent. I'm sorry, but this is pretty much the same hindsight bias displayed by gavanwoolery and Kurzweil's worst critics. Basically, Kurzweil's predictions are incredulously absurd to you, until they become blindingly obvious.

You can point to obscure German R&D projects all you want (and who knows how advanced that prototype was, or how controlled the tests were), but I was blown away by the Google self driving car, as were most of the people on HN based on the enthusiasm it received here. I thought it would take at least a decade or so before people took it for granted, buy you've set a Wow-to-Meh record in under a year.

Once again, I strongly encourage you to fire up Youtube or dust off an old computer, and really try and remember exactly what the tech environment was really like in previous decades for the average consumer. Zip drives, massive boot times, 5 1/4 floppy disks, EGA, 20mb external hard drives the size of a shoe box, 30 minute downloads for a single mp3 file, $2,000 brick phones, jpgs loading up one line of pixels a second, etc.

To be clear, I'm not a Kurzweil fanboy. He's not some omniscient oracle, bringing down the future on stone tablets from the mount. What he is is a meticulous, thoughtful, and voracious researcher of technological R&D and trends, and a reasonably competent communicator of his findings. I'm very familiar with the track records of others who try and pull off a similar feat, and he's not perfect, but he's far and away the best barometer out there for the macro trends of the tech industry. If his findings were so obvious, why is everyone else so miserable at it? Furthermore, his 1999 book was greeted with the same skepticism and incredulity that all of his later books were.

For some of your other points, I've included links below:

Research has been initiated on reverse engineering the brain - Kurzweil was clearly talking about an undertaking like the Blue Brain project. Henry Markram, the head of the project, is predicting that around 2020 they will have reverse engineered and simulated the human brain down to the molecular level:

http://www.ted.com/talks/henry_markram_supercomputing_the_br...

http://en.wikipedia.org/wiki/Blue_Brain_Project

A 1000 dollar pc can perform about a trillion calculations per second. -- This happened. This is also an extension based on Moore's law and so in some sense predicted a decade previous. Pretty much all of Kurzweil's predictions boil down to Moore's law, which he would be the first to admit. I'm not sure what you're trying to say.

Autonomous nanoengineered machines have been demonstrated and include their own computational controls. - If you read his prediction in context, he's clearly talking about very primitive and experimental efforts in the lab, which we are certainly closing in on:

http://www.kurzweilai.net/automated-drug-design-using-synthe...

http://en.wikipedia.org/wiki/Nadrian_Seeman - If you've been following any of Nadrian Seeman's work on nanobots constructed with DNA, Kurzweil's predictions seem pretty close

http://wyss.harvard.edu/viewpressrelease/101/researchers-cre...

http://www.kurzweilai.net/a-step-toward-creating-a-bio-robot...

http://www.aalto.fi/en/current//news/view/2012-10-18/

Three-dimensional chips are commonly used. - I guess you could quibble over them being a few years late:

http://www.bbc.co.uk/news/technology-17785464

http://www.pcmag.com/article2/0,2817,2384897,00.asp




Okay, I looked at some of the "reasonably unbiased job of grading his own predictions." I'll pick one, for lack of interest in expanding upon everything.

He writes: “Personal computers are available in a wide range of sizes and shapes, and are commonly embedded in clothing and jewelry.” When I wrote this prediction in the 1990s, portable computers were large heavy devices carried under your arm.

But I gave two specific counter-examples. The JavaRing from 1998 is a personal computer in a ring, and the TRS-80 pocket computer is a book-sized personal computer from the 1980s, which included BASIC. So the first part, "are available", was true already in 1999. Because "are available" can mean anything from a handful to being in everyone's hand.

Kurzweil then redefines or expands what "personal computer" means, so that modern iPods and smart phones are included. Except that with that widened definition, the cell phone and the beeper are two personal computers which many already had in 1999, yet were not "large heavy devices carried under your arm", and which some used as fashion statements. I considered then rejected the argument that a cell phone which isn't a smart phone doesn't count as a personal computer, because he says that computers in hearing aids and health monitors woven into undergarments are also personal computers, so I see no logic for excluding 1990s-era non-smart phones which are more powerful and capable than a modern hearing aid.

There were something like 750 million cell phone subscribers in the world in 2000, each corresponding to a "personal computer" by this expanded definition of personal computer. By this expanded definition, the 100 million Nintendo Game Boys sold in the 1990s are also personal computers, and the Tamagotchi and other virtual pets of the same era are not only personal computers, but also used as jewelry similar to how some might use an iPod now.

He can't have it both ways. Either a cell phone (and Game Boy and Tamagotchi) from 1999 is a personal computer or a hearing aid from now is not. And if the cell phone, Game Boy, etc. count as personal computers, then they were already "common" by 1999.

Of course, what does "common" mean? In the 1999 Python conference presentation which included the phrase "batteries included", http://www.cl.cam.ac.uk/~fms27/ipc7/ipc7-slides.pdf , the presenter points out that a "regular human being" carries a cell phone "spontaneously." I bought by own cell phone by 1999, and I was about the middle of the pack. That's common. (Compare that to the Wikipedia article on "Three-dimensional integrated circuit" which comments "it is not yet widely used." Just what does 'common' mean?)

Ha-ha! And those slides show that I had forgotten about the whole computer "smartwatch" field, including a programmable Z-80 computer in the 1980s and a smartwatch/cellphone "watch phone" by 1999!

I therefore conclude, without a doubt, that the reason why the prediction that "Personal computers are available in a wide range of sizes and shapes, ... " was true by 2009 was because it was already true in 1999.

As regards the "obscure German R&D project", that's not my point. A Nova watching geek of the 1980s would have seen the episode about the autonomous car project at CMU. And Kurzweil himself says that the prediction was wrong because he predicted 10 years and when he should have said 20 years. But my comment was responding to the enthusiasm of rpm4321 who wrote "Predicting that self-driving cars would occur in ten years in the late 90s is pretty extraordinary, especially if you go to youtube and load up a commercial for Windows 98 and get a flashback of how primitive the tech environment actually was back then."

I don't understand that enthusiasm when 1) the prediction is acknowledged as being wrong, 2) autonomous cars already existed using the 'primitive tech environment' of the 1990s, and 3) the general prediction that it would happen, and was more than science fiction, was widely accepted, at least among those who followed the popular science press.

"I strongly encourage you to fire up Youtube or dust off an old computer, and really try and remember exactly what the tech environment was really like in previous decades for the average consumer"

I started working with computers in 1983. I have rather vivid memories still of using 1200 baud modems, TV screens as monitors, and cassette tapes for data storage. I even wrote code using printer-based teletypes and not glass ttys. My complaint here is that comments like "primitive" denigrate the excellent research which was already done by 1999, and the effusive admiration for the predictions of 1999 diminish the extent to which those predictions were already true when they were made.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: