FruityLoops and Reason compete with Garage Band too closely, application denied. Firefox competes with Safari, application delayed and then withdrawn...
Don't forget: GPS, accelerometer, gyroscope, microphone, second microphone, still/video camera, second still/video camera, flash/light, touchscreen, proximity sensor, light sensor, and to top it all off an internal battery with a minimum life of 6 hours.
Apropos the prognostications about circa-2020 brain implants, when I was a twenty-something, back in the mid-1980s, I read Neuromancer.
And I wanted to be the first guy on my block to have a cyberspace jack in his skull.
Now I am middle-aged and cynical and slightly chewed-up by the dot-com 1.0 startup experience, I want to be the first guy on my block to have, for a brain implant ... a firewall!
(One lesson I've drawn from 20 years of reading comp.risks is that the closer we integrate software into our lives, the deadlier the consequences of security exploits. Also? The exploit may be transient but the fallout from it can potentially last for a lifetime.)
Taking another stride forward, your brain implant in 2020 will, then, be faster than an i7, have 16GB of RAM, 2TB of storage, and be able to re-render the whole of Toy Story 3 in 10 minutes. Somehow, that doesn't even sound surprising to me anymore.
Except that it will not be an implant because individual neurons in the brain have an irritatingly low bandwidth of only a few bits per sec.
Instead it is much cheaper and hassle-free to use the firehose connection that already exists into the brain: the eyes, for which even protocols for communication are established. Screens in contact lenses are my bet on where the future of computers are headed. That will with time largely remove the need for physical objects that transfer information.
The interesting part is how Apple in 2020 will spin that they ten years earlier named a display far to bulky to fit into your eye "Retina display".
It might be possible to connect the brain implant to the brain in a similar way that the eyes are connected to it, thus creating an additional input instead of using an existing one. Brains seem to have an uncanny ability to adapt to new inputs.
We're already using the tongue to replace optical input for people who have been blinded.
There's also the possibility of tapping in at the ears, the spinal cord, hell, it's entirely conceivable you could cut off a hand and attach an implant there. When you think about the amount of bandwidth that your mind has between your hand and your brain (seriously, move your hand a bit and think about how much data that is in absolute terms) there's a whole lot of bandwidth in established connections, especially for someone who has been maimed and has brain pathways lying dormant for controlling a limb.
That's the one piece I pulled out of my ass to conclude with a bit of hyperbole :-) I doubt a 2010 PC could render Toy Story 3 in a year if things are similar now to how they were with Toy Story 1 in 1995. I'd also be curious to see how modern PCs would cope with Toy Story 1 though..
[Update: If it helps rescue me, I found a page about the difference between rendering TS1 and TS3 in their respective times:
It turns out the original Toy Story took an hour per frame to render, whereas Toy Story 3 can render in real-time (or 24fps, at least.) Given that's probably on a server farm, my claim is still hyperbole, unless you expect cloud processing to be mainstream on all devices by then ;-)]
Assuming a single machine (which is obviously not true, but I can't find any details about the size of the render farm), that'd mean the entire film would take something like two hundred and thirty seven years to render.
Also, note that the WSJ article you linked to is talking about Toy Story 3D which is a re-rendering of the original film in stereoscopic 3D.
Mobile phones being roughly ten years after PCs which in turn are roughly ten years after supercomputers has been a fairly consistent pattern for quite some time. At least for the decade I've been working with mobile phones.
The Apple /// is a bit misleading - the typical entry-level Apple computer in 1980 was an Apple ][+, at about half the specs (memory and CPU performance) and a cost of around $2k, with the monitor and floppy drive thrown in.
Your statement is more fallacious: the iPod Touch has the same hardware, but a TCO exactly equivalent to its retail price. Similarly, you could buy an unlocked iPhone and then just turn off the 3G radio. It would still be strictly more powerful.
To pick nits, the iPod touch lacks the camera and GPS. An unlocked iPhone is more expensive than a locked iPhone. None of that makes the grandparent post correct, but it does bear pointing out that your alternatives are either inferior or cost more than $200.
N900 works great as a wifi phone (since it has skype, google talk and google voice) and costs $580. Of course, it's also a nearly complete linux computer.