Hacker News new | past | comments | ask | show | jobs | submit login
2000 iMac compared to the 2010 iPhone (arstechnica.com)
124 points by jlangenauer on June 21, 2010 | hide | past | favorite | 31 comments



You even had to run unapproved software on the 2000 iMac! How far we've come.


FruityLoops and Reason compete with Garage Band too closely, application denied. Firefox competes with Safari, application delayed and then withdrawn...


Yes, it is a damn shame that Apple imposes its crazy App Store rules on the Mac as well.

(I get what you're trying to say, but that's hardly the point of the post)


Downvotes? Uh, okay.


Which led to terrible things like being able to run Flash on it, if you wanted to.


Don't forget: GPS, accelerometer, gyroscope, microphone, second microphone, still/video camera, second still/video camera, flash/light, touchscreen, proximity sensor, light sensor, and to top it all off an internal battery with a minimum life of 6 hours.


Apropos the prognostications about circa-2020 brain implants, when I was a twenty-something, back in the mid-1980s, I read Neuromancer.

And I wanted to be the first guy on my block to have a cyberspace jack in his skull.

Now I am middle-aged and cynical and slightly chewed-up by the dot-com 1.0 startup experience, I want to be the first guy on my block to have, for a brain implant ... a firewall!

(One lesson I've drawn from 20 years of reading comp.risks is that the closer we integrate software into our lives, the deadlier the consequences of security exploits. Also? The exploit may be transient but the fallout from it can potentially last for a lifetime.)


The guys behind it'll be the dealers. And given their record - basically-irreversible Parkinsonism from one tainted shot of a designer drug (http://en.wikipedia.org/wiki/MPPP, http://en.wikipedia.org/wiki/MPTP), you first.


Taking another stride forward, your brain implant in 2020 will, then, be faster than an i7, have 16GB of RAM, 2TB of storage, and be able to re-render the whole of Toy Story 3 in 10 minutes. Somehow, that doesn't even sound surprising to me anymore.


Except that it will not be an implant because individual neurons in the brain have an irritatingly low bandwidth of only a few bits per sec.

Instead it is much cheaper and hassle-free to use the firehose connection that already exists into the brain: the eyes, for which even protocols for communication are established. Screens in contact lenses are my bet on where the future of computers are headed. That will with time largely remove the need for physical objects that transfer information.

The interesting part is how Apple in 2020 will spin that they ten years earlier named a display far to bulky to fit into your eye "Retina display".


It might be possible to connect the brain implant to the brain in a similar way that the eyes are connected to it, thus creating an additional input instead of using an existing one. Brains seem to have an uncanny ability to adapt to new inputs.


We're already using the tongue to replace optical input for people who have been blinded.

There's also the possibility of tapping in at the ears, the spinal cord, hell, it's entirely conceivable you could cut off a hand and attach an implant there. When you think about the amount of bandwidth that your mind has between your hand and your brain (seriously, move your hand a bit and think about how much data that is in absolute terms) there's a whole lot of bandwidth in established connections, especially for someone who has been maimed and has brain pathways lying dormant for controlling a limb.


What numbers did you base yourself on for the Toy Story claim? I'm curious.


That's the one piece I pulled out of my ass to conclude with a bit of hyperbole :-) I doubt a 2010 PC could render Toy Story 3 in a year if things are similar now to how they were with Toy Story 1 in 1995. I'd also be curious to see how modern PCs would cope with Toy Story 1 though..

[Update: If it helps rescue me, I found a page about the difference between rendering TS1 and TS3 in their respective times:

http://watchingapple.com/2009/09/pixars-blistering-rendering...

It turns out the original Toy Story took an hour per frame to render, whereas Toy Story 3 can render in real-time (or 24fps, at least.) Given that's probably on a server farm, my claim is still hyperbole, unless you expect cloud processing to be mainstream on all devices by then ;-)]


The Wired article claims that the average render time is seven hours per frame: http://www.wired.com/magazine/2010/05/process_pixar/all/1

Assuming a single machine (which is obviously not true, but I can't find any details about the size of the render farm), that'd mean the entire film would take something like two hundred and thirty seven years to render.

Also, note that the WSJ article you linked to is talking about Toy Story 3D which is a re-rendering of the original film in stereoscopic 3D.


Sorry! I hand in my reading comprehension card forthwith :-(

However, it was all worth it for your great citation. Thanks!


The article talks about the re-render of Toy Story 1 in 3D, not Toy Story 3.


Mobile phones being roughly ten years after PCs which in turn are roughly ten years after supercomputers has been a fairly consistent pattern for quite some time. At least for the decade I've been working with mobile phones.


Hopefully in 2010 we can do this comparison with our physiological selves and be amazed - the technology leaps don't even wow me anymore.

You can call me spoiled.


1990: Mac LC

- $2400

- Mac OS System 6

- Motorola 68020 @ 16 MHz

- 40 MB Hard Drive

- 2 to 10 MB RAM

- 256 kB Graphics Memory

1980: Apple ///

- $7800

- Apple SOS

- Synertec 6502A @ 2 MHz

- 140 kB 5.25" floppy disk

- 128 kB RAM


The Apple /// is a bit misleading - the typical entry-level Apple computer in 1980 was an Apple ][+, at about half the specs (memory and CPU performance) and a cost of around $2k, with the monitor and floppy drive thrown in.


Very soon we won't need computers at all, just chips implanted in our skulls.


Ray Kurzweil was right!


perhaps you meant Gordon Moore?


Crazy to think my $200 phone is now more powerful than my $1200 desktop Mac was only ten years ago.

same old fallacy again. iPhone total cost is much more than $1200.


Your statement is more fallacious: the iPod Touch has the same hardware, but a TCO exactly equivalent to its retail price. Similarly, you could buy an unlocked iPhone and then just turn off the 3G radio. It would still be strictly more powerful.


To pick nits, the iPod touch lacks the camera and GPS. An unlocked iPhone is more expensive than a locked iPhone. None of that makes the grandparent post correct, but it does bear pointing out that your alternatives are either inferior or cost more than $200.


do you pirate all software to your iPod touch?

also please let me know when I can edit video and likes on it


No matter what phone you use, you have to pay a monthly fee to use it... If you buy the hardware alone, it's less than $1200.


N900 works great as a wifi phone (since it has skype, google talk and google voice) and costs $580. Of course, it's also a nearly complete linux computer.


ever heard of prepaid SIM cards?




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: