Hey, I was going to post a link to the PDF chapters of his Black Book, the massive anthology of his 90's PC game programming articles, but all I can find are broken links and a mirror on gamedev.net with no links to it. So here you go, for all you people living in 1997:
it's an interesting article because it suggests it might be possible for smart enough software running on a general purpose parallel cpu to compete with a GPU when rendering graphics, using the same amount of silicon.
...so GPUs are just a temporary hack, waiting for smart enough software (and some tweaking of the CPU instruction set) to catch up.
Hardware is always a temporary hack.
There's always a back and forth going on between specialized hardware and general purpose cpu. In the end though, software always (slightly) outpaces the hardware.
When all is said and done though, We'll be moving to realtime raytracing/radiosity for 3d rendering.
That's silly. Programmers are being brow-beaten into radically changing their programming style to accommodate a drastic 90-degree turn in the evolution of hardware, from general-purpose serial to parallel and SIMD. Look at this rasterizer, for instance. It's comically inefficient on most hardware of the last decade. If this keeps up, optimal serial algorithms are going to become just academic ideals like Turing machines, never to be implemented.
You could say there's a back-and-forth between hardware and software, but to me it looks like software is playing catchup. "Oh I've got 16-wide vectors now? Ok let me go back and completely change everything." "What, I can't load a vector from an odd-numbered array position? Hmm, there goes THAT algorithm." The electrical engineers are running us.
The silicon just won't go any faster. What do you want the electrical engineers to do? Moore got you hooked on his junk and now his supply's been cut off. People tried to switch you to something more sustainable, i.e. anything but x86 and C, but you couldn't wait for your fix.
A real hacker should be happy that there are fresh new problems to solve. Or, if you just want to get shit done, use a dynamic language and be glad that CPU speed is rarely an issue any more.
Yes, but you'll notice that it hasn't been a onesided thing, the GPU and CPU have moved towards each other as it became clear that the general purpose problems that needed to be solved were addressed rather nicely by both.
A young woman is preparing her first thanksgiving dinner. As she gets everything ready for thanksgiving day, she very sternly reminds herself to let the turkey finish thawing in the sink overnight. She puts it in and places the dishrack over the top of the bird. Her husband walks into the kitchen and sees this. "Why are you doing that?" he asks.
"My mom always did that to help the turkey thaw" she told him.
The next day Mom calls to see how everything is going. "Fine, Ma. I have everything ready to go in the oven. I even remembered to put the rack over the turkey last night."
This seemed to confuse her mother a bit. "What are you talking about?" she asked.
"Oh, I remember you always put the dish rack over the turkey when it was thawing in the sink," she said.
There was a pause on the end of the line. "Yes, but honey, we had cats!"
This isn't a perfect metaphor, but turkey == rasterizer and cats == hardware. Circumstances change, and that moves around where optimal is in the solution space. (his was far less obvious than "not using a dish rack" but the idea is the same)
I miss the classic Abrash where he begins with some personal anecdote out of nowhere:
http://www.bluesnews.com/abrash/chap68.shtml
Hey, I was going to post a link to the PDF chapters of his Black Book, the massive anthology of his 90's PC game programming articles, but all I can find are broken links and a mirror on gamedev.net with no links to it. So here you go, for all you people living in 1997:
http://abrashblackbook.infogami.com/