It seems like a sound principle, in theory, to trade memory by programmer time. After all, one can buy a lot of RAM for the cost of a programmer's work day (even the expensive ECC RAM used on servers).
However, most people don't seem to be able to tell where that stops making sense, and the end result is software that's slow no matter how much memory you throw at it.
Anyone that has worked for any amount of time with "enterprise" software knows what I'm talking about.
However, most people don't seem to be able to tell where that stops making sense, and the end result is software that's slow no matter how much memory you throw at it.
Anyone that has worked for any amount of time with "enterprise" software knows what I'm talking about.