I know precious little about hardware stuff and I was curious, how true an emulation does this sort of methodology produce? I remember reading this article (http://arstechnica.com/gaming/2011/08/accuracy-takes-power-o...) about the difficulties inherent in true software emulation and I was wondering if someone out there could relate the two in a way a hardware/emulation newbie could understand.
It's implemented on an FPGA, it's more-or-less recreating the original (digital) hardware. The author pointed out, implementing the original CPU instructions per the spec isn't that challenging. In my 2nd and 3rd year computer engineering courses we implement CPUs which are similar in complexity. Admittedly, getting them to run in hardware versus simulation is a bit of a hurdle sometimes.
The thing is, game designers are all about juicing functionality. If they could take advantage of something that isn't in the spec ( like the extra instructions the author mentions ), they probably did. So you have to find all the common mis-behaviours, some of which might have been side effects of analog components included in the console. Similarly, the analog hardware on your FPGA board is probably nothing like what the original console contained; the sound and video output are hard or impossible to get right.
In short, you can avoid the conditions that they discuss like deadlocks more easily in a VHDL design, but things like the colours in the video may be impossible to get right (the author points this out as well).
On a side note, I don't fully agree with the linked article's 'twice as accurate, twice as slow' hypothesis. A lot of the cases they discuss are either analog effects which are expensive to reproduce in a digital emulator, or just hardware edge cases which don't add a lot of processing overhead. Basically, recreating a quirk can be very expensive or pretty cheap computationally, and there's no reason to say they'll average out over time.