Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> err, can you link me to the other thread?

https://news.ycombinator.com/item?id=7141039

They just mentioned bitcoin mining and didn't go into any detail.

> Does AMD GPU hardware have a general advantage mining?

Yes. Disclaimer: I don't mine anything, am not that familiar with how Bitcoin works, and mostly heard about this stuff on the grapevine, so some of this is probably off. Corrections are welcome, if there are any cryptocurrency experts lurking.

At first, Bitcoin mining was done on CPUs. It uses SHA-256 for hashing, which (relatively speaking) isn't that difficult to compute. At some point, someone developed a GPU implementation for it, after which GPU mining quickly overtook CPU mining in terms of cost efficiency. The problem is easily parallelizable (just run a separate hashing procedure on each of the many processing elements in the average GPU), making it a better fit for GPUs than CPUs.

The short reason[1] that AMD GPUs are better than NVidia GPUs for this purpose is that while NVidia's GPUs use more powerful, but lesser in number processing elements, AMD GPUs use less powerful but more numerous processing elements (a processing element is basically marketing speak for each of the dozens or hundreds of small, specialized "CPUs" that make up a GPU). For mining Bitcoins, the extra capabilities of the NVidia processing elements basically go to waste, while AMD's cards, with more, less power-consuming elements, give you both "more hash for the dollar" and "more hash for the megawatt hour." This caused a huge spike in the value of ATI cards, completely unrelated to demand for PC gaming.

However, because this was a trivially parallelizable problem and there is big money at stake, miners came up with FPGA-based solutions (and later ASICs) dedicated to the purpose of mining Bitcoins, which in turn took the Bitcoin mining throne from GPUs. As I understand it, at this point mining Bitcoin with a GPU is a net-negative, and you need an ASIC farm to actually make anything off the operation.

At another point, Litecoin came around, and one of it's design goals was to be only practical to mine on CPUs, so that Bitcoin miners could make use of the underutilized CPU in their PC-based mining setups. It used the scrypt algorithm in place of SHA-256 for this purpose, which was designed to be computationally expensive and difficult to practically implement on FPGAs or ASICs (and therefore, more resistant against brute-force password hashing attacks), in particular by requiring a lot more memory than it would make sense to allocate a hashing unit on dedicated hardware. Unexpectedly, someone came up with a performant GPU implementation for that as well, giving AMD GPUs back the throne for mining (of Litecoins, and the Litecoin-derived Dogecoin). At this point, there is no sign of a cost-effective FPGA or ASIC-based scrypt mining device, so it looks like things will remain that way for a while.

[1] The gory details here, including something I wasn't aware of until now: NVidia GPUs lack an instruction for a certain operation necessary for SHA-2 hashing that costs them a couple of instructions each loop to emulate. AMD GPUs do have an instruction for it, so this automatically gave them another advantage over NVidia GPUs for mining purposes. Not sure if this applies to scrypt as well, but I'd guess so, since it was derived from SHA-2.

https://en.bitcoin.it/wiki/Why_a_GPU_mines_faster_than_a_CPU...



As far as I understand, this advantage of AMD was only -or mostly - present on its 5xxx and 6xxx series of GPUs. The new 7xxx (and 8xxx) ditches VLIW: http://www.anandtech.com/show/4455/amds-graphics-core-next-p...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: