Hacker News new | past | comments | ask | show | jobs | submit login

It's a relief that AMD has a performance/watt alternative to bulldozer. I sure hope they can keep being a business so I have someone to buy hardware from that doesn't fuse off features to screw us out of a %65 margin.

Either way, I'm hoping ARM64 will trickle up from iThingies to the desktop so I can buy a CPU with Virtual MMIO without paying an extra hundred bucks.




Maybe their desktop and server CPUs aren't so hot right now, but I don't think you have to worry about AMD for a while. All of the current generation of consoles have AMD GPUs, those GPUs are on the same die as an AMD CPU for two of the three (PS4 and Xbone), AMD GPUs remain competitive with NVidia's offerings, and they seem to be winning mindshare with their lower-level Mantle graphics API.

EDIT: And the whole Bitcoin-mining thing (or Litecoin/Dogecoin mining thing, these days), as mentioned in another thread.


Does AMD GPU hardware have a general advantage mining?

err, can you link me to the other thread?


Nvidia intentionally cripples the double precision floating point performance of their "gaming" cards to make the market for Tesla cards.

AMD doesn't, which is why you don't see much firepro, and why their entire line sells for so much right now while mining on hashes is big.


Neither sha256 (bitcoin) nor scrypt (litecoin) mining uses floating point operations.

AMD cards are faster because they are built with more, simpler "cores", compared to Nvidia's fewer, more complex "cores". Mining benefits from the increased parallelization and doesn't benefit from Nvidia's "fancy" "cores". For sha256 mining, AMD cards gain an additional advantage by supporting a bit rotation instruction that Nvidia cards do not.


AMD GPUs are about 4x faster for bitcoin mining and about 2x faster for litecoin mining compared to similarly priced/energy consuming Nvidia GPUs.


> err, can you link me to the other thread?

https://news.ycombinator.com/item?id=7141039

They just mentioned bitcoin mining and didn't go into any detail.

> Does AMD GPU hardware have a general advantage mining?

Yes. Disclaimer: I don't mine anything, am not that familiar with how Bitcoin works, and mostly heard about this stuff on the grapevine, so some of this is probably off. Corrections are welcome, if there are any cryptocurrency experts lurking.

At first, Bitcoin mining was done on CPUs. It uses SHA-256 for hashing, which (relatively speaking) isn't that difficult to compute. At some point, someone developed a GPU implementation for it, after which GPU mining quickly overtook CPU mining in terms of cost efficiency. The problem is easily parallelizable (just run a separate hashing procedure on each of the many processing elements in the average GPU), making it a better fit for GPUs than CPUs.

The short reason[1] that AMD GPUs are better than NVidia GPUs for this purpose is that while NVidia's GPUs use more powerful, but lesser in number processing elements, AMD GPUs use less powerful but more numerous processing elements (a processing element is basically marketing speak for each of the dozens or hundreds of small, specialized "CPUs" that make up a GPU). For mining Bitcoins, the extra capabilities of the NVidia processing elements basically go to waste, while AMD's cards, with more, less power-consuming elements, give you both "more hash for the dollar" and "more hash for the megawatt hour." This caused a huge spike in the value of ATI cards, completely unrelated to demand for PC gaming.

However, because this was a trivially parallelizable problem and there is big money at stake, miners came up with FPGA-based solutions (and later ASICs) dedicated to the purpose of mining Bitcoins, which in turn took the Bitcoin mining throne from GPUs. As I understand it, at this point mining Bitcoin with a GPU is a net-negative, and you need an ASIC farm to actually make anything off the operation.

At another point, Litecoin came around, and one of it's design goals was to be only practical to mine on CPUs, so that Bitcoin miners could make use of the underutilized CPU in their PC-based mining setups. It used the scrypt algorithm in place of SHA-256 for this purpose, which was designed to be computationally expensive and difficult to practically implement on FPGAs or ASICs (and therefore, more resistant against brute-force password hashing attacks), in particular by requiring a lot more memory than it would make sense to allocate a hashing unit on dedicated hardware. Unexpectedly, someone came up with a performant GPU implementation for that as well, giving AMD GPUs back the throne for mining (of Litecoins, and the Litecoin-derived Dogecoin). At this point, there is no sign of a cost-effective FPGA or ASIC-based scrypt mining device, so it looks like things will remain that way for a while.

[1] The gory details here, including something I wasn't aware of until now: NVidia GPUs lack an instruction for a certain operation necessary for SHA-2 hashing that costs them a couple of instructions each loop to emulate. AMD GPUs do have an instruction for it, so this automatically gave them another advantage over NVidia GPUs for mining purposes. Not sure if this applies to scrypt as well, but I'd guess so, since it was derived from SHA-2.

https://en.bitcoin.it/wiki/Why_a_GPU_mines_faster_than_a_CPU...


As far as I understand, this advantage of AMD was only -or mostly - present on its 5xxx and 6xxx series of GPUs. The new 7xxx (and 8xxx) ditches VLIW: http://www.anandtech.com/show/4455/amds-graphics-core-next-p...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: