Hacker News new | past | comments | ask | show | jobs | submit login

I love how crypto has become easy-to-blame, when in fact datacenter after datacenter is being stuffed to the brim with chips for ML workloads. It's important to recognize when consumers are being squeezed by multiple market forces.



The issue is that crypto mining is the only system designed to have negative marginal utility for each GPU consumed. It literally pays people to take energy and GPUs away from the open market for zero incremental gain to the network.

The difference is that each additional iPhone or ML accelerator or cloud server CPU provides incrementally more utility. More ML accelerators means more ML processing.

Proof-of-work mining is the only system designed to get less efficient with each additional unit of compute power. Add another GPU to the mining network and the network will automatically become less efficient. More GPUs doesn’t mean more transactions, but the system pays people to essentially burn these GPUs.

That’s the complaint. Not that chips are being used, but that the marginal output of each additional GPU is negative.


> more ML processing

I would argue that most ML processing by most "big data" companies also has marginal negative output for each additional GPU. Certainly anything to do with social media or adtech and a lot of analytics in general is a net negative for the world, while consuming electricity and processing hardware.


Right, the ML processing is competitive in nature, just like the crypto mining is. Much of it also rests on illegal activity that just hasn't been discovered yet, whether overtly illegal (like botnets sloshing around ad money under fraudulent pretenses) or just mild to moderately illegal, like prima facie unauthorized database usage or use of scraped data. There are also uses of data that are illegal if it comes from US citizens and ?? if they are non-US citizens. Just because the government is collectively a fat slug when it comes to law enforcement in these areas doesn't mean the laws aren't on the books.


You can't seriously be conflating all the money laundering and ransomware attacks enabled by crypto with a tiny bit of crime which happens at the margins of ML (like it might in any industry) can you?


That's a value judgement, which is different than the case with POW which structurally devalues the output of each additional GPU.


It structurally pays less per GPU as you add more GPUs, because the security budget is a zero-sum budget. (though, that's only if you ignore extrinsic effects, and some people argue that more security = higher coin price = higher security budget).

Even if the security budget is zero-sum, the amount of actual security added to the network with each additional GPU is constant.


It seems like your argument is that cryptocurrency needs to use over half of the computing power on earth at any given time to consider it secure. That's a very bad standard. Visa and MasterCard don't need this much computing power for security.

Security is an easy argument to reach for when you want to justify something stupid.


Even those things you have decided provide marginal value employ people, and you never know where innovation will come from. Crypto is comparable to a ponzi scheme and is an asset with almost no real value, trading only on the greater fool theory.


If you are this reductionist you can argue all electricity consumption is net negative.


They are not being reductionist. Each additional hashing unit cancels out at the next difficulty adjustment, having a net effect of zero on the overall working of the network.

Yes, the hash rate of the network increases, thus the difficulty for an attacker to perform a double spend. But, arguably, that's an already insane level of resources, many orders of magnitude higher than what a bitcoin-like payment system would reasonably require.

So even if you consider the network socially useful, pouring thousands of times more hashing power that the network actually requires to perform its task securely is still a waste of resources.


The amount of value extracted by the person mining reduces as more compute power starts mining, but that's actually a negative feedback loop that will reduce the total spending on chips for crypto mining.

The amount of security provided to the network is constant in the number of GPUs. As you add more GPUs, the network gets harder to attack. Does the network need as much security as it's paying for? Nobody is really sure, it's an open debate in the crypto industry.


I really don't see how the current hashrate is required to prevent a 51% attack, that seems way beyond what is necessary.

These Proof of Waste systems aren't really a negative feedback loop, ultimately asset prices determine mining yield. Capacity will keep being added to the network until the difficulty rises such that you are required to destroy $99.99 of resources for $100 of crypto. Until that point it's worth buying more hardware and energy as you get a positive expected return.


The network gets more efficient, but harder to attack. Each incremental spend on a miner incrementally increases the network security.


>The issue is that crypto mining is the only system designed to have negative marginal utility for each GPU consumed. It literally pays people to take energy and GPUs away from the open market for zero incremental gain to the network.

That's because the utility is not in processing transactions but the 'security' of the blockchain, which is funny considering the most of these blockchains are still looking for a use case that justifies their market cap and yet still run into performance limitations processing transactions.


I have capital I want to deploy. Zero/Negative interest rates are a reality. Traditional broker managed investments are a hot zone for fees. Newer tools for access to markets is covered by reddit day traders; real estate is also highly competitive due to airbnb businesses and other forces.

Capital will always find the easiest path, and right now that's crypto and no one on HN is complaining about micro-VC activity.


It's mostly not ML either. There were production hiccups do to covid, increased consumer demand for chips due to covid, increased demand of mobile chips because the mobile industry is still massively growing, among other issues.

Crypto and ML are each rather small aspects of the issue at hand.


Datacenters are not deploying consumer GPUs.


But datacenter CPU/GPUs take the same wafers, machines, and factory time slots as consumer GPUs. Which that is the price increase this article is about.


Ok - I wasn’t replying to the article - I was replying to a comment about crypto-induced price increases of consumer gear being blamed on datacenters, which just isn’t true. It’s not that OEM prices are going up, it’s that retail prices are. Nvidia might sell a consumer GPU which is immediately resold on the retail market at 100% markup.

This is driven by crypto, not DL.


> Nvidia might sell a consumer GPU which is immediately resold on the retail market at 100% markup.

The same might be said of event ticket resellers making a big markup.

But the reality is that, in the case of event tickets at least, it's very common for the original sellers and the resellers either to be secretly the same party, or at least have some kind of profit sharing agreement.

It's simply that the original seller doesn't want to be seen to be price gouging, so doesn't want to put the list price of an item up, but still wants to gain. That's frequently done by it being very expensive to become an 'approved seller' or similar.


Agreed. I actually don't have a problem with ticket resellers for the very reason that if there is an active resale market the tickets were too cheap to begin with (or more likely, as you note, were never really available at those prices anyway as the promoters are selling at 2x face through the back door).

If someone is willing to pay 2x MSRP for a GPU then the MSRP is wrong (or rather, it's just a "suggestion" as the S indicates). My only point is that up until recently, there was not a long-term secondary market for GPUs. That is a new phenomenon that has been driven by crypto (as opposed to a big reduction in supply or huge increase in PC gamers).


It's still mediated by the same bottlenecks right? NVidia would be able to produce more consumer GPU's if they weren't competing for capacity at TSMC and Samsung


Not just ML, with everything moving to cloud services, while consumers still use phones plus laptops, the number of chips used per person more and more.


I'm not an advocate for cloud services by all means, but aren't they generally much more efficient in using their resources than traditional on premise solutions due to the sharing of the underlying infrastructure?


> when in fact datacenter after datacenter is being stuffed to the brim with chips for ML workloads.

I don't see this at all. All these GPUs for non gaming use are a tiny drop in the ocean in comparison to the mainstream market.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: