Hacker News new | past | comments | ask | show | jobs | submit login
GPU Price Tracker (unitedcompute.ai)
50 points by ushakov 11 hours ago | hide | past | favorite | 39 comments





A price tracker should be more sophisticated than just pulling a single listing from Amazon and dumping whatever price a random 3rd party reseller is listing it for.

The prices here are not indicative of the market at all. They're a single sample from the most egregious offender.

More data points for a 5090 Founders

   - Amazon $4,481

   - StockX $3,447

   - eBay   $3,500-$4,000
I hope whatever product "United Compute" is selling is more thoughtfully constructed than this.

I find skinflint[0] is good for this sort of long-term tracking.

[0] https://skinflint.co.uk/?cat=gra16_512&view=gallery&pg=1&v=e...


It's a .ai domain.... they're selling "Wrapped LLM"

Furthermore, the menu is missing a close button, the components look like shadcn or AI generated, and overall it's not well optimized for mobile.

Also listing Coca Cola in the team section, without indication of a partnership or investment - likely as a joke - is not a smart move.

It looks like - and probably is - a random assortment of projects from a single person, the "branding" is simply not reflecting this.


I own stocks for Nvidia, I believe they will still climb higher than ever before. But at home, my setup has AMD components because they are more worth it.

I am more into AMD cards than anything, I wish this site also tracked the prices of AMD aswell.


> I believe they will still climb higher than ever before.

I think this expectation is already priced in. I invested when I saw LLMs kicking off with no reflection in the NVIDIA share price and made 10x when the market caught up.

Now with tariff uncertainty and trump signalling to China (via Russia) that there would be no repercussions for invading Taiwan I’m less convinced there is growth there, but the possibility of massive loss. In the days of meme stocks this might not matter of course.

Note that an invasion of Taiwan would have huge implications everywhere but any company that needs leading edge semiconductors to directly sell their products would be screwed more than others.


Why is memory stuck at such low values while applications clearly demand more?

My guess would be 'artificial scarcity for the purpose of market segmentation', because people probably wouldn't buy that many of the expensive professional cards if the consumer cards had a ton of VRAM.

I'm surprised the RTX cards don't have a Terms of Use that prohibits running CUDA on them. They already removed NVLink from the 40-series onward. Maybe running 8k VR could use the 32GB on the 5090 but I can't imagine much else that's not compute.

I'm looking forward to newer APUs with onboard 'discrete' GPUs and quad or more channel LPDDR5X+ and 128GB+ unified memory that costs less than an M3 Ultra.


HBM are in very limited supply and NVidia tries to buy all the stock it could find at any price[1][2]. So the memory literally couldn't be increased.

[1]: https://www.nextplatform.com/2024/02/27/he-who-can-pay-top-d...

[2]: https://www.reuters.com/technology/nvidia-clears-samsungs-hb...


how about GDDR?

Ok, time to start supporting another brand folks.

Nvidia indeed does this with the *60 cards, which are limited to 8 GB. They probably copied this upselling strategy from Apple laptops.

Except now, Apple with it's shared VRAM/RAM model now has better deals especially past 24GB of VRAM than you get with Nvidia now (for inference at least).

A Macbook or Mac Mini with 32GB as a whole system is now cheaper than a 24GB Nvidia card.


Applications that consumers use (games and desktop) work fine with the amount of memory that consumer GPUs have.

GPUs targeting more RAM-hungry applications exist, but they’re quite a bit more expensive, so people who play games buy gaming GPUs while people who need more VRAM buy cards targeting that application.

Why would a consumer want to pay for 40GB of VRAM if 12GB will do everything they need?


> work fine with the amount of memory that consumer GPUs have

Most consumers buy GPUs to play videogames. Recently, nVidia launched two flavors of 5060 Ti consumer GPU with 8GB and 16GB memory, the cards are otherwise identical.

Apparently, the 8GB version is only good for 1080p resolution with no DLSS. In many games, the difference between these versions is very substantial. The article says 8GB version is deprecated right at launch: https://videocardz.com/newz/nvidia-geforce-rtx-5060-ti-with-...


It looks like the 8GB cards are about $60 (10-12%) cheaper than the 16GB cards.

I sure don't want a world where we only have 32GB 5090s and nVidia reaching farther down the price-vs-performance curve to offer consumers a more affordable (but lower performing) choice seems like a good, rather than a bad, thing to me. (I genuinely don't see the controversy here.)


> 8GB cards are about $60 (10-12%) cheaper

nVidia says the launch prices for them are $400 and $500, respectively.

> seems like a good, rather than a bad, thing to me

The bad thing is the most affordable current generation card capable of playing modern videogames in decent quality now costs $500. That’s too expensive for most consumers I’m afraid.

Steam hardware survey says 7 out of 10 most popular cards are nVidia [234]060 which were sold for around $300. Despite most of them also have 8GB VRAM, when consumers bought these cards a few years ago, 8 GB was good enough for videogames released at that time.


If you're defining 4K@60Hz or 4K@120Hz as the left extreme of "decent quality", then sure.

Legacy Blu-Ray discs maxed out at 1080p30. Most people would consider those discs to be "decent quality" (or more realistically even "high quality") video, and a $400 video card is well capable of playing modern games at (or even above) that resolution and framerate. The entry-level 5060 cards are also good enough for video games released at this time, in either memory trim.


> If you're defining 4K@60Hz or 4K@120Hz as the left extreme of "decent quality", then sure.

The 8GB version struggles in 1440p, and only delivers playable framerates in 1080p with some combination of in-game settings. Here’s the original article: https://www.techspot.com/review/2980-nvidia-geforce-rtx-5060...

I agree with the author: that level of performance for $400 is inadequate. BTW, I remember 10 years ago I bought nVidia 960 for about $200. For the next couple years, the performance stayed pretty good even for newly released games.


There are supply constraints at almost every single step in the GPU supply chain.

An earthquake three months ago, production issues, and insatiable demand mean that every single GDDR/HBM chip being made at factories already operating at maximum capacity has been sold to a waiting customer.

If Nvidia wanted to double the amount of VRAM on their products, the only thing that would happen is the supply of finished products would be halved.

No amount of money can fix it, only time.


What about AMD card?

CUDA.

The website has an ".ai" domain. It's about people wanting to run inference, and maybe mine cryptocurrency and for some reason only NVIDIA cards are used for that.

You can run inference on AMD cards, ROCm[1] is a thing. I am running inference on amd cards locally.Plus the highest performing cards for computational workloads are AMD's[2] of course you can't buy these on amazon.

1. https://rocm.docs.amd.com/en/latest/index.html 2. https://www.amd.com/en/products/accelerators/instinct/mi300....


Some reason: CUDA

It really is amazing how much these have increased. NVidia 3090 for almost as much as the MSRP for 5090? Incredible!

This is now giving me scarcity mindset; when prices go back to normal, I'll only buy top tier; last longer before needing to UG. Screwed that one up last time; bought a 4080 when the window opened for a few weeks. (You could just buy them direct from Nvidia's page for a bit)

3090 is like 600-800 USD used and basically no new stock.

They have shit data since Amazon doesn't really sell most of those cards and they do no validation


Any plans to make this available as an API / link to common purchase links for each to back the "live" pricing data?

It would be nice to have something like a score to indicate how powerful it is, determined by the price, to see which one is kind of the best.

Neat: when clicking on the name, I would like to be redirected to Amazon. The link on the far right was hard to find. :)


Not really GPU tracker - more like Nvidia card comparison.

AMD has some really interesting things on the drawing board, and Apple should definitely be in the mix.


Yeah, Apple devices and Nvidia Spark would make for interesting additions. I wish we had something like a performance/$ comparison between architectures.

Not Nvidia, some of Nvidia, missing e.g. 3060

I'm curious what's responsible for the current uptick.

Maybe Moore's law is dead, but its sister about the doubling of the computing hunger every year seems to be well and fine. And wait until we get bored of using GPUs to make artists starve and finally leverage AI for something fundamentally useful--or terrible--, like helping billionaires live forever....

LAWL "It's not a bubble"

Strange specs table - it seems to ignore the tensor core FLOPs, which is what you'd be using most of the time if you're interested in computational throughput.

would love to see this become an arena where I can let my local GPU "fight" against other ones



Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: