A price tracker should be more sophisticated than just pulling a single listing from Amazon and dumping whatever price a random 3rd party reseller is listing it for.
The prices here are not indicative of the market at all. They're a single sample from the most egregious offender.
I own stocks for Nvidia, I believe they will still climb higher than ever before. But at home, my setup has AMD components because they are more worth it.
I am more into AMD cards than anything, I wish this site also tracked the prices of AMD aswell.
> I believe they will still climb higher than ever before.
I think this expectation is already priced in. I invested when I saw LLMs kicking off with no reflection in the NVIDIA share price and made 10x when the market caught up.
Now with tariff uncertainty and trump signalling to China (via Russia) that there would be no repercussions for invading Taiwan I’m less convinced there is growth there, but the possibility of massive loss. In the days of meme stocks this might not matter of course.
Note that an invasion of Taiwan would have huge implications everywhere but any company that needs leading edge semiconductors to directly sell their products would be screwed more than others.
My guess would be 'artificial scarcity for the purpose of market segmentation', because people probably wouldn't buy that many of the expensive professional cards if the consumer cards had a ton of VRAM.
I'm surprised the RTX cards don't have a Terms of Use that prohibits running CUDA on them. They already removed NVLink from the 40-series onward. Maybe running 8k VR could use the 32GB on the 5090 but I can't imagine much else that's not compute.
I'm looking forward to newer APUs with onboard 'discrete' GPUs and quad or more channel LPDDR5X+ and 128GB+ unified memory that costs less than an M3 Ultra.
Except now, Apple with it's shared VRAM/RAM model now has better deals especially past 24GB of VRAM than you get with Nvidia now (for inference at least).
A Macbook or Mac Mini with 32GB as a whole system is now cheaper than a 24GB Nvidia card.
Applications that consumers use (games and desktop) work fine with the amount of memory that consumer GPUs have.
GPUs targeting more RAM-hungry applications exist, but they’re quite a bit more expensive, so people who play games buy gaming GPUs while people who need more VRAM buy cards targeting that application.
Why would a consumer want to pay for 40GB of VRAM if 12GB will do everything they need?
> work fine with the amount of memory that consumer GPUs have
Most consumers buy GPUs to play videogames. Recently, nVidia launched two flavors of 5060 Ti consumer GPU with 8GB and 16GB memory, the cards are otherwise identical.
Apparently, the 8GB version is only good for 1080p resolution with no DLSS. In many games, the difference between these versions is very substantial. The article says 8GB version is deprecated right at launch: https://videocardz.com/newz/nvidia-geforce-rtx-5060-ti-with-...
It looks like the 8GB cards are about $60 (10-12%) cheaper than the 16GB cards.
I sure don't want a world where we only have 32GB 5090s and nVidia reaching farther down the price-vs-performance curve to offer consumers a more affordable (but lower performing) choice seems like a good, rather than a bad, thing to me. (I genuinely don't see the controversy here.)
nVidia says the launch prices for them are $400 and $500, respectively.
> seems like a good, rather than a bad, thing to me
The bad thing is the most affordable current generation card capable of playing modern videogames in decent quality now costs $500. That’s too expensive for most consumers I’m afraid.
Steam hardware survey says 7 out of 10 most popular cards are nVidia [234]060 which were sold for around $300. Despite most of them also have 8GB VRAM, when consumers bought these cards a few years ago, 8 GB was good enough for videogames released at that time.
If you're defining 4K@60Hz or 4K@120Hz as the left extreme of "decent quality", then sure.
Legacy Blu-Ray discs maxed out at 1080p30. Most people would consider those discs to be "decent quality" (or more realistically even "high quality") video, and a $400 video card is well capable of playing modern games at (or even above) that resolution and framerate. The entry-level 5060 cards are also good enough for video games released at this time, in either memory trim.
I agree with the author: that level of performance for $400 is inadequate. BTW, I remember 10 years ago I bought nVidia 960 for about $200. For the next couple years, the performance stayed pretty good even for newly released games.
There are supply constraints at almost every single step in the GPU supply chain.
An earthquake three months ago, production issues, and insatiable demand mean that every single GDDR/HBM chip being made at factories already operating at maximum capacity has been sold to a waiting customer.
If Nvidia wanted to double the amount of VRAM on their products, the only thing that would happen is the supply of finished products would be halved.
The website has an ".ai" domain. It's about people wanting to run inference, and maybe mine cryptocurrency and for some reason only NVIDIA cards are used for that.
You can run inference on AMD cards, ROCm[1] is a thing. I am running inference on amd cards locally.Plus the highest performing cards for computational workloads are AMD's[2] of course you can't buy these on amazon.
This is now giving me scarcity mindset; when prices go back to normal, I'll only buy top tier; last longer before needing to UG. Screwed that one up last time; bought a 4080 when the window opened for a few weeks. (You could just buy them direct from Nvidia's page for a bit)
Yeah, Apple devices and Nvidia Spark would make for interesting additions. I wish we had something like a performance/$ comparison between architectures.
Maybe Moore's law is dead, but its sister about the doubling of the computing hunger every year seems to be well and fine. And wait until we get bored of using GPUs to make artists starve and finally leverage AI for something fundamentally useful--or terrible--, like helping billionaires live forever....
Strange specs table - it seems to ignore the tensor core FLOPs, which is what you'd be using most of the time if you're interested in computational throughput.
The prices here are not indicative of the market at all. They're a single sample from the most egregious offender.
More data points for a 5090 Founders
I hope whatever product "United Compute" is selling is more thoughtfully constructed than this.reply