The paranoid conspiratorial tone of TFA is a symptom of what's wrong with the Internet and therefore humanity:
> The big brands that are ultimately at stake (Sony, Microsoft, NVIDIA, AMD) all talk about “demand exceeding supply”. Oh, really? [New paragraph.] It is a rather evasive explanation, without any communication about the actual state of their supply and production lines…
The dark secret? Not only is demand GREATER THAN supply, but supply is — wait for it! — LESS THAN demand! OMG, they've been covering this up the whole time! And not only that, current events and world affairs are contributing to this situation! When will the sheeple wake up?!
You mentioned this briefly, but: I think the big problem is the stay-at-home orders combined with massive improvements in technology results in tons more people wanting to upgrade immediately than initially expected (based on previous releases).
It wouldn’t make sense for Nvidia to produce less than the demand on purpose because that results in less money. If we take the view of companies want to maximize profit, it would make sense that they’d try to fulfill demand. Another big problem is scalpers and websites not doing enough to stop it. They create an artificial scarcity so they can make a quick buck. Sure, a PS5 isn’t “essential” like toilet paper is, but it’s still a dick thing to do.
Scalpers aren't powerful enough to create an artificial scarcity. If there were not really an increased demand for the products, noone would pay their inflated prices. When demand outstrips supply, the logical thing for manufacturers and (e-)retailers to do would be to increase prices themselves to balance out the situation, but for various reasons (one of which is possible negative PR) they are reluctant to do that, so the scalpers profit...
Your argument seems to be scalpers don’t have power because OEMs could just raise the price themselves and get all that scalp profit.
That’s good in theory but in practice you have complex sales channel configurations and contractual agreements. If Best Buy is selling your GPU at $400 because that’s the agreed upon MSRP, they’re going to be pretty unhappy if they’re out of stock and suddenly you decide to increase MSRP to $800 after they’ve sold out because then they’ve made even less money.
So your job now is to figure out what the scalp price will be in the future before there’s any demand.
Additionally, please factor in your pissed off customers who paid you $800 for launch day delivery for the price to potentially drop to $400 a few months after. What’s the cost of that brand damage?
Scalpers have numbers (if demand and supply are close, they have an artificial power to restrict it) and power (the OEMs have to tread a delicate line with many constituencies that scalpers don’t since you “blame” scalpers but that doesn’t hurt anyone’s actual reputation).
For all we know scalpers are part of the business plan. NVidia gets record sales numbers and press articles about how the GPU is sold out. All the OEMs offer preferential access to buy the GPUs giving them better margins, getting rid of inventory, and no reputation risk beyond at worst some marginal grumbling that they’re not doing enough to stop scalpers. The only ones getting screwed are early adopters who are eager to pay higher prices.
It’s the perfect way to segment the market by price sensitivity without paying for it or running afoul of any laws if it’s being done intentionally.
So, I heard an interesting solution to the scalper price problem, where suppliers are unable to raise prices.
What Nvidia could do is sell a special "doctors without border's edition" version of their graphics card, that is 500$ more expensive, and the extra money is donated.
That way, the extra money goes to charity instead of the scalpers, and nobody can really complain about it, because what do you hate charity?
I would think that you would not be allowed to deduct the full $500 because you are getting a service — the ability to "skip the line" and grab a graphics card. You can only deduce $500 - FMV of that service, which a reasonable person might say is the difference between the street price of the scalped non-DWB cards and the DWB cards.
If the street price of the scalped cards is higher than the DWB edition, then you would not be able to deduct anything.
This would be for people who want to pay the market price, but who don't want to participate in scalping. If you'd rather wait, then you're not the target market.
I think AMD/NVIDIA are stupid (in a capitalistic sense) for selling their products well below market price, and leaving money on the table for scalpers to collect.
> in practice you have complex sales channel configurations and contractual agreements. If Best Buy is selling your GPU at $400 because that’s the agreed upon MSRP
Just to be clear, you cannot contractually obligate a vendor to sell your product at the MSRP. (Or any other price.) That's the whole reason the term is "manufacturer's suggested retail price".
> ...but in practice you have complex sales channel configurations and contractual agreements...
If simple makes more money these companies can do simple. Every company has a small group of people who can make things happen quickly (which is a couple of months in a big corporation). A price change from $400 -> $800 is going to get their attention.
It seems to me more likely that the reason for a slow response response is that the governments of the world seem to hate change and will likely penalise companies for responding quickly to extreme market signals. The safe way to do it is to release a 'new' product at a vastly inflated price. Basically that is just working around cognitive biases but it works.
The point about scalpers being part of the business plan gave me a few moments of thought; thanks for that.
> Scalpers aren't powerful enough to create an artificial scarcity.
Of course they are. Back in ye olde days when GPU mining for bitcoin was still possible (basically, FPGAs were not widespread and ASICs not even on the horizon), the amount of people who bought out all usable GPUs and resold them on ebay was massive. It was almost impossible to get anything for months unless you were willing to pay many multiples the ordinary worth.
In concert and other event tickets it's even worse. Some tickets have been known to sell at 20x markups, Rammstein concerts or Wacken tickets are usually outsold in a matter of hours and end up on ebay the day after.
> In concert and other event tickets it's even worse. Some tickets have been known to sell at 20x markups, Rammstein concerts or Wacken tickets are usually outsold in a matter of hours and end up on ebay the day after.
That's because there's built-in scarcity by virtue of physical limitations of a venue.
If your stadium only holds 15,000 people, but your city has 11 million or more like, say, the Dallas-Fort Worth metroplex, you can easily find 15,000 metalheads. Probably an order of magnitude more.
People keep talking about "artificial scarcity" with these GPUs, but its not artificial. The foundries that make these are running at max capacity 24/7. They are trying to get these cards into the hands of the people that want them, there's just simply not enough foundries and capacity available.
There isn't some grand conspiracy in this case. In this case, this is the effect of increasing consolidation of the computer industry as a whole.
Look back the 1980s at the amount of hard disk manufacturers. There were easily 10+ companies... Western Digital, Seagate, Connon, Maxtor, Quantum, etc. Now what are there? Three? Seagate, Toshiba, and Western Digital?
This must mean that these concert tickets aren't being sold at market prices to start with...
Now, a much worse situation is for critical health products, like what happened with masks and hydroalcoholic gel during the start of Covid. Not surprising that the governments intervened and made scalping of that illegal, and relaxed some rules on their fabrication.
(But on the other hand the governments did some "scalping" themselves for the medical workers, by not only requisitioning masks, but also making propaganda about how masks weren't effective!)
> This must mean that these concert tickets aren't being sold at market prices to start with...
The "market" is more than just the price of goods sold, it's always also the environment. Events have to at least show some level of affordability for the common man or public backlash occurs.
Hardly anyone would go to a soccer match if the starting price for the ticket was in the four-digit range, any band that dared to put up scalper-market prices as base prices would be flamed to death as "elitist" by the media and the fans alike. I mean, people are already claiming that Wacken has gone too elite and Rammstein sold out to the rich.
Also, politics would intervene because many venues have been built entirely or largely with taxpayer money or the operation is supported by taxpayer money (tax credits, public transport, public parking).
Easy access to code to run your own bot doesn’t help. In 2000, the internet was pretty new, but now, YouTube exists and can teach anyone to set up their own bot for a chance to “get rich quick!”
I haven't looked, because I would get too depressed, but I'm sure YouTube contains videos teaching people how to get rich quick by creating videos that teach people how to get rich quick by…
Just my own tin-foil observation: the prices of components in much demand tens to remain much higher than the original anounced price for years.
Scalpers cannot account for that.
Surely Nvidia is actually chatging premium prices as well and blames it on scalpers and unexpected demand.
Nvidia just tricks the world with the low announced price, bates and then switches, realtors do that all the time.
There's some relevant backstory to understand what was said - Nvidia's CEO kicked off the RTX 3000 launch (which was the first of the recent chain of products that have been hard to get) by saying "The 3080 and 3090 have a demand issue, not a supply issue" and as you just pointed out this doesn't make any sense but here we are having to point out there are reasons in the world Nvidia can't produce enough GPUs since that sounds too negative for them to say outright for some reason.
Since companies have at least admitted there isn't going to be a lot of supply (some after reality actually hit them but at least they eventually said it) but still while dodging actually admitting it's simply because of the reasons in the article, hence the article.
I think it's a valid point and certainly not conspiratorial or what's wrong with humanity even if it wasn't.
Yea, this also happens periodically, granted at lesser degrees. Incorrect projections, retooling, mundane management changes that HAVE to make their immediate mark on production and other crap. Given the ever so minor hiccups to the global supply chain we experienced throughout the year, on top of a plant catching ablaze... this isn't that surprising. This too shall pass.
I think you're overly simplifying the issue. The interesting part here is about why there are these supply issues - are there technical limitations, lack of manufacturing capacity, manufacturing difficulties, supply issues for source materials, transportation problems or costs, etc.
> With the exception of the current RTX 3000 (Samsung), all market launches of the year 2020 rely on the 7-nm chips from TSMC. ARM-SoC manufacturers such as Qualcomm also use 7 nm and have only recently moved capacity from Samsung to TSMC. TSMC is currently pretty much alone in the market with this technology and the production lines are therefore working at full capacity. A sudden increase in production is therefore not possible.
> A special insulating film, the so-called ABF substrate (Ajinomoto Build-up Film), which is indispensable for etching chips in 7 and 5 nm, has also become increasingly scarce and scarce in recent months. The prices for this film are said to have risen by 40% and the waiting time is said to have increased to up to 4 months.
The company behind it Ajinomoto has for the last century since its existence built a business in food products: seasoning, oils, beverages, dairy products and chemicals. ABF came about through internal innovation. There's a really good (perhaps unintentionally funny) video of it here:
I feel like this is the kind of video they would show to every employee on their first day at work, and to every school class that wants to pay them a visit at their factories. And it's really entertaining too!
Interesting, even general-purpose chip makers like ST, NXP and Infineon experience shortage.
> Other components make things even more complicated: The small additional chips (controller, supervisor, etc.) from ST Micro, NXP or Infineon are also missing due to another deficiency: silicon wafers with 8 inch diameter. Many products then suffer from this, such as PC power supplies, but also consoles of the PS5 and Xbox series X and the graphics cards. Flash memory and DRAMs, on the other hand, do not seem to be in real short supply [...]
It's a real problem. There's a worldwide shortage of high-end fab capacity. That's not an easy business to enter. TSMC is spending US$20 billion on their newest fab. They started in 2018, and they expect to get parts out in volume in 2021 or 2022.
If you want to duplicate TSMC, you'd need a huge amount of money. TSMC is valued at around US$500 billion. Anyone not already a major fab operator would probably fail. They'd have to go through several generations of fabs to catch up. Meanwhile, TSMC is moving forward. China is trying to catch up.
There's never been an industry like this before, where it costs half a trillion dollars to be a player. Not just to be #1, but to even be in the game.
The shortage is actually a direct outcome of Intel's problems. Historically Intel had by far the highest capacity for leading edge fabrication processes - just for their own CPU production. It didn't make sense for companies like TSMC to invest in additional fabs as long as Intel had the lead. AMD taking the lead and moving ~25% of x86 CPU production to TSMC is a huge shift.
I suspect this problem is only going to get worse.
Each node shrink has come at incredible costs and there does not look to be any path towards decreasing those costs.
Eventually, there's simply going to be no ability to shrink. After all, we have a lower limit of single atoms. They have a radius of around 0.06 -> 0.5 nm. That's not terribly far from the current state-of-the-art processes of 7nm. Further, I doubt we'll ever get to that level in the first place without some really clever engineering. The issue to solve is electromagnetic radiation will induce currents in surrounding conductive material.
I really don't know where the industry ends up going when we hit that brick wall. Different materials? More exotic CPU designs?
Honestly, that will probably end up being what happens. Once node shrinks are exhausted we'll see efforts poured into improving yield and using better materials. It will be less about getting smaller and more about refining everything.
Once the node shrinking stops, that's when you are likely to see R&D funds go away from manufacturing processing and towards chip and architecture design. Because that's the only place you'll see improvements is a more well designed chip.
Aircraft manufacturing, maybe. Only two companies in the world can make a good large jet airliner - Boeing and Airbus. It took Airbus 21 years of government support to get to the first good product. COMAC in China is now trying.
Container shipping has hundreds of players. Anyone who can afford a container ship can play.
There are lots of little car companies. China has a huge number of car makers.
It's also worth mentioning that there have been an unusual high number of tech product launches really close to the end of the year in 2020 where demand is naturally high. Last year AMD had around half a year between launch and shopping season to build up stock - this year it was less than two months with additional competitors for chips launching at around the same time.
> Incidentally, the price per kilo for air transport has doubled worldwide and Europe is increasingly resorting to the new Silk Road, which China has pushed very hard: conventional road and rail transport, the prices of which have even tripled in the meantime! Especially bulky peripherals, such as PC screens, are also suffering greatly: according to a well-known brand, prices rose by 15 or 25% in France, for example.
So we are told by the FED and that there is no inflation and one of the justifications to keep rates low. If this is not inflation then what is?
What's that? Do you mean the Fed (short for “Federal Reserve”)
> that there is no inflation
The Fed doesn't produce inflation measures, it consumes them. It's the Bureau of Labor Statistics that produces the data indicating (e.g.) a 1.2% (not “no”) annual inflation by the main headline price index from Nov. 2019 through Nov. 2020.
> If this is not inflation then what is?
It is inflation in narrow specific markets; narrow markets often have rates of inflation very different than the broad aggregate measures. Relative prices of different goods are not fixed. Fed policy response primarily to broad price indexes, not any of the specific narrow markets you point to.
>So we are told by the FED and that there is no inflation and one of the justifications to keep rates low. If this is not inflation then what is?
Because inflation/cpi is measured using a basket of goods. pc parts being inflated by 25% doesn't mean there's 25% inflation overall. That said, the the money printing/ZIRP did cause inflation, it's just that it was mostly in assets, rather than in consumer goods hence why the cpi hasn't budged much.
Don't think it's relevant to the fed because it's clear there is an actual supply chain shortage which is being exaggerated by much higher than normal demand. Hence the low rates are not causing that. In fact raising rates would make it harder to borrow to increase production or to pay for the now more expensive components.
I mean, if you believe the FED then you have to just dismiss this as inflation. But I think it's a lot more complicated.
1. Basket of goods doesn't include investment goods. Economists have some fancy schmancy reason for it (basically you get your money back when you sell), but I don't buy it.
increasing monetary mass dilutes the purchasing power of the dollar. No one would argue that inflation doesn't increases the price of financial instruments - everyone correct returns for cpi. But we're correcting the returns with a metric that denies that those instruments can have inflation.
Meanwhile, what have we seen in the past 20 years?
Price of housing (not included in the cpi) and financial markets have seen great returns. This despite the '08 financial crisis.
2. Printing money isn't the only thing that affects the price of goods. The cost of producing them does as well. Inflation is hidden by the dismantling of industry.
Well what have we done over the past 30 years? We've outsourced everything to countries that exploit slave labor. So if (as an example) the dollar is half as good, but the price of labor goes from 50% to 10% of the purchasing price, most of the inflation is now masked. Does the CPI track that my 1.1X more expensive sweater was made with slave labor? Or does it just track that the hourly cost of a handyman (local labor) has stagnated?
Meanwhile things that cannot be off-shored have become expensive. My house. My education. My health care.
3. Hidden inflation. It is universally agreed that appliances made today are of piss poor quality. They're poorly made, with cheaper parts, and thinner sheet metal. How do you track that properly?
Food packages have become smaller (rounding down when doing unit conversions is an easy freebie). A 125 g. bag of chips is a 110g. bag of chips.
> Basket of goods doesn't include investment goods. Economists have some fancy schmancy reason for it (basically you get your money back when you sell), but I don't buy it.
The reason the basket of goods is focussed on consumer goods is because the price level of concern (where it is used, there are many other price indexes used for other purposes) is the cost of living, not cost of investing. Hence, the Consumer Price Index, the main inflation index monetary policy is concerned with. (There are also broad and industry-specific Producer Price Indexes, which are used for other purposes.)
The cost of investing wouldn't be measured by asset prices, but interest rates (inverted: low interest rates are a high cost of buying future money directly, which also drives up the cost of speculative instruments which may be redeemable for money in the future.) And, yes, the cost of investing is deliberately high currently, there is no secret or conspiracy about that. It's the overt policy.
> Price of housing (not included in the cpi)
Cost of housing is included in the CPI (both actual and imputed rent are part of the calculation.) It's a major component of the overall CPI. [0]
I thought they explicitly said they are letting inflation run above the target value for a short period of time in order to average out to the target value since it was allegedly so low for so long.
(Personally I'm not sure that it is, due to CPI changes and the like, but that's another debate.)
If (like me) your on a long waiting list to get a new Nvidia GPU or AMD processor this piece sheds some light on the issue from a supplier perspective.
Over the past three weeks I've had 7 store pages on auto refresh trying to catch a 5950x in stock. I've finally got one on order but it's a pre-order and its earliest ETA is Jan 12th. I guess this gives me time to try and order the GPU I want.
It isn't just CPUs and GPUs though. The cooler I want won't be in stock until July per the manufacturer. I had to get lucky to get the PSU I wanted. People are employing crazy techniques to get each piece of their machine, not just the critical components.
Scalpers are magnifying the issue. The scalping problem will be solved eventually when supply catches up, but the longer that takes, the more stock the scalpers will obtain, thus prolonging the time window scalping will be profitable.
I didn't know what I signed up for last month when I decided to do a build but I sure as hell wasn't expecting this. I only need the GPU now so at least the end is in sight.
I built a new machine back in August and the PSU was the hardest part. I tend to only buy Seasonic power supplies, but there just weren't any in stock anywhere. I ended up finding an Asus power supply on B&H and it served me well, but I really hate deviating from my approved vendor list. (So far it has treated me well. It's a 1200W unit and I can pull 1000W from it without any problems at all. The same cannot be said of other non-Seasonic units I've had over the years. I was plagued with flakiness with a Corsair 1000W unit many years ago. I wasted so much time debugging that problem; finally connecting an oscilloscope to the 12V rail and noticing that the 12V rail fell to 10V when running AVX instructions. That explains why the computer just hard powered off when I hit an AVX workload :)
ECC memory was also in a bad state then, so I suffer with non-ECC memory. So far, that is also treating me well. Can't worry about bit flips you don't know about! (taps forehead)
Yeah, I'm now happy to have switched my decade-old 400W Corsair to a (perhaps oversized) 750W one just before Covid hit. It didn't solve my RAM issues (memtest just hangs with 2×8 Go, and the PC randomly reboots with 4x8 Go), and so overclocking is still out, but at least I don't have to worry about my PSU anymore...
I've had consistently bad luck with memory, though I never went out of my way to check which chips the module was using, or whether the modules were on my motherboard's QVL. I did both of those things with my most recent build and things are working better than usual.
I haven't heard of a lot of problems with power supplies causing bad memory; usually the memory itself is just bad. RMA that stuff and re-roll.
It's interesting because I live in Tunisia (which has lots of challenges for importing stuff, one of them is lack of foreign currency) but the new Ryzen series is widely available including the 5950x. You can find it in most of the gaming suppliers (like this one https://skymil-informatique.com/processeur-amd/3357-amd-ryze...). The RTX series are also available, though I'm still waiting for AMDs.
I wanted a new GPU so I could run Cyberpunk 2077. Turns out that even people with those GPUs are having issues, so by the time I find one, the game should be sufficiently patched.
I have a massive Steam backlog, so I'm not too frustrated.
Have you tried overclocking the everloving shit out of your current card? I mean throw everything at it, best thermal paste/pads, cooling, hardware mods. Genuine question, as that's what I do before deciding to buy a new, well, anything :D
Surprisingly, it works pretty well... A few burned out GPUs and motherboards over two decades seems like a good risk/reward ratio.
But if it goes (really) wrong, you have nothing until the supply issues are resolved. Sounds like a bad plan right now. Especially if there's a perfectly good Steam backlog to get through (when is there ever NOT a Steam backlog!).
If you're _just_ overclocking, then I think that's safe to say. However, the OP was referring to things like installing new thermal pads and installing hardware mods. That sounds meaningfully more risky to the lifetime of your card :)
Admittedly I haven't tried anything newer than a GTX 1070, but on most cards you can softmod or hardmod the voltage to dangerous levels, enough to burn some VRM components or the GPU itself. It's quite fun :D
huh, strange. I would have thought it runs alright on PC especially since I had no trouble whatsoever with my Radeon RX470 (2016), running on the preferences that the game gave me (more or less middle of the road with a solution of 2560*1440).
So far the only way I've seen people get one is by standing outside microcenter before they open. My friend is doing that just this moment. He snagged me a 3060ti, he's probably about to get another GPU for himself. The bots online have been ridiculous and scalpers see blood in the water.
I find Nvidia GPU's in Europe in stock on retailers often, 3060/3070 mostly but the prices are 500 to 700 euro - well above the recommended retail price.
At the risk of asking a stupid question: why have queues instead of raising prices until demand gets nearer to the level of supply?
Not that I would want this of course, but I'd also want free bubble gum and the gum companies... they're there to make money. Why not chip makers? I don't see auctioning systems popping up, I see people in in queues for a GPU, "out of stock" pages, and purchase quantity limits being imposed.
Sure, but why bother with preorders and backlogs if you can just sell your product for more money? If one party increases prices they'll get fewer sales, but if the product isn't available to begin with that doesn't matter (up to a point). I can see the appeal of having the money before you have to produce the product (due to preordering) but it doesn't seem beneficial for the bottom line.
It's the manufactured scarcity which helps them make money most of the time. They have to create hype and desire and exclusivity... and repeat it indefinitely.
They could change out of that cycle now that it's actually scarce... but most of these chip manufacturers are too big to be flexible like that.
> They have to create hype and desire and exclusivity
That would be the same also with absurd prices right? (See brand clothing, brand phones, brand mice, etc.) It's not as if production increases when prices increase.
> You cannot add features and features and get each user a new chip every 2 year.
Except we can.
Video games continue to get more graphical enhancements: and people continue to want to buy stronger GPUs (Raytracing, VRS, etc. etc.) so that its possible for them to see these improved graphics.
Be it high-framerates for esports, or 4k graphics, or virtual reality, or raytracing. Video gamers have always pushed for better-and-better.
There's nothing wrong with being excited for these new features. It probably sucks to reprogram these engines every few years to take advantage of these features... but that's a job lol.
There are plenty of video-game developers who don't care much about the latest-and-greatest graphics. Instead, they build a "retro" look and try to compete on gameplay alone. That's also good and fine (and I usually tend to buy these games instead).
I mean, they're games. They're supposed to innovate and excite us in new ways. They're entertainment. Some developers choose the graphical route, others choose gameplay. Nothing wrong with that.
I think it can be done without having to obsolete previous hardware. I'm not saying to stop performance improvement, just that it could be done more responsibly, to avoid a system of growth and short lifecycle for profit motives.
This would be much easier if they published/maintained drivers for older chips but many companies do not because they "want to protect IP."
I would argue most of the "IP" is really just forced API incompatibilities so that most software only works on their devices and anything that isn't is irrelevant to reverse engineering but it doesn't matter, there's surprisingly little money to be made manufacturing things that work well and shocking amount to be made shoving things that barely work down everyone's throats.
I'd like to hear more from companies, not consumers, that can't get the computers they want. How do they manage? More reuse? I traded up my computer at work, but it turns out they had to give my old one to a new employee due to shortages.
I started wondering as soon as Apple M1 was announced. Under a comment about porting Linux to Apple Silicon [0], citing the lack of official support and documentation, my closing remark was
> Unless the circumstances change and indicate otherwise, I think the Apple Silicon [Linux] port will be a serious waste of community time, talents, and resources. It's better to spend time on a platform where vendor support and documentation exists... However, the conclusion assumes that a serious competitor of Apple Silicon will eventually emerge and more supportive to the community, but it won't necessarily happen. I'm somewhat afraid that biting the bullet and reverse engineering the Apple Silicon could be the only way to have a high-performance Linux desktop on ARM - I hope not.
Will there ever be a serious and more community friendly competitor to Apple Silicon in the next five years? And if so, who is it going to be?
Replying to myself because I'm too late to edit it:
From March 2020: "The Surface Pro X Should Scare Apple" [1]. The headline looks pretty amusing in hindsight. Apple entering this market will likely validate Windows-on-ARM as a strategy. Microsoft made a pretty strategic blunder by not working to make sure a Chrome port could be available on the first generation, but it was available on the second generation Windows-ARM laptops.
I can't manage to read this right now and skimming doesn't get me the answer I want and an internet search doesn't either. But I can't help but wonder how much precious metal bottlenecks factor into this.
Isn’t it really the semi industry unwilling to invest because they don’t want to get burned just a couple of years down the road. I just don’t see this demand lasting.
> The big brands that are ultimately at stake (Sony, Microsoft, NVIDIA, AMD) all talk about “demand exceeding supply”. Oh, really? [New paragraph.] It is a rather evasive explanation, without any communication about the actual state of their supply and production lines…
The dark secret? Not only is demand GREATER THAN supply, but supply is — wait for it! — LESS THAN demand! OMG, they've been covering this up the whole time! And not only that, current events and world affairs are contributing to this situation! When will the sheeple wake up?!