If that's true, how would one rank pumping supercritical liquid nitrogen at high rates through a heatsink? Super-grandmaster?
Seems like the heat flow would be substantially impeded by any boiling of the LN2.
Or, for that matter, simply using a chilled copper ingot as the heat sink? There must be some threshold at which the limiting problem is getting the heat out of the die, not getting the heat out of the chip's package.
Carrier freezout actually makes this non-workable -- there's a limit to how cold you can make CMOS devices before they stop functioning. To say nothing of the specific heat of liquid helium, which is miniscule compared to LN2
Using a blowtorch as a heat-source is common in extreme overclocking, to heat up VRMs and memory modules up enough to boot the machine, and keep it working.
liquid helium might get into the device itself and cause other problems. (e.g. iphones stop working in the presence of helium)
Due to the Leidenfrost effect, incidental contact of bare skin and LN2 is not usually a problem (it boils off locally and creates a gaseous N2 barrier). Other things superchilled by it will not be as forgiving, however.
You insulate the motherboard. The aim is to allow only die contact the cup your N2 is in.
There are still problems like cpu/board/components not working due to low temperature or condensation. Generally they are solved by better insulation and trying different components until you find ones that work.
Overclocking is 25% cooling and 75% picking parts that can operate at low temperatures and high frequencies.
Don't forget the crazy binning for ram sticks, cpu dies, and graphics cards. There's a reason all the top guys have major sponsorships, it takes a lot of money to find the top 1% of parts.
Today we didn't learn actually. They don't mention it being stable:
“It's a lot easier to control a benchmark which is always the same”, explains Rywak. "A game makes the whole process less predictable in terms of hardware load which could lead to stability issues while working on frozen hardware.”
Not every high-hertz LCD will be better than a CRT in every dimension - black levels and off-angle viewing being the two points that a CRT could hope to lead on - but a 120hz OLED, on the other hand...
Human eyes don't have a refresh rate, per se. I recall hearing that they can discern differences in frequency up to 3 kHz or so; though obviously you can go much slower than that and still feel smooth and relatively responsive.
Object recognition requires you to see an insane amount of detail. You can reconstruct a lot, but you still have to see a lot before you can recognize an entire object.
That puts the threshold for basic perception much higher; you may not be able to see the details of the objects flashed at you for 1/500 of a second, but you can tell that something was flashed. If a bright light is strobing at 500Hz, you can tell it's strobing, not just on.
What is interesting, from the screenshot, is that the game is actually CPU bound. Contrary to an often held belief in the high-end video games optimization circles.
No. I just bought a 4k screen only to find out that 12k is coming down the pipe. I do not want to think about what 1000hz 12k screens will cost. Stop this madness now.
Something is always coming down the pipe. Fortunately you have need to walk the treadmill, and if you're only just getting a 4K screen now, then you're probably not the kind of insane early-adopter who is.
12k is possible but it's hardly around the corner. Even 4K has both content and hardware issues around it. 8K is going to be the next mass market push but we're not even done with the 4K party.
Also 4K display devices are available at a modest price point now. The bigger issues are content. We're mostly there with mass-market media, but if you want to drive a AAA video game at 4K resolution you're having to make compromises and spend a lot on the hardware to drive it.
They're going to keep making new things. And the new things are going to have bigger numbers. It's okay.
12k? Never heard of. Let's go with 8k, which is 7680x4320.
Assuming our current standard of 8 bits per color with no alpha (3 bytes per pixel), which may be too low if you care so much about your monitor, your required bandwidth becomes:
7680 * 4320 * 3 * 1000 = 99532800000
99532800000 / 1024 / 1024 / 1024 = 92 gigabytes per second of bandwidth you will consume just to pump stuff to your monitor. Better not use integrated graphics!
To give a comparison, here's 4k@60hz:
3840 * 2160 * 3 * 60 = 1492992000
1492992000 / 1024 / 1024 = 1423 Mb/s.
Also notice that 8k monitors already employ tactics such as "visually lossless compression" (which means: lossy compression but they think you won't notice) and other stuff aimed at trying to not really submit full frames all the time.
Forget your 12k. It will only be useful to increase your energy bill.
In real life it would be subsampled 4:2:0 and 1000hz is nonsense because it's not divisible by 24 or 30. So a more reasonable 8k@960hz 4:2:0 is (76804320960) * (8 + 2 + 2) = ~356Gb/s or only 35.6Gb/s if you pick a reasonable 96Hz.
By 960Hz even a lossless delta coding scheme on the wire could reduce the bandwidth by over 10X for any normal footage.
But then if you add some kind of encoding to reduce the bandwidth you sacrifice the monitor response time that a gamer who wants to run doom @ 1000fps would demand, because decoding takes some time.
For comparison, netflix was just barely able to saturate a 100gbps (that's gigabits per second, so only 12.5 gigabytes) network link from one computer, and that's just pumping data without having to render anything.
At some point, it’s not worth upgrading resolution. I don’t know what that point is for you; but, eyes only have a certain arc-length resolution, beyond which everything additional as far as resolution is meaningless.
For me, that’s a bit more than 1440p at 3 feet at 27”.
I had a quick look about to see what the eye/brain can actually perceive and the below is interesting. We can appreciate frame rates far higher than I thought.
A pilot identifying a plane displayed for 1/220th of a second (reddit link) is pretty impressive.
Yeah, and those tests were to comprehend the image to the point of identifying an air craft. If you're just trying to identify motion, you could probably perceive much higher frame rates.
There is no proven limit of how many frames per second our eyes can see, and I'm sure you would be able to discern a difference between 144hz and 1khz. You may not be able to fully comprehend each still image, but the procession would almost certainly appear smoother, especially for fast moving objects.
This is probably good enough in practice, although you can see differences even beyond 1000Hz by observing the phantom array effect of flickering signals during fast eye movement.
Nonsense. We wouldn't cope with it in the same sense we can't cope with the millions of colors in a 24-bit color space. Do we distinguish each individual color? No, but he full spectrum enables continuous color flow.
When it comes to such a high framerate, the upgrade is akin from going from 256-color palette-swapping VGA to 24-bit HD, or ather from stop motion to realistic motion blur.
I also learned that literally pouring liquid nitrogen over a CPU from a cup is a grandmaster overclocker move.