Curious about the latest (?) tech. nvida's RTX, I found this:
> RTX causes a visible performance hit, which is offset by a technology known as DLSS, which stands for deep learning supersampling. In order to create this, Nvidia trains a neural network on pre-release game engine images at lower and higher resolutions. The AI provides the weights information for Tensor Cores in consumer GPUs through driver updates.
> When DLSS is turned on, the game is rendered at a lower resolution, with Tensor cores working to upscale to a higher resolution using Deep Learning. This results in a higher frame rate with a slightly worse image at high resolution. This can be used in conjunction with ray tracing to provide better framerates. NVIDIA claims that users can achieve performance similar to ray tracing off with a combination of DLSS and ray tracing on.
AMD in turn whipped out contrast adaptive sharpening shader that does the same thing without bingo filing buzzwords. https://www.techspot.com/review/1903-dlss-vs-freestyle-vs-ri... This move forced Nvidia to work on their own sharpening filters making fancy DSLL obsolete with dump brute force filter. So much for AI.
https://clara.io/view/5c7d28c0-91d7-4432-a131-3e6fd657a042
It is based on the DICE method described here: https://colinbarrebrisebois.com/2011/03/07/gdc-2011-approxim...