Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> HDR can't give you more range,

It gives you more range.

> so what does it actually do?

If you have done a little C programming, you will know (or can check) that a 32-bit `int` represents values from -2147483648 to 2147483647 while a 32-bit `float` goes from -340282346638528859811704183484516925440. to 340282346638528859811704183484516925440. - clearly more "range". This is almost exactly what is happening here.

(If you haven't done a little C programming, you should!)

> The only thing I've seen it do is override my brightness setting to make my screen go to full brightness when I've set it lower.

That is too bad. My experience is very good: only the QR code becomes much more striking, while the rest of the display remains the same. Perhaps it is implemented by increasing the brightness on your display to compensate for the reduced range available due to your brightness setting, but if that's what is happening here, something else is also compensating the colours to make the non-HDR areas of my display darker so that I cannot notice.



A 32 bit float and only represent 2^32 numbers (maybe less), same as a 32 bit int.

What’s happening here is using a short (16 bit) rather than a byte (8 bit), although not quite to the same extent, for each channel. HDR allows you to represent 2^6/12/18 times as many colors as SDR.


HDR10 is 10-bits per component and there are plenty of 16-bit images that aren't HDR, for example: http://i.imgur.com/Wm2kSxd.png -- the main thing about using HDR is to increase the range of values, i.e. what the biggest and smallest values are, not merely how many distinct values you can represent.


Well that’s just the colour space, you’ve also got the gamma curve to consider when you look at the perception of how something looks




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: