Forget 4K, 6K, or 8K. Pixel density is the spec when considering any display. It doesn't matter if it's a wearable, phone, laptop, monitor, TV, or stadium wrapping LED surface, the only important factors for resolution are the distance to the viewer's eyes and the distance between the pixels.
For context, 20/20 vision is the ability to perceive 1 arcminute of angular resolution. Beyond that point there are diminishing returns.
To see what this looks like for different viewing distances: https://www.desmos.com/calculator/dtrszipgmd (black line is mm, red line in freedom units, green is 220PPI reference).
20/20 vision is a low bar though. I’m bothered by how much my vision has deteriorated since I was a teenager. The difference is obvious, as I now need to walk up to things I used to read from a distance. I’ve tested 20/20 a few weeks ago.
Absolute preach right here. I would say I laugh when someone says it's not perceptually possible for me to differentiate the pixels of my 4K display from a typical viewing distance, but that's actually become such a common assertion that I'm dying to know if my vision is just that much better or theirs is just that much worse. :/
That's a good point to highlight: 20/20 is not perfect vision. It's simply a benchmark for "good enough" vision. IIRC there are physical limits around 0.4 arcmin. People with good eyesight sit somewhere between the two.
This happens naturally with age, usually starting sometime between age 45-50, as your eyes shift from variable focus to fixed focus (at distance). Google presbyopia.
So viewing from 40cm away is about the point where 220PPI crosses the point where you can tell any difference? That makes me think 220 is quite a well picked number, as that's about how far away someone might typically be viewing.
There's no sudden switch. All perception is a little squishy and dependant on the individual. For most people though, once you're creating an image that's 220PPI and viewed from 400mm or so it's probably better to dedicate bandwidth to better temporal resolution, colour depth, or lessening the compression/compute/power/cost needed to balance those three.
> So viewing from 40cm away is about the point where 220PPI crosses the point where you can tell any difference?
There is a difference between your ability to resolve, and whether you can tell _any_ difference. The usual measure of sight is visual acuity, which is your ability to discern details of a certain size at a certain distance. Specifically, in most exams, it's the ability to tell strokes from blobs.
However, even when the pixel density is beyond your ability to discern details, they can still affect your perception in other ways. For example, misalignment of borders can be detected far beyond what your visual acuity is.
On computer screens, particularly, pixel density also affects how things are rendered, think diagonal lines, curves, hollow spaces, etc. Take an extreme example, a box rendered on a 4x4 screen is going to look better than on a 2x2 screen even if the former is twice the distance away.
> Forget 4K, 6K, or 8K. Pixel density is the spec when considering any display.
I'm not really sure what you're refuting, if anything. Either I personally find 220ppi now rather unimpressive, or I found the 2015 MacBook display to have been especially impressive, but I am not sure which; all I was really saying is that they do indeed have quite similar pixel densities.
Nothing to refute - I was simply continuing from your highlight of pixel densities.
While PPI has stayed mostly static (which is a good engineering choice), plenty of other display parameters have improved: better contrast, faster response times, deeper colour, larger continuous surfaces. The 2015 MacBook display was impressive. Modern panels are too!
> The 2015 MacBook display was impressive. Modern panels are too!
Apple's always been pretty impressive with their color reproduction. Which makes sense, since they are really, really pushing hard for designers and content creators (that's one of the big central themes of nearly all Mac-related advertising). Their newer, more premium machines are mostly sporting HDR & WCG displays, and macOS has had incredibly good software support for many years.
The MacBook was a premium product for its time, so even 5 years later, there were brand new higher-resolution (and higher-PPI) panels that performed much worse! My 2020 ASUS laptop is a great example of that. The display was 4K, Adobe RGB, and had a higher PPI, so it sounds like it should have been better, but it only supported 24 bpp and had awful ghosting (around 6 frames to change a pixel).
But at least then I won't screenshot something and have the file saved to disk in Adobe-RGB-interpreted-as-sRGB (which is literally what would happen when the dedicated GPU did the color space conversion, but that was then taken as an input to the Intel APU and interpreted as sRGB, even though it was rendered for an Adobe RGB display) which even if converted back still loses a ton of precision due to Adobe RGB being squeezed into 24 bits in the middle. I care about that a lot. Something that annoys only me is possibly tolerable, but something that affects the content I share with others is like life and death (in terms of importance to me).
They also connected that display to the Intel APU so Windows could not properly do the color space conversion in hardware, resulting in artifacts that were so terrible I had to turn off the conversion entirely and just deal with viewing everything in sRGB-interpreted-as-Adobe-RGB. (yes I know "switchable graphics" is a near-universal configuration in laptops, but it's incredibly cursed and causes so many fucking problems in a completely irreparable fashion. Why not just connect the display input to the display output as it was intended, ASUS turds.)
They market it as "Pantone validated", so I guess that means that they added negative value for the sake of a marketing sticker. (Honestly, put that way, it definitely sounds like something that they couldn't possibly not do. Companies just can't help themselves when it comes to this stuff.)
When I switched to a desktop, I grabbed a random 15" 4K monitor from China for $70. It's sRGB, but it's proudly sRGB, and didn't half-ass itself into something that is hopelessly broken like ASUS did.
When I have a spare $400 I'll dish out for one of the 10-bit DCI-P3 displays on AliExpress and then be happy.
Generally apple picks the optimal ppi and they don’t market the ppi value, only that it’s “retina”. They will prefer odd resolutions over suboptimal ppi. For example, the m2 macbook air is 2560 by 1664 at 224 ppi. The maximum ppi you can get on an iphone is 460 ppi, so I assume in their testing they found that even people with perfect vision holding it really close had no use for sharper screens than that.
> Generally apple picks the optimal ppi and they don’t market the ppi value, only that it’s “retina”.
First world problem, but I hate when companies choose a one-size-fits-all value that is allegedly supposed to reach the limits of human perception, but that value still turns out not to be enough for me because (surprise) some people are naturally slightly better than average.
For example, the new Apple VR headset, despite having a resolution of around 4k*4k pixels per eye, would probably still have to make up for it in other ways, because I can still see significantly more pixels than that.
Apple actually seems to know this, and they know it so well that their entire marketing push is based on augmented reality and immersion. Blending things with the environment, moving around and interacting- basically things that can never really be disappointed by the screen resolution being insufficiently high.
The "human perception" crap is always marketing. Everything is supposedly at the limit of human perception, until the next generation of technology comes out, and then this time it's for real at the limit of human perception! I think the last 20 years of tech products I've bought were always marketed as at the limits of human perception.
Step aside 8 bit color, 16 bit color displays are here: 65536 colors! Nobody can perceive more than that! Then 32 bit color came along: 1+MILLION colors!! How can it get better than that? Humans can't perceive more! 30FPS gaming as the gold standard for what the eye can behold, then it became 60FPS, now it's 120FPS! The PPI arms race is the same thing.
And none of those devices have a PPI that you can customize when you purchase the device, in contrast to many PC laptops that generally offer multiple options (say, either 1080p or 4K on the same device), meaning each device indeed has a one-size-fits-all PPI. I never said Apple uses the same PPI for all their devices.
The iPhone is still pretty much the same ~330PPi as before, only the OLED number are inflated. If you only count the smallest PPI within the sub pixel of OLED they are still, magical ~330 ppl on newer iPhone.
218 ppi matches the native displays on current Macs.
The Dell 8k has 288 pixels per inch, which is 4x the detail of the original Mac.
Macintosh 128k was released with a 72 ppi display, so every pixel was the size of a "point", a unit used in more traditional, paper-based graphic design.
The history of the point [1] is equally interesting and messy. Back in school, my old teachers (mostly old printing machine operators, and typesetters) preferred Apple over anything else especially because of the native 72ppi vs windows normal 96ppi.
It's as if the Windows spec were retconned into being the historical norm all along, but I need to dig around to find the origin story here... Why 96 dpi?
My simple guess is that 96 is also divisible by 12, and thus easy to split up into 2, 3, 4, 6, 12, 18, 24 parts... you also have eight pieces in 72 for every nine pieces in 96.
In 1996, it was adopted by W3C for Cascading Stylesheets (CSS) where it was later related at a fixed 3:4 ratio to the pixel due to a general (but wrong) assumption of 96 pixel-per-inch screens
So close... The display with the smallest pixels in my collection has been the quirky Essential PH-1; its display is a silly 504 ppi. I loved that thing, but the device is no longer supported by official updates.
It looks mighty nice, just 4 ppi short of exactly 2x pixels per linear inch of these new Macs. I have no idea why.
For context, my MacBook from 2015 had a PPI of 220, so this comment is extremely accurate.