Although I haven’t used these monitors, my experience with high dpi on windows and Linux has been a nightmare compared to OSX. It’s surprising me the non Mac world hasn’t made this a bigger priority.
I actually think Windows does it better than OS X (and Gnome, Wayland, and anything that does not support true fractional scaling). OS X just scales the entire surface and as result it always look blurry.
> OS X just scales the entire surface and as result it always look blurry.
I genuinely have zero idea what you are talking about. Typing this from my macbook connected to a 5k LG ultrawide monitor, and it is as crystal sharp as it can get. As opposed to my windows 10 desktop (connected to the same monitor) having some occasional application windows render fairly blurry and inconsistently (one of the main offenders of this is, ironically, task manager). And don't even get me started on font rendering in general.
When I used a 5k LG, on the lowest scaling above 100%, I would get shimmering effects when I moved windows. You could see the same art/glyph rendered differently depending on if it was on an even or odd line; move the window 1 pixel and the text totally changed. If you only ever run at integer scaling, this wouldn't be apparent.
Windows does a Much better job with non-integer scaling because hairlines are 1px no matter what the scaling and text is rendered with pixel-hinting instead of macOS's new, lame strategy of super sampling.
Surprisingly, macs can’t actually scale the UI like Windows. All you can do is simulate higher or lower resolutions. Which is fine if your DPI is sky-high, but a real pain in the arse if you’re working with a QHD 24”for example, and just want everything to be a bit bigger
OS X not only uses a lame hack to scale, it completely muddles the issue by introducing the concept of "HiDPI" UI elements. Somehow I can set my 4K monitor to use "native resolution" at 3840 x 2160, and yet the UI and fonts look fuzzy! Absolutely terrible, and a complete embarrassment for Apple imo since they are supposedly the UI kings. You only don't notice the issue because you're using a 27" 5k display, which has been "blessed" by Apple as the "correct" DPI to match native screens. For those of us with 4k screens (95% of the market), I guess we're just supposed to enjoy a subpar experience. Even X11 looks better.
For me, I only closed the book on the issue after finding BetterDisplay [0]. Basically a 3rd party program that gives you complete control over resolution, display density, and a ton of other options on MacOS. It has a trial mode but it is well well worth the money. With that + the CLI tweak to set font smoothing to 0, the 4K experience on MacOS looks decent. You can even decrease the effective scale of the native screen past "More Space", so those of us with good eyes can actually take advantage of the screen real estate.
Also, if you're curious to explore this issue beyond my subjective thoughts here, these [1] [2] blog posts do a great job diving into what is so bad about MacOS scaling, why 4K 27" or 32" screens end up looking bad, and why 5K 27" look okay.
macOS will render at the next highest integer scale factor and then downscale to fit the resolution of your monitor instead of just rendering at the fractional scale in the first place
There are several scenarios where it clearly doesn't look that good, and where Windows objectively does a much better job.
Most people (and companies) aren't willing to spend $1600 on Apple's 5K monitor, so they get a good 27" UHD monitor instead, and they soon realize macOS either gives you pixel perfect rendering at a 2x scale factor which corresponds to a ridiculously small 1920x1080 viewport, or a blurry 2560x1440 equivalent.
The 2560x1440 equivalent looks tack sharp on macOS. It renders at 5120x2880 and scales it down to native, as I said it’s effectively supersampling. I used this for years and never experienced a blurry image. I now run a 5k2k monitor, also at a fractional scale and again it looks excellent.
Modern Linux DPI support is a nightmare. It's a shame, since if you just run and old-school software stack (X11; minimal window manager; xrandr to adjust DPI if you hotplug a monitor), then it has much nicer font rendering than Mac OS.
This is particularly frustrating since I've been using high DPI displays since the CRT days. Everything horribly regressed about a decade ago, and still isn't back to 1999 standards.
IDK, high DPI worked fine for me under Linux. I just set the desired DPI in Xfce settings, and everything scales properly. (Except Firefox, which has its own DPI setting! But it works equally painlessly.)
Where things go haywire is mixed resolution. It's best avoided :-\ Hence now I have a 28" 4k external screen which is exactly like four 14" FHD screens of my laptop, so the DPI stays strictly the same.
I'm actually not sure what your complaint is at this point. I've long been using 3 mixed dpi displays on windows 10 for gaming as well as normal desktop stuff. Any relatively modern software scales fine to high dpi. Some old software using old APIs has to be upscaled by the OS and is blurry, but that's stuff like... Winamp.
I guess you've not used VMWare, VirtualBox, DaVinci Resolve or most anything written in Java. There's more, but that's off the top of my head. There's plenty of software out there with unusably small text/displays even with just one display.
At high DPI the difference in font rendering between ClearType, Freetype and macOS diminish greatly, it's mostly a matter of taste, and at least Microsoft hasn't crippled low DPI rendering in recent Windows versions like Apple did with macOS.
I'd guess gaming is at least partially responsible. For anything more than 2k you need a high end/expensive video card, which just aren't that common. Just look at the steam stats right now.. 62% of users have 1080p.