Hacker News new | past | comments | ask | show | jobs | submit login
Q&A with the developer of BetterDummy: from macOS secrets to his motivations (theregister.com)
95 points by laktak on Dec 7, 2021 | hide | past | favorite | 154 comments



I think Apple is trying to force the hand of monitor manufacturers towards 4k or higher. The entire external monitor industry is such a no-innovation, customer-hostile dumpster fire.

Buy the GHR1050PQS75-UHW204773! It’s got garbage white uniformity, terrible black levels and IPS glow, cheap plastic housing and settings software that feels like it was created in 1994. Wirecutter recommended!

Compare this to the TV industry, where you can get amazing picture quality with OLED and at least somewhat decent levels of software competence.

Meanwhile desktop monitors (something most of us spend 8 hours a day in front of) are stuck using 30 year old LCD tech and most aren’t even 4k yet.

The entire PowerPoint/excel crowd seemingly has no problem spending most of their waking hours staring at giant pixels all day, even if the laptop screen they have sitting next to the monitor has almost 3X the pixel density.


> I think Apple is trying to force the hand of monitor manufacturers towards 4k or higher. The entire external monitor industry is such a no-innovation, customer-hostile dumpster fire.

Please stop apologizing for Apple's customer-hostile behaviour here. A $2 trillion company should be able to make an external monitor not look like blurry trash when connected to their hardware.

The fact that it takes someone to make a third-party app to fix what Apple can't speaks volumes to how much they care about/understand their customer base. Apple should have tested M1 hardware with hot garbage external monitors, realized the customer experience sucked, and fixed it.


Don't know about Apple, but maybe those external monitors shouldn't be such "hot garbage"?!


Not all external monitors are "hot garbage" and it was probably my mistake to use that term.

Apple could always choose to add some "innovation" to the external monitor market by manufacturing their own line of displays. [1] For example, they have a nice 4.5K display in the iMac, but they won't sell it as a monitor for some reason.

Instead, they offer exactly one external monitor that starts at $4999 USD.

[1] https://en.wikipedia.org/wiki/Apple_Cinema_Display


Oh, from my PoV all those 4K displays are hot garbage.

It'd be quite nice if Apple would upend the external monitor market but usually they prefer serving their own market instead. And as an Apple user today you are quite spoiled indeed...


> The entire external monitor industry is such a no-innovation, customer-hostile dumpster fire.

I last few years we've seen a massive shift towards ultrawide, high (dynamic) frame rate and HDR screens from pretty much all manufacturers. Many of those technologies Apple just didn't support until recently. They still don't support DisplayPort MST for use of many bigger monitors in daisy configurations.

So, what are you talking about? Did you even check the market outside the Apple store?


Yes, I spent almost a month researching the entire market and every single offering available.

Nearly every ultrawide you speak of has very poor pixel density. 4k spanned across 34+ inches diagonally looks pretty terrible when you view it at the distance most people sit from their monitors.

I can’t seriously be the only one who can tell the difference. Have you ever compared how text looks on your phone or laptop screen compared to most monitors available today?


Agreed, there are few/no options if you have decent eyesight and want high pixel density.


Even if you don't have decent eyesight crisper text is easier to read than a blurry mess of pixels representing letters.


Indeed, still nobody makes the monitor I want – a double-wide 4K (or 5K) 27".

The closest we have now are some wide 5K's with good DPI and the Dell 8K.


Yep, text looks nicer in 4k. So what? 1080p is still plenty workable. I remember the jump from my 1024x768 CRT Sony to my first LCD monitor (I think it was 1920x1200?). Now THAT was an upgrade. 4k is just trivial in comparison. I for one intend to use my 1080p monitors until they die and cannot be fixed.


Never thought I would see the time 4K isn't enough anymore!


It all depends on how close you sit to the screen. What matters is pixel density, and that’s a function of both the resolution and how close you’re sitting.

So if you have a 4k TV and a 4k monitor, there’s actually quite a difference. If your monitor is bigger than 24” you’re getting way worse pixel density and visible pixels at the distance you’re sitting.

On TVs it’s typically never a problem because you’re sitting far enough away.


4K was only ever enough for 24" monitors, where you can use a clean 200% scaling.

If you've already moved on to 27" or higher, 4K was never enough!


That's why Apple makes their 27" iMac 5K. 4K is almost enough.


Ironically it's not easy to find a current-gen 4k 24"


They stopped selling the Dell UP2414Q! What a shame. So it looks like the only option is the pricy LG Ultrafine 24MD4KL.

There's also only a single 5k display on the market (also from LG).

It's a shame that there are so few high res displays on the market, and they are all 16:9...


LG also has the 24UD58 for $300


It wasn't enough to start with for large monitors. High pixel density is important for text readability. A 30" screen at typical monitor distance has very few pixels per arcsecond.


Not the OP but for me: I don't watch movies on my monitor so I don't care about HDR, I don't play games so I don't care about dynamic frame rate.

What I need is sweet, sweet resolution! Those ultra-wide or high-frame rate all come at the expense of DPI which is the same with the DPI from 2K monitors...


> They still don't support DisplayPort MST for use of many bigger monitors in daisy configurations.

The worst part about this is the fact that macOS does in fact support MST, but only for their 6K (and 5K on older machines, no longer requires MST since DP 1.3) display.


If that was the case, why doesn't macOS support 8k displays?

If you want an 8k display at 60Hz, your only option is to use Windows.

Microsoft is also the only company that offers a high res 3:2 display (the Surface Studio at 4500x3000 pixels), a size that is more geared towards productivity rather than streaming movies.

It's weird that if you want to get nice displays, Windows is today a better option than macOS.


I've been in the market for a new monitor for quite a while now, but haven't found a good fit, precisely because most monitors evidently kinda suck for the price. My requirements are pretty straightforward:

- 400+ nits peak brightness (my room gets really bright during the day, and my current monitor at 250 nits is practically unusable near noon)

- 4k resolution at 32 inch size (so I can use it without scaling while not taking too much desk space)

- < $500 (I refuse to pay more for a 32 inch screen than a 40 inch+ screen in a TV with much better specs)

As far as I can tell, this monitor doesn't exist (suggestions welcome!). Especially problematic is that brightness requirement, every monitor with more than 300 nits brightness I've seen has costed over $1000 for some reason.

At this point I'm probably going to just end up settling for a 40-43 inch TV instead, plenty of those that meet those requirements within the $500 budget, but the last time I used a TV as a monitor, I remember being really annoyed at having to use my remote to turn on the TV after it goes to sleep (turns off?), compared to a monitor which can get woken up by inputs from the connected computer.

Are there any other folks using TVs as monitors here? Curious if people have ideas on how to get around this issue in their own setups?


IIRC, if the TV supports commands via HDMI, it can be woken up from sleep without a remote. It requires extra setup on the computer to work.

I "solved" the issue of my preferred screen size not being available by going retro. It turns out that 12+ year old formerly-expensive flatscreens are a bit rubbish with brightness and power consumption, but they are better than "modern" screens for my preferences.


> Are there any other folks using TVs as monitors here?

YMMV, but in my experience cheapo TVs tend to not support 60Hz refresh rate. Most often it's either 24, 30 or 50, and that's absolutely a dealbreaker. It's fine for a TV, not for a computer monitor.

So if you're trying to save a buck, TV is probably not an option.


I definitely remember a time when this was something to watch out for (early 4k days), and made most TVs non-starters as computer monitors.

I don't think it's true anymore though as most TVs I've seen, even the cheap ones, can do 4k 60hz over HDMI no problem.

This one I've been eye'ing is $500 at amazon for the 43 inch version: https://www.amazon.com/SAMSUNG-43-Inch-Class-QLED-Built/dp/B...

According to rtings it does 4k at 60hz, 500+ nits peak brightness, and even has a 32 inch version (at a even lower $400 price)!

https://www.rtings.com/tv/reviews/samsung/q60-q60a-qled

It's an absolute steal in terms of specs/dollar compared to pretty much every 4k monitor out there. I'm just hesitant to pull the trigger because of the manual wake up thing, which has really frustrated me in the past.


Hot garbage is a pricey laptop that by design won't play well with very common peripherals to force owners into buying another pricey peripheral.

There are absolutely zero reasons why an M1 shouldn't work with 1080p and above, except the usual Apple greed and lock-in.


Yeah, it's really interesting to read through a thread about MacOS not playing well with third-party displays turn into a discussion about how only Apple makes good displays. That's not the point: as a customer, I prefer vendors to enable me to put together a setup that works for me. Apple seems to be intentionally failing in this regard.


I have a 1920x1080 144hz display and another 3440x1440 60hz display with a Mac mini m1 without issues. Im curious on what issues do you have with 1080p?


I don't give a shit about 4K, colour accuracy, ultrawide, or 120+Hz refresh rates. My cheapo monitors look good enough to me, and I have no interest in dealing with the dumpster fire that is HiDPI.

Applications already know how to deal with subpixel AA, and dealing with one application with poor antialiasing is still vastly preferably to dealing with one application with broken HiDPI scaling.

If monitors could just stagnate in ~2010-2015 tech for all eternity, that'd be great. It's a solved problem, let's collectively move on to problems that actually deserve the attention.


>I don't give a shit about 4K, colour, ...

I can't go back now, once I got used to high-dpi everything else looks like garbage.


I often switch between ~6" 2560px phone screen and ~7" 1920px one, and the later clearly can display more information at the same level of readability than the former.

Same with 15" 4k hi quality laptop screen vs 40" 4k display.

So in the only metric that matters, hidpi appears to be irrelevant.

The only thing that I find to matter is the angular size of the display at a distance, that doesn't make your eyes strain. (as a dev)


Settings software stuck in 1994? Good. I want it stuck this way.

The last thing I want in my monitor is any "software". It's meant to do one thing - display pixels. IoT-ization of screens will just turn them into another "smart TV" product category with unusable UX, ads, always-on connectivity and data collection, cloud profiles (see Razer mice), and 30+ second startup times.


Maybe it'd be nice if the calibration data could be sent from the OS to be applied directly on the monitor instead of the Video Card?

And I really don't mind when producers gather (anonymized) telemetry data of my usage: maybe that'll get them to improve those damn products.


> Maybe it'd be nice if the calibration data could be sent from the OS to be applied directly on the monitor instead of the Video Card?

That would not be nice.

When you "calibrate" a monitor, you're really creating an output profile... a record of what signals create which colors. You can also calibrate a camera, and get an input profile, which is a record of which colors create what signals.

If you display a picture on-screen with a color profile, and your monitor has a color profile, the software translates the image from the image's color profile to the monitor's profile, and displays that on-screen. The problem here is that different pictures on screen will have different profiles, so you can't just load a profile into the monitor and do the conversion there.

For example, you might open a website, and see a bunch of pictures with sRGB profiles. And then you might open another website, and see pictures with Adobe RGB, which covers a wider gamut. The Adobe RGB profile pictures can be more colorful, if your monitor supports it, but both sets of pictures can be on-screen at the same time. Then there's the different ways you convert from one profile to another, depending on what you want (perceptual, relative colorimetric, absolute colorimetric).


I do mind telemetry, especially on something as private as my display (or keyboard).

Realistically, why does it matter what the exact mechanism of applying calibration data is? As long as it works with the appropriate .ICM file, I'm happy. Or, (sounds crazy) just have monitors introduce a setting that treats the video input as a dumb, uncalibrated RGB stream and do the colour mapping internally in the monitor.


On the MacBook Air of my wife we had also display problems. If I remember correctly you had to press the option key somewhere on a click to get better resolutions.

https://osxdaily.com/2015/08/27/show-all-display-resolutions...

"Under the ‘Display’ tab, hold down the OPTION / ALT key while you press on the ‘Scaled’ button". Talk about UI.

Before that we had spend quite some money on HDMI cables and USB-C to HDMI adapters and stuff blaming 3rd party products along the way.


I think this is probably intentional to try to get people to spend money on a Mac display.


When you say a "Mac" display, what are you referring to? Apple sells exactly one display - the XDR - and its pricing is such that no one casually buys because their $300 external display isn't working correctly. I very much doubt that is their motivation; it's likely just a (possibly misguided) effort to keep the UI simpler.


It was more of a $1500 external display ;-)


I'm confused. If this isn't a joke I don't get, would you mind sharing some advice on how to get it for that price?


I have replied to

"buys because their $300 external display isn't working"

with the fact that our external display was $1500. I'm confused now :-)


I thought you meant that the Apple XDR Display could be had for 1500. My bad.


No, sorry for being not more clear.


Still can't understand why I can't buy a 5k/6k at 27", 30" or 32" display in the non-Mac world.

4K is a rather low DPI, and using it at 150% OS scaling is not that great.

Apple offers such great displays for the MacBook and iMac, with non-fractional scaling and such a good density that sub-pixel anti-aliasing is no longer needed for text rendering.

Why do I have ZERO options for that outside Apple's offers?!


Dell has the UltraSharp UP3218K 8k screen for about 4 grand.


You can't even buy the Dell 4K 24" anymore. My guess is buyers don't know what they are getting so just go for size.


How close is your screen that 4k/27" seems like low DPI? What are you comparing it to?

But I did monitor shopping recently, and what I wish I could get outside the mac world (or even in it) is the form-factor.

Like there are a lot of very nice 4k HDR IPS displays out there today, but nothing I could find in the thin form-factor of an iMac or laptop display.


Regular distance on my desk, 75-100 cm I guess.

I have a 32" since I want larger text at my age. But it doesn't compare to the Retina display on my MacBook: I can often see the pixels and scaling artefacts.

What I would've liked was 5k at 32" for a perfect doubling of my preferred resolution: 2560x1440.


Because you get what, 60Hz at best on those Apple display. 60Hz is not enough any more.


60 Hz is definitely enough. The vast majority of video content stops at 60 Hz. Applications and games may benefit from 120 or 144 Hz refresh rates, but for the latter, it also means that you're pushing a lot more pixels through your graphics card - which may well not keep up.

IMO, 4K60 is a reasonable choice for those that prioritise resolution over refresh rate.


I like high frame rates, but if I have to choose, I will go for high pixel density every time. But I'm not a gamer.

Here's to hoping that someday we'll get to enjoy both!


I would just like to plug my very very expensive 2 year old Intel MacBook Pro into a monitor without the fans immediately running at 100% and cooking my desk. A problem that has gone completely unacknowledged by Apple despite the several hundred pages of forum posts all over the web about it.

https://forums.macrumors.com/threads/2019-16-is-hot-noisy-wi...


Plug the charger in on the right side. [1]

1. https://www.imore.com/heres-why-you-should-probably-charge-y...


Nice to know, as I'm at this moment charging my 2020 top-spec Intel MBP from the left-side port.

No matter. I'm going to sell it anyway. What a fan monster.


I had this with my mid 2015 macbook pro. Window server would be using 40% of my resources when using 2 external displays.


I remember having this issue a few years back with a 1080p display and a MacBook Pro from 2016. Connecting a Raspberry Pi Zero W yielded better display results than a $3k MacBook Pro, which was really disappointing. I can't believe it's still an issue to this day.


that's a different issue altogether, this one is about macos not supporting 2k and 4k monitors native resolutions, that one was about antialiasing. At some point, when Retina was introduced, Apple decided to completely disable their subpixel hinting so low resolution monitors (if 1080p can be considered low) that can render sharp text on Linux and on Windows look ugly and blurry on macos


>At some point, when Retina was introduced

It was later than that, if I recall correctly it was available until Catalina although you did need to modify a config file to enable it.


Yep, they made it gradually less discoverable, leaving an option in system preferences first, then removing it and allowing to enable it from cmdline, then removing it altogether. I believe it went away completely with Big Sur.


And here I thought this was going to be about the HDMI YCbCR issue. https://spin.atomicobject.com/2018/08/24/macbook-pro-externa...


macOS still has a completely retarded implementation of DisplayPort MST (i.e. the only supported display is their own 6K display).

It's been growing on me for years, as macOS is my favourite platform, but I could go on a full Linus Torvalds rant about this. Why the fuck can't I use my dockstation that has 2 DisplayPort outputs and 1x HDMI to connect 3 external monitors? It literally works on Windows on the same fucking machine. On macOS all you get is mirroring.

So ultimately the reason is that Apple is completely oblivous to the millions of people who don't use monitors that are made or sold by Apple. Either oblivious or they flat out don't care.


> So ultimately the reason is that Apple is completely oblivous to the millions of people who don't use monitors that are made or sold by Apple. Either oblivious or they flat out don't care.

They are not strangers to limiting or breaking existing protocols or standards in favor of their own implementation, and then saying "see? Apple just works".

Take their broken SMB implementation in OSX as an example.


>Take their broken SMB implementation in OSX as an example.

That doesn't make sense as they completely rewrote their SMB implementation to improve it.


Quite a lot of people would suspect that the the initial reason for reimplementing SMB on Mac OS X with smbx was that Samba chose to use GPLv3 for newer versions, rather than any direct technical reasons.


I've got 2 AOC monitors (2369M + 2490W1), On my mid 2015 macbook pro I had no issues (apart from window server using all my resources when they were plugged in). I'm the new M1 Macbook pro(work laptop) I'm finding quite often I am getting small vertical bands on both monitors randomly, these are more noticeably when using dark themes. I've changed cables and colour profiles and not really fixed it. I've not had this with my old laptop or even my desktop pc.

I would swap them out for 4k displays but it's just too expensive for me. I have been using my glasses more to use my monitors thinking my eyesight is getting worse but it could just be the monitors...


I know this is at least somewhat a software issue; as in on many monitors options for display scaling don't even appear. My Dell U3818DW does have retina scaling options, thankfully, but my Asus PG278Q did not when I was using it. I ended up following along to this article [1] to get some retina display options for it.

With that being said, I think the larger issue is that there simply aren't that many high dpi external monitors available, and the ones that do exist are expensive. There are a ton of options for 4k panels in PC laptops that are like 13-17 inch and Apple's retina displays on their iMacs and Macbooks are also high dpi but I can count on my fingers how many true high dpi screens there are for external display options.

Gamers don't really want higher resolution since GPUs can't drive them well natively and games don't really need the sharpness and I imagine that corporations buying hundreds of monitors at a time don't really care about it, either, especially since Windows renders text on them quite adequately. Mac users are certainly longing for some decent monitor options that aren't 5000 dollars, though.

[1] https://medium.com/comsystoreply/force-hidpi-resolutions-for...


Here are 118 different 4K monitors starting at 240€:

https://www.worten.pt/informatica-e-acessorios/monitores/mon...

I've bought 3 LG ones during the pandemic and they're great. What other options do you need?


Make that an IPS panel and it's only 7 monitors under €300


So basically what you're complaining about is that high-end products are expensive? How many 4k IPS displays would you have found at the $500 price point even a few years ago?


Irrelevant, the point was that they are hardly “affordable”; if you’re going to get more than 1 external monitor, viewing angles are going to be a concern, and that leaves IPS as the only viable option.

I’m personally running 3 24” external monitors, and I’m fairly sure that I’d have a viewing angle issue had my panels been TN.


Why are viewing angles concern with a multimonitor setup? Don't you turn every monitor to face you directly, so that the viewing angle is 0?


This is what I do with two 1080p monitors; I don't need IPS at all, since I'm always viewing them close to perpendicular to the screen. Has worked just fine for me the past couple of years.


How is IPS high-end, exactly? It's table stakes for the modern-day. The technology is over a decade old, and TN is a strictly worse tech. If it isn't cheap, it should be cheap.


And ultimately this, there’s no way 4K 60Hz IPS panel could be described as “high end”.

High end for 60Hz these days is OLED, IPS is the de-facto standard, and TN or VA aren’t worth buying.


Sorry, but those are not hiDPI. For hiDPI, you need at least 5k at 27 inch. There was a single (!!) LG display (27MD5KL) that met that criteria, but it was discontinued.

The only external hiDPI displays that exist are the Apple Pro XDR (only works on macOS), and the Dell UP3218K (doesn't work on macOS).

Of course the iMac is hiDPI, but it's not an external display.

It's bewildering how manufacturers keep pushing for this low resolution 4k crap 10 years (!!!!) after hiDPI displays first appeared in laptops.

Reference: https://geizhals.eu/?cat=monlcd19wide&xf=12018_200


> There was a single (!!) LG display (27MD5KL) that met that criteria, but it was discontinued.

It was updated in 2019 as the 27MD5KL-B, specifically designed for Macs:

https://www.lg.com/us/monitors/lg-27md5kl-b-5k-uhd-led-monit...

I saw it listed on apple.com two months ago, but it was a month+ back-ordered. I ended up buying one via eBay (for my Mac Mini M1).

EDIT: it's still available on apple.com: https://www.apple.com/shop/product/HMUB2LL/A/lg-ultrafine-5k...


Sadly it's out of production, everything you can find (if you can find it) is new-old stock.

And also unfortunately, the build quality is very, very bad.

But yes, better get that one while you still can! But be warned, my friend recently got one, and had to go through about 10 retailers until he found one that could actually sell him one. Many retailers put these on their websites in the hope that LG will be able to deliver stock, but LG can't deliver stock.


Uhm well, I upgraded from a Cinema Display. Granted the LG has a plastic frame, but I've paid for 3 replacement glass screens for my Cinema Displays because they chip so easily.

Color-wise, the LG is fantastic - and I was very skeptical about getting it. The old one wasn't very good when I compared it side by side with an iMac Pro (Apple Store, Union Street, SF). I'm not sure if they changed the screen quality, but it seems on par, if not better than my old Cinema Display.

EDIT: yeah, my first purchase was via Amazon as "Used - Like New", which turned out to be a 2 year old model that couldn't talk to the Mac. I found a new one at a great price via eBay which works great, but the manual was in Cyrillic...


>And also unfortunately, the build quality is very, very bad.

I think the early models had some problems. I've got the later version and it is rock solid.


We need better definitions then. Apple uses 1440p/1600p on laptop screens. At normal viewing distances using that on a 14" laptop and 4K on a 27 inch external screen works perfectly with the same scaling. I know because that's what I use with just font scaling that's the same across screens. With my work 14" 1080p Windows laptop 2x scaling is needed on the external screen. If 4K 27" is not HiDPI what do we call 1080p in that size which is still the normal resolution for those screens?


> We need better definitions then.

We have a definition, it's ppi, and you need at least 220ppi.

> Apple uses 1440p/1600p on laptop screens.

No, it doesn't.

The current 14'' screen is 3024x1964 and the current 16 inch is 3456x2234 (both 254ppi).

The previous generation used about 220ppi displays.


The current 13.3" screens are 1600p, the original Retina displays were also 1800p on 15". Only the last generation is a little higher and still not 4K like in PCs. Apple itself has always branded "Retina" based on angular resolution and not surface density, so viewing distance makes a huge difference.


No, their 13'' screens have been 2560x1600 (226ppi) since 2012. You are confused because the nomenclature like 1080p refers to vertical resolution while 4k refers to horizontal resolution.


No to what? I know what 1080p/1600p/1800p is. Here's a direct quote from the Macbook Air tech specs:

Retina display: 13.3-inch (diagonal) LED-backlit display with IPS technology; 2560-by-1600 native resolution

If 1600p on 13" is not Retina you need to argue this with Apple instead.


> If 1600p on 13" is not Retina you need to argue this with Apple instead.

It's 226ppi display. Why would it not be a retina display?


1280x720: 720p

1920x1080: 1080p

2560x1440: 1440p

2560x1600: ...

EDIT: if you're going to edit your comment once called out on talking nonsense, at least have the decency to note it.


4K 24" will be HiDPI because it's 2x 1080p.

5k 27" is HiDPI, because it 2x the old 1440 screens.


4K 27" is also 2x1080p. 1080p panels come in all sizes, including 27". And Apple uses 1600p on their laptops, when most laptops these days are 1080p. I guess Apple laptops are also not HiDPI as 1600p is not 2x1080p. Angular resolutions matter much more than surface DPI. You don't use a 27" screen at the same distance as a 24" one the same way you don't use a 14" laptop screen the same distance as your phone.


Yes, angular resolution is all that matters, which is why their small displays are now 254ppi, while the iMac and the Pro XDR is 218ppi.


Don't take it from me, you can read Steve Jobs himself on the introduction of the Retina term:

https://en.wikipedia.org/wiki/Retina_display#Rationale


Do you really believe that people use these 4k@27'' displays at a different distance than mac users use their 27 inch iMac?


I use a 4K 27" screen at a distance where it uses the same scaling as a 1440p 14" laptop. Looks great to me, but if your vision is better or you're using it closer then it may not be enough. Other people use 4K TVs or 65" screens to be able to set them farther and keep their vision more rested for the same angular resolution. Resolution requirements depend massively on how you setup your workspace and the quality of your vision. There are plenty of people that use software scaling for 1080p 14" laptop screens as they consider normal sizes too small for example. The range of configurations is very high and you're on one extreme of that range. There's nothing wrong with that but that's probably why you're frustrated with the lack of options on the market. For most people even 4K 27" is already overkill and 1080p external screens sell well instead.


Please answer the question I posted. "Do you really believe that people use these 4k@27'' displays at a different distance than mac users use their 27 inch iMac?"


Yes. I do believe an external 4k 27" on a laptop is used differently than a single screen on an iMac. I also believe people use iMacs in ways that are different than what you use, which was the whole point of my answer. People use screens in all sorts of different ways. Most of them have resolution requirements much below yours, which is why the market apparently doesn't cater for your needs.


For most people not using macos 4k is overkill and 1080p can be enough not by itself but because the other two major OS (windows and linux) have antialiased text with subpixel hinting. I.e. they can exploit the pixel internal structure to display sharp text even on low resolution monitors. Apple deliberately decided to ditch their own implementation as you don't need it for retina displays and they only sell and care about retina displays.


Almost no 4k monitors would be considered high dpi. To achieve 220dpi you need a 4k monitor to be 20 inches. 27 inch 4k monitor 4:1 pixel mapping makes it appear to be 1080p, which many people would not consider to be appropriate for that size monitor. You can use fractional scaling but that comes with its own sets of issues because the pixels don't map correctly.

27 inch 1440p at 100% scaling in Windows or MacOS is what most would be consider appropriate sizing for UI elements and text by default, so in order to achieve high dpi you need a 5k monitor, like the LG UltraFine or the 27 inch iMac.


Watch the corners.

I bought 3 4K 27” LG ups ones, one the corners blew out (massive bleed til you can’t see the pixels for the white shining through), one is going thr same way and only the newest shows no sign of it (yet).

Going Samsung next time since LG make panels for everyone and I simply don’t trust them now.


can you confirm they work in retina mode with macs? e.g. anyone tried 27UL500 and can confirm text looks good?


No, I tried these 4k@27 inch displays, and you can enable scaling, but it looks really really poor compared to my 5k iMac. 4k at 27 inch is not retina-quality, period. This is the only reason I use an iMac, I simply can't get an equally good display without paying €5k+.

I understand that not everybody understands hiDPI displays, but it really grinds me gears to see low-resolution displays advertised as solutions to people who do understand hiDPI.


Ok, I understand it cannot be comparable with a 5k iMac. But is text good enough? with €5k+ I can build a pretty serious linux rig with a couple of 4k monitors and keep the macbook air m1 just for mobile use. It's a shame that my 10+ years old macbook pro with High Sierra can render text better on any external monitor than my brand new air m1.


I have a 4k 27" LG monitor and it is awesome with my Mac. I use it with 150% scale and the font is ultra sharp, best monitor I've ever seen so far


Please check your eyesight. I can easily tell the difference between a 220ppi and a 280ppi display when viewing text even at one meter viewing distance. 4k@27'' is only 163ppi. And my eyesight is not even what it used to be.

To me, 163ppi looks more similar to how old ~100ppi displays looked that to my 224ppi iMac.


Have they fixed the performance issues with non-integer scaling?


In my opinion, text rendering on 4k@27'' is nowhere near sharp enough. In my experience there isn't a continuum from low-dpi to high-dpi, but rather there's a sharp transition where text suddenly becomes sharp and 4k@27'' is below that line.

In fact, once you see these new 280dpi displays, even 5k@27'' barely makes the cut. I wish it were sharper.


Angular resolution is much more important than the actual surface DPI. Depending on you viewing distances and the quality of your vision 4K on a 14" laptop is needed or 1080p on a 27" screen is more than enough. Your requirements are extreme compared to most.


Yes, angular resolution is all that matters, which is why books, which are supposed to be used closer than any screen have been at least 300dpi since we have invented phototypesetting.


We can discuss how much angular resolution is enough but we clearly don't need 300dpi for a billboard. To get the same font quality when looking at a phone, laptop or external screen a different dpi will be needed.

Here's Apple's range of Retina displays:

https://en.wikipedia.org/wiki/Retina_display#Models

The DPI difference between highest (iPhone 12) and lowest (Pro Display XDR) is 2.2x. The angular resolution difference at typical viewing distance between those two screens is 1.1x. Clearly the angular resolution at typical viewing distance is a much better definition of what's good enough resolution than the DPI.


Yes, and with the exception of a discontinued LG monitor, and a $4k Dell display that doesn't work on macOS, nor on Linux with OSS drivers, no external non-Apple display can provide the required angular resolution.


That totally depends on what angular resolution you require and what distance you are comfortable with. Using that list Apple's Retina concept starts at 57 and 4K 27" is ~60. And even that is assuming you place a 27" screen the same distance away from your eyes as a 24" or even 21" screen as all the iMacs use the same viewing distance.

But I'm glad we got over the idea that 220dpi is a magical number. Otherwise not even the iMac or the Apple Pro XDR displays counted as HiDPI.


My 4K monitor looks and works perfectly fine on my M1 Mac ... when it works. I've had many connection issues with the dongle I haven't had on another computer running Linux. Some days it just switches on and off seemingly randomly in the span of 10 seconds to 30 minutes. In the last few days it has been fine. I won't be surprised when it appears again.


BetterDummy solves mouse stutter on external displays as well! (M1 air with 1080p monitor)

Does nobody at Apple use external monitors??


I'm guessing the use the 32" 6K XDR...


Unrelated but newer Macbook Pros (with M1 Max chips) as well as the Apple Pro Display XDR have entirely gotten rid of ICC profiles which seems crazy to me. Apple's definition of a professional is a more like a hipster. Real pros want no bullshit abstractions and hidden stuff, they want all knobs and switches exposed in all its glory. Btw, if you want to see how bad Apple Pro Display XDR is, look no further than the legendary HDTV Test channel on YT. Totally crushes Apple with straight up numbers and pure unadulterated objectivity: https://www.youtube.com/watch?v=rtd7UzLJHrU


> Totally crushes Apple with straight up numbers and pure unadulterated objectivity: https://www.youtube.com/watch?v=rtd7UzLJHrU

I don't know a lot about either of these monitors, but I don't find it surprising at all that a 30,000€ monitor would be better than a 6000€ monitor. Saying that the reviewer objectively crushes apple by using these two monitors in a comparison seems quite misleading.


The author is just doing what the keynote said with the exact model of the Sony monitor.


Wait, do I get this right that M1 Macs cannot work with monitor calibration tools? I am not a graphics professional but my wife is and we are thinking about buying a M1 Mac - but she needs a calibrated display.


ICC profiles still exist but maybe they don’t show by default? You might need to option-click to view settings if they go missing compared to previous macOS versions, I’ve been noticing a bit of a streamlining or reorg of settings in recent macOS releases. It’s still documented the normal way though https://support.apple.com/en-ca/guide/mac-help/mchlf3ddc60d/... - and I can confirm it works still: I needed a specific HDMI output to be rec.709 as it was the only way to get a DVD-quality stream to play correctly on an older HDTV. And it worked great. I also haven’t heard any reports that X-Rite calibration software has stopped working or similar. As far as I know, works great. :)


I mean, that video is comparing it against a Sony HX310, which costs $30,000. If you have a budget of $30,000 to spend on a single monitor, by all means, go for it.


Dude, did you watch the Apple Keynote when they released the Pro Display XDR?

They literally claimed to be better than that $30k monitor. It is totally fair to compare it and prove that Apple was wrong.


An Apple Keynote is marketing fluff. Nobody should be taking that at face value in the first place.


Using Citrix is also catastrophic both on the 5K iMac and the Retina MacBook Pro (and probably others too). Not sure who's to blame for that. Curious to see if this tool solves the issue.


I'm suffering with underscan on a 1440p monitor while using an intel MBA (2015). The mini-DP port would only work with a VGA adaptor & the 1080p resolution ends up with a vertical black bar on the left as a result of the underscan.

No way to fix this either since macOS Monterey doesn't have any settings to change, ends up showing that a 40" HDTV is plugged in when it's just a 27" 1440p display...


Ever tried this EDID patch? https://gist.github.com/ejdyksen/8302862

Had to run this script on every MacOS update till Big Sur or Catalina in order to get a perfect picture, otherwise MacOS would detect a TV and not a monitor.


Thanks, I'll check if that fixes it.


These are issues I had with linux...so, I will just stick it :)


Gnome does have an option for fractional scaling but any legacy x11 apps will look blurry. Last I checked this was pretty much just electron and chrome holding things up.


>just electron and chrome holding things up

There are flags you can pass on the commandline to tell Chrome to bypass XWayand. I have been running Chrome (stable channel) that way for many months -- on Gnome 40 set to a fractional scaling factor. No blurriness.


Ubuntu 21.10 doesn't recognise my asus qhd bought a year ago, max resolution detected is fhd, no issues with windows 10


at least linux has subpixel antialiasing so text still looks crisp on perfectly good full hd monitors...


[flagged]


At the risk of joining the dog-pile against your comment, not once have I used this particular insult with any "ableist" subtext in my life. I interpret it to be a stronger form of "stupid" or "moronic", and that is it. I am aware that there is a modern trend to mark usages of this word as ableist, but I simply disagree.


Interesting that despite never considering it to be ableist you knew exactly which word I was referring to.

> I interpret it to be a stronger form of "stupid" or "moronic"

And why do you suppose that is?


> you knew exactly which word I was referring to.

I have come across this issue several times in the past couple of years by the virtue of reading comments under articles. These criticisms are a part of the public record.

> And why do you suppose that is?

Most people learn words by figuring out what others call a certain thing, and then reusing that term to build a shared understanding of the world. This word is no different. At no point did this label acquire (in my perception) any connection to ability (mental or otherwise) until I first heard of it from US-centric English speakers. In my perception, it's the opposite: anything that is "retarded" is perfectly capable of being better designed/implemented or of making better choices, but instead actively chooses to do the worst possible thing for seemingly irrational reasons.


So you have read several times that this term is criticized as offensive, yet claim that in your perception the word is the opposite of offensive. What more would it take to get through to you?

> In my perception, it's the opposite: anything that is "retarded" is perfectly capable of being better designed/implemented or of making better choices, but instead actively chooses to do the worst possible thing for seemingly irrational reasons.

This is a such strange thing to double down on.

Not everyone is "rational" as you perceive them. That's the point. And it's not always an active choice. Some people are disabled. Some people think and act differently than you. You don't get to call them irrational any more than they get to call you intolerant.


> So you have read several times that this term is criticized as offensive, yet claim that in your perception the word is the opposite of offensive. What more would it take to get through to you?

Why do you want to "get through" to me? I heard the discourse, and disagreed with the arguments, it's that simple. I am perfectly capable of understanding a conflicting point of view without being convinced by the arguments.

> This is a such strange thing to double down on.

There is nothing for me to double down on, this is simply how I see the situation. Whether you choose to believe that I'm sharing my view in good faith, is up to you.

> Some people are disabled. (...) You don't get to call them irrational any more than they get to call you intolerant.

I already made it clear that I consider irrationality in the light of whether someone is capable of making the opposite (rational) choice. If they are not (for any reason, including disability), then there is no reason to call them rational or irrational, because you cannot measure their behaviour with this metric.


The google definition for retard is "delay or hold back in terms of progress or development." This exactly matches what the OP was trying to convey about the mac displayport implementation.


Look up "retarded" in Merriam-Webster:

https://www.merriam-webster.com/dictionary/retarded


Merriam seems politically biased imo as this fails to explain the meaning of many modern day uses like "fire retardant" or the numerous uses in physics. The google definition from oxford perfectly explains its formal meaning as well as hint at how it could also be used as an insult.

Holding back a fire in terms of progress also seems like a perfect fit.


Indeed, it is a common word in the subject areas you mentioned. Strange they didn't have an inflammatory or hostile reply when given a legitimate example of correct usage of language.


We're not talking about 'fire retardant'. The quote is:

> macOS still has a completely retarded implementation of DisplayPort MST

> completely retarded

It's extremely frustrating how many people in this community chose to defend this comment as if it were an eloquent description of the issue.

Calling attention to problematic language is not hostile or inflammatory. This term is flagged as offensive in the dictionary in this context. Don't shoot the messenger.


I think deliberately interpreting other people's speech only within the framework that your mind works in is problematic. It doesn't even matter if there's consensus — you're choosing to interpret it as ableist despite it clearly not being used in that manner and you refuse to acknowledge that people different from you use language differently. How is this not a bigoted stance to hold? It's language imperialism.


Exactly right, this is why such snow flake identity politics that redefine words should not be allowed on HN. It's effectively ruined twitter and a good chunk of the internet, so it should stay over there.

It's silly that the OP wishes to turn a conversation into computer monitors into a conversation about bad or good words.


They are biased. By their definition of “anti-vaxxer” people who merely oppose vaccination mandates are anti-vaxxers as well. [1]

Which is, frankly, quite insulting to me, because I’m vaccinated, and consider myself by no means an anti-vaxxer, I oppose vaccination mandates.

But Merriam-Webster says I’m an anti-vaxxer so I guess I must be one ¯\_(ツ)_/¯

[1] https://www.merriam-webster.com/dictionary/anti-vaxxer


Sorry but I think you’ve lost the plot. This has nothing to do with anti-vaxxing.


The first definition is "very stupid or foolish" which seems quite apt given the context. The second one talks about a disability, which does not really seem applicable since we're talking about monitors and not people.

So where is the issue exactly? He used the word correctly as far as I can tell.


You are overlooking the fact that the word is flagged as offensive. That is the entirety of the issue.

Definition of retarded

1 informal + offensive : very stupid or foolish

2 dated, now offensive : affected by intellectual disability : INTELLECTUALLY DISABLED


That still fits. OP is describing the product which they believe to be sub standard and likely want to imply offense to it. There would be a difference between "This product is shit" and "You are shit". Both are sending offense to the target, but one of them is a computer with no feelings.


[flagged]


>OFFENSIVE very foolish or stupid.

Still seems very apt, no?


No. It's flagged as offensive in all caps. That means using it is the opposite of appropriate.


We detached this subthread from https://news.ycombinator.com/item?id=29470254.

It's not a net win for the forum to have flamewars about this.

https://news.ycombinator.com/newsguidelines.html


it's ok, people can swear on the interwebs


[flagged]


Please don't post flamebait here or take threads further into flamewar.

https://news.ycombinator.com/newsguidelines.html


Have you ever considered that some people here actually have disabled family members?


[flagged]


Please stop taking HN threads further into flamewar. You've done it a lot in the past, and your account seems unfortunately to be swerving back into it. It's not what this site is for.

https://news.ycombinator.com/newsguidelines.html




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: