Windows actually modified the old-school Win32 APIs to support DPI scaling recently, so even most WinForms apps get proper DPI scaling.
It’s absolutely possible. And as mentioned, switching from dpi as int to scale-factor as int was reason enough to break all software, going back should be just as justified.
If you know the technical details of how that was done in Windows then please help get that implemented. AFAIK those win32 APIs were not changed to take floating point, so there is some other kind of heuristic or something going on there, and I expect it won't work in all cases e.g. if the application itself renders its own backing surfaces and never takes scale into account.
AFAIK it’s not much different from e.g. Qt. As the protocol between compositor and app includes a full dpi value the toolkit can just render everything at a larger scale, with the same rounding pattern used to ensure everything still fits. Sure, some parts will be slightly offset, and any widget that first renders, requests the rendered data, and post-processes that will be slightly blurry, but it works fine.
The same can be done in any environment where the protocol between toolkit and compositor contains enough metadata and applications can tell the compositor that they’re rendering at native scale.
So, X11 works, Wayland won’t work until the protocol gets significantly changed.
I had a similar patch for my compositor, the protocol libs, and Qt where I just let windows specify that they’re rendering at native scale, and I let Qt handle the scale factor as if QT_SCREEN_SCALE_FACTORS had been set accordingly.
If you have a working patch then I would suggest working with other developers to get a similar thing implemented elsewhere. Also I don't think this needs significant changes in the Wayland protocol at all so I'm really confused what you mean, it seems like all you need is one additional message to specify the DPI to the application. The hard part is getting this implemented everywhere.
So at first you need a way to have the compositor notify the application which dpis it should render at (because it’d be multiple).
And then you’d need a way for applications to tag each buffer it renders with the DPI it’s rendered at, so the compositor can show the right buffer natively on the right screen.
For performance improvements, it even makes sense to provide a clip mask telling which part of the window each DPI should be rendered for.
Now, that’s complicated, but doable. I did it.
What’s more complicated is handling situations where compositors don’t support this, or only support parts of it.
That whole backwards compatibility part is more work than getting the whole feature built and shipped, and it’s entirely unnecessary.
Can you please publish this work somewhere so somebody else can use it? Or consider writing a blog about it? I think that would be a great way to help out. Also which compositor did you use? If it wasn't Weston then that could be a problem, if you want to have a chance for this to be stable in Wayland then usually you'd start with a working patch to Weston. If you don't want to do that work then maybe you can send your patches to someone who is able to modify them and get them working in Weston or KDE, or anything else really.
Edit: Also I'd like to give some feedback.
"a way to have the compositor notify the application which dpis it should render at (because it’d be multiple)."
Right, so that's one additional message. I assume by multiple you mean the case when you have multiple monitors at different scales.
"a way for applications to tag each buffer it renders with the DPI it’s rendered at, so the compositor can show the right buffer natively on the right screen."
I think this would be one or two additional messages. But I'm actually confused by why you would do this because it seems like it would still cause performance issues, if you're now rendering every program two or more times every frame. I think you may want to save this for an "accuracy mode" or something, and normally have it so only the max DPI applies.
"it even makes sense to provide a clip mask telling which part of the window each DPI should be rendered for"
This won't work in Wayland because the client never clips windows, it just redraws the whole window each time.
"What’s more complicated is handling situations where compositors don’t support this, or only support parts of it. That whole backwards compatibility part is more work than getting the whole feature built and shipped, and it’s entirely unnecessary. "
I don't see how that's complicated or why it's more work, programs would just work as they do now, i.e. you just assume those programs always have a DPI of 96 * buffer_scale.
Also depending on how you design this, you may have to consider how this would interact with OpenGL and Vulkan, and maybe consider the possibility of creating additional extensions there in order to handle this. But that may or may not be necessary, I'm not sure.
> I think this would be one or two additional messages. But I'm actually confused by why you would do this because it seems like it would still cause performance issues, if you're now rendering every program two or more times every frame. I think you may want to save this for an "accuracy mode" or something, and normally have it so only the max DPI applies.
Some compositor/toolkit combinations today render at both 1x and 2x, or similar scales, if multi-monitor setups are used.
Now, to improve that situation, you’d need to render at each of the scales of the monitors the window is on right now instead.
To improve performance of that, sending a clip mask from compositor to toolkit to say "hey, only render the left half at 2x, right half at 1x" makes sense, if the toolkit decides to support it, that’d improve performance at no cost.
> or only support parts of it
What if the compositor sends the info about each monitor’s DPI, but doesn’t support the dpi-tagged buffers, instead only supporting viewport-tagged or integer-scale-tagged buffers?
Hey, I just wanted to say that I'm also very bothered by this issue! I want to echo the sentiment that if you have a repo on Github or elsewhere with your patches, I would be glad if you shared it.
Windows has one advantage: the display server api is private, so there's no app talking directly to window server, but everything goes through gdi32/user32. As a result, they could make some changes that Linux toolkits cannot make; and linking to gtk (or even libx11/libxcb/libwayland-client) is not mandatory in the Linux world, so they cannot fix things behind app backs.
And the windows solution is not reliable either; the only reliable solution was macOS one.
It’s absolutely possible. And as mentioned, switching from dpi as int to scale-factor as int was reason enough to break all software, going back should be just as justified.