Hacker News new | past | comments | ask | show | jobs | submit login

Eh, no. Gnome doesn’t actually do fractional scaling – as said earlier in the thread, it’ll render at the next integer resolution and scale down in GPU.

Which I don’t really have the performance for (I’m already using Budgie’s option to do exactly that under X11 right now already, it’s not exactly the most performant solution).

And even under Wayland, Gnome still renders at the next integer scale.

While KDE under X11 actually renders at the correct resolution, with proper scaling and perfect sharpness, which I really miss.




That is working as intended, and either you don't have a graphics card that is good enough to drive your monitor, or you're using CPU-rendered apps that will always be slow at higher resolutions. This is a problem with a lot of cheap hardware and older software. It worked for Apple because IIRC they've been porting all the low level drawing over to Metal. GNOME is a lot slower to do that unfortunately, you won't see everything being GPU accelerated until GTK4 becomes widely adopted.


The Metal (Apple) or OpenGL/Vulkan (Gtk) renderers are not that important for apps that were fine with software rendering for years. The final composition of framebuffer is being done with hw anyway (for most cases, unless you have a really old hardware with non-working opengl 3 or so).

The downscaling, when done properly, is being done with output encoder, not with GPU. It should be transparent for anything graphic card, that came with DP or HDMI port. Maybe except original rpi.


In my experience it becomes an issue if you do something like maximize the window on a 4K screen, GTK3 has noticeable slowdown there whereas GTK4 is faster.


That’s not what I, as mostly android/web/qt dev, want though. Why should I get 50% less performance just to fix some tiny single-pixel issues on a screen where I can’t even see individual pixels?


I'm not sure how to answer that question, you don't have to do anything you don't want to do. GTK and its apps currently use integer coordinates, so in order to change that it needs to be changed to use floating point, and all the apps need to be changed to use it too. This is an API break and another thing that will take a long time to plumb through the whole stack and I doubt it will happen until at least GTK5. Maybe use KDE or Windows until then? Or get different hardware?

Edit: comment from a GTK developer here that goes more into detail: https://gitlab.gnome.org/GNOME/mutter/-/issues/478#note_7939...

I guess if you want to know why should things be this way, you could say it's because of technical limitations? Though that may not be a satisfying answer to you.


GTK2 used float coordinates and supported fractional rendering.

GTK3 breaking that is a bug, not a feature.

Especially text should NEVER be scaled after rendering.

And re: performance, that’s great that you’ve got hardware that can run 3D in-browser shader sites or games at 6K resolution at 60fps just fine, mine can’t. If it were running at 4K, as it’s supposed to, I’d get significantly more fps. Especially at a time where GPUs are rare and Gnome still decides I should get 50% more performance just so they save some work.


I think you might be thinking of Cairo when used with one of the print backends, not GTK. GTK2 did not use float coordinates or support fractional rendering. See for example here:

https://developer-old.gnome.org/gtk2/2.24/GtkWidget.html#gtk...

https://developer-old.gnome.org/gtk2/2.24/GtkWidget.html#gtk...

Those would need to be floats if you wanted to have a non-integer scale. Changing everything to use float coordinates is a large undertaking that touches the whole stack and will require a lot of work. It still hasn't been done yet as of GTK4. If you'd like to spend your time helping out then please do so. Please refer to the gitlab comment above for a description of some of the technical issues that you would have to solve. But it may be more cost effective for you to just get some new hardware with a better GPU, or use something else.


It is not 50% less performance; on hardware from the last 10+ years, it should not be noticeable. Neither in performance, nor in energy. If anything, it should be more performant and less buggy, becuase instead of complicating the software, you solve the problem with dedicated circuitry, that is already part of the output encoder anyway.


Not noticeable is interesting.

What hardware from the past 10 years are you using that rendering 3D games in browser or shader websites at 6K vs 4K resolution makes no difference?

I’m already struggling to get enough fps at 4K, with Gnome rendering it at 6K and downscaling to 4K my performance is even worse. That’s using an RX 5700XT, a GPU that in the current absurd market retails over $1000.

And as Gnome ensured that neither GTK nor Wayland expose any non-integer scale factors, or support it, browsers on linux have no way to render at native resolutions and handle scaling themselves, which would improve my performance situation significantly.


If you have games, run them fullscreen. Fullscreen surfaces will get removed from composited desktop, scaled independently and can run at arbitrary resolutions.


How do I run https://www.shadertoy.com/ in native fullscreen?

How do I actually multitask while doing that?

At 1.5x even my Ryzen 9 3900X and RX 5700XT are stuttering to show many shadertoy examples at 4K, where the performance at 1x is more than acceptable.

Simply: Gnome (and Wayland) is broken, and this API must be changed.

EDIT:

For https://www.shadertoy.com/view/ss3SD8 the difference on my laptop for example is 1280x720 at 1.4fps (1.25x scale) or 850x450 at 17fps (1.0x scale).

That’s a massive difference, and enough to be the difference between usable and unusable.


"Gnome (and Wayland) is broken, and this API must be changed."

Again I would say no, that's wrong, it's just not currently meant to run on your hardware. If you know how to change this API then please help. AFAIK there is no one working on this currently. To solve the problem, it really needs someone who is committed to getting it working on the type of hardware that you have and who can champion that use case. If that's not you then you'll have to wait (indefinitely) until that person shows up, if they do at all.

Also, shadertoy is a really bad test case as those are more like demos, they aren't optimized and are just made to show how much you can do by computing things on the fly in a shader. A real program you'd see in production would make better use of GPU memory and wouldn't hammer the GPU cores so hard.


What’s so complicated about just copying Windows, Android or Qt 1:1?

I’ve got all my time already full with jobs, projects, and other stuff I’m working on, if I’d find time, I’d love to help extending wayland, deprecating GTK3 and GTK4, and modifying the software accordingly.

I’ve made custom patches for the same purpose in the past already, but I just don’t have the time.

Especially since with X11 it was at least somewhat possible to do it, but now with wayland it’s sadly enforced in the protocol to only use 8-bit sRGB and integer scale factors.

If I ever find the person responsible for that decision...


It's not hard to copy an API specification, but actually implementing the API and changing drawing in every app, toolkit and compositor to support floating point is what is going to be complicated and will take time.

Edit: The decision to use integer coordinates I believe was made long ago in GTK1 because that's what X11 used, and it just hasn't been changed since.

"Especially since with X11 it was at least somewhat possible to do it"

X11 never actually supported this and also only uses integers for screen and input coordinates so I'm not sure what you're referring to. I think you are thinking of the app itself doing its own DPI scaling which will also work in wayland, but it will have the same problems where the DPI could mismatch with other apps and with the rest of the desktop.

"I’d love to help extending wayland, deprecating GTK3 and GTK4"

Well GTK3 is already in maintenance mode, and apps are currently in the process of being ported to GTK4, so first they'll probably want to finish that and get some feedback before starting on GTK5... it will probably be at least a few years before anything related to this could ship.


GTK2 had an integer DPI variable. GTK3 now just has an integer scale variable.

GTK2 could actually render at 144dpi, or 108dpi, just fine. Everything was slightly misaligned, not everything scaled perfectly, but it worked.

X11 had an integer Xft.dpi variable. Wayland now just has an integer scale variable.

WHY was this downgrade made? If a protocol break could be made to remove this functionality, a protocol break to add it back should be just as justified.

Break everything, apparently removing functionality was a good enough reason to do it, re-adding it should be just as well.

Yes, I am extremely angry after having spent years yelling at the wayland-wg and gnome devs not to make these decisions, not to put these things into the protocol, and complaining that if an API break is necessary, it should be extensible for these use cases. I’ve been complaining for a decade now and all of the complaints got ignored and instead the mistakes I warned about were made.


That DPI value AFAIK only changed the text scale, not the scale of widgets. You are thinking of two different things, the GTK3 integer scale actually affects widgets. The Xft.dpi setting also originally only meant text scale but has apparently been overloaded to apply to widgets in Qt? I'm not sure, I haven't tried this recently and I don't use X11. But it's right there in the name: Xft is the X freetype font rendering.

I understand your frustration but from what I have seen, nothing has actually been removed. This is just a feature that was never implemented because nobody signed up to implement it. Wayland could be extended to support it eventually but somebody actually has to put in the work. I think the best bet for somebody working on this would be to get it fully working and stable in KDE first, since Qt apparently has the toolkit support for it, and then maybe it can be adapted to work for GTK.


> I understand your frustration but from what I have seen, nothing has actually been removed. This is just a feature that was never implemented because nobody signed up to implement it. Wayland could be extended to support it eventually but somebody actually has to put in the work. I think the best bet for somebody working on this would be to get it fully working and stable in KDE first, since Qt apparently has the toolkit support for it, and then maybe it can be adapted to work for GTK.

The issue with that is that first of all, the wayland protocol needs to be modified to add support for this, and the entire freedesktop/gnome community refuses to even consider adding support for anything like this because gnome doesn’t need it/can’t support it.

I’ve talked with vendors of devices using Linux with Wayland, who’ve been trying to get the committee to add support for years.

I’ve talked with KDE devs who’ve been frustrated without limits.

There’s a lot of people who are absolutely burnt out and angered by the constant stonewalling from Gnome, pretending for a decade that no one needs this and so it shouldn’t even exist.


I really have no idea what you're talking about or what committee you're referring to. KDE can develop its own Wayland extensions and already has a lot of them. They don't need GNOME's approval. I believe they are housed in this git repository. https://invent.kde.org/libraries/plasma-wayland-protocols

Once they become stabilized and if other desktops took an interest, then they could be proposed as standard. But KDE does not need to wait for GNOME to implement a feature in Wayland, or vice versa. If you could mention which KDE developer you were talking to then maybe we could help them get this implemented in KDE first? Then afterwards if they can offer some tips to offer GTK (or other toolkits) on how to implement this, that would be valuable to everyone.


Honestly, you're right. I should just stop trying to discuss with the wayland wg and gnome devs and just create my own protocol extension without asking anyone, submit patches to Qt, Kwin, and certain webbrowsers, and then just use those myself.

Sure, Gnome still won't adopt them ever because they're stubborn and think they know everything better, but at least I'd have a somewhat working solution.

Maybe elementary OS would actually use it too, they've already got support for just scaling everything based on the font size.

Thing is, I need this functionality yesterday. And it's been an absolute pain hearing that even if you spend an excessive amount of work today, in a community where all of gnome is against you, maybe someday in a decade it'll get better.


I think that sounds like a great idea. But I just don't understand why you were so invested in GNOME adopting them when it seems clear that they weren't really that interested, and now it seems like it actually doesn't even matter to you? I'm sure you know not to "put all your eggs in one basket" as it were.


Honestly, the big issue is that Gtk is used as basis to obtain the scaling factors for all browsers, Java, and several other tools including everything based on electron.

As long as Gtk refuses to support this, I can fix KDE, but I can't fix Java or browsers.

Java, Firefox and Chrome reject patches, saying Gnome is responsible. Gnome says they don't care.

And I don't want to run custom builds of Firefox and Java anymore. It was such a hassle constantly building them from source with my old patchset.


I think if KDE actually had a working implementation of this then that would go a really long way towards convincing those other things to support it. IIRC Chrome at least doesn't get the scale from GTK. But KDE's Wayland implementation is still unstable (in part because it seems to have a lot more features than GNOME's wayland implementation) so I think it will be a while before that happens too.

If you are really committed to using Linux and don't want to wait then I said it before but I think a better use of your time would be to spend a few hundred dollars on some different monitors that can work at just 2x scale, and then let someone else deal with this.


I've got a 27" 4K HDR10 DCI-P3 1000nits monitor with builtin KVM.

Usual price point around $1600-$2200.

Comparable 2× monitors start at around $4500, and the cheapest option only supports macOS and has a $1000 monitor stand.

A "few hundred dollars" won't be enough.

At the moment, I just work all day at ~10-15fps using Budgie with fractional dpi on X11, but it's definitely not a great experience.

I used to be 100% in on KDE, but as you mentioned, Wayland support is limited.

Budgie is atm sadly the only option for multi-monitor hidpi that's not using gnome's broken top bar concept.


Ah okay. I think the work required to get this implemented across the stack will still cost a lot more than $11,000, just saying. Also in my opinion 4K monitors at that size are an unfortunate purchase because the PPI is not high enough to look crisp at that viewing distance. Apple upgraded theirs because you need to get above 200 PPI range in order to have it look comparable to print. For me personally, I couldn't justify spending my spare time working on that when other monitors exist that don't have that problem.


4K at 27" is an unfortunate resolution / density. I know, I also have such monitor :(.

That said, 10-15 fps on X11 is a pathological case, something is wrong there. When I ran X11 and played with xrandr, the slightly larger framebuffer had zero impact (outside games) on AMD Vega64.


> GTK2 could actually render at 144dpi, or 108dpi, just fine.

Your definition for just fine must differ from everyone's else. Most GTK2 shipped assets for 96 dpi and that's it. Just try running GIMP on 192 dpi display, with "properly set dpi", for example. You will see yourself that is was not just fine.

> WHY was this downgrade made?

Because it didn't work. That's why.

> If a protocol break could be made to remove this functionality

It wasn't protocol break. It was apps either ignoring the facilities (in better case) or being outright broken -> desktop locking on constant values, where everyone was able to test their wares.

> protocol break to add it back should be just as justified.

Not going to happen.

> Break everything, apparently removing functionality was a good enough reason to do it, re-adding it should be just as well.

You can start. Show 'em.


> You can start. Show 'em.

I have.

I’ve made a custom build from source of my entire desktop environment, changing the scale factor back to a dpi value, and modifying Qt and Gtk2 apps to use that.

And it works.

You just need to break the wayland protocol and rebuild everything from source, but for over 2 years, I ran that as daily driver setup because I was so pissed off by this decision.


> modifying Qt and Gtk2 apps

You don't get to modify apps -- that's exactly part of the problem being solved; your solution has to work with whatever user throws at it; from Chrome to xeyes, sorry.

Oh, and with staged updates too -- it might take some 2 years to adopt your solution, if everyone cooperates (which is a big if); not all distros update at the same time and some do not bother updating for another cycle (see Ubuntu and Wayland).


> You don't get to modify apps -- that's exactly part of the problem being solved; your solution has to work with whatever user throws at it; from Chrome to xeyes, sorry.

That's not how open source works. We get to modify apps. That's the whole point.


> That's not how open source works. We get to modify apps. That's the whole point.

1) people are running all kinds of applications on Linux, not just open source ones. The solution has to work with Siemens NX or Autodesk Maya just like it works with Gtk Demo.

2) Even if you can modify apps, are you going to modify them all? No, most of them would be never fixed.

3) so you are mistaking "we get to modify apps" with "we get to rebuild the world, and everyone's system is part of the rebuild".


> 1) people are running all kinds of applications on Linux, not just open source ones. The solution has to work with Siemens NX or Autodesk Maya just like it works with Gtk Demo.

Applications which are extremely performance critical and accuracy-critical, where the compositor shouldn’t touch rendered pixels at all? Yeah, that works great, right?

We can just enforce the functionality in all new toolkits and deprecate old ones. Existing, old software will just render badly, like it does today, and everything new or updated will get all the advantages.

We can even enforce a new Wayland2 protocol, everything old can just render through XWayland as a blurry mess, Gtk3/4 included (they do support X still after all), and everything new can use the improved protocol immediately.


Just make it a required feature in Gtk4 and distros and apps will be forced to support it.


It's not going to happen in GTK4 because it would be an API break and GTK4 has already shipped. So that's why it will have to wait for GTK5 or greater.

Edit: For further explanation see here for a longer description of GTK's release strategy. https://blog.gtk.org/2016/09/01/versioning-and-long-term-sta...


See? I’ve complained to the Gtk devs since before the release of Gtk3 that this would be an issue, I’ve even made suggestions how to avoid it, and yet despite all that all I’ve got were "it’s unnecessary" for years. And now the issue is baked in, I’m supposed to wait for even more years and spend lots of time to fix the issues other idiots made because they couldn’t listen.


In my experience you can't really tell open source developers what to do, you can give suggestions, but if they don't agree with you then the most effective way to get your point across would be to write the code yourself. Also, if you want someone to listen to what you have to say, I would recommend against calling them idiots and other insults.


If every day every single interaction with your computer becomes a pain purely because of some stubborn people who refuse to listen, it becomes very hard to stay calm. Especially after almost a decade.


I'm really not sure I understand, you don't have to use GNOME or Linux. Don't put yourself through any unnecessary pain. Also please remember that nobody is obligated to listen to you in open source, participation is entirely voluntary.


I can’t reasonably do dev on windows, I’ve tried.

So, I have to use linux.

And I have to use something which supports my 27" 4K monitor at 1.5x scale today, not at some point in the future.

I can’t just not work until this is fixed, I have to deal with what broken tech exists today. Sadly.


Again, it is not a matter of new API! New API solves exactly nothing. It won't solve the libXaw/Motif/GTK2/GTK3/whatever apps. They still have to run correctly.

Your new API won't be adopted by everyone overnight, that's why it is a dead end.


Those old toolkits can just be be deprecated (most already are). Most of them don't even run on Wayland: by rendering through XWayland, they will be blurry anyway. You can just continue to have a fallback path that just makes a blurry mess.

GTK3 is the exception but you can just let it be blurry; everything else can be sharp.


Sure, they are deprecated, but Xwayland has to display them at correct scale, still.

One of the issues in X11 is, that the display server can tell the clients what the screen dpi is, but it never knows, whether they honor it -- and they never tell. Many of them don't, some do, so you cannot have uniform handling for them, and if you find a mechanism where they can tell, how do you retrofit it back? You won't.

So yes, putting the mechanism into current toolkit is easy, but that means that the linux desktop is going to be broken for decades to come, because the old client won't disappear overnight. If they did, we would be in pure Wayland for years already...


> and changing drawing in every app

Exactly. Nobody is going to update everything what's already out there. For many things, there's not even the source, even if there was a will. There's unbelievable dragging of the feet in the Linux community, so even that will is a big question.

These compositors have to play with cards they were dealt, not with cards they wish they had.


Why was it possible to break the world to remove these things when switching from Gtk2 to Gtk3, but now it’s not possible to re-add them? Why was it possible to remove it when switching from X11 to wayland, but now it’s not possible to re-add it?

I’ve complained about this since before Gtk3 and Wayland were stable, constantly warning against making these decisions, and now that the bad decisions have been made, doing another protocol break to fix them is apparently out of the question?


You don't understand.

When GTK3 came, it didn't break GTK2 and GTK2 apps. They could work side-by-side, as they were before.

What you suggest is either kicking out the world from beneath them, or having all of them updated, at the same time.

That's not going to happen.


Sadly I think you will have to accept some of those apps will just never have Windows-style DPI scaling, sorry. It's the same thing with various older apps on Windows that will just not get updated because they use Win32 API functions that are hardcoded to use integers. Maybe you could come up with some clever solution that tricks them into rendering at a different size? I'm really not sure, if you are an expert in this then please help.


Windows actually modified the old-school Win32 APIs to support DPI scaling recently, so even most WinForms apps get proper DPI scaling.

It’s absolutely possible. And as mentioned, switching from dpi as int to scale-factor as int was reason enough to break all software, going back should be just as justified.


If you know the technical details of how that was done in Windows then please help get that implemented. AFAIK those win32 APIs were not changed to take floating point, so there is some other kind of heuristic or something going on there, and I expect it won't work in all cases e.g. if the application itself renders its own backing surfaces and never takes scale into account.


AFAIK it’s not much different from e.g. Qt. As the protocol between compositor and app includes a full dpi value the toolkit can just render everything at a larger scale, with the same rounding pattern used to ensure everything still fits. Sure, some parts will be slightly offset, and any widget that first renders, requests the rendered data, and post-processes that will be slightly blurry, but it works fine.

The same can be done in any environment where the protocol between toolkit and compositor contains enough metadata and applications can tell the compositor that they’re rendering at native scale.

So, X11 works, Wayland won’t work until the protocol gets significantly changed.

I had a similar patch for my compositor, the protocol libs, and Qt where I just let windows specify that they’re rendering at native scale, and I let Qt handle the scale factor as if QT_SCREEN_SCALE_FACTORS had been set accordingly.


If you have a working patch then I would suggest working with other developers to get a similar thing implemented elsewhere. Also I don't think this needs significant changes in the Wayland protocol at all so I'm really confused what you mean, it seems like all you need is one additional message to specify the DPI to the application. The hard part is getting this implemented everywhere.


So at first you need a way to have the compositor notify the application which dpis it should render at (because it’d be multiple).

And then you’d need a way for applications to tag each buffer it renders with the DPI it’s rendered at, so the compositor can show the right buffer natively on the right screen.

For performance improvements, it even makes sense to provide a clip mask telling which part of the window each DPI should be rendered for.

Now, that’s complicated, but doable. I did it.

What’s more complicated is handling situations where compositors don’t support this, or only support parts of it.

That whole backwards compatibility part is more work than getting the whole feature built and shipped, and it’s entirely unnecessary.


Can you please publish this work somewhere so somebody else can use it? Or consider writing a blog about it? I think that would be a great way to help out. Also which compositor did you use? If it wasn't Weston then that could be a problem, if you want to have a chance for this to be stable in Wayland then usually you'd start with a working patch to Weston. If you don't want to do that work then maybe you can send your patches to someone who is able to modify them and get them working in Weston or KDE, or anything else really.

Edit: Also I'd like to give some feedback.

"a way to have the compositor notify the application which dpis it should render at (because it’d be multiple)."

Right, so that's one additional message. I assume by multiple you mean the case when you have multiple monitors at different scales.

"a way for applications to tag each buffer it renders with the DPI it’s rendered at, so the compositor can show the right buffer natively on the right screen."

I think this would be one or two additional messages. But I'm actually confused by why you would do this because it seems like it would still cause performance issues, if you're now rendering every program two or more times every frame. I think you may want to save this for an "accuracy mode" or something, and normally have it so only the max DPI applies.

"it even makes sense to provide a clip mask telling which part of the window each DPI should be rendered for"

This won't work in Wayland because the client never clips windows, it just redraws the whole window each time.

"What’s more complicated is handling situations where compositors don’t support this, or only support parts of it. That whole backwards compatibility part is more work than getting the whole feature built and shipped, and it’s entirely unnecessary. "

I don't see how that's complicated or why it's more work, programs would just work as they do now, i.e. you just assume those programs always have a DPI of 96 * buffer_scale.

Also depending on how you design this, you may have to consider how this would interact with OpenGL and Vulkan, and maybe consider the possibility of creating additional extensions there in order to handle this. But that may or may not be necessary, I'm not sure.


> I think this would be one or two additional messages. But I'm actually confused by why you would do this because it seems like it would still cause performance issues, if you're now rendering every program two or more times every frame. I think you may want to save this for an "accuracy mode" or something, and normally have it so only the max DPI applies.

Some compositor/toolkit combinations today render at both 1x and 2x, or similar scales, if multi-monitor setups are used.

Now, to improve that situation, you’d need to render at each of the scales of the monitors the window is on right now instead.

To improve performance of that, sending a clip mask from compositor to toolkit to say "hey, only render the left half at 2x, right half at 1x" makes sense, if the toolkit decides to support it, that’d improve performance at no cost.

> or only support parts of it

What if the compositor sends the info about each monitor’s DPI, but doesn’t support the dpi-tagged buffers, instead only supporting viewport-tagged or integer-scale-tagged buffers?


Hey, I just wanted to say that I'm also very bothered by this issue! I want to echo the sentiment that if you have a repo on Github or elsewhere with your patches, I would be glad if you shared it.


Windows has one advantage: the display server api is private, so there's no app talking directly to window server, but everything goes through gdi32/user32. As a result, they could make some changes that Linux toolkits cannot make; and linking to gtk (or even libx11/libxcb/libwayland-client) is not mandatory in the Linux world, so they cannot fix things behind app backs.

And the windows solution is not reliable either; the only reliable solution was macOS one.


Android started with it from the start and developers were aware of it.

Windows and Qt were retrofitted and it shows. Both are buggy and unreliable.


> "Gnome (and Wayland) is broken, and this API must be changed."

> Again I would say no, that's wrong, it's just not currently meant to run on your hardware.

It' absurd that not running on common hardware is, somehow, not broken.


We can point out corner cases for both approaches till cows come home.

Yes, both of them have advantages and disadvantages, engineering is about picking the right compromise. So yes, there will be cases where you would be better of with the other approach... for a while.

But the approach is not being picked just for today and for the current state of tech. For the same reason, you won't lock yourself into unnecessarily complicated software that is going to be permanently buggy, and to be obsolete soon -- just like you would not appreciate being locked into Psion/Symbian-like memory management today, despite making sense, solving a problem and being more effective years ago.

This is exactly the same case. That software is going to stay here for decades... you RX 5700XT is not.


I mean, it works, and the end result is a fractional scale. But, yeah, it has some overhead.

When I tried here some months ago, KDE with fractional scaling wasn't as sharp as Gnome's upscale-then-downscale (I'm not really sure why. Downscaling should involve some blurring). I really wanted to use KDE, but text rendering on KDE's fractional scaling appears to be blurry.


> Gnome doesn’t actually do fractional scaling

Gnome actually does fractional scaling. It doesn't do fractional rendering. Two different things; you can do one without another.

As I wrote elsewhere, you cannot do perfect sharpness at fractional resolutions. How are you going to render 1px wide line exactly?


> As I wrote elsewhere, you cannot do perfect sharpness at fractional resolutions. How are you going to render 1px wide line exactly?

With downscaling, I get a blurry mess. With fractional rendering, I get a line that may be a tiny bit too wide or too thin, and may be one pixel off, but it’s going to be perfectly sharp and clear.

At least render fonts at native resolution and only up/downscale the rest of the widgets. Scaling fonts can NOT be done by post-processing or it WILL be wrong.


> At least render fonts at native resolution and only up/downscale the rest of the widgets. Scaling fonts can NOT be done by post-processing or it WILL be wrong.

They are fine, but just cannot downscale to arbitrary sizes. Note that macOS doesn't do 125%, for example, because that's one of the worst cases - you have 5 pixels to do job of 8.


If you do that, you throw out all the hinting and scaling code the font authors may have added.

Many fonts intentionally change weight slightly at small sizes, by scaling it afterwards you break this functionality.

Also, by scaling afterwards in compositor in sRGB space you create issues with brightness as the compositor (at least under Gnome) does not take gamma into account.

The amount of tradeoffs is extreme, compared to a few UI widgets getting slightly shifted around.

And I’ve already mentioned the performance issue in multiple other places in this thread.


Hinting is not really used nowadays; it made sense on low res displays, but not in HiDPI. You are better off with autohinter now.

Wrt weights, fonts are not defined in pixels; so adjusting for this is the easy part.

Yes, ignoring gamma is a problem, and I'm not sure whether anyone in Gnome/Freetype/Harbuzz is working on this; probably not. But this is a problem for low dpi and integer scaled hidpi too, not just for fractional scales.

Widgets slightly shifted also mean your mouse is going to be shifted, and quite possibly in different direction. Now, that's going to be a problem, that the users can clearly reproduce.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: