I very much appreciate all of the hard work that goes into striving for a consistent, beautiful UX in ElementaryOS. It feels like there is a driving vision behind all of the design and usability choices.
The move to Inter as a UI font will be a good step forward, as Open Sans has not aged well. Especially as displays are higher and higher DPI, another good step forward would be to completely disable font hinting. It's subjective, but it would seem consistent with the rest of the distro's approach to visual design.
But those are details that some like me deeply appreciate, while others don't see what the fuss is about one way or the other, for better or worse. What really matters is that I can throw ElementaryOS on my wife's or mom's laptop, and they can figure it all out no problem, without all of the terrible anti-user everything that is Windows.
Inter, IMO is a terrible choice for a UI font. It’s the exact reason why Apple switched from Helvetica [1]. If you want to learn more, see the difference between Apple’s SF Pro, SF Pro Display and SF Pro Text. Inter is a font that fits neither as a body text (fitting is awfully inconsistent), as a display typeface and certainly not as an interface typeface (counters are too tight, fitting needs to be redone specifically for interfaces, hinting is lacking and it is untested on various DPI settings and optimizations that someone like Apple has put effort into).
For the record, I think Apple’s UI has gotten worse but they’ve hired one of the best Typographers in the world. They did several presentations in WWDC 2016.
I absolutely agree. I tried Inter the other day for my interface, and everything immediately felt wrong. However, I then tried using it on my résumé, and it looked fantastic! On the Apple note, I agree that SF is a pretty good font, but it's also not without it's issues (which are being exacerbated in updates like Big Sur).
I'd love to use Elementary OS, but their use of Vala is just such a turn off.
That language is dead except for a tiny gnome niche.
If they wanted the mac devs for their sleek look and feel they could have switched to swift.
And if they wanted to gain easy popularity they could have switched to rust. But vala... there is no developer pool to draw people from, and they're not apple where ObjC might have been an obscure but in the end really productive ecosystem.
> I'd love to use Elementary OS, but their use of Vala is just such a turn off. That language is dead except for a tiny gnome niche.
I'll go out on a limb and say that while this might have been closer to the truth 3 years ago, there has been an uptick in activity on the language recently, especially with improvements in tooling. If you look here (https://github.com/topics/vala), while you'll see it used in smaller projects, you'll also see there's a fair amount of high-quality software written in Vala. Clearly the language is capable for serious use.
As far as the language itself is concerned, I think it has a number of good points that aren't very common (outside of Swift or Rust): that is high-level abstractions with little to no runtime cost, easy to learn, C compatible and binds easily to practically every language under the sun thanks to gobject-introspection.
I'm a iOS developer and don't mind Vala at all. The language directly encodes the GTK object model, and I'd rather pick up a boring C#-ish language on a weekend, than learn GTK and then understand how it is bridged into a language which already has its own conflicting object model.
JavaScript is way more popular than Rust, and I don't feel gjs projects are doing much better than elementary OS.
Vala seemed very simple and if it's still solid enough for them to release a stable DE, I guess it's not a big issue. I hear your concern but it seems most people can adapt to Vala in a week.
Yeah I don't get it. While it's certainly not the post popular language out there, it's not like you as a user are impacted by the choice of a programming language it's written in.
Default apps are mostly good. Music is the only one I don't like in general, and while Epiphany is not my preferred browser, it seems perfectly capable (hell, it even syncs with Firefox Sync). The rest of the apps feel very pleasant to use and I never had to bother with looking for an alternative.
We're not talking about a simple piece of software here but an entire ecosystem which has Vala as it's language of choice.
Using a certain operating system comes with the implicit assumption that I'm also gonna write programs for said operating system. And I'm not gonna learn some obscure version of C#/Java for that, just because the gnome project felt that they had to n.i.h. an entire programming language.
> gnome project felt that they had to n.i.h. an entire programming language.
This isn't what happened. When Vala came out (2006), it was at the time the only programming language advertising such high-level concepts as found in C# (like async/await) while guaranteeing you native performance without a VM. At the time it was billed as a better C++ and a faster C#. When the elementary project adopted Vala (2007), Swift and Rust weren't things, and they would remain unstable for at least the next decade.
Today the world has changed, and while I'd argue that Rust is one of the best languages invented, Vala still has some things over it: easier to learn, binds well with C and a million other languages, faster compilation and a simpler toolchain, has built-in type annotations for Gtk and DBus, all while guaranteeing you native performance.
You admittedly made me change my mind a bit about Vala.
Apparently it was ahead of it's time, which is a bit sad. Nevertheless, elementary is the closest thing I know to a working open-source desktop OS, and Vala hinders it's adoption.
At some point you gotta pick the hill you're gonna die on, and that shouldn't be dev mind-share.
The horse they bet on didn't make it, why fall into the sunken cost trap, either bet on a different horse (rust), or pick a tried and true one (C) and write a decent FFI for alternative languages.
It's not sunken cost. Vala is performant, plays well with a multitude of languages, doesn't require a VM, and makes it easy to write Gtk applications. Unlike Rust it's very simple to learn and the elementaryOS project likes it for that reason since it lowers the barrier to entry for app developers. Based on this I'd say it's still got a lot of things going for it.
Now is there more to improve? Absolutely. Despite its use by users, development of the language itself suffers from chronic underinvestment. But I think that's changing a bit recently. (Full disclosure: I'm involved in developing the language and tooling. We could always use more contributors.)
I find it easier to develop apps in Gtk Python. But I can see how Vala might be more performant.
Towards this end I've worked on a transpiler (py2many). Any suggestions on what language Python should be transpiled to for the best Gtk runtime experience?
Somewhat off topic: Whenever I see these user-friendly Linux distributions I think they would be great for mass deployments in schools or similar. Is there any technology that gives Elementary (or similar distributions) a deployment workflow like Chrome OS; ie zero-config (for the user) to log into a central service and configuration of the user account while being easy to administer?
I've got some thoughts on this. I've found that there are two approaches to centralised management of computers: Installing a pre-made all-in-one product with pre-programmed configuration options (Windows with mostly AD, GPOs) and having numerous automatable components that can be stitched together with scripts (UNIX, sometimes Windows with Powershell). I prefer the scripting approach, as it gives me much more flexibility beyond one vendor's products. The scripts can also be shared, just now I'm setting up a second "home" server and I can mostly copy-paste Ansible roles from the first. So a school admin who's somewhat familiar with Linux could start with a complete template. The problem is that familiarity with Linux, some people will run at first sight of scripting.
The other problem is that I don't think there's money in developing a shiny boxed solution with marketing. Schools mostly go to Linux for cost savings, you couldn't convince them to pay enough to cover all the expenses of a SW company. A "support group" of school admins that administer Linux desktops might, though, which is where the script-sharing could come in. Less shiny, less "just buy it and we're done", but more likely in my opinion.
I agree about the two paths and I share a preference for the non-vendor one. Yet at the very least I need an OS that out of the box accepts say an email user@org.tld as username and automatically contacts org‘s directory to check for authorisation and pull email credentials at the very least. This seems so basic to me that it should exist outside of commercial solutions.
Well, there is FreeIPA, best compared to AD, but it's still much more complicated than what Chrome OS offers. I agree something as zero-setup would be great.
Then there's Canonical recent announcement that Ubuntu will support group policies via active directory - which will probably bleed into other Debian/Ubuntu derivatives.
It looks like there is still no fractional scaling out of the box (always possible with hacks). The elementary team has defended the choice not to introduce it before [1], but it just makes it unusable for me and many other users.
KDE Plasma handles fractional scaling well. Gnome does it mostly fine on Wayland. Now you can also run Gnome on X11 with fractional scaling (at least with Pop OS [2]).
There's a cost of running the DE with fractional scaling of course. But I am fine with that if that makes the DE usable!
Agreed on the need for fractional scaling, there is no escape from 13" 1080p and 27" 4K screens. Dell used to sell some laptops and monitors that worked well at 200% zoom, but even they stopped. I've rolled my eyes at the "What is HiDPI" blog post for so long.
More recent threads about fractional scaling on elementary include the comment "[fractional scaling] is blocked by moving to Wayland".
I can understand the developers not wanting to implement a FS hack that will likely be replaced soon anyway, when the project priority is supposed to be UX innovation. I can also understand delaying the move to Wayland until it meets the standards for stability. But in the meantime it's a frustrating situation, since the likely catalyst will be Ubuntu's 22.04 LTS moving to Wayland, and it's a barrier for many users.
Well, how about this. I'm going to be on chemotherapy for six weeks and I'm not working. Point me to the files in X.Org that need to change for fractional scaling and maybe I'll write a patch. It's not hard to implement, right?
Does KDE handle fractional scaling better than Gnome? I use Gnome on X11 and there are problems I run into (AMD GPU if that matters). I end up having to run my 4k display on 1440 to make it usable, which is really unfortunate. Wayland at 4k with fractional scaling has a bunch of visual bugs for me as well.
If KDE handles these things better, I'll jump ship in a heart beat.
I think it’s a fair philosophical choice. Fractional scaling is really hard. macOS doesn’t bother, choosing instead to render at an integer scale factor and downsample. Doing something similar on Linux would be hard, for mostly historical reasons.
At first blush, it felt like a copout to have the compositor downsample. After all, wouldn’t that be blurry? Then it occurred to me: it’s always going to be blurry. You can’t draw a “1px border” at 150% without blurry edges. I am fully aware that rendering at a higher resolution then downsampling will produce a different result, but, still, so as long as we are piggybacking off the fact that our layouts are constrained to a pixel grid to make our UIs look nice and sharp, fractional scale will be blurry.
The story only keeps getting worse for fractional scale though. People who have used VSCode on fractional DPI have likely already witnessed a somewhat confounding problem: the terminal is not just blurry, but it is scaled such that it sometimes looks like it is 1px off height or width wise. Because it is. HTML is scalable and DPI independent in part because you are only ever exposed to the virtualized pixels. But on the flip side of that, this means that if you have a canvas somewhere, determining how many actual pixels you need for the buffer is not trivially possible. The grid fitting could be different at different corners of the canvas. This obviously is an example specific to HTML, but similar conundrums should naturally appear any time you have only DPI independent units exposed.
Using floating point can get you into trouble too. I have not personally experienced it but I have seen scaling artifacts in Qt that are apparently spurred on by floating point error. While I don’t know the exact mechanism by which this happens, it’s not too hard to construct a scenario in your head.
All of this adds up to a bleak picture in my opinion. Wayland has a very good implementation of DPI awareness that lets unaware clients fallback by simply doing nothing[1], which is a big step up from X11. In theory, Wayland compositors could try to avoid fractional scaling by picking an integer scale to advertise and downsampling all of the surfaces. But even failing that, at least you could push legacy apps to just see everything at 1x, and get the extra blurry visuals.
Until Wayland is a reasonable successor for all users, though, picking between pushing for Wayland for better fractional scaling over Xorg for better software or hardware support is a tough choice. It may not directly factor into Elementary OS and their decision to not support fractional scaling, but I can only imagine it’s not helping any. There are plenty of problems to solve and improve on in Linux that are a bit more tractable.
And then, after we’ve all finally solved all fractional DPI issues, the world will probably have virtually eliminated displays in consumer devices that look too big at 200% scaling. Of course.
P.S.: I am fully aware I left out details here about font rendering, and some GNOME-powered Wayland issues like their horrid dbus API for cursor size change notifications. It’s already a long enough post that goes into the weeds too much.
> I think it’s a fair philosophical choice. Fractional scaling is really hard. macOS doesn’t bother, choosing instead to render at an integer scale factor and downsample. Doing something similar on Linux would be hard, for mostly historical reasons.
What macOS does is the cleanest and most correct way to fractionally scale in my book.
Why would it be hard on Linux? You just blit the buffer at whatever resolution you want.
In Wayland this is simple. In Xorg it is not because the X protocol contains no knowledge of what scaling factor your app is rendering at. The server doesn’t know about DPI scaling. You can’t just make things “appear” as if they’re a different scale factor; X apps get real pixel coordinates in their events, directly from the X server.
The Wayland compositor is aware of what DPI every surface is in a given frame exactly, and it has a unified protocol to advertise it.
What macOS does is not terrible, but downsampling from 2x to 1.5x will look a decent bit worse than say 3x to 1.5x or rendering at actual 1.5x. In theory with today’s Wayland protocol, as far as I know, a surface could set its own DPI to 2x or 3x and get the desired behavior today. However it definitely isn’t free...
Because of text rendering. Apple is a very... idealistic company. From the very beginning of their use of anti-aliased vector fonts, it’s clear that they did not want hacks like pixel grid fitting or subpixel anti-aliasing that compromises the integrity of the font or ties the rendering closely to the way it is blitted. The way they made up for this was ultimately by shipping HiDPI displays in most of their devices. That way, the blurry rendering became more palatable.
Now don’t get me wrong: I’m not saying it looks unusable at 1.5x or 2x scaled down to 1.5x. It just looks dramatically less crisp than actual 1.5x with subpixel anti-aliasing. But you can’t scale a surface with subpixel anti-aliasing without defeating it because the process of scaling will treat the different subpixels of rendered text as just RGB values. This looks terrible. Even worse is if you are using hinting on top of that - now you are grid fitting to a grid that has no correlation to the actual grid of the display, thus, hinting actually just makes things look worse while compromising the integrity of the outlines.
Granted, upscaling 1x has some of these issues and it is passable. Still ugly, but it doesn’t look unusably bad. But it is still quite wrong.
Turning off hinting and subpixel AA might seem like the right call, but I don’t think so. For some languages, like East Asian languages, subpixel AA can add visual clarity that makes it much easier to read complex glyphs. While for English it is merely a creature comfort, for some languages it might provide a much needed legibility improvement.
All of this is what I was trying to subtly allude to near the end, but oh well :)
tl;dr: The X server itself is not aware of window DPI, and cannot virtualize pixels in events and protocols. Wayland compositors are, but it is still maturing and software and hardware compatibility is gradually working its way up. And OSes other than macOS use tricks like hinting and subpixel AA to improve font legibility which do not play terribly well with framebuffer scaling.
Tl;still dr: doing this would be hard for historical reasons :)
Fractional scaling is mostly needed on High DPI displays. High DPI displays also happens to be the case where you don't need subpixel AA, nor you need native FS rendering. Even for East Asian languages.
It's ironic, I think, you say Apple is an idealistic company, but they have implemented a very simple and pragmatic approach, and with great results.
While here we're discussing why Linux can't do the same due to a dozen self-imposed restrictions that don't matter in practice. The concept of (non-fractional) logical pixels can be implemented very neatly into any rendering subsystem without this affecting the concept of pixels, or events, etc. Seems no one can see the forest for the trees in this case.
> Fractional scaling is mostly needed on High DPI displays. High DPI displays also happens to be the case where you don't need subpixel AA, nor you need native FS rendering. Even for East Asian languages.
2x scale displays certainly make East Asian text readable, but rendering 2x and downscaling to 1.5x will do a lot of damage to the legibility. It’s far from ideal.
> It's ironic, I think, you say Apple is an idealistic company, but they have implemented a very simple and pragmatic approach, and with great results.
When they first implemented it, it shipped on displays that could only handle 1x scaling for a very long time. The first “Retina” Macbook is from 2012, while macOS is from 2001. I have an iBook G4 sitting next to me. It’s a nice piece of hardware, but yes, text is blurry. Apple is “idealistic” in that they favored elegant engineering over practicality. Not everything Apple has ever done embodies this, but they definitely try to change the status quo rather than make the most of it, and this is surely no different.
I didn’t say it as necessarily a jab, but it had its tradeoff. It’s a distinctly Apple move.
> While here we're discussing why Linux can't do the same due to a dozen self-imposed restrictions that don't matter in practice. The concept of (non-fractional) logical pixels can be implemented very neatly into any rendering subsystem without this affecting the concept of pixels, or events, etc. Seems no one can see the forest for the trees in this case.
The self-imposed restrictions you refer to are issues with the design of the Linux graphics stack. Like I said, the X11 protocol does not afford the X server any awareness to the scaling an application renders at. The deeper you dig the clearer it is that it simply couldn’t be fixed. In the X11 world, screen coordinates are exposed to all clients. In the Wayland world, clients only use surface-relative coordinates. This might sound like an unimportant distinction, but when you are trying to implement DPI virtualization, the API matters. If applications running at different scale factors suddenly have their drop down and context menus appearing all across the screen, that’s not a good experience.
So why is X11 like this? Well, it’s ridiculously old. The X protocol has changed and evolved a fair bit, but it’s so old that its lineage predates the very first Mac altogether. Xorg has improved plenty in the last 20 years, but the protocols have been largely frozen for quite a while, with some minor exceptions. Trying to fix these issues now, 40 years down the road, is a losing battle. It would be possible to try to pull a Microsoft, invent and implement a new DPI virtualization layer. But it’s just not that simple. Xorg does not have a builtin compositor, and it does not force you to use compositing. From the server PoV, there’s no obvious place to scale surfaces. There aren't even "surfaces."
The list goes on. So if it’s so old and problematic, why didn’t they try to fix or replace it? Well, it turns out they have. It’s called Wayland. It’s not new; it’s been cooking since 2007. Wayland provides a global compositor that is aware of the scaling factor of surfaces and old X11 apps are DPI-virtualized by virtue of the way Xwayland works.
So why isn’t Wayland everywhere? Because open source developers can’t will it into existence. They have to push things forward incrementally. The Linux desktop is not exactly the best funded thing in existence. Canonical decided to try to pursue their own display standard called Mir instead of collaborating on Wayland, so their resources were not going into it. Meanwhile, semi-political squabbles with NVIDIA and the Linux kernel developers has lead the proprietary NVIDIA drivers to be very slow to support Wayland. NVIDIA tried to forge a path around the unified buffer management APIs used by every other graphics vendor and proposed EGLStream, which ultimately hasn’t worked out and they are now pursuing a better path forward.
The lesson here is you can’t say “why didn’t they just do the obvious thing? It’s sad everyone is too stupid to understand this.” Nobody doesn’t get how to implement DPI virtualization. But in 1984, display scaling wasn’t on the top of everyone’s mind. It wasn’t in the early 2000’s either. So it’s not surprising that the decades long legacy of X has left it hard to handle scaling.
It's worth adding on here: The reason they did not simply try to use the existing X drivers is because that was an equally bad thorn in their side fraught with plenty of peril. But let's please not get into that. I am not an expert, and when I dug into it, it felt like enough yak shaving to last a life time.
In other words:
Tl;dr: it’s hard to do under Linux for historical reasons.
> 2x scale displays certainly make East Asian text readable, but rendering 2x and downscaling to 1.5x will do a lot of damage to the legibility. It’s far from ideal.
I'm typing this on a MacBook with scaled display and legibility is just fine. Asian scripts look fine, too. iPhones also use a scaled output (most recent models all do, there's no match between physical pixel and GPU pixel at all anymore on OLED, and there's no match between GPU and logical pixel either). You'll never hear customers in Japan or China complain about poor legibility.
That's the difference between theory and practice. If you try it, you realize there's no problem. Otherwise I'm sure we can keep writing about how bad it's in theory all day.
> I have an iBook G4 sitting next to me. It’s a nice piece of hardware, but yes, text is blurry.
You're talking about an iBook that 1) isn't high DPI 2) using OSX which isn't scaled (fractionally or in any way at all).
I'm sorry but your iBook is irrelevant to the discussion. We're not discussing here legacy text AA on legacy low DPI display. None of this matters.
>I'm typing this on a MacBook with scaled display and legibility is just fine. Asian scripts look fine, too. iPhones also use a scaled output (most recent models all do, there's no match between physical pixel and GPU pixel at all anymore on OLED, and there's no match between GPU and logical pixel either). You'll never hear customers in Japan or China complain about poor legibility.
>That's the difference between theory and practice. If you try it, you realize there's no problem. Otherwise I'm sure we can keep writing about how bad it's in theory all day.
I do not appreciate the way you are treating me as if I clearly do not have any experience or knowledge on the subject matter. I in fact use an iPhone and have owned multiple Mac computers, including my M1 Mac Mini. I also studied Japanese in college and font rendering was always a kind of issue.
Comparing Linux and macOS is not too interesting since Linux actually is closer to macOS in many regards with font rendering. Instead, it would be instructive to first compare Windows 10 to macOS (Big Sur).
This comparison shows the default fonts in macOS and Windows 10 rendering the word 「醤油」. macOS Text Edit is on the left, and Windows 10 Notepad is on the right. I set the font size to 10pt on Windows to help make it a more even comparison; Windows defaults to 12pt which makes 1x plenty legible. However, in a head-to-head comparison at a similar font size, 1x is not terribly legible here on either side. Still, at 1.5x you can see a clear difference in legibility already. This is with a common word that is mildly complex, but still not nearly the worst case scenario for typography. And yet, some of the features are quite difficult to distinguish. It's good enough, but it's not great. The Windows typography here is ugly, but practical.
What if macOS could scale 1.5x natively? Well, the comparison would look like this:
On the left is 12pt at 1x, and on the right is 18pt at 1x. At first glance it does look similar, and neither is ideal. However, this is still a noticeable improvement. The two strokes near the bottom of the left kanji are now distinguishable with the human eye, and it is generally less blurry. This is without the advantage of hinting or subpixel AA, which should make it even less blurry and increase the horizontal spatial resolution enough to make more features easily distinguishable. And this is still not a worst case example. It’s an example of something common.
By the way, regarding iPhones. I have an iPhone XS. The iPhone XS has a DPI of 458! That is insane. The monitor I'm currently using, by comparison, has a DPI of around 110. So yes, you can get away with quite a lot on an iPhone XS without any noticeable artifacts or blurriness. But that's not realistic. Laptops and phones are different devices with different hardware and different distances that we view them at. OLED displays on computers are still the exception.
> You're talking about an iBook that 1) isn't high DPI 2) using OSX which isn't scaled (fractionally or in any way at all).
> I'm sorry but your iBook is irrelevant to the discussion. We're not discussing here legacy text AA on legacy low DPI display. None of this matters.
I don't know what to tell you, I was trying to illustrate the pitfalls of Apple's early choice to not integrate techniques like hinting and subpixel anti-aliasing into their font rendering. It's not like the font rendering looks meaningfully different on an iBook G4 than it does today, it's that the screens have gotten higher resolution. They waited it out. If you want to talk about impractical decisions, the 10 year stretch when Apple laptops had blurry fonts and no HiDPI displays is certainly relevant to the discussion.
Also, Linux still needs to support 1x rendering well. It does not have the luxury of choosing the hardware it runs on.
I'm sorry the conversation tone is changing, I didn't do this intentionally. But there are few things that logically don't connect for me here.
Supporting FS using the macOS algorithm doesn't affect 1:1 rendering at all. You can enable subpixel AA when rendering at non-fractional scale, and you can disable it when you do fractional scale.
Windows 10 already does this per control i.e. when you render a control in RGBA it disables subpixel AA, otherwise it supports it. In this case what I propose is vastly simpler: do it per desktop, or even per X server instance.
It's literally an afternoon project, and let people pick their settings.
Anyway none of this matters, because Linux on the desktop doesn't matter either (yet). And with that attitude honestly it'll never matter.
As a last note, you can do subpixel AA with the macOS algorithm. It'll be more performance intensive as it'll be a custom shader to scale this way, but it's mathematically 100% doable. I just don't think it's worth the bother.
What I am trying to tell you is that 1:1 font rendering on macOS is simply not very good. This caused Apple to be the ass of jokes for a while. If you think that 1:1 rendering of common kanji is good, well, we’ll just have to agree to disagree. As for downsampling for fractional scale... I was only ever trying to demonstrate that it is not ideal, and not everyone wants it. I still (as I had said early in the thread) find it to be an acceptable tradeoff.
And yeah, UI toolkits could switch off hinting and subpixel AA at 2x. I acknowledge that this would be logical anyways, since subpixel AA is not very useful at 2x.
But honestly, as far as Apple typography goes, it’s all beside the point. This was in a thread where I was initially trying to demonstrate that doing it in Linux is hard for historical reasons. I believe I made that point. As for will it ever be done? Well, like I said, it has been done. Wayland supports what you are describing and more just fine, right now, and it works very well. (Full disclosure that most programs will actually just use fractional scale, but you can also round up to the next integral and the compositor will downsample as you would hope.) The actual hard problem is getting Wayland working for everyone, on their hardware and with their software. Progress is being made every year, but it’s been a long road. Until then, I just think it would make more sense for Elementary OS to work on more tractable problems with the Linux desktop.
In the age of high density displays, fractional scaling is a must. It's the first thing people will encounter that indicates that a DE is antiquated.
On a 11" display at 1920x1080, the alternative is to suffer resolution mismatch (which can introduce any number of artifacts depending on how the panel handles the input, the least of which is a blurry/fuzzy display) or drop the resolution fractionally for pixel accurate rendering.
I did not buy a 11" laptop with a FHD display to browse at 960x540.
This is why I’m totally against using such high resolutions on such small displays in the first place. Sure fractional scaling would help—but sticking to a perfectly reasonable 900p or even 768p would look even better _and_ save battery and processing power!
(Remember, we’re talking about an 11" laptop. 1366x768 is perfectly reasonable at that size; totally different from the crappy 15" devices that use that resolution to cheap out.)
Elementary OS has truly come so far. I still remember the old days where it started as a Gtk theme and how excited I was back in early 2010s to try out the beta versions of Luna. Unfortunately haven't used it since I bought a MacBook, but I truly appreciate the effort by Elementary team to provide a well thought out aesthetically pleasing linux distribution.
I'm honestly always confused as to what makes Elementary (or Mac for that matter), so much more pleasing than the much maligned GNOME?
I'd like to understand because these discussions which often include "I'd use Linux but awful desktop" always leave me feeling like people are tapped in to a dimension I'm not privy to.
I want a UX which is mostly unified (outliers will happen, but should be few), clean and crisp, and easy on the eyes. This is of course subjective, and I've never found an equal on Linux, and Windows has never measured up for me, including Windows 10. Font rendering has always looked oddly antialiased and mostly washed out on Linux. It's never felt as refined as MacOS' UX. But again, that's my personal take.
The turnoff for me is usually wrt UI scaling, i.e. I have a 4k monitor and I want to scale the entire UI up to maybe 175% the normal size. On Ubuntu, with GNOME, I can do this in the settings, but at the expense of performance and sharpness. I'm not sure how Cinnamon or Elementary fare. To contrast, Mac OS X handles this without any issues- there is no blurriness, no performance degradation at any UI scale.
I did a 360. I was a GNOME 2 fanboy, switched to FUDUNTU because of GNOME 2 support (before MATE / Cinnamon forks), left GNOME because of Unity, etc., had a macintosh, used Xubuntu.
10 years later I'm on vanilla GNOME and I think it's amazing. Truly perfected.
I think it’s subjective, both gnome and elementary seem like interesting designs. But Elementary is obviously inspired by macOS so it’s going to attract people who like that over Gnome’s take, or would otherwise like the windows XP end of things.
Lots of negativity, as is expected on HN. For what it’s worth, I like Elementary OS and hope version 6 is a success. I’m going to kick the tires today.
Tried elementary but the hidpi scaling is useless! Also it was unable to handle different dpi for each display.
Had any of this changed in this version?
I always run Elementary shortly after each release and then after a few months find myself coming back to pure Ubuntu after the rest of the ecosystem moves on and updates.
I'm very excited for the multi gesture track pad support. It's such a small change but I just haven't been able to find good support for this outside of Mac OS.
I'm tempted to install this version the second they hit their first release candidate for that feature alone.
I ran elementaryOS for a couple years when I worked in research. I switched off when I bought a intel NUC, and the skylake processor platform wasn't supported yet (not their fault) and had to run the latest kernel and never went back.
I like elmentary. It's pretty, works well, etc. One thing that did irritate me was when I had to install something that wasn't "default." A gnome or QT application, and it looked to fugly. Just didn't work.
I with these guys the best. I think what they are doing is really cool, and I admire all the work they've done in vala and what not.
Quite looking forward to this as my Linux laptops all run it. I’ve found it to be a great DE with a clean, no-fuss experience, and have often considered moving to it (plus a Windows VM with pass-through graphics) for a desktop.
I’m not necessarily like flatpak, but these childish websites with badly written arguments are the things that should be killed..
These are toxic and doesn’t help anyone at all
Also anecdotally I quite like flatpaks as a user. For example having Lutris as a flatpak (it's in flathub beta) is quite nice because it doesn't pollute my system with all those Wine dependencies.
Pro Tip: Use Flatseal to manage Flatpak permissions, e.g. give some apps access to specific folders you use that they don't have by default.
Wow! I never saw that site but it is FULL of misinformation and purposely claiming misleading "facts". I like how there is absolutely no contact info, as this tool knows he's full of crap.
The move to Inter as a UI font will be a good step forward, as Open Sans has not aged well. Especially as displays are higher and higher DPI, another good step forward would be to completely disable font hinting. It's subjective, but it would seem consistent with the rest of the distro's approach to visual design.
But those are details that some like me deeply appreciate, while others don't see what the fuss is about one way or the other, for better or worse. What really matters is that I can throw ElementaryOS on my wife's or mom's laptop, and they can figure it all out no problem, without all of the terrible anti-user everything that is Windows.