This is also likely a performance nightmare. Funny that they mention that "new hardware has enabled us to..." which means that this will perform poorly on old devices.
At a previous company, we were forbidden from using translucency (with a few exceptions) because of the performance cost of blending. There are debugging tools we'd use fairly often to confirm that all layers were opaque.
Unlikely. Frosted glass blur was introduced almost twelve years ago in iOS 7, and was supported all the way down to the iPhone 4. Many apps like control center have used a full screen blur without any performance issues for a long time.
Apple at the time created their own 'approximate gaussian blur' algorithm specifically to enable this, and it ran crazy fast on devices where a simple gaussian blur would barely achieve double digit FPS. Even if this 'liquid glass' effect is heavier to compute, on the hardware we have today it will be a negligible performance concern.
> Unlikely. Frosted glass blur was introduced almost twelve years ago in iOS 7, and was supported all the way down to the iPhone 4. Many apps like control center have used a full screen blur without any performance issues for a long time.
"Without any performance issues"? Entirely false - reviews at the time noted iOS 7 dramatically reduced battery life - all across the board for Apple devices, even for the then latest iPhone 5S and 5c (https://arstechnica.com/gadgets/2013/09/ios-7-thoroughly-rev...).
The abuse of transparency/translucency in the UI was the primary reason - you could go to Accessibility settings and disable animations + transparency/translucency and get notable increases in both runtime speed of the OS UI and battery life.
Indeed, I remember the switch to iOS 7, for me battery life seemed to get slightly worse but there were conflicting opinions at the time. It's fresh in my memory as it was around the same time I binged on all five seasons of Breaking Bad :)
I's also true that iOS 7 made the 4/4S seem much slower, but the frosted glass effect still ran at 60FPS - that was my point. It was really impressive at the time. Though unless you spent hours sliding the control center up and down, it's hard to blame the blur effect for the reduced battery life, as it rarely appeared inside apps. Most likely the result of increased OS bloat and proliferation of background services.
You can’t judge battery life and performance off a .0 release when the priority is on delivering features with the minimum number of showstopper bugs. At least wait until the .1.
It has been like this for every Apple release for over 20 years.
Maybe for "Apple", but there's one team that takes performance seriously. The WebKit team has a zero tolerance policy for performance regressions (https://webkit.org/performance/) dating back to the implementation of the Page Load Test in 2002 (Creative Selection, p. 93).
WebKit sounds like the kind of scrappy startup Apple might want to acquire and gain some hard-earned engineering knowledge.
If Apple has been shipping betas for 2 decades that do not meaningfully prepare the release candidate for users, something is horribly wrong. They're either not listening to the feedback they receive or they're not giving themselves enough time; both are firmly within Apple's control.
Well, firstly, this is a developer beta. So the target audience are developers that want to get a head start on getting their app(s) ready. So measuring battery performance of those dev betas is dumb.
Also, they do listen to feedback and do gather it. They won't change entire design language now tho.
The parent comment wasn't talking about the developer beta, they were talking about the .0 release. They should use the release candidates as an opportunity to dogfood new solutions instead of shipping an MVP to prod.
It’s almost certain to be a fairly cheap thing, at least for a GPU that can sling pixels at the gigabytes per second necessary to get smooth touch scrolling at these screen resolutions.
The demos only show a very limited array of shapes. Precompute the refraction, store the result in a texture, and the gist should be sample(blur(background), sample(refraction, point)). Probably a bit more complicated than this—I’m no magician of the kind that’s needed to devise cheap graphics tricks like this—but the computational effort should be in that ballpark. Compared to on-device language models and such, I wouldn’t be worried.
(Also, do I need to remind you of the absolute disdain directed by 95/98/Me/2000 users at the “toy” default theme of XP? And it was a bit silly, to be honest. It’s just that major software outfits don’t dare to be silly anymore, and that way lies blandness.)
> It’s just that major software outfits don’t dare to be silly anymore, and that way lies blandness
Great observation! We need some of that sillyness back. Everything is all serious and corporate nowadays, even 'fun' stuff like social media or games. Even movies can't be silly anymore.
Not sure about 'serious and corporate', the big corps like to appear cute, folksy etc. and recently we even saw new Google Material Design advertised as judged more "rebellious" by focus groups. Maybe bland and toothless is just a general direction of contemporary culture and style that they follow.
Myself, I can appreciate corporate stuff presenting corporate. More truthful, feels a little less manipulative.
>It’s almost certain to be a fairly cheap thing, at least for a GPU that can sling pixels at the gigabytes per second
Okay, but what about the battery connected to the GPU? The battery in my iPhone has already degraded below 80% health in the 2.7 years I've had it, so I'd rather not waste its charge on low-contrast glass effects.
> And all of this just to make the whole UI white and generic.
3:30–3:45 in the video is painful. Describing “giving you an entirely new way, to personalise your experience”, while showing… white. White white white. Oh, and light tinted backgrounds to set your white on. I hope the personalisation you wanted was white.
We used to have such customisation, then it kinda went away for a while because it was too hard and limited development, and then dark mode was hailed as a brilliant new invention.
But it is worth remembering that dark mode does actually get you some things; it’s not all bad: the restrictions do have some value.
Full customisation became paradoxically limiting: when you give too much power to the user, the app is essentially operating in a hostile environment. Of course, a lot of it was laziness on app and UI framework developers’ parts, but it really did limit innovation, too.
Dark mode gets you a pair of themes that you can switch between easily, and an expectation that there are only two themes you need to consider, with well-defined characteristics. This is a much more practical target, a vastly easier sell for app and framework developers.
The funny thing with monochrome icons is that in some ways they were actually a better fit for a full-customisation environment, where you had arbitrary background and foreground colours. Once it’s just mundane light and dark themes, you could more safely have full colour in two variants.
Certainly light mode and dark mode does not mean things need to be monochrome.
From what I've seen,the refractions happen in predictable contexts so I suspect that they'll be able to create shaders, etc that will limit the performance hit
I would imagine that for a known geometry of glass, you can do the ray tracing once, see where each photon ends up, and then bake that transformation into the UI. If you do this for each edge and curve your UI will produce, you can stitch them together piecewise to form UI elements of different shapes without computing everything again from scratch.
Early iPhone hardware was barely keeping with rendering the UI with a total ban on transparency. Even on iPhone 4 which improved the hardware a lot had the issue that it also increased amount of pixels to be pushed around.
And yes, later iOS on early hardware was huge PITA and slowdown.
I suspect that their new technique implements the existing fast gaussian blur, and since the patent is about to expire, it was a good time to spice it up.
I suspect as others have mentioned here, they use a "Liquid Glass" shader which samples the backing layer of the UI composition below the target element and applies a lens distortion based on the target element's border radius, all heavily parameterized so as to be used with the rest of the system's Liquid Glass applications like the new icon system.
Surely it's a performance nightmare because whatever is behind the frosting has to be rendered in full. Without this it can see that it's occluded and not have to render. Or does MacOS not do that?
I don't know how long you've been following Apple but with previous "high cost on old hardware" features they just disabled them for old hardware.
Apple loves their battery life numbers, they won't purposefully ship a UI feature that meaningfully reduces them. Now bugs that drop framerates and cause hangs, they love shipping those.
Maybe in the past, but my iPhone 13 still has pretty good battery life considering the battery has physically degraded over the years. No update felt like it killed the battery.
Eh, I use an iPhone 11 that's 5.5 years old, with the original battery and to this day the battery life is not noticeably different from when it was new.
It's the first iPhone I bought and has lasted longer than any of the three Android phones I had before it.
Literally impossible for your battery life not to have degraded in 5.5 years, battery tech just degrades - my 14 Pro was noticably worse in less than a year.
Windows Vista introduced this same concept. Performance was awful unless you had compatible graphics acceleration. 20 years later, I think most devices should be fine, especially Apple devices.
Vista was dogged by issues caused by migrating display drivers from NTDDM to WDDM 1.0, something that was only finished by 7 (which dropped NTDDM fully and introduced WDDM 1.1) and 8 (which afaik had mandated WDDM 1.1 only).
Unlike previous GDI acceleration, DWM.EXE could composite alpha channel quickly with the GPU, and generally achieved much higher fill rates on the same hw - if the drivers worked properly.
Yeah one of the easiest ways to make windows vista+7 perform better was to simply disable all the fancy UI graphics that add nothing. I don't care if my window title bars have a gradient and animated transparency. It's actually a bit distracting and makes the system perform worse, so I just turned it off.
Even on modern devices though which have more computation and graphics power to the point that they aren't going to actually lag or anything while rendering it, why waste cycles and battery animating these useless and distracting things? There's no good justification.
these performance hungry "improvements" are forcefully introduced to legitimately slow down older devices and force the device refresh across the user base.
I have been using 8 year old iPhone just fine, but features like these over time will make the experience slower and slower and slower, until I am forced to refresh my iphone
I think probably a much bigger problem is app bloat. Devs are usually using very recent if not brand new top end devices to test and develop against which naturally makes several types of performance degradation invisible to them (“works on my machine”). Users on old and/or low end devices on the other hand feel all of those degradations.
If we want to take increasing device lifetimes seriously we need to normalize testing and development against slow/old models. Even if such testing is automated, it’d do wonders for keeping bloat at bay.
More likely it's a result of pressure to ship highly visible "improvements," combined with a lack of ideas that could improve the experience in a meaningful way. What do you do in that situation? Ship an obvious UI update that wouldn't have performed on the last gen hardware.
I haven’t used the new UI, so don’t assume this to be an endorsement of it, but even if you have good ideas about UI improvements and implement them, there still is pressure to make the UI look different because that, at a glance, shows users that they get something new.
And yes, “looking different” doesn’t have to mean “requires faster hardware”, but picking something that requires faster hardware makes it less likely that you will be accused of being a copy-cat of some other product’s UI.
And you base your first sentence on…? Surely not the ol’ “my phone slows down when my battery is failing so that I’ll buy a new phone” canard?
To be clear, these are new features that will likely have a setting to turn off. There’s no conspiracy, nothing “forcefully” added for the purpose of driving upgrades. (Ah, ninja edit): There’s not even a guarantee these features will be supported on an eight year old phone. EDIT: wait a minute...your eight year old phone won't even be supported.
(EDIT: reworded first paragraph to account for the ninja edit.)
When is the last time a company has admitted wrong-doing? No, Apple admitted to slowing down phones when the battery was shot so it wouldn’t just suddenly shut down.
I adamantly believe this was the right call for Apple to make. I frequently switch between Apple and Android phones across different generations. At the time I had an aging flagship Samsung that did NOT do this. My battery indicator would say "18%" and it would last however long that implies...if I didn't do anything remotely CPU-intensive. If I did anything that boosted the CPU, the current draw caused the battery voltage to fall off a cliff and the phone would instantly shut down without warning.
The worst part was that during the boot sequence, the CPU ran at full-throttle for a few moments until the power-management components were loaded. So I couldn't restart it. As long as I didn't open a game or YouTube or a wonky website with super awful javascript, I could continue using the phone for another couple hours. But if the phone turned off, it couldn't be turned back on without charging it more ... even though it had "18%" battery left (as determined by voltage, not taking into account increased internal resistance in the battery as it ages).
I was envious of iPhone users that got a real fix for this (Apple slowing down the phone when the internal voltage got low). I would have greatly preferred that Samsung had done the same for my phone too.
I agree, it was the right call to make -- a temporarily-impaired device is always better than a temporarily-failed device, especially when you're talking about something you may need in an emergency situation.
That said, Apple _significantly_ erred in not over-communicating what they were doing. At that point, the OS would pop warnings to users if the phone had to thermal throttle, and adding a similar notification that led the user to a FAQ page explaining the battery dynamics wouldn't have been technically hard to do.
That was fake, tho. They slowed down old iPhones to make you buy a new one. My iPhone 7 wasn't auto shutting down, battery health was good, but they still made it so slow it was unusable the same week they released the iPhone X.
There is literally a zero percent chance it was anything to do with batteries. This is not a conspiracy theory. It's an objective fact.
They didn't admit bad intent. They admitted to doing something with good intent(the slowing was to stop crashes with near EOL batteries) but that they weren't transparent about it.
I'd much rather us have progress and people with 8 year old phones suffer than ensure that everything continues to run smoothly on any old device for eternity.
I would prefer to be told that my battery is weak so I could make a decision on if I want to replace the battery, replace the phone, live with the phone shutting down randomly when battery is low, or continue with a slower phone. That's just me.
Apple absolutely effed up by not communicating the specifics well, but that’s corporate policy. Apple docs have always been targeted at the non-technical user and therefore inadequate for others.
But this one is true. Apple obviously puts out slowdown updates right as they release a new phone. They made my iPhone 7 unusable the same week they released the iPhone X.
I don’t think your overall take is wrong (it’s about money), but maybe the simplicity of it is.
Reality is that designers, product managers, engineers — they all wanna build cool things, get promoted, make money etc.
You don’t do that by shipping plain designs, no matter how tried and true. The pressure to create something new and interesting is ever present. And look we have these powerful Apple silicon chips that can capably render these neat effects.
So no I don’t think it’s a shadowy conspiracy to come after your iPhone 8. Just the regular pressure of everyday men and women to build new and interesting things that will bring success.
In the late 90s/early 2000s desktop computing was moving at such a pace that an 8 year old PC was near unusable. Overtime progress slowed and its not unusual to have a decade old desktop now. The problem is thinking that mobile has slowed that much too. Mobile is still progressing quite rapidly so yeah an almost decade old device is going to feel slow.
You have what an iPhone 6? 1GB of RAM vs 8GB for modern devices, the first A chip came out 2 generations after yours as has 2% of the power of a current chip so modern chips are likely close to 100x as powerful as your phone.
Why should we hold back software to support extreme outliers like you?
> Why should we hold back software to support extreme outliers like you?
What are apps and mobile sites doing differently today besides loading up unnecessary animations and user tracking? How has user experience improved for those operating on devices fast enough to make up for developer laziness?
if I want to play games, I will buy the latest iPhone.
If I want to a smartphone with couple simple primitive apps that just send JSON and call REST APIs in the cloud, I don’t want to be forced to shell out $1500 every couple years
Yes, everything has a lifetime, 10 years is a very good run for a complex piece of technology you can carry in your pocket. Send it in for recycling.
So that we can have better features and functionality in our future systems. Backwards compatibility is an anchor. If you want new things then expect to get new platforms to run them on don't expect everyone to limit their possibilities to support you.
The vast majority of things don't get recycled properly.
We are not talking about new features. Of course no one expects to run a LLM on an ten year old phone, again we are talking about fashion. It is change for change's sake. It is not providing value to users it is so the the designer gets to eat and management and shareholders are kept happy.
There is a difference between actual technical progress and you throwing out your skinny jeans because baggy pants are now in fashion.
Why shouldn't we build phones that last ten year, twenty years, or even more?
Windows 10 keeps telling me I need to buy a new Desktop in October. I don't remember when I bought it, but it runs fine for everything I do. I've been running Linux for ages on my laptops, I be upgrading my desktop to Linux too!
Windows 10 is EOL. As a fellow internet user I'm glad Microsoft is taking a harder line these days on people running EOL software. The internet has a history of being swamped by people running EOL versions of Windows full of security issues causing problems for everyone else.
No one is holding back software. You're not running local LLM or anything useful, you're adding performance cost for merely displaying icons on screen.
No one is holding back software because they aren't being allowed. If we were forced to support decade+ old devices though software would for sure be held back.
Laggards cost society by running insecure devices that generally impact the rest of the world besides just complaining about no one continuing to support them long after the useful life of their devices.
> Laggards cost society by running insecure devices that generally impact the rest of the world
Maybe there's also a cost to updating phones as frequently as people do, and inefficient software running across billions of devices.
I wouldn't blame people who make their hardware last longer and call them "laggards". And it's not their responsibility to write security patches for their device, that falls on the manufacturer.
For these people, me included, they don't need the latest hardware features to ray trace a game or run some local LLM. We're just taking some photos, making calls, getting map navigation, messaging, interacting with CRUD apps, and web browsing. None of that requires the latest hardware, and especially Apple hardware from 8 years ago is more than capable of handling it smoothly.
Ask anyone who had to deal with supporting IE back in the day what the cost to the world is fort supporting tech laggards. They are an anchor on tech growth and a real issue.
If you're running an insecure device past it's support life it's your responsibility and your fault if it's used to attack others. You are fully to blame for choosing to use something past it's serviced life. You cannot expect companies to support old software forever.
Currently replying from my iPhone 16 pro (granted, not old by any means) on the iOS 26 dev beta. MOST things actually feel smoother/snappier than iOS 18. Safari is a joy to use from a performance perspective.
It’s in beta so ofc I’m getting a ton of frame hitches, overheating, etc. but my summarized initial thoughts are “it’ll take some getting used to, but it feels pretty fast”
> MOST things actually feel smoother/snappier than iOS 18
I have a feeling the whole smooth animations thing contributes to this a lot. Obsessing about the reaction time and feeling of how stuff comes on the screen. But yeah iPhone 16 pro is probably a bad performance test case
After using it for a couple more days, battery life hasn't really changed from 18. I'm tempted to say that it's better but I don't want to make any claims before I actually track battery life across a week and compare it to my battery life pre-update.
The overheating is a common occurrence, but it doesn't persist. It seems to be certain things (setting the animated backgrounds in iMessage is a good example), but the moment I'm not doing one of those things the temp feels fine. My battery does drop a percent or two during those cases (which sucks), but my typical use of the phone hasn't yielded any noticeable battery life loss compared to 18.5
There's a difference between something like a transparent background (you can run i3/picom on a potato) and having to composite many little UI elements to render a frame.
I can think of a couple of creative ways to dramatically optimize rendering of these effects. There is probably quite some batching and reordering possible without affecting correctness.
Ceteris paribus your performance is always going to be substantially worse even with tons of fancy tricks. Those also get much harder to implement when you're building a complete UI toolkit that has to support a ton of stuff rather than just writing first-party apps/OS components.
I think that the batching that I have in mind would work especially well with complex layouts. The thing to realize is that even if you have tons of elements on a screen, their visual components aren't actually stacked deeply in most cases and the type and order of applied effects is quite similar for large groups of elements. This allows for pretty effective per-level batching in hierarchies, even if elements don't have the same parents.
Right. My point is the response to this is "well if we optimize it more we'll improve performance", but oftentimes if you optimized the existing code you would also improve performance. Your end state is still worse.
Is it really worse if the GPU spends maybe 0.5ms more per frame on these effects? I'd be surprised if a good implementation adds much more to the per frame rendering time.
> At a previous company, we were forbidden from using translucency (with a few exceptions) because of the performance cost of blending.
I imagine this was on mobile devices.
Blending was relatively expensive on GPUs from Imagination Technologies and their derivatives, including all Apple GPUs. This is because these GPUs had relatively weak shader processors and relied instead on dedicated hardware to sort geometry so that the shader processor had to do less work than on a traditional GPU.
Other GPUs vendors rely more on beefier shader processors and less on sorting geometry (e.g. Hierarchical-Z). This turned out to be a better approach in the long term, especially once game engines started relying on deferred shading anyway, which is in essence a software-based approach that sorts geometry first before computing the final pixel colors.
Interestingly, in iOS 18, suppressing transparency (there’s a setting for it) makes performance worse, not better. The UI lags significantly more with transparency disabled. I expect it will be the same with iOS 26: there will be setting to reduce the transparency (which I find highly distracting) but it will make performance actually worse…
Thanks for this insight. It's very counter intuitive. Normally transparency is additional work for a GPU.
I had "Reduce Transparency" check-box in settings turned on because I distaste semit-transparent interfaces. Was not noticing performance problems except one application - Ogranics Maps which were unusably slow after switching to another app and returning to maps so I had to restart it freqently (swipe up). I was thinking that the problem is with Ogranics Maps code.
After seeing this comment re-enabled transparency (iOS default) and Ogranics Maps working fast even if I switch between Organic Maps and other apps!
My phone is always in power save mode. Re-enabling transparency actually made the UI less jerky. It was mostly the keyboard that became unresponsive, I could type 15-20 letters while it froze and it would then „catch up“.
Re-enabling transparency improved this a lot, also keyboard still hangs a bit from time to time. I’m always in power save mode, on an iPhone 12, running iOS18.
> This is also likely a performance nightmare. Funny that they mention that "new hardware has enabled us to..." which means that this will perform poorly on old devices.
Not sure if it is planned obsolescence but it certainly is an upsell to upgrade.
Translucency being a main feature of Mac OS X is decades old at this point. I remember a magazine article touting it as an advantage over the upcoming release of Windows XP!
> At a previous company, we were forbidden from using translucency (with a few exceptions) because of the performance cost of blending. There are debugging tools we'd use fairly often to confirm that all layers were opaque.
I feel like a few years back when I still used an Intel macbook i noticed an increase in battery life and less frames dropping (like during 'Expose' animations) by disabling transparency in Accessibility settings.
Yes, but I think it’s about giving the consumer “more” so that the upgrade train doesn’t stall out and stop moving. They need everyone upgrading iPhones every 3 years and,people won’t do that for just an abstract “it’s faster.”
Meh, Vista laptops could run lots of translucency fine (well as long as they were actualy Vista era laptops and not just XP era laptops with Vista installed)
you just proved that MSFT released slow OS to force people refresh hardware.
Plus, vista was released in 2007, XP SP2 (the most popular version) was in 2004. so its like ~3 years diff. So its not like hardware has progressed in 3 years, its more like new software got significantly slower
I don't think upgrading was the reason for Vista performance. MS wasn't in the hardware business back then (and is just a marginal player even today).
They WAY overreached in their goals with Longhorn. When they finally decided to cut back features to something actually attainable, they didn't have enough time to make a high-performance OS.
Windows 7 was a well-loved rebrand of what was essentially just a Windows Vista service pack and improved performance (though it was still too heavy for a lot of the older machines people tried to upgrade to Vista). If they'd have cut back on their goals earlier, Windows 7 is likely a lot closer to what would have shipped as Vista.
At a previous company, we were forbidden from using translucency (with a few exceptions) because of the performance cost of blending. There are debugging tools we'd use fairly often to confirm that all layers were opaque.