Browsers and developers should try to do their best with limited resources. So it most cases that means making predictions about user behavior. Sometimes that means preloading content (e.g. if you're on a landing page, preloading content from the next likely page will make the it feel much more performant) and sometimes lazy loading to not waste resources if a user isn't going to see content in the first place.
But of course there will be "bad branch predictions" sometimes. I think "having a page load before I get into the tunnel but not scroll down enough to have content load before I lose Internet connection" certainly seems like a case that it's fair not to optimize for.
Instead of all the complexity and code and branch predication, how about just send the page and the images? If lazy loading is super important, the browser can lazy load and render off screen. Don't make thousands of solutions for thousands of platforms on the server side.
Before the loading attribute on img, developers did have lots of custom solutions for lazy loading. A major point of lazy loading is to save resources on the server that would be wasted when serving an image that is never seen.
I like the loading attribute, but it hasn't been supported in Safari until very recently, so the solution has had to be a bespoke server + javascript solution to please Google. Still think a better solution is for the browser to optimize what is "below the fold" and make the decision based on user preference (I want to load the whole page). And most importantly, Google shouldn't weigh in on how to implement pages.
I agree but changing the world to that would be quite a fuss. But if that was already where we are at it would be better. But the JS as an Operating System per site ship has sailed.
I agree. A simple way to solve it is to make the article readable without images. Good old alt tags! A data science heavy article may
not benefit. But a dozen words can paint a picture.
Browsers and developers should try to do their best with limited resources. So it most cases that means making predictions about user behavior. Sometimes that means preloading content (e.g. if you're on a landing page, preloading content from the next likely page will make the it feel much more performant) and sometimes lazy loading to not waste resources if a user isn't going to see content in the first place.
But of course there will be "bad branch predictions" sometimes. I think "having a page load before I get into the tunnel but not scroll down enough to have content load before I lose Internet connection" certainly seems like a case that it's fair not to optimize for.