You don't even need a fancy "send html fragments over the wire" approach to create a better user and developer experience.
Just sending full pages, server side rendered, like Hacker News and Wikipedia do is fine:
Going from the HN homepage to this topic we are on:
36 KB in 5 requests.
Going from the Wikipedia Homepage to an article:
824 KB in 25 requests
Going from the AirBnB homepage to an apartment listing:
11.4 MB in 265 requests.
Going from the Reddit homepage to a Reddit thread:
3.74 MB in 40 requests
In comparison to AirBnB and Reddit, HN and Wikipedia feel blazingly fast. And I am sure the developer experience is an order of magnitude nicer as well.
Is that AirBnB stat including “content” stuff like images or embedded maps? It’s still worth critiquing the bandwidth spent on that stuff, of course, but it wouldn’t really be fair to include it here when the discussion is about the architecture of page transitions.
Page transitions are much faster for me on Wikipedia than on AirBnB.
Whatever I click on AirBnB, I get a white screen, then some placeholders, then the placeholders one after the other get replaced by the actual content.
Whatever I click on Wikipedia - its just there instantly.
"Plain HTML" gmail vs. any other version. It's way faster to navigate.
Remember when AJAX was supposed to make pages faster? LOL. Of course, back then we were just sending HTML to be injected, not JSON that has to be processed by 50+ Javascript functions before finally becoming HTML (or, some DOM entities, anyway).
> "Plain HTML" gmail vs. any other version. It's way faster to navigate.
And offers far less features. Attaching a file to an email takes ages without drag and drop, this alone is worth waiting few more milliseconds to most people (Gmail loads in less than 2 seconds on my crappy hotel wifi)
Bear in mind that serving Wikipedia pages (static text documents) is much simpler that AirBnB listings (dynamic image heavy content).
It is possible to create a faster AirBnB by changing some of the underlying constrained imposed on the system (eg by accepting to serve cached, stale, listings; or by lowering the amount of telemetry) but apparently Airbnb management does not think the tradeoff is worth it.
Of course you are entitled to think that these decision from Airbnb are wrong and that they are hurting their sales. But you know the saying: « The least you know about a problem, the more convinced you are that you have the right answer »
This disregards the business model behind the companies. There are no transactions for wikipedia or hacker news.
Both AirBnB and Reddit are incentived to collect a massive amount of data on their users and to maximize the experience in any way possible. They need the data to understand user behavior and feed it back into the recommendation algorithms. For both hospitality and social media, users look for images/video content since it is much more powerful than text for most people. All of their competitors have equally bad performance for similar reasons, so they don't lose a competitive edge.
Hacker news specifically goes for a retro look with an interface that has not changed significantly in decades. Wikipedia simply doesn't have the money or engineering capacity to support a more robust platform like airbnb or reddit.
I am incredibly annoyed by slow websites. I noticed that I have started using my browser in a very asynchronous way where I open multiple tabs, do a single action, and move to the next one. When I get to my last tab, the first is hopefully ready.
For instance, on GCP, I:
0. Open 2 tabs for a 2 google cloud instance
1. For each tab, I click on the action I want. E.g Logs, deployment, and networking.
2. Then, for each tab again, I click on the nested action I want
3. Repeat step 2
It might sound like a lot, but each click easily takes 5-10 seconds, which is an excruciatingly long time if you spend a lot of day on that site -- and yes, it takes just as long doing it sequentially. Furthermore, I usually use fairly capable machines, so performance should not be an issue IMHO.
It does not happen on all sites, and I know GCP is a very complex site, but it happens often enough that it has become a habbit to me.
I love sites like Wikipedia or HN where everything is snappy.
"HTML over the wire" generally refers to tech like [0] LiveView, [1] Hotwire, [2] LiveView, [3] Blazor, etc. They aren't about about ditching JS and more about not writing your HTML in JS (and yes, SSR).
I’ve never tried it, but my understanding is when JS is disabled HN fires off a GET request (via a link) and returns a 204 to prevent browser navigation, right?
Interestingly enough, although javascript gets used, the vote links don't trigger an XHR request - instead the javascript parses the vote URL from the vote button element and then a GET request is made by creating an Image object in JS.
You can see votes appear on your network tab under the image tab instead of Fetch/XHR, even though their response return type is text/html -- interesting little hack, although I'm not quite sure why it's done this way instead of fetching since it already requires JS.
The vote/downvote buttons do not require JS. You can verify this in the same, under the network tab. There is disallowed JS request, but the vote succeeds independent of the JS not being allowed.
AJAX stands for Asynchronous JavaScript And XML--it was originally designed for asynchronously manipulating XML data using JavaScript because at the time, the future was XML. The act of using JavaScript to send requests is still sometimes referred to as AJAX, despite usually not dealing with XML in any way. The most common use for sending asynchronous requests is making API calls, which today generally return JSON, but you could use AJAX to load HTML and dynamically add it to the page.
the person I replied to said "we dont need any of this fancy hotloading" and then gave examples of fancy hotloading, hence me asking, and pointing out the contradiction.
Like you said, AJAX isnt limited to XML, and is really just referring to any use of XMLHttpRequest to hotload data whether its XML, JSON etc.
I wasnt asking what ajax was. I was asking if it falls into the definition of "fancy."
I think my parent proved the opposite point they meant to. old reddit and hn and wp ARE the fancy html over wire sites, and airbnb and reddit are NOT the fancy over wire sites.
Refreshes can be very fast, and often essentially invisible in modern browsers, when you don't bloat the page unnecessarily, but I think AJAX still has a use for things like the voting button on HN or "add to cart" buttons in a web shop. It would be annoying to lose the scroll position (and perhaps other state) for each item you want to add/upvote.
(Although modern sites that try to sweep latency under the rug by immediately updating the state on the client side and later reverting the change when the server responds are perhaps more annoying.)
"... where the browser waits briefly before starting to paint, especially if the page is fast enough. This ensures that the page renders as a whole delivering a truly instant experience."
This is why you don't see flashes of a blank white page while a new page is loading on most sites these days.
Also browser refreshes should definitely not lose scroll position, though I can understand how with full ones the former position might not be the "correct" one any more with dynamic content like HN... or substack comments (and how did those manage to be so slow and buggy displaying what is almost exclusively text ?!)
Not a refresh of the same page, but if you perform some action (e.g. adding an item to a shopping card), that would imply a POST request which then (according to the common pattern) redirects to the original page... so then you do lose the original scroll position, I think?
Latency between page going blank and FCP, poor recovery mechanisms. Significant number of users is still on occasionally bad connections (e.g. on a train, walking into the part of the building with poor signal etc). If interaction in SPA does not work, they can try again, because they still have the UI. If page goes blank and does not load, not every user will recover from this.
I'm not convinced that it is... but it's not as smooth as client load-once rendering can be.
What is/was painful was, as an example, the first versions of ASP .Net Web Forms, where every client action was a full server round-trip... Server-rendered Blazor over the wire (not local or in-office server) is roughly as painful by today's standards.
I do think that it's a mix... start with basic html and enhance, or start with a component toolkit with client rendering, and inject server donut or other pre-rendering. With Cloudflare and Deno, I think the latter definitely has some traction.
If you have a complex user interface with the possibility that many parts might be in use at any given moment, full page refreshes mean your users could lose work.
This would definitely be a bad user experience.
But it doesn't mean refreshes are bad, just that the architect of that application hasn't factored them into the user experience appropriately, or they've made their interface unnecessarily complex.
All modern browsers have enough tools baked in that we can work around this without reinventing the execution and rendering pipeline (e.g. localStorage).
if the page is large and you only need to bring back a little data it's probably better to just grab the little data. or upvote so your just sending a little bit of info back to the server.
What's funny is that the old Reddit makes use of dynamic loading of content here and there the right way. The "load more comments" of the old Reddit is nice and fast. While the "continue this thread" of the new Reddit is annoying and slow.
Frontend frameworks developed in response to the need to develop/maintain very complex browser-based applications like Facebook across a large team of developers. AirBnB/Reddit have dramatically more complex user behavior and are much more input- and media-oriented. You cannot wait on round trips to the server for user interactions, and you cannot just do hybrid client/server the old jQuery way on big applications because you get spaghetti code.
Comparing Reddit and Wikipedia is like comparing a motorcycle to a bicycle. They are different, with specific strengths and weaknesses. Both design/engineering paradigms have a place, imo.
You cannot wait on round trips to the server for user interactions
Old Reddit still works fine, and I’d argue better than new front end heavy Reddit. You are right it may be for the benefit developers, but Reddit isn’t complicated enough that a heavy front end provides much benefit to the users.
Old Reddit is significantly faster than new Reddit and I found it far nicer to use, but people in general didn’t like the “old, clunky” UI. Old Reddit was much maligned for how how it looked compared to modern sites.
I do much prefer the look of new reddit, but the performance is painful. I don't see why you couldn't have a nicer looking UI, but with the performance of old reddit.
Yeah, I cannot even begin to imagine what it would be like to develop AirBnb without some sort of front end framework. I used to write pure JS and jquery back in the day, and it was super easy to end up with a rat’s nest of spaghetti code. And those apps were several orders of magnitude less complex!
Too much of the web is developed on fast machines with fast internet and high quality monitors.
I spend a lot of time in trains, hotels and airports, and I carry a 2017 12" MacBook. It strongly influences how I build websites. It's infuriating to wait for pure content to load because it's tied to a bunch of crap that isn't what you want to read.
In that context, developer docs are often the best kind of websites.
I did a similar test on news websites at https://legiblenews.com/speed about 6 months ago that made the rounds here. It’s insane how many MB some news sites push over the wire and the number of requests they make from ads and other JS “features” that should be a few hundred KB.
Just sending full pages, server side rendered, like Hacker News and Wikipedia do is fine:
Going from the HN homepage to this topic we are on:
Going from the Wikipedia Homepage to an article: Going from the AirBnB homepage to an apartment listing: Going from the Reddit homepage to a Reddit thread: In comparison to AirBnB and Reddit, HN and Wikipedia feel blazingly fast. And I am sure the developer experience is an order of magnitude nicer as well.