Sorry :D nothing wrong with server-rendered HTML and forms for many use cases. Unfortunately, by the limits of connectivity, if you want to build something that is interactive, data-driven, and responsive at the same time you will always be forced to render on the client-side at some point.
> if you want to build something that is interactive, data-driven, and responsive at the same time you will always be forced to render on the client-side
I'm not sure I personally agree with this blanket statement. Connection times can be, for a vast majority of users, measured in tens of milliseconds, giving us a pretty good budget for processing and rendering while still appearing to be "instantaneous" (100ms) or "fluid" (1s). Even the worst case connection times (satellite & mobile) are still measurable using hundreds of milliseconds.
The worst ranked countries still average in excess of 1.5 Mbps transmission rates - more than enough for compressed text.
And, given how unresponsive so many "top 100" SSA pages are (such as blogs that take whole seconds to display their initial content), I can't agree that doing that processing on the server would actually be less interactive or responsive.
Even the test application from this article can load from scratch in under a second, most of that time being the DNS resolution and the server processing for serving the page.
I'm not an expert but I'm certain we aren't even close to providing this level of connectivity. Throttled data, sucky hotel wi-fi, underground tunnels, bad service in rural areas of e.g. Germany etc. are real, unfortunately.
You cannot hope to get enough uptime and guaranteed response time ranges for an app's interactions from any server; you will always get inferior UX to client-side rendering (which, in a way, has 100% uptime and guaranteed response times only bound by CPU/RAM/bugs in your code).
It’s moderately ironic to talk about poor connections (largely due to cell phone situations) without discussing the lackluster processing, memory, and battery life available on those cell phones. Not to mention the fact that SPAs tend to be significantly more heavyweight in their initial download requirements than server-powered forms, taxing those poor connections before you even see a single line of text.
Again, the performance bar for SPAs has been set so low by top-100 sites (like Medium and other SPA blogs) that a server powered forms would have to work very hard to be worse.
As a side note, we can’t forget that even these “100% uptime” SPAs in the real world largely still rely on requests and responses from a server backend; still rely on prompt responses to their ‘XMLHttpRequest’ calls (hello, animated spinners!).
True, we should respect all the resources of a device, at best. Also true for desktop (Slack's memory usage comes to mind).
A major result of the study is greatly reduced bandwidth (and consequently, shorter parse time) compared to the original TeuxDeux, so I'm working towards respecting these resources, for what it's worth.
To be clear, the study does not care about doing SPAs or not. The results are applicable to server-rendered HTML as well (write a function that enable some behavior on an element, mount by class name, done). I agree that many use cases (e.g. blogs) should mostly be server-rendered and only progressively enhanced with JS for some UX improvements.
But highly interactive apps (drag & drop just being one example) will not have comparable UX without client-side rendering. I do not want a 100-1000ms delay (or an error message) after dropping an item in some list because of a server-roundtrip. This is not good UX.
Also, when filling out a form, I do not want to lose data or context when clicking submit while I'm in a tunnel. I'd rather have a client-side rendered UI that keeps my context and tells me "Sorry, try again when you're connected again".
Even better if it works fully offline and syncs the transactions I've done with a backend once I'm connected again.
A small note: 100ms is effectively imperceptible to a human. Even 1s is acceptable in most cases. There’s a HCI study from the late 60’s that defines this.
WRT losing data by filling out a form, that hasn’t been an issue for years now (except when the “smart client renderer” decides that it is). Most browsers will not lose data in an interrupted form transmission.
Plus, most browsers (especially mobile ones) handle interruptions like a tunnel fairly well, waiting for the connection to return without losing data. And without having to think about that usecase as a web developer.
It’s funny to me that you mentioned slack above; some of my favorite old-school chat experiences were server-side rendered pages. They worked remarkably well for the limitations they faced.
All this said, the proper compromise is probably doing both client and server-side rendering. I’m reflexively against client side rendering, because of how they’re typically implemented: slow to download up front, each SPA creates its own interaction primitives, and finally the interactivity of an SPA is only rarely required and yet they’re used everywhere (read-only SPAs are the worst).
Javascript - and client-side rendering - is the power hammer of the frontend development world.
Not sure about the interruption handling, maybe I need to do more research there.
But totally agree with the last parts - currently, typical SPA implementations are often misguided and create more problems than they solve (if any) compared to a server-side approach. That does not mean that pure server-side is always enough to provide good UX, especially with interactive/offline apps.
In the real world my crusty old-fashioned drag-and-drop works way better than pretty much all "SPA"s (who are only trying to display plain text) on low/spotty connection. SPAs usually fail to display anything but a blank screen when there is a slow connection.
Losing form data hasn't been an issue in forever... The browser saves it when you go back or forward. You don't even need to press "back" - you can even just refresh the error page and as long as you click "yes" to the pop-up telling you you're resubmitting a POST, the form will submit with the original data entered, no problem.... Of course, SPAs like to break this for no reason, stop doing that.
This is so true. If you want to feel again what's possible, make a vanilla website (without JS and with very few images) and do all form processing server side. You will be astonished how fast and actually reactive this side can feel. Part of this feeling could be that animations (thinking of the popular material design by Google) itself take 10ms to 100ms. Within this duration, the browser can easily load a new page.
Completely agree. I still build websites with server-side rendering and just enough JS for Ajax, which changes innerHTML and occasionally adds an element. They are astonishingly fast, and they still function if the client has JS turned off.
I remain convinced that for 90% of websites, using React is shooting a sparrow with a cannon.
For what it's worth, the study was not targeting SPAs specifically. The patterns I found can very well be used to add some minimal behaviour to some HTML generated completely on the server side. I'm by no means advocating SPAs or client-side rendering for each and every use case.
Understood. I do appreciate the effort you put into this and the fact that you built it from scratch. FWIW your example has convinced me to explore client-side rendering now that I know it doesn't require a monster library.