Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Rather than sending the full dataset (which could be fine if it's small), just do serverside filtering. And rendering for the most part.

It drives me nuts how much completely useless JS is written. 90% or more of web apps could easily be serverside rendered with maybe a few small scripts added.

Instead we have this shit, people writing hundreds or even thousands of lines of JS to make this dumb ass auto-refreshing search/filter page whose only purpose is to make a bunch of pointless requests and computation while I'm in the process of building my query.

Just have the filters be a form and the search button submits it. The backend executes the query, builds the page and returns it. Easy peasy, no pointless requests, no sending megabytes of pointless data in response to every request. No complex JS(which by the way is an absolute shit-tier language for anything beyond 100 LOC) logic.

I may be biased due to my personal experience, personally I find that the vast majority of JS I'm forced to deal with shouldn't exist. It's either a result of bad application design, bad API design, JS that just does things html and CSS do better, or dumb ass requirements like "we need this filter page to update the results every time the user does anything". No you don't, you just need a way to build a query and a way to submit it. And paginate it.

"But page loads take too long that's why we use SPAs" they take too long because you're sending megabytes of pointless JS that shouldn't exist. Remove the JS and they're fast. Plus SPAs are frequently slow as shit anyway, because most web developers just kind of suck and write shitty code. Like sending a huge Json document with every request rather than handling it in the backend.



But it isn’t huge. I just checked how much data my local pc store fetches for the first page of RAM, 18 items total. It does too much ajax-in-json bs to estimate, but let’s assume each product uses 500 bytes, which is more than reasonable. It’s roughly 9kb total. The first RAM image on that page is 8kb. So the pictures on that page alone are roughly 18x bigger than that json. That means if there’s 18 pages of RAM, and there was 35 actually, the whole json is as big as one-two pages. Iow, unless a user changes just one filter and then buys immediately, it’s more effective.

As a consumer, this all feels like being between two fires. One side creates the stupidest UX possible, where one change can fix that, while the other side claims it’s all bs anyway and we must go medieval. Can’t we just listen to a user for once?


I said:

> sending the full dataset (*which could be fine if it's small*)

In the specific case of search/filter pages, I prefer the server-side rendered experience as a user. If I want to just check the box for RAM and search instantly, I can do that. But if I want to build a more complex query I don't need it to keep updating over and over.

I don't necessarily hate the auto-updating pages if they are implemented well. I just don't think they're any better, so why waste dev hours on it? It costs money to maintain that code, and the more of these pointless little JS applications you bake into your website the more expensive it is to build and maintain.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: