Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
How we switched our template rendering engine to React (pinterest.com)
122 points by eldude on Nov 18, 2016 | hide | past | favorite | 39 comments


Best web app I've developed was basically Django, with bare minimum JavaScript to handle some in-page effects. People been trying to shoehorn desktop concepts in the web since forever and it will never work. A desktop app is installable, a web site is not. It makes no sense to wait your js app to download for the sake of smooth page transitions.

Take a look at reddit's new mobile site, written in react. It takes around 7 seconds to load, to display text and links. Compare that to HN's, which loads in 0.1 sec, for basically the same content.

SPA's have their place, for things like games or the likes of google maps. But for the rest, please respect the web paradigm . Your site will be simpler to build, faster and most importantly, lighter. The web world has become mobile-first, not js-centric...


I don't agree with this logic one bit. What you seem to be experiencing is SPA's that can use improvement. There is no reason why a SPA is required to load the entire app at once. There are many ways to split out the code so that each page only loads the bare minimum of what it needs. It is totally possible to provide a better user experience on mobile devices when the app only has to download the JSON required to render the next page. There are many tricks that can be done to take advantage of this. The SPA can load the bare minimum to render the page requested and then start pre loading the rest of the app or leave it on demand.

While I agree that there are many situations where a straight port of a static site to a SPA doesn't give too much benefit, it sets them up to move to more advanced interactions moving forward.

The web, although maybe not originally intended, has become the most powerful software platform we have. Things are getting dramatically better and the rate of innovation isn't showing any signs of this slowing down. It is already amazing and it's going to get better.


I honestly feel the same. I've been working as a web developer for 8 years now, so it's not like I've been locked in a basement for decades and don't like all this new fangled stuff because it's not what I'm used to. I just don't get what SPAs provide over server rendered pages with a sprinkling of JavaScript to make things more interactive.

Ok maybe it makes sense if you are building a game. Or something that needs to run offline. Or Facebook. But most web sites aren't like that.

The other day there was a Ask HN about what stack a CMS should be built in. Most people were recommending their favourite client-side JavaScript framework. I just don't get it...


A CMS seems to me like exactly type of thing where a SPA is a large UX improvement. It's an app you would frequently be spending a large amount of time in which easily offsets the few extra seconds of page load time.


The web paradigm also supports accessibility for people with visual disabilites.


single page apps are part of the web paradigm. They too benefit from this.


As someone who spent the whole week migrating a static site to React, I hear you. I don't care if it means I have to learn another language or framework, just stop making SPAs for no reason!


You're betting against history.


SPA is mostly useful for reducing the cost of large application deployments. It improves speed for loading the initial bootstraping of the application. That means index is delivered almost instantly, and depending on the async loading of the code, can be dramatically speed up. SPA will reduce the overall server-rending of content besides json results which will be compressed, and json is smaller than html markup. Also SPA provides for a more sophisticated user experience. But like all things, the original cars got people from a-z, but not in style or comfort. As technologies improve over time, so will the reasons for using SPA. Browsers eventually will allow offline browsing as a standard feature, and having the SPA JS installed & json written to localstore/indexedDB will make using that UI rather nice, instead of always getting a 404 response when you have no internet and such. SPA is about control, server-rendered requires connectivity and difficulty handling response when failure occurs. 1st Major SPA's were email clients, consider google's gmail or microsoft web outlook service for exchange servers. Huge benefit with SPA, because the UI is complex and should not be constantly refreshing, because it can cause contextual change on what is displayed and present in-app state from re-initialization.


This is totally anecdotal, but my wife and my mom both commented to me recently on how Pinterest's latest UI changes have really slowed the site down in their experience. I wonder if React is partially at fault there?


Would have liked to have seen some figures or graphs visualising the React performance boost.

Also I thought one of the advantages of using a js framework is that rendering can be done on the client? It sounds from this article that server rendering is more advantageous?

Also SEO was mentioned, but I thought javascript was taken into consideration by web crawlers?


>It sounds from this article that server rendering is more advantageous?

Rendering on the server feels faster to the user. With client side rendering, you will often get a skeleton of the site that loads, and then the content that fills that skeleton 100ms or so apart from one another. This sounds like nothing, but it is perceptible to the user.

There are absolutely ways around this, but even the Big Guys get it wrong sometimes (for instance: facebook).


Initial page load feels faster when rendered on the server. The very moment you navigate to another part of the site, client-side rendering is a huge win for apparent load times. Not only can you give the user feedback that something's loading, but you only need to retrieve the delta from the server between the page you're on now and the one you're going to. Within a few clicks, the amortized cost of the SPA in terms of both latency and over-the-wire size is already lower.


> Also I thought one of the advantages of using a js framework is that rendering can be done on the client? It sounds from this article that server rendering is more advantageous?

Server rendering has advantages when first loading the SPA from the server: better user experience - faster loading because you don't need to wait for AJAX calls to do their round trips between browser and server to get data to populate the page - and better SEO - you don't need to rely on the search engine to correctly render the whole page.

With state changes after the initial load, though, client rendering is much better for UX (responsiveness). The whole point of "universal javascript" is that you can get both (initial server rendering and subsequent client rendering) with the same code base.


Can't one populate the initial state from the server to skip the extra round trip of the ajax call? Eg. dump a json somewhere on the page and have react load it from there, like a global JS variable.


I suspect that in most cases time spent loading the initial data is dwarfed by the initial app payload and turning the data into html.

Server-side rendering solves this by sending you a full 'screenshot' of the initial app, even before the app is downloaded and 'fresh' data is requested.

The clever thing React does in this use-case is that even though the initial load is 'finished' html, React won't have to re-render the entire thing once your app is loaded and requests new data. Instead, it will be able to seamlessly take over and update only what's new, and do its usual thing from there.

One issue you could run into is that users start trying to interact with your app before React has taken over. I'm not sure if that's a problem in practice though, perhaps others can chime in.


You can achieve that with almost any framework.


I never claimed otherwise, but yeah, it's not something unique to React.

Could you elaborate on how this works with other frameworks? Does it work out of the box in most cases, or require some plugin/module?


Yes, that's what's done but also you want the page to be server side rendered for SEO and may be for those who have js disabled.


There's that, or use something like python-react[0]. No js on the server required ;)

[0] https://github.com/markfinger/python-react


Crawlers/indexers take JS into account to varying degrees. In my experience they will execute client side code, but AJAX calls are ignored. So at the very least you want to send down data for the initial load.


I've definitely seen Googlebot hitting the API endpoints of a site via AJAX calls. I can't say definitively how they treat that data, but they definitely call it.


Real world experience with clientside rendering for SEO has tons of downsides. Inconsistency in indexing, delayed indexing (yes Google), some (many) bots not doing it... etc.


It isn't solely server side rendering - the power of React and Node is that you can render on both server and client side, and get the best of both worlds.


Creating a universal JS app on the web has some extra cons too - not only is the code run on the server, but also a not insignificant JS payload is downloaded & executed on the client, as well as increased app complexity in order to maintain client/server separation at extension points.

One should be cognizant that one doesn't get benefits for free with the choice to go with universal JS.


>as well as increased app complexity in order to maintain client/server separation at extension points.

Can you explain what you mean by this?

And are you comparing client-only vs universal, or client-only vs server-only?

Large JS can be solved with code-splitting. The necessary code is downloaded in chunks when needed.


Not OP but think about the window object in the browser. In a client-side only JS application, you can assume that window will always exist. In a server-side PHP application, you never have to worry about interacting with window. In a SSR React application, you have to manage that complexity of window existing sometimes and it not existing other times if you want to share components between the client and the server.


So then there's:

- code-splitting needed for larger codebases

- taking the window object into account

Other stuff I can think of:

- more limited options for routing/etc. because you want to avoid having to maintain two codebases. This means you might have to use Redux+redux-router+whatever instead of express/hapi/etc. because you can't (?) run those frameworks client-side.

- less freedom to modularize state in your app, because last I checked it was still rather messy to ensure that all the separate data-loading events are taken care of server-side before everything is sent to the client. If you're a fan of the Redux approach this shouldn't be a problem though.

- Possible 'issues' with using ESNext features server-side without going through Babel.

All in all I think it's worthwhile and, nowadays, pretty doable to create a 'universal app'. But it has its own complexities and at least last time I tried, I could find very few examples of how to best do this, and we're probably still far away from a 'canonical' solution.

Furthermore, in practice I've found that in many cases it's not really worth the extra effort. If your app is large/complex enough to have a noticeable load-time, it's probably fine for the user to load the entire app first (it might even still be cached), and it's often the case that SEO is not really important.

In my experience, the few cases where a big-ish app needed to load stuff quickly involved going straight to some subpage in the app. In that situation I often found it easier to either just render that bit server-side the old-fashioned way, with its own logic (in which case React is still a good option).


The data is transmitted yes. In practice, after gzip, this is negligible compared to say... any average image loaded on that page.


> but I thought javascript was taken into consideration by web crawlers?

It's still kind of crappy. So people end up building more complex solutions like this or using stuff like prerender.io to work around this issue. Just adds another layer of complexity to the already way to complicated stack of modern SPAs (in my opinion)


javascript taken into consideration by crawlers is still likely to be less efficient and sure than just giving the crawlers html they can understand, thus with universal javascript the same code on client and server can render both places and give crawlers a server rendering of what a non-crawler will see with client rendering (it's what I do, I find the benefit minimal but there is a benefit)


I saw a presentation about REST and it included the concept of 'on demand code'; why do we need to serve up all our react components in a js blob? Can't we serve up minified and compressed components preemptively on the wire to keep load times small? Would this offer a benefit or am I missing something?


Congrats to the team!


Downvoted for positivity, damn HN. That's harsh.


More likely downvoted for not contributing to the discussion. While I'm sure it's appreciated, imagine comments like this being on top of the comment thread. I don't think anyone reading HN goes to the comments to scroll through a dozen of "Well done!" and "Congratulations!" comments before seeing comments with substance. It's probably safe to say people on this site go to the comments for extra information, insight and different perspectives on the topic.


I wish there were a way to share the voice of appreciation e.g. like that. Sometimes I really appreciate what a comment or submission had but I'm not sure how to express it. While up voting does that, it is anonymous to the receiving user, though on the other hand, I guess it's what keeps HN from becoming only about the 'likes'


I absolutely share the sentiment, but like you said, not having this feature is the sacrifice to make in return for a high(-ish) quality discussion board. The other end of this spectrum would be reddit, where you need [SERIOUS] tags and heavy moderation, and still get inside jokes, memes and circlejerking for the sake of karma/likes.


From the hn guidelines: "Please resist commenting about being downvoted. It never does any good, and it makes boring reading."


Well it was that or just assume that HN comments are only for bitching about minor discrepancies in the article, nerdy holy wars, or for general negativity.

Of course that's the general impression outside HN so perhaps they're correct and I'm just fighting a lost cause.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: