API requests at page-load are definitely going to lower the page speed score. No API requests should happen at all, ideally, and all script and CSS to render everything "above the fold" should be loaded in-line. Nothing that is visible "below the fold" should ever run or load until the page is scrolled down by the site visitor. Only the bare-minimum script parsing that is required for the content "above the fold" should happen. Sure you can load scripts in-line for stuff below the fold, but make sure it doesn't actually get parsed by the browser until that feature is likely to be visible on the screen.
It doesn't matter what library or framework is used - even jQuery can score 100% page speeds on Google Lighthouse. It really only depends on following every single nitpicky thing that Google Lighthouse recommends, and finding a solution for it. I took a site (actually a few thousand sites) that was scoring about 5 to 35 out of 100 on Google Lighthouse (depending on the customiztions), up to 100/100 on Google Lighthouse. It took a lot of work, but the clients are happy now.
Where React doesn't do well is with lots of DOM elements. In another project I created a complex web application that had about 3000 div elements with drag-and-drop interface and all kinds of stuff going on. Well React couldn't handle it. The browser started crashing due to memory issues if the page was open longer than 20 minutes or so, it slowed to a crawl and eventually froze. I ended up switching that one component to a canvas based solution (Konva) and it solved the problem completely. I still use React for all the other simple UI stuff, but I learned my lesson about what React is really good at and what it's not.
> Nothing that is visible "below the fold" should ever run or load until the page is scrolled down by the site visitor.
On the other extreme, completely deferring any loading of 'below-the-fold' content until it's visible can also have horrendous consequences, if that loading involves downloading any external resources. Not every visitor can just make further requests near-instantly, and it's those RTTs that really slow pages down, in my experience. Excess API calls are just one (very common) source of excess RTTs.
The obvious compromise would be to load 'below-the-fold' content ASAP once all 'above-the-fold' content is finished, unless it's unacceptably heavy for the target device. (Then again, some people still won't be happy: I recently talked to one person who derided the load times of a certain blog, but I found that all of its meaningful content was loaded and visible very quickly: they were only taking issue with the Disqus comment widget at the bottom at the page.)
> Not every visitor can just make further requests near-instantly
We use a "loading" spinner. People know they have a shitty computer, and they'll wait an extra second instead of spending $1000 on a new computer. That kind of page speed problem isn't one we could ever fix, and it's something the user is typically well aware of being on their end, because every site is slow, not just ours.
> completely deferring any loading of 'below-the-fold' content until it's visible can also have horrendous consequences
We use a combination of lazy loading and SSR (not React SSR, it's hosted on a custom CMS). Content is SSR to maximize SEO. In some places we'll only render the first 3 or 4 items and then lazy-load the rest if it's a lot of data. Too many DOM elements at page load is also bad for the page speed score. Javascript is lazy-parsed, meaning the text of the script can be loaded in-line, but not parsed until the page is scrolled and the content is in view. Loading the text of a script isn't what slows the page down, it's the parsing of larger scripts that causes page speed issues. When the page scripts can be parsed in smaller chunks, the scrolling experience can be more fluid and the page speed test is satisfied. All images below the fold are lazy loaded. All 3rd party widgets are lazy loaded where possible, because they all suck and they mess with page speed pretty badly. There are lots of other tricks to get to a perfect 100 score on Google Lighthouse. Fortunately for us, "below the fold" is generally easy to accomplish across all of our sites, by the nature of the type of sites we are building. It doesn't work for all kinds of sites, but for ours it works very well.
You obviously didn't read what I wrote or understand it. I specifically said this:
>Sure you can load scripts in-line for stuff below the fold, but make sure it doesn't actually get parsed by the browser until that feature is likely to be visible on the screen."
I specifically said you could load a script in-line for stuff "below the fold" as long as the browser doesn't parse it until it's used. That's very different than doing an HTTP request for a script file while scrolling.
But you know what? Forget it. I'm done trying to explain things to people who think they already know it all and didn't even understand my original comment.
> users can wait an extra second [every time they scroll to load new content]
which is exactly the bad experience the commenters above are talking about.
They understood your comment, and they disagree. This is not the good advice you think it is, unless your main goal is to score 100 on lighthouse for SEO purposes, not UX.
The loading spinner is specifically for people with shitty internet connections when loading dynamic data after the initial page load. And you're completely misunderstanding practically everything I wrote and replacing what I wrote with your assumptions. Go ahead, it's the internet, bash away all you want. But I know what I did, I know it works, and I know it's not janky at all - it's your assumptions that are wrong. The advice is good, your understanding of it is not. You don't need to reply, I won't be trying to explain any of this any further just so you can misunderstand everything I wrote, again.
People scroll pages to skim. This also sounds like it might break CTRL-F.
If I can't skim your page instantly I will more than likely churn my visit.
Doesn't matter that I have a good computer on a 1Gbps connection, you ruined my experience. I'd rather wait 1 sec for the full page to load, than wait a series of 100msecs on what should've been a fully loaded page to actually load at an arbitrary point in time.
Maybe you missed the part where I said we're using SSR for content? That solves the CTRL-F problem easily.
You (and a lot of others here) are making a ton of wrong assumptions, imagining things I never said, and making up your own problems that don't exist in my code just to try to bash me, without even really understanding anything that I wrote in my comment. This entire thread sucks and is full of low-quality trolls. I've been doing front-end for ~30 years, I know what I'm doing. Don't bother replying, I won't be responding to further wrong assumptions and bashing.
How do you know that was a React issue and not a memory leak, or an error in your own code perhaps?
This smells of a memory leak, particularly if you forgot to add a dependency to a hook for example, but there is plenty of non react related code that could go wrong with drag and drop interfaces too
Not using any hooks, it's simple old-school react started from create-react-app, then converted over to Preact, and then updated to use latest webpack and all libs updated to latest versions, so this project has gone through many changes and I have no doubt if I started it new today from the ground up things would be different. I think the problem was with the drag-drop library, react-dnd I think. But the point is that switching from DOM to a canvas solution fixed all the problems I was having.
It doesn't matter what library or framework is used - even jQuery can score 100% page speeds on Google Lighthouse. It really only depends on following every single nitpicky thing that Google Lighthouse recommends, and finding a solution for it. I took a site (actually a few thousand sites) that was scoring about 5 to 35 out of 100 on Google Lighthouse (depending on the customiztions), up to 100/100 on Google Lighthouse. It took a lot of work, but the clients are happy now.
Where React doesn't do well is with lots of DOM elements. In another project I created a complex web application that had about 3000 div elements with drag-and-drop interface and all kinds of stuff going on. Well React couldn't handle it. The browser started crashing due to memory issues if the page was open longer than 20 minutes or so, it slowed to a crawl and eventually froze. I ended up switching that one component to a canvas based solution (Konva) and it solved the problem completely. I still use React for all the other simple UI stuff, but I learned my lesson about what React is really good at and what it's not.