> You can improve your network request speed, but you can’t bring the time taken down to literally zero, i.e. the speed if your page was already prefetched while the user was looking at search results.
Maybe I'm just old, but I don't understand why this is a worthwhile goal. Even with its bloat the internet today is much, much faster than it was 15 years ago and I'm quite happy to wait a few seconds for a page to load. In addition, doesn't pre-fetching every result just waste bandwidth on mobile?
Well, for one thing, internet speed depends on the quality of your connection, which in the case of mobile networks varies wildly.
But personally, I’m a sucker for low latency across all of tech (and gaming as well). That’s why I use Safari instead of Chrome, Terminal.app instead of iTerm, C++ instead of Rust sometimes (compile times), and basically anything else instead of Java (startup time), among other preferences. I even prefer reading ebooks on normal screens rather than e-ink displays, simply because I don’t want to wait between pressing “next page” and seeing the page show up. Not surprisingly, then, I really appreciate website snappiness in general, including links that load instantly due to prefetching. I can’t say I appreciate AMP as it exists today, because the quick load comes with a host of UX issues, but I have hope for the future.
Take that perspective how you will. I think I’m a bit of an outlier in just how much latency bothers me, but pretty much everyone consciously or subconsciously appreciates when it goes down. Probably including you: the current speed might seem “fast enough” now, but if faster loading becomes the norm and the other issues are dealt with, I bet you’d have a hard time going back.
Maybe I'm just old, but I don't understand why this is a worthwhile goal. Even with its bloat the internet today is much, much faster than it was 15 years ago and I'm quite happy to wait a few seconds for a page to load. In addition, doesn't pre-fetching every result just waste bandwidth on mobile?