Hacker News new | past | comments | ask | show | jobs | submit login

Seeing this, it reminded me of an interesting topic: caching at browser-level the common external JS libraries, to achieve big performance improvements: https://github.com/w3c/webappsec-subresource-integrity/issue...



And that reminds me of the rebuttal: such "embedding/blessing" of libraries would give them a considerable tailwind and stifle innovation. Suppose React/styled-components/Apollo/Highcharts is already available on the client. In that case, it becomes much harder to consider alternatives, and any new contenders (Vue/Solid/urql/emotion/etc) would never get traction.


Yeah, cross-domain public caching had mostly theoretical benefits and never really worked that well in praxis. Glad browser vendors moved away from it.


I'd like to read mor about why it only has theoretical benefits. Do you have any links to share where I can learn more? Thanks!


Check the link in my top comment. Most problems are related to the privacy.

A website could always tell if you loaded the resource or had it in cache. If you had it in cache, it means you have previously visited a website that used the same library. Now, if multiple such inferences can be made, it could be used for many things like fingerprinting, for marketing purposes, for identifying a user's identity, etc.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: