Hacker News new | past | comments | ask | show | jobs | submit login

Sounds like a bad idea to me.

I would expect latency (network round trip time) to make this entire exercise worthless. Most polyfills are 1kb or less. Splitting polyfill code amongst a bunch of small subresources that are loaded from a 3rd party domain sounds like it would be a net loss to performance. Especially since your page won’t be interactive until those resources have downloaded.

Your page will almost certainly load faster if you just put those polyfills in your main js bundle. It’ll be simpler and more reliable too.




In practice when this wasn't a Chinese adware service, it proved to be faster to use the CDN.

You are not loading a "bunch" of polyfill script files, you selected what you needed in the URL via a query parameter, and the service took that plus user agent of the request to determine which were needed and returned a minified file of just the necessary polyfills.

As this request was to a separate domain it did not run into the head of line / max connections per domain issue of Http 1.1 which was still the more common protocol at the time this service came out.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: