Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

With the advent of ES6 modules's selective imports and tree-shaking, that's definitely quickly becoming a better approach. With the old CommonJS modules, you need to be concerned about overall code size, which is where small, single purpose modules excel, and why this approach has proliferated to this degree.


I've been reading about tree shaking. I'm not at my laptop at the moment, so I can't settle this question by testing it. I'll toss it to the community:

https://news.ycombinator.com/item?id=11349606

Basically, how does tree shaking deal with dynamic inclusions? Are dynamic inclusions simply not allowed? But in that case, what about eval? Is eval just not allowed to import anything?

I've been reading posts like these, but they are pretty unsatisfying regarding technical detail: https://medium.com/@Rich_Harris/tree-shaking-versus-dead-cod...

Maybe someone else was wondering the same thing, so I decided to post it here before wandering off to the ES6 spec to figure it out.


> Are dynamic inclusions simply not allowed?

Correct. One of the motivating factors for ES6 modules was to create a format that can be statically analyzed and optimized.

> Is eval just not allowed to import anything?

Correct.

See this excellent writeup for more details regarding ES6 modules: http://www.2ality.com/2014/09/es6-modules-final.html


Doea code size really matter to node.js ? And how common was commonjs (no pun intended) on the client before ES6 ? Also doesn't commonjs bundling add a significant overhead when talking 5 line function modules ?


Quite common actually (see Browserify)! In fact, the increasingly widespread use of npm and CommonJS on the client is one of the factors that motivated the npm team to transition to flat, auto-deduplicated modules in version 3.


There is little reason to bundle node.js code. It's an optimization, and a dubious one. In my experience, speed of execution isn't impacted at all. I haven't tested the memory footprint, but it seems vanishingly unlikely that dead code elimination would have any substantial effect.

There's probably not any overhead in bundling, though. Not in speed or memory, at least. The overhead is in the complexity: the programmer now has one more place to check for faults, and debugging stack traces now point to a bundled version of code instead of the original sources.

The case where none of this is true is when require() ends up being called thousands of times. Startup time will be correspondingly slow, and bundling is the cure. But that should only be done as a last resort, not a preemptive measure.


That entirely depends on your bundler. Newer bundlers like http://rollupjs.org/ combine everything to avoid this.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: