The surprise isn't that the memory associated with `logIt` is leaked.
The surprise is that `logIt` holds on to the giant `str` object in its environment, despite the fact that there is literally no way for the `str` variable to ever be used again.
V8 is smart enough to not make `logIt` hold on to `str` if there are no closures at all which refer to `str`. It's the introduction of the unrelated `doSomethingWithStr` closure that forces `str` into the lexical environment.
BTW, the original bug in the Meteor codebase wasn't a setInterval.
It was that code to replace a certain type of object (a Spark renderer) with a new instance of that object accidentally ended up with a reference to the preceding renderer in a closure assigned somewhere on it, even though that particular closure didn't actually use that reference.
So instead of replacing renderer #N with renderer #(N+1) and GCing #N, we ended up with a stream of renderers which never could be GCed... even though the reference keeping the old ones alive was literally impossible to ever use.
But you're still clearly retaining a reference to the lexical frame, so I'd think the "zeroth-order" assumption would be that you'd be keeping everything in that frame.
It's nice that JavaScript will do enough tree-shaking that it will eliminate the single-closure case, but that's an optimization, not something you should be counting on.
Similarly, I bet if you had a call to "eval" in there, you'd keep the entire frame, but I don't understand the exact semantics of JavaScript's eval enough to really say.
I suppose this boils down to whether your mental model is more "frame-like" or more "upvalue-like". If you think of keeping around the containing frame, it's not at all surprising that you retain the variable. If you think of the closure as a discrete set of upvalues, then it's a surprise.
> I'd think the "zeroth-order" assumption would be that you'd be keeping everything in that frame.
Why would that be a reasonable assumption? If a variable cannot ever be referenced, it should be garbage collected. In the example, "str" cannot ever be referenced. This is why we use garbage collected languages.
> but that's an optimization, not something you should be counting on.
It's not an optimization, it's correct GC behavior, and it's absolutely something you should be counting on if you're using a GC language.
I suppose I don't find it surprising, because I've used a lot of R, and R can't even optimize the simple case. There, it's for good reason, since you can extract the environment of any function, so once a closure escapes, you have a handle onto its entire containing frame.
I think I've used scheme implementations that have the same "feature", with less of an excuse.
It's a similar "failure of GC" if you create an object with two fields, like "obj={a: giant_value, b: 5}" then had your closure return obj.b. I bet the "a" slot would not be GC'd, even though there's no way to reach it.
If your mental model is that you're holding onto the lexical frame, it's a pretty much the same behavior.
That allocates a closure. Now if you want it to sit around forever, then hey it is working as intended. Otherwise, it does "leak" some memory that will never get reclaimed by the GC.