That doesn't look "functional" at all. The "slamTimTam" function is modifying an object it's been given. Modifying an object that you've taken as an argument should almost never happen anywhere in any codebase, much less a functional one.
Although it's not totally your fault: TFA seems completely confused about what "map" actually does, itself. I can't really tell whether they know what immutability is or not.
In fact one of my biggest pet peeves is "fluent" style code that modifies the object it's being called on. D3 in JavaScript is a perfect example of this abominable perversion of that "functional" style.
And further - there isn't really anything wrong with modifying an object. Functional programs are nice because they remove lots of extraneous state, but there are many situations where you still have to track state somewhere - doing it in a contained object that is changed is fine.
The issue (particularly in JS) is when you allow assignment by reference value, and not by copy - in that case modifying an object can be dangerous because other parts of the code may be holding onto the same reference, and you've accidentally introduced subtle shared state.
Ideally - all users would be aware, and would make sure that assignment only happens with a call to something like copy()/clone() but the language doesn't really have the tooling to enforce this (unlike many other languages ex: c++/Rust/others)
Yes. Furthermore I am not really even sure what the motivation was in the first place. I have found generators to be most useful in my career in exactly one circumstance--when I want to abstract some complicated batching, filtering, or exit condition logic out of my main processing loop. As a specific example, generators are great for taking an input stream and returning the first 10000 packets matching '^foo.*' in batches of 10.
In the article, it seems like we are just looping over a set of 11 cookies and doing exactly the same thing to all of them. Not sure what I'm missing. The stuff after that regarding infinite sequences is pretty good though!
> In the article, it seems like we are just looping over a set of 11 cookies and doing exactly the same thing to all of them. Not sure what I'm missing.
You're missing that we're not just looping over a set of 11 cookies and doing exactly the same thing to all of them. We're streaming over a lazy sequence of cookies and doing the same thing to them until we stop (in the example when they hit 5 cookies eaten instead of eating them all and, presumably, becoming ill).
Generators act like streams, and are useful in any place you might see this pattern:
value, state = next_value(state)
-- alternatively if we can mutate the state directly --
value = next_value(state)
But they permit you a greater deal of flexibility about the way "state" mutates than a typical state structure might (or at least greater ease of use).
Do js generators resume from a yield point? Reading this thread I see no mentions of the fact that generators usually(?) resume from whatever branching-looping state they were at yield, which is hard to simulate in general. A simple for(i){yield} - yes, just store an index somewhere and simulate to “resume” at it in your stream-like closure. A complex stateful algorithm with lots of cached lexical context - well, manageable with a corresponding data structure, but more expensive in terms of mapping it to a state table.
The whole point of a generator is its stack frame and its “instruction pointer”. You can implement a state machine right in regular code via cheap primitives. Otherwise the difference is just syntactical, afaiu.
I mean, you could look at the examples in the article and see that, yes, they resume from a yield point. But here's a simple illustrative example:
function* bar() {
yield 3;
console.log("Resumed after the 3")
yield 4;
console.log("Resumed after the 4")
}
function* foo() {
yield 1;
console.log("Resumed after the 1")
yield 2;
console.log("Resumed after the 2")
yield* bar();
}
for(let n of foo()) {
console.log(n);
}
Output:
1
Resumed after the 1
2
Resumed after the 2
3
Resumed after the 3
4
Resumed after the 4
It is a weird example. But that transform does not work generally. Consider if you want to stop when insertIntoMouth() returns a particular value. Or if there is some filter() step, so that you don't know that `n` input items will map to `n` output items.
The difference here is push vs pull (also called eager vs lazy). It is often easier to write pull computations that only do the work that is actually needed.