As a noob I copied code from Railscasts or Stack Overflow or docs or IRC without understanding it just to get things working. And then at some point I was doing less and less of it, and then rarely at all.
But what if the code I copied isn't correct?! Didn't the sky fall down? Well, things would break and I would have to figure out why or steal a better solution, and then I could observe the delta between what didn't work vs what worked. And boom, learning happened.
LLMs just speed that cycle up tremendously. The concern trolling over LLMs basically imagines a hypothetical person who can't learn anything and doesn't care. More power to them imo if they can build what they want without understanding it. That's a cracked lazy person we all should fear.
As a noob I copied code from Railscasts or Stack Overflow or docs or IRC without understanding it just to get things working. And then at some point I was doing less and less of it, and then rarely at all.
But what if the code I copied isn't correct?! Didn't the sky fall down? Well, things would break and I would have to figure out why or steal a better solution, and then I could observe the delta between what didn't work vs what worked. And boom, learning happened.
LLMs just speed that cycle up tremendously. The concern trolling over LLMs basically imagines a hypothetical person who can't learn anything and doesn't care. More power to them imo if they can build what they want without understanding it. That's a cracked lazy person we all should fear.