Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

If you give GPT3 code with a bug in it, and ask it to find the bug, it can't really do that. I'm pretty sure giving it all the data and asking it why things aren't working the way it should, it wouldn't have actual knowledge.

There's a depth to explaining things that GPT still can't do. It's still astonishing, and has completely changed my idea on what AI can do, like write plays with very incredible context (better than most humans!) but there are still major limits.



>If you give GPT3 code with a bug in it, and ask it to find the bug, it can't really do that.

The hell are you talking about? I've been doing this literally any time I need something fixed and it does just fine.


It doesn’t solve non-trivial bugs. It can bugs that match patterns that have been asked a lot if Stackoverflow or something like that.


I love the sound of goal posts as they go whooshing past.


I think it's more likely a problem that the GP poorly worded their initial statement, rather than actually moving the goalposts. They were probably having trouble with a few thorny bugs, tried ChatGPT, got nowhere, and forgot to qualify their initial statement with "for the few non-trivial bugs I tried".

From the external point of view, the goalposts moved, but from within the GP's poorly expressed mental model, they haven't moved. But, that's just a guess.


It can definitely find and fix bugs. Of course it's not great at it and cannot be relied upon.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: