Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

This mostly means that LLMs are good at simpler forms of pattern matching, and have much harder time actually reasoning at a significant depth. (It's not easy even for human intellect, the finest we currently have.)


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: