Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> Those examples, if included in training data, would confuse an LLM and likely lead to poor results.

I don't think that LLMs are good enough that they can they confused by logical inconsistencies in the training data.



Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: