Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Why does it? With how much it hallucinates, it's likely to be objectively worse than existing RegEx filters.


regex is too specific, LLMs seem like they could help us make more general types if we can avoid hallucinations, maybe we could teach one to generate spam in order to teach another how to recognize it




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: