Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

have you thought of adding a trap that a LLM might trip on but a human wouldn't.


The problem is not the LLM, the lies are the problem.

I don't think an employer would mind a résumé that is factually correct, but edited by a LLM. In the style of "here is my résumé, emphasize the items that match this job offer, and also, fix my grammar and spelling".

Here, the candidates are using a LLM to invent experience that matches the job offer, making a fake résumé. A human doing it doesn't make it better.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: