Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

That is the giveaway for some of the high-profile LLMs. The ones I run locally with ollama can be VERY close to perfectly human, subtle typos and all


What’s your local setup? I run llama3:8b and it works well




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: