Hacker News new | past | comments | ask | show | jobs | submit login

Is it an evaluable fact that any particular LLM "is substantially less often correct" than what?

The comparison to biology is to ask if what is termed "hallucinations" are different from what human minds do.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: