Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

If Google's LLM is trained on the SeqTrack manual, then yes.

If it's not then it will just invent some plausible-souding bullshit that doesn't actually work.

After the fifth time you get burnt by this the whole LLM experience starts to sour.



Yes. I wonder what's LLM providers' approach to ensuring high-quality of training materials?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: