Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Curious to see how quickly each LLM picks up the new codecs/options.




I use the Warp terminal and I can ask it to run —-help and it figures it out

the canonical (if that's the right word for a 2-year-old technique) solution is to paste the whole manual into the context before asking questions

Gemini can now load context from a URL in the API (https://ai.google.dev/gemini-api/docs/url-context), but I'm not sure if that has made it to the web interfaces yet.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: