Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Bacteria are not intelligent - they have no intent to live, learn, or evolve. However, given enough time, resources, and the right stimuli, bacteria have nonetheless evolved to create us, humans, with intents to live, learn, and evolve. It is trivial to hook up an LLM to its own outputs or that of another LLM. People messing around with LLMs on Twitter are casually playing with rudimentary approaches to add goal-directed behavior to them. In the end, does it matter whether or not it actually has intent, if we decide to make it act as if it does anyway?


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: