Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I haven't seen a model since the 3.5 Turbo days that can't be ruthless if asked to be. And Grok is about as helpful as any other model despite Elon's claims.

Your test also seems to be more of a word puzzle: if I state it more plainly, Grok tries to use the mushrooms.

https://grok.com/share/bGVnYWN5_2db81cd5-7092-4287-8530-4b9e...

And in fact, via the API with no system prompt it also uses mushrooms.

So like most models it just comes down to prompting.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: