Hacker News new | past | comments | ask | show | jobs | submit login

GPT-3 and these kind of natural language generators in general can already present opinions if prompted to do so. Depending on what you mean by "opinion", you might also require some memory to let it be consistent in what opinions it presents to you. (Not that humans are always consistent either.)

If you dig any further than this into the question, you quickly get back to the age-old question of "when is it 'real' consciousness and not just an automaton that acts and sounds conscious?"




True. I'm just thinking of what makes AI different from people. Could "opinions" be such a thing.

When we say somebody has an "opinion" we often mean they have some hidden agenda why they raise their "opinion". It is not a fact but opinion. It is frequently about what should be done, and to whom, and thus is most often self-serving.

Humans are intention driven creatures which continuously try to advance their own agendas. So I wonder are there AIs which would exhibit similar behavior, trying to influence the behavior of others with their "opinions". AAnd do we need such AIs? Are they not evil?




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: