Hacker News new | past | comments | ask | show | jobs | submit login

Didn't msft do the same thing with twitter and end up with racist bot? I am not sure how this will turn out.



Microsoft's bot (Tay) learned as people talked with it. People took advantage of that and basically attacked it with racist things which meant it ended up learning to be a racist.


Actually, IIRC someone had discovered a debug command, "repeat" or something like that. So people would just tell it to repeat offensive sentences.


offtopic, but how does someone "discover" it? by sheer luck?


So we're rediscovering parenting via AI!


I can see AI parent being a new career.


My point was that, parenting 101 is not to model behaviour we don't want to see in our kids (don't point to twitter if you want a polite non-racist AI bot!).

Thinking about it now, this is deeper... There's a fear that AI will take over the world, use weapons in unethical ways, say one thing and do another, etc... If we use news channel and politicians debates to teach AI, I'm afraid that this is exactly what we're going to get!


Thinking about this, this might turn to be the greatest thing for humanity:

https://twitter.com/dorfsmay/status/785907475480350720

"The same way adults stop swearing once they have kids, we might become more honest and ethical by fear that AI will learn from us."


Here's a short story on that topic:

http://karpathy.github.io/2015/11/14/ai/


You don't want to deal with a teenage AI.


I remember FYAD doing the same thing to an eliza implementation that had learning back in the day, in like 2004. Plus ca change...


It's pretty well established that modern AI's main contribution is (a) massively larger datasets, (b) algorithms and technology to handle those massively larger datasets, (c) boring but important parameter tuning.

The core learning algorithms are not changing.


Tay was capable of learning, which is what lead to the abuse.


Same thing happened when Watson started reading Urban Dictionary. The damage was so bad had to roll it back to an older backup.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: