Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Maybe an AI is best able to detect when another AI did someone's homework.

The purpose of an assignment is not to produce a document, it's for the student learn how to synthesize information.

Having an AI produce the document does not provide the student, or the rest of society who needs the student to be good for something, with anything.

In this case it was even stupider. It wasn't a test, it was an informational question about themselves.

What in the ever loving hell is the defense or value in having an AI fabricate a random answer to a question about what you think about something? At least outside of some marketing data gathering context where there is a reason to give false information.



> Maybe an AI is best able to detect when another AI did someone's homework.

That's a losing battle. The best CAPCHA solvers are now better than humans.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: