Hacker News new | past | comments | ask | show | jobs | submit login

yeah! i think AI tools must be transparent of the input, period.

it feels too unfair leading a new art style and simply be copied with machine precision and speed… opting to not contribute to the neural network database should be a thing but i do not know how reverse engineering of output can be done




Yes, there should be opt outs for ML training. They could take many forms - robots.txt rules, special HTML tags, http headers, plain text tags or a centralised registry. You can take any work out of the training set without diminishing the end result. But doing so would mean being left out of the new art movement. Your name will not be conjured, your style not replicated, your artistic influence thinning out.

If an artist wants her works to have the fate of BBC archives, that removed millions of hours of radio and tv shows from the internet, then go ahead. The historic BBC content was never shared, liked, commented or had any influence since the internet became a thing. A cultural suicide to protect ancient copyrights.




Consider applying for YC's Summer 2025 batch! Applications are open till May 13

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: