Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> Microsoft wouldn't be able to pull that code out of already trained

I imagine they could, they just wouldn't want to. Because it might require retraining the model from scratch, or at least from some not-very-recent checkpoint.



Yeah that was actually what I was trying to get at. Microsoft would have to get rid of the model trained on protected content, remove that content from the training set, and start over training a new model.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: