> Microsoft wouldn't be able to pull that code out of already trained
I imagine they could, they just wouldn't want to. Because it might require retraining the model from scratch, or at least from some not-very-recent checkpoint.
Yeah that was actually what I was trying to get at. Microsoft would have to get rid of the model trained on protected content, remove that content from the training set, and start over training a new model.
I imagine they could, they just wouldn't want to. Because it might require retraining the model from scratch, or at least from some not-very-recent checkpoint.