The point of signing is to do a checksum of the content of the video, not just adding a signature next to the stream (which would be pretty useless).
The signature of the politician on a street corner would only be valid if it actually verifies the content of the video; and the only entity that can produce that signature is the holder of the private key.
> The point of signing is to do a checksum of the content of the video, not just adding a signature next to the stream (which would be pretty useless).
That's what I meant: sign the video content with some new key in a way that looks like it was done by a device.
> The signature of the politician on a street corner would only be valid if it actually verifies the content of the video; and the only entity that can produce that signature is the holder of the private key.
The hypothetical politician's encryption keys are irrelevant. The point is that even authentic video is going to be signed by some rando device with no connection to the people depicted, so a deepfake signed by some rando keys looks the same.
IMHO, cameras that embed a signature in the video stream solves some authenticity problems in some narrow contexts, but it's nowhere near a panacea that "will pretty much solve the trust issues deep fakes are causing."
It's not about the keys of the politician. It's about the keys of the manufacturer of the camera. The aim of the signature is to prove that a video did in fact come from such camera at such location, at such time.
Of course, if one has physical access to the camera/sensor it's possible to make a fake video with a valid signature. But it's a little more difficult than simply running a deepfake script on some machine in the cloud.
> It's not about the keys of the politician. It's about the keys of the manufacturer of the camera. The aim of the signature is to prove that a video did in fact come from such camera at such location, at such time.
If the goal is to allow a manufacturer to confirm video came from one of their cameras, I think it's somewhat more helpful than I was originally thinking, but it doesn't change my opinion that this technology would only "solve some authenticity problems in some narrow contexts." Namely, stuff like a burglar claiming in court that security camera video was forged to frame them. I don't think it addresses cases of embarrassing/incriminating video filmed by bystanders with rando cameras and other stuff like that.
> Of course, if one has physical access to the camera/sensor it's possible to make a fake video with a valid signature. But it's a little more difficult than simply running a deepfake script on some machine in the cloud.
If you're faking a video like I described, you certainly would have "physical access to the camera/sensor" that you claim made it. You're making a fake, which means you can concoct a story for the fake's creation involving things that are possible for you to acquire.
The signature of the politician on a street corner would only be valid if it actually verifies the content of the video; and the only entity that can produce that signature is the holder of the private key.