I find (2) to be kind of a BS narrative. It’s not like they are open-sourcing the code. All big tech companies have security audit programs. Does that assuage your security concerns when Google, Meta, etc do it? Basically, what they are saying is they are doing it on the backend and you should give them credit for doing it on the frontend because some security researchers are going to get a Disneyland tour.
Personally, I actually am somewhat encouraged that tech companies are subject to legal process (in the form of lawsuit discovery etc). Tech companies have paid billion-dollar settlements for allegedly breaking the laws in segments of their business that don’t generate billions in profits. Obviously there’s a lot of room for them to push the limits of the law, disagreement about what the law is, and there mays be ways for them to get away with things. But the fact that audits are happening, the code and logs can end up in court acts as some kind of constraint on their behavior.
“We’ll take the extraordinary step of making software images of every production build of PCC publicly available for security research. This promise, too, is an enforceable guarantee: user devices will be willing to send data only to PCC nodes that can cryptographically attest to running publicly listed software” [1].
Personally, I actually am somewhat encouraged that tech companies are subject to legal process (in the form of lawsuit discovery etc). Tech companies have paid billion-dollar settlements for allegedly breaking the laws in segments of their business that don’t generate billions in profits. Obviously there’s a lot of room for them to push the limits of the law, disagreement about what the law is, and there mays be ways for them to get away with things. But the fact that audits are happening, the code and logs can end up in court acts as some kind of constraint on their behavior.