> Because there's going to be something that looks like a violation of the Fair Housing Act (f.ex., or any number of other laws promoting equality) and "Our algorithms handle it" isn't going to satisfy a judge.
What is an open algorithm? Pseudocode? Actual implementation code? A certificate from some third party auditor? A certificate from some formal verification system?
We could start by forcing google and Facebook to disclose who paid for what advertising.
It’s something a lot of European countries have rather strict laws on, yet, even if you’re in a partnership with Google, you can’t know. If you have a record company with a YouTube channel, Google will pay you for your views, but they won’t disclose to you what share of the money it is that they are paying you. It could be 90%, it could be 1%, you have no idea. That’s just crazy to me, at least it’s relatively harmless in this example.
They won’t tell you who paid for political content either. They won’t disclose how many users it reached or why those specific users were targeted. We have rather strict laws on political advertisements in many European countries, but those laws are completely ineffective when this data is hidden by big Tech.
It doesn’t have to be about political advertisements or elections. I could be anything really. I could run a hiring campaign that targeted young men on Facebook. That would be illegal, but there wouldn’t be any evidence I did it, because Facebook won’t release that information. Facebook got in trouble for making those tools available, but as far as I know, no one who uses them got caught.
That’s a little less harmless, and couldn’t that be a good place to start introducing some transparency?
What is an open algorithm? Pseudocode? Actual implementation code? A certificate from some third party auditor? A certificate from some formal verification system?