I was actually surprised by the generally positive reaction the GDPR got in recent threads here on HN. I guess the suspicion of data hoarding overcame conspiracy theories about government regulation or EU protectionism.
BUT it’s important to note that GDPR would probably not have had an effect on the specific situation with Cambridge Analytica. CA is obviously toast if not by law then by the attention alone. Facebook, however, is likely allowed to share data under GDPR as they did with CA: they got the users’ permission initially, and there isn’t much you can do to protect yourself against malicious actors.
>they got the users’ permission initially, and there isn’t much you can do to protect yourself against malicious actors.
The EU is clearly moving against that blatant circumvention. I don't know exactly what they are going to do, but the whole, "just sign all your rights to privacy way with one click" is something they want to change.
I think the mostly likely situation will be one where each specific instance of use of your data would need explicit approval. Moreover the prompt cannot be disingenuous legalese. It needs to be clear and concise. I fear it might just become another Cookie's Law. But it might still be useful. For example, imagine if you get something like:
"Facebook discovered that you have Chronic Illness 1. Facebook requests permission to share this information with Insurance Company in your State. Do you approve?"
I think the insurance company would care a whole lot!
Facebook's big data is getting to where they can predict things like pregnancies, illnesses based on parsing minor changes in behavior and correlating it against the big data set. This is of course super interesting, but it also gives you results like 'suddenly this guy is 42% more likely to die in the next 6 months and doesn't know it'. There are no certainties, but to an actuarial entity like an insurance company?
That's more than worth getting your lobbyists to repeal any shred of requirement that you have to keep faith with such a person. Insurance combined with big data and stripped regulations makes such an industry purely a financial play: handled properly they can, for a time, collect money and never pay any of it out, until it becomes obvious that's what they're doing.
Those are the entities most interested in having Facebook tell them you're probably getting sick. And why would Facebook ever tell you? That's their inference. You never said a thing about it, and indeed they could be wrong. But don't bet on it.
Regarding "they got the users’ permission initially" this is true for the users that signed up for it, not anybody in their social graph. GDPR treats data about a user as data belonging to this user. Those people have definitely not consented to having their data mined for this use case.
Next (as I understand) the consent was for research purposes, not for the CA targeting. So under GDPR Cambridge Analytica could be fined 4% of global revenue or €20M - whichever is HIGHER [1]
AFAIK Facebook shared data of non-consenting individuals ("friends") of consenting individuals. In light of the GDPR this would be at least borderline to illegal. As well data from consenting parties was used in a manner not consented to (that would cross the line) and handed over to a fourth party. Finally FB did not ensure proper data handling (crossing the line again). At least when regulators would be willing they'd had a leg to stand on.
BUT it’s important to note that GDPR would probably not have had an effect on the specific situation with Cambridge Analytica. CA is obviously toast if not by law then by the attention alone. Facebook, however, is likely allowed to share data under GDPR as they did with CA: they got the users’ permission initially, and there isn’t much you can do to protect yourself against malicious actors.