I'm wondering why it took slightly but not significantly more than the 3 months. On one hand, adding a button doesn't have to take 3 months despite all the necessary reviews etc., not if it's actually considered a priority. On the other hand, if the benefits were considered worth the fine/fee, you'd expect a bigger delay.
Personally I wouldn't read too much in to the timing. Although the end result looks small the change affects multiple large projects, each with hundreds of millions of users, and likely has knock on effects in other parts of google. Sprinkle in a bit of legal review + big company bureaucracy and I can easily imagine it taking 3 months.
They'll gladly pay (maybe) 2 million in fines so that later they can pretend complying was extremely difficult for them. Drop in the bucket in the long run.
I can guarantee that the vast majority of implementation time was more likely due to verification and auditing of the solution, rather than the solution itself.
The company I work at would break if cookies weren't allowed. It would be a mad scramble to change many many things to just function at all. 3 months seems pretty fast to me to be honest.
Also there is likely lots of code that would break without a tracking cookie passed to it because tons of code is written with the assumption the cookie would be there.
well, it wasn't just the last 3 months that there was supposed to be a way to disable those, just that it has to be a single button now (which afaik was previously already the case anyway)
> This update meant we needed to re-engineer the way cookies work on Google sites, and to make deep, coordinated changes to critical Google infrastructure.
No they didn't have to. Rejecting all cookies was possible before, it was just hidden behind convoluted and confusing menus. To make this functionality available with a single click of a button they didn't have to invent any new technology or process or backend service. Just move the onClick handler to a different HTML element on a different page.
Yes, you will need to white list many websites, yes, you will also become appealed at the sheer amount of cookie requests you receive and thank God that you have installed noscript every day
Google cookies do not work that way. Ironically, one of the challenges is that some of them are firewalled from each other so that it's harder to aggregate a holistic picture of a user in one location. You know, for privacy reasons.
And not even moving an onClick handler is trivial given the layers of abstraction that Google's UIs are built on top of.
It's D) The hard part is the backend data storage, not the frontend.
People's intuition of how user data storage and retrieval works at Google scale doesn't align with the reality. Cost of storage (immediate and secondary), effect on creation of new permissions, privacy protection, user security (can this new option somehow be used to leak userdata or trick users into doing something they didn't want to do?), and (if they're only rolling this out in Europe) region gating all have to be accounted for.
Not at all. The previous pop-up gave you the opportunity to switch off every item individually. What do you think would happen if new items were added?
The new button sets a new default that affects new options for data collection. And yes, adding one new field that affects user configuration across all Google properties (or even two major Google properties, such as the main site and YouTube) is a one-quarter project. Privacy review, legal review, data storage assessment to make sure an additional Boolean isn't going to blow a limit somewhere, thinking through all permutations of options users could already have set to make sure that this new feature doesn't break existing user configurations... Hell, they probably had to do analysis to make sure that you couldn't uniquely identify users based on the setting because of how many / how few people would click reject all. And did they roll it out everywhere or just in europe? Europe only requires additional work.
It's a big corporation, extrapolating how it works from small corporations leads to inaccurate conclusions.
That may well be. However, I’d argue that effects like these are just the cost of doing business at their scale – and therefore entirely their problem. No one outside of the affected corporation should need to care at all.
Oh, definitely. This overview was an explanation for why "Just adding one new option" is a one-quarter project and not nearly as simple as people from smaller operations assume it will be.
... in fact, I think this is indicative of the risk Google experiences now of disruption. At their scale, they are no longer agile. When a company is willing to spend money on a daily fine for exceeding the target deadline after three months, they're signalling strongly that smaller, more focused operations can out-maneuver them.
Google has a very strong interest in developing tracking techniques that do not require cookies, so they can appear to be on the side of the public with respect to privacy, while also securing a competitive advantage for tracking in a post-cookie world.
The good thing about the GDPR is that it's about the act of tracking and collecting data as opposed to focusing on specific technical means, so moving away from cookies won't allow you to work around the law.
This is absolutely true. I took a survey through MTurk from Google, from back when they were reworking the "ad" icon on search results. They wanted me to quickly select the first non-ad, with different "ad" icon styles (same color as rest of text, different shapes, etc).
Basically true but not the terms I'd use; they aren't really A/B tests but staged rollouts, though the process and tooling required is similar. We did staged rollouts of _everything_ back when I worked on google search that wasn't a trivial bug fix. We'd move it to 1% for a day, check metrics, increase to 10%, hold a couple more days and check metrics, then to 100%. Very sensitive or risky launches might hold at a full 50% for some time. UI changes were "dark launched" behind a flag that we incrementally flipped on. The reason is that no test suite captures reality and this discipline forces you to account for easy rollbacks (just turn the flag off) and handle "skew" (the case where I user starts a session where the flag is off but then starts talking to a machine where it is on, or vice/versa). This was in addition to the binary that released multiple times a week and rolled out slowly over the course of the day, and often this happened after multiple versions were tested in experiments with statistically significant samples.
It is "funny" reading that, the amount of energy they spend to be illegal but only just so that they can squeeze as much dark patterns on their users before they get a fine. Rinse and repeat.
For normal companies / webmasters it is quite simple though. You don't have anything to gain from extorting your users. So please respect your users instead.
Don't use shady practices that require you to bring up a cookie-banner in the first place. Just don't.
Reliably identifying a (unique) visitor is pretty difficult using ip logs, though.
Cookies generally make this much easier, at the very least identifying a visit.
With ip logs, you’re not just dealing with the fact that IP addresses are often shared between people (eg behind a NAT), you now also need to record IP addresses, which arguably is an even bigger privacy violation than just using cookies.
I’m not saying that one method is always better than the other, but it’s definitely not as black-and-white as you make it out to be.
> Why do you think you need to uniquely identify all your visitors?
Why would I not? A brick-and-mortar store can tell when the same person walks in twice; it's easy to imagine webmasters wanting that too (to stay noting of having a chance of distinguishing real people from bots and scrapers).
Most brick and mortar stores can not reliably tell whether any single person has visited the store before unless they track them in some way. Probably only case is the store owner who is also the only worker with excellent memory for faces and details.
How fine grained detail do you actually need? I can't see how knowing if one person or two people in a house visited you're site helps. They could have easily just switched to their phone.
Yea on top of that you can fingerprint the TCP/IP/TLS settings of the user’s connection as an additional point of data: https://nmap.org/book/osdetect-fingerprint-format.html. My gut feel is that browser+tcp+ip+tls fingerprinting can get you pretty damn close to uniquely identifying users without needing cookies.
> if it was that effective, why wouldn't just google switch to it and stop wasting money on ideas like FLoC, and ditch 3rd party cookies?
First, I would be shocked if Google isn't using fingerprinting data in conjunction with cookies.
But also, this comment misunderstands what FLoC was. FLoC was, at least in theory, an attempt to get the benefits of targeted advertising without uniquely identifying users. That's what the "C" in FLoC refers to - users aren't targeted individually, but rather by the cohort they belong to.
FLoC unfortunately had many issues, one of which is that there were concerns that the cohorts were too granular and could still effectively denanonymize users. There's some research indicating that this was the case - FLoC cohorts revealed more info than they intended to, but also still less than individualized profiles do.
The stated goal of FLoC was actually more privacy-focused than the status quo (individualized profiles). Unfortunately, that's not what ended up happening - or at least, the general public didn't trust that it was.
fingerprinting probably brings in more identifyable bits, but without the cookies it would not be commercially reliable data to determine uniqueness.
i am aware what FLoC was and tried to be. it was a horrible idea with it's default optin putting onus on website operators to add headers to opt-out from a mass survelience exercise generating money for the surveillance capitalists.
Well, you asked (perhaps rhetorically) why Google didn't switch to using fingerprinting instead of FLoC. And the answer is that FLoC and fingerprinting suit different use cases. Fingerprinting doesn't serve the goals of FLoC, and FLoC doesn't provide the same data that fingerprinting would.
In addition to the sibling poster’s comment, publications had data on what parts of town and stores sold their publications at newsstands and the names and addresses of subscribers.
Before you could get free publications like MacWeek, you had to fill out demographic data to qualify.
Nielsen ratings are for small subsets of users who explicitly opt in to tracking. That's not the same as "uniquely identify all end users of my website" it's like paying user testers.
Nielsen tried it’s best to get a demographically representative sample of households. It wasn’t purely a self selected sample. I doubt people were incentivized by the $5 they put in envelopes.
But the reality of the 'total information awareness' that the profiling systems build up on you, your actions on site, all the sites you go to .. that is beyond the level of the worst stalkers. Shady doesn't being to cover it!
If you don't consent to my connection to your server, all you have to do is return HTTP 403 Forbidden. Alternatively, you may simply not reply to my HTTP requests at all.
You can flip this the other way though — “this site uses cookies for ad targeting, business metrics, and UX improvement blah blah.” If you the user don’t consent all you have to do is terminate your connection.
They'd love that, wouldn't they? Their way or the highway, take it or leave it. They know that people depend on internet services and have no choice but to accept whatever bullshit terms they add to their little document which nobody reads. A document they need to "accept" before they're even allowed to read it.
Now we can deny consent, know that they're legally required to respect our choice and enjoy the fact they can't punish us for it by denying service. Whole world needs to adopt and expand upon these laws, and also ramp up enforcement and start fining all these non-compliant cookie banner-displaying web sites out of existence.
They don't get to flip that script. They're not entitled to data about us. It's our information, they don't have an assumed right to collect it for any purpose no matter how benign. That they even think they can do it is offensive. The audacity of these people. Any data we do give them is to be used for our benefit. It's not meant to fuel some surveillance capitalism nightmare.
In this context "they" is "web developers" and the Orwellian scenario you're describing is "The way the web has worked since cookie jars were implemented."
Yes, I don't doubt web developers would mostly prefer not to be bothered and just keep doing what they've been doing. Much cheaper than doing a system audit to figure out if you're GDPR-compliant, concluding with your lawyers that you have no idea if you're GDPR-compliant, and putting up a cookie banner nobody wants just in case. That's assuming you can afford lawyers.
At this point, it's safer to partner with a host that'll manage all that stuff for you than run an independent site. Which, ultimately, the Googles of the world love to hear, because they can afford the lawyers.
The GDPR is an amazing bit of jiu-jitsu to build a moat around competition. All the big names have to do is spend a pittance every so often to stay in compliance and they win the market, while small operators decide the risk ain't worth it because "Just don't collect user data" is actually quite hard to guarantee on a system built atop infinite cheap copying and arbitrary site access.
Not even the apache default logging config is GDPR-compliant. I don't know how anybody expects indie site admins to keep up.
Just don't collect any data and enjoy instant compliance. You're not supposed to hire expensive lawyers to tell you how you can spy on users safely, you're supposed to not do it in the first place. Most read-only websites have no actual need to collect data.
I believe the GDPR specifically forbids that sort of configuration. By which I mean, I'm not allowed to deny you service if you aren't happy with consenting to my collection of your data.
It forbids denying me service as punishment for not consenting to your data collection. You can totally make your public server private though. Configure it so that it only responds to certain computers, then invite people or charge them for access. If you do it like this, the only way people can talk to your server is if you allow them to. They won't be strangers and will have your consent.
You still need their permission to collect data and you can't revoke their access as punishment for refusing.
This whole thing reeks of taking the easy way out and dumping the problem on the user. Why can't you analyze usage patterns in a controlled environment to identify the typical number of page loads?
You end up at a statistical answer like 20 hits to our home page equal 6.3 users, statistically speaking.
Why care? Despite all the pearl-clutching, knowing how many visitors you have is only marginally useful data at best, so it doesn't need to be precise.
Are you being disingenuous? Do you really think that commercial websites can operate without knowing if their visitor count dropped by 15% over 30 days, or what their visitors come looking for, or through which channels they find and buy your products?
Stop overfocusing on cookies and instead remember to add this geolocation process, and why you do it, to your privacy policy. Please name the geolocation service provider (sub-processor) you share the users IP with, so your users can audit how their data is used. Please do a privacy-assessment to check if your sub-processor does anything else with that data, like selling the info "IP f.o.o was seen by our customer bar.com" to data brokers. Please ask the users for their consent to be geolocated, and don't do it for those who say no. And please offer them an option to change the geolocation data in case it is wrong.
If you decide to hire a sub-processor incorporated in a country that does not respect article 8 of the charta of fundamental rights of the european union (the right to data privacy), you have to ask the user before sharing their data (their data meaning "IP f.o.o accessed our service bar.com") with the geolocation service. See GDPR article 45ff. Please consider using a provider from a country that respects the fundamental right to data privacy. Note that the "privacy shield" collapsed due to the USAs trend towards surveillance capitalism and that the USA is not considered a safe harbor for personal data.
There are some exceptions, where you don't need consent: you could argue you need to geolocate the users to comply with embargos, because your company is american and you are not allowed to do business with people living in some geographic regions, like crimea. But even if you don't need consent you must still disclose that you process your users personal data that way, and why, in your privacy policy, so your users can decide to not use your service, if they don't agree. That may seem to contradict your business interest, but that is consumer protection in a nutshell for you.
Note that, if you now use this data to show graphs to your marketing team and have meetings about improving advertisements by targeting regions, you are in violation of GDPR article 5, because the purpose you stated (embargos) does not match what you actually do (targeted marketing). This is a principle americans often find hard to grasp: only because you have the data for some reason, that doesn't mean you can do whatever you want with it. This becomes clearer if you don't think of personal data as a thing in possession of those who collect it, but as a good that stays in possession of the person it is about and gets licensed to those who use it with a bound purpose. Consent management and privacy policy then being similar to a license agreement.
Now if you ask your users nicely for consent to be geolocated, and if you have a sane reason for wanting that data, the users may even agree. Just tell them about your awesome marketing department and how much they love region targeted marketing and if they don't bite, offer them a goody and they will agree. Hey they will even be offended by mistakes in your providers geo-ip-db and fix those for you. Note that this is a part of the right to data privacy: if you gather and process about the user the personal data that they are from somewhere, they have a right to know that and tell you "well no that is a mistake, i am from elsewhere". If you never tell them that you geolocate them, this is impossible.
The key problem is: most people who want that data (let's avoid the word "you" here) likely don't have a sane reason, they are just nosy and want to track their users out of curiosity. They know their tracking is kinda sus, so they don't want to tell the consumers about it, or ask for permission, or offer any goodies, and they don't care about a small error rate in their big data swamp. Instead they hide behind some "everyone does it" defense and act surprised if people consider them shady. Or worse, they require the data to offer user-unfriendly anti-features like content not being available in some regions (which actually could be a reason to not ask for consent: contracts with third parties like movie corporations requiring geolocation as part of online movie distribution), but in practice all that does is leading consumers to pay third parties to move their traffic around the globe, wasting resources to break the anti-feature.
But i digress, the key takeaway is: don't overfocus on cookies, state how and why you process personal data and it becomes obvious if you should ask for consent. An http-server does not need a consent banner to process the http-clients IP, it could not answer the clients request without it. The client gave it the IP for a very specific reason. But that reason and that process does not mean you can take the IPs from the servers logs and do with them whatever you want. That data does not belong to you, even if you process it. So please don't do that without asking for consent, or at least explaining why you do it. That is our fundamental right as data subjects.
> Stop overfocusing on cookies and instead remember to add this geolocation process, and why you do it, to your privacy policy. Please name the geolocation service provider (sub-processor) you share the users IP with, so your users can audit how their data is used.
Or don't be lazy/cheap and use a geolocation implementation where you don't need to share the IP at all.
Geolocation can be done without a subprocessor. It goes against the grain of outsourcing everything under the sun but all you need is a high-quality local database of IP address ranges.
And yeah, by all means, include that in your privacy policy.
I don't know details of the GDPR (as the don't apply to me) but a logged IP address belonging to the original requestor seems odd, do you also own the footprints you physically left behind on the ground as you walk around?
You may not own the physical footprints, but you have a fundamental right to the personal data encoded in them. A corporate service collecting footprints, offering insights and analysis about those who made them, creating maps and profiling their interests based on where they walk, does concern the basic right to data privacy, even if the whole thing is analog, and not out-sourced, and uses no cookies. This is why i compared it to licensing. A musician may not own every physical record, but they have rights to the music they created. Not a perfect metaphor, but good enough to better understand the concept.
Yep, and we clever HN people know that. But the web isn't all clever hackers. There are lots of small businesses with WordPress websites who just want to see a graph on a page. They will keep adding cookie banner plugins so long as they can keep seeing that graph.
I switched to plausible because the added privacy is also a major UX benefit: there's no GDPR banner on my website.
I would claim, based exclusively on experience, that most small businesses websites are built by web design companies. It is people working in the web shop industry that make decision about build tools.
Also in my experience, there isn't usually a GDPR banner on small businesses websites built on wordpress. A pizzeria, barber, local grocery store, or some other small businesses don't really have or need much on their website. A few images, a page for location/contact, a page about the employees/founders, and for the more advanced ones a web shop. No GDPR needed for any of that, and for those web shops, once a person is registered all the consent are usually given as part of the registration processing.
The typical model that I see for small businesses is also one where no one need or want a graph over visitors. The customer pay the web shops a one-time fee to build the site, and then it mostly sits there until the customer decide to build a completely new website because the old one is getting too old. No A-B testing, no optimization for user retention, no frontend interactions, no user flows. Mostly all the customer want is to not be paying a web developer any more money. This is why some shops will utilize "proprietary modules" in order to keep the customer on their (usually partners) servers, in case the customer might get ideas of moving the site to a better/cheaper hosting provider.
The most common situation that I know of when people are requesting visitor graphs is either when customer apply an marketing campaign and want to see the effect on their website, or when a conglomerate has acquired a bunch of smaller companies and want to make decision about closing down/merging specific website. In those cases the analytics usually get added when that need arrive.
> Shady practices like knowing how many visitors you get on your website?
Perhaps I am taking that statement too literally, but in the last century I could just look at my server logs to find out how many, from which IP addresses, and the referrer. No need for cookies for that info at all.
A huge number of website operators either don't have access to these logs or wouldn't know how to access them. And even given access, grep'ing a log file gives you far less rich information than a Google Analytics dashboard.
You could just do "Set-Cookie: visited=true; Max-Age=<interval>". No unique id, but you still can count uniques by checking requests for the lack of that cookie. This cookie is not personal information, and cannot be used to identify a person, not even indirectly, and thus needs no consent. This is basically what most those "cookie banners" do anyway, set a preferences cookie - that cannot be linked back to a person, if done properly.
Or if you want to avoid the cookie altogether, you could use some static, cachable resource with a cache expiration date. Basically the good old counting pixel. Almost the same as the non-identifying cookie, except caches are more likely to be automatically evicted by browsers.
The only thing that matters about cookies is whether they are necessary, not whether they contain identifying information. Even duration doesn't matter. They should be explained to the user, but consent is not necessary.
Some cookies are even mentioned specifically as allowed. The example given is keeping track of a shopping cart across visits. Do that, and you have your uniques. While hinted at, it does not specifically mention those have to be session cookies: you could have a banner with "accept cookies", then use session cookies whether or not accept is pressed. It even seems to be common practice to hide explanations behind a "more info" button.
>The only thing that matters about cookies is whether they are necessary, not whether they contain identifying information.
Incorrect, kinda.
The GDPR concerns personal information, and information that can identify people directly (e.g. location data) or indirectly (e.g. an "opaque" unique id, as it can be potentially linked back to a person, or an IP address, as it can be potentially linked back to a person, with the help of a court order compelling an ISP to pass through subscriber information to a complainant or law enforcement, and that subscriber may live alone).[0] The GDPR does not concern itself with stuff that cannot be used to identify a person or is personal data.
The earlier ePrivacy Directive (better known as the "cookie law", although the section concerning "cookies" is only a small part, and does not even mention cookies explicitly) is a vague thing, on the other hand.
Specifically, it says under "Art 5 - Confidentially of communications" that
"Member States shall ensure that the storing of information, or the gaining of access to information already stored, in the terminal equipment of a subscriber or user is only allowed on condition that the subscriber or user concerned has given his or her consent, having been provided with clear and comprehensive information, in accordance with Directive 95/46/EC, inter alia, about the purposes of the processing. This shall not prevent any technical storage or access for the sole purpose of carrying out the transmission of a communication over an electronic communications network, or as strictly necessary in order for the provider of an information society service explicitly requested by the subscriber or user to provide the service."
Some people therefore say this rules out all non-"necessary" cookies (unless there is explicit consent). However, this is not the intention of the directive, not how legal experts evaluated it, not how courts in particular evaluated it. If you followed that maximal view of the text, then you couldn't legally serve anything to a user (as the users browser might temporarily or permanently store that information without user-intervention), cannot "make" a browser cache stuff, cannot even store that a user opted against tracking cookies. Instead, it has to be seen in under the "confidentiality" umbrella of that Article, meaning the "information" mentioned has to be information that concerns the user. Non-identifying (neither direct or indirect) cookies do not fit that interpretation, and courts have acknowledged that (and because it's the EU and it's vast, some courts went against it too).
The proposed ePrivacy Regulation (successor to the ePrivacy Directive) is meant to make things less vague and simpler, especially in regards to cookies, and explicitly allows anonymous user counting via cookies, among other things. While the ePR has not passed, courts did take notice, and consider it whe they evaluate the intent of the law makers as it pertains to the still reigning ePrivacy Directive.
>They should be explained to the user, but consent is not necessary.
Correct. You still have to inform people, even if your cookie use is merely "we do not use cookies to track or identify users".
Maybe surprisingly to some, the aforementioned access logs up thread, are likely illegal without user consent, because usually they contain IP addresses of users. While the "visited=true" non-identifying cookie is not (in courts with reasonably knowledgeable judges at least).
Yes, it's not the official website, but also yes, it's the same text of the official directive recitals, except on this unofficial website you can properly link it without fuss.
it all depends on if you are a logged in user with a session or not. you can login to an account from any number of devices but you are still only one user in the metrics.
They don't, no. If optimising for that kind of thing is necessary for a business, then that business is in my opinion one that can go away.
It's like how search results are almost entirely rubbish now, because things are optimised for what Google looks for. So similarly, I have no sympathy for sites that need that kind of analytics.
Are crawler access patterns really that non-uniform across pages and large enough to make this a problem? And for crawlers that are not immediately indentifiable from the user agent? Are you sure you are not just counting users without JS / a blocker that interferes with your tracking / intermittent connections as bots?
If you really care what users like on your site why not ask them?
Services like this make it trivial to land in court, because they nudge their customers to collect data under the pretense of error analysis, a valid business interest not requiring consent, and then use the data for behavior analysis and profiling. If, as a user, you can't turn of the later without damaging the former, you got sold shit and should take your business elsewhere.
the apps i work on don't collect "behavioural data", so i don't have a strong opinion about it. however i think there are some crucial differences here.
1. sentry.io breadcrumbs are just a nicer interface to one's own log messages, and log messages are useful and necessary to have a well functioning app. where do log messages end and profiling/behaviour data begin? that's a rather fuzzy line.
2. even if one "logs" every breath the user takes (probably covered in the ToS), it's still only limited to one app and one service, while cookies are trivial to abuse for cross-site/cross-app tracking both inside and outside a company.
concerning the fuzzy line: log messages shouldn't include personal data (and in sentries defense they are trying to be helpful when it comes to that) Yet many people prefer to throw everything into the logs, arguing that much helps much and debugging without data is horrible. And suddenly the logs become a rich data swamp, and all that is needed is a nicer interface. So a lot of analysis that would otherwise require specific implementations or even user consent instead becomes data analysis of debug logs. That creates more incentive to throw everything into the swamp. And it makes it easier to forget its personal data: "If it's in the logs, i am not accessing the database i need permission to access." A lot of questionable personal data processing can be moved to the backroom of the backend, but that doesn't make the processing less questionable, just makes it easier to hide it from those subject to it, making it more illegal. Which is what i am warning about.
EU privacy regulations focus the purpose of personal data processing. If a company makes a contract with their users that says they log personal data for the purpose of debugging, and then they use it for web analytics, that is not allowed, its a violation of the contract. And like you stated many just write consent into the ToS. But let us look at the privacy friendly case where the users are asked if they agree to other behavior analytics not related to debugging. And suddenly the log interface isn't so nice anymore.
In a perfect world personal data is labeled with the purposes it can be used for.
If such issues are not relevant to the company you work for, be grateful, and don't take the warnings personal. But by all that is holy to you don't tell people the log interface is a great substitute to web analytics.
How can you claim it is limited to one app, if section 6.2 of the ToS of the service you and thousands of other companies use to manage logs says you allow them to create aggregations and summaries and distribute them to third parties?
Don't forget that the internet is full of websites that aren't owned by HN users. People used to just slap Google Analytics on there and call it a day.
Again, nothing shady, just bad practices. A bit like putting everyone in CC instead of BCC.
Besides, there's only so much you can do with backend logs. It doesn't work so well for small but meaningful frontend interactions, user flows and the like.
I certainly need some sort of stats to operate my website, although I don't need nearly as much info as GA collects.
Also a nitpick: the GDPR is about collecting data, not setting cookies
I disagree, just slapping GA on websites is shady shit. Not because the individual webmaster is doing shady stuff with that data but because this gives Google a ridiculous amount of data.
Not sure, but I don't think so. I think it's only if that site itself can link the IP to a name / user. For example, storing all the real world addresses in the world doesn't require a GDPR notice, but they're all related to people.
IP addresses match the definition in GDPR article 4 point 1: they are "an identification number, location data, an online identifier". Well the second one has to be stretched a bit, so an address matches location data, but the other two fit quite well.
If those are the only criteria, then I think it doesn't fit.
When I still live in the UK 6 years ago, IP addresses were rarely fixed, and changed everytime you connected. In fact, people paid extra to get a fixed IP.
I guess it's a bit of a grey area though, since sometimes it fits, sometimes it doesn't.
I still don't understand why this is a website thing and not an user-agent^W^W webbrowser thing.
EU could have requested browser vendors to implement a mechanism to accept or reject cookies. We wouldn't have those oddly designed (and infrequently infested with dark patterns in attempt to sway users towards accepting everything) bars and popups, and it would've been 100% reliable (and even the reject decision would be remembered correctly) instead of hoping that website actually respects the choice instead of having a banner that does absolutely nothing.
That doesn't seem like a big issue. Have browsers refuse all cookies by default, and let the server send headers that say "please allow these cookies, they're actually necessary". Browsers can either trust that list, or present it to the user and let them decide.
Yeah, I know what you meant. And every site will list all cookies as necessary. So then you need to pass a law that says sites need to be honest about which are necessary and which aren’t…because browsers can’t actually determine this.
Right, the crucial part is legislation. The context is that the original comment said "EU could have requested browser vendors to implement a mechanism to accept or reject cookies".
What we're missing today is:
- An automated way of having browsers know what cookies are used for (What I'm suggesting is quite easy), and
- Legislation to prevent companies fron lying (What the general context is about)
That would be the general fix for the entire situation.
No, we’re only missing this browser stuff you’re talking about. But again if you’re going to legislate this and then trust companies to follow, it’s just easier to have them implement.
You know thats the funny thing - actually this would be pretty close on how one could do it right now. Nobody actually forces sites to show a big cookie banner, they choose to do it.
The user could have a small button on the side to open a dialog and to activate more cookies than just the necessary ones if one really wants to (I think).
It's of course not how you describe it on a technical level but from a UX standpoint.
True. Everyone (myself included) calls this a "cookie popup", but it's about tracking. "Do not track" was a better name, but otherwise a poor implementation.
I wondered why not make such tracking (or "cookie") prompts a part of webbrowser UI, standardized and sure to be compliant. With enforcement of what browser can enforce (disabling persistent storage), and leaving the rest (e.g. disabling IP-address based tracking) to the website.
The legislation doesn’t want to require specific technical implementations. They want to specify the legal requirements as independent as possible from whatever underlying technical mechanisms are used. That makes it more future-proof and at the same time also simpler than having to come up with a technical standard.
One potential reason is responsibility - if the browser just doesn't implement the required UI for that, who would be at fault? Would the browser vendor be fined, and for how much?
Now it's up to those doing the tracking to tell exactly what they collect and who they share it with, and ask for permission. If they screw it up, they get fined proportionately.
Make it opt-in and it's solved. Server doesn't receive "yes you can" header because browser didn't implement it? Cool, no tracking. Or if there is tracking, website is at fault.
Only the same thing if you do not agree. You cannot consent to any tracking, the EU wants it to be explicit what you consent to ( data collected and who is it shared with).
"Do Not Track" setting does not do anything but sending a header to all websites you visit. Plus, it was blamed to have an unfavorable default.
The whole popup mess those days is essentially about those facts - there are no easy per-site controls with no default state (with a possible default if user consciously and explicitly configures their system as "accept all everywhere" or "reject all everywhere" - but this can't be out-of-the-box). Cookie prompts are just like that: per-site and without a default (you have to actively make a decision every time - accept or reject).
I'm arguing that it would've been better if all those consent prompts would've been unified and a part of browser UI (rather than website UI). Because of two reasons: 1) the UI and UX would be uniform, consistent, and not prone to design whims and dark patterns; and 2) because this way browsers would be able to guarantee that the choice is partially respected by actually not accepting cookies. DNT header would be still needed to tell website to disable server-side tracking (and one has to believe that website respects it - turning this into a legal matter), but the whole design is different.
(Just not like that annoying geolocation or camera popup prompts lol. A bar on top of the webpage would've been a sane option, I guess.)
Since the law requires it to be opt-in, having Do Not Track be the automated default without any prompts is not problematic but the whole point of the thing.
The default state should be DNT. Actually, using cookies for tracking, or any kind of fingerprinting for that matter, should be opt-in. The only reason it was opt-out is because adtech runs the web, and EU lawmakers are technically illiterate and influenced by the same tech giants they're supposed to regulate.
So there are two things here: making a browser setting opt-in, and making it actually enforceable. The DNT header was a step in the right direction that got swallowed by corporate influence.
I wouldn't trust Google with something like this. They might just implement it in an almost-correct but useless way in Chrome and use their leverage and lawyers to stall for as long as possible.
sofixa's point this isn't really about "cookies" aside, why should the state legislate a mostly-well-behaved actor have to do additional work to deal with bad actors, instead of (at least trying to) addressing the bad actors directly?
HN already whines about how hard GDPR is to comply with, can you imagine how bizarro it would be if the EU regulators were chasing after Firefox and not Facebook?
> why should the state legislate a mostly-well-behaved actor have to do additional work to deal with bad actors
This is pretty much just how regulated industries work. One jerk does something shady or dangerous and then everyone else has to do extra paperwork and inspections in order to prove they aren't doing shady or dangerous things too. It isn't the regulator's fault; it's the jerk's fault.
Somewhat ironic that the cookie banner of theverge.com is using the same tactics/patterns (or even worse, according to UBlock Origin) that Google was fined for.
Yes and it’s also not confirming to the regulations that made them add the banner in the first place!
I have no idea why anyone chooses to add a banner and then add one that is obviously in violation. Why not then just not add one? Is it because these cases are not yet enforced/fined so they think “let’s put an obnoxious non-compliant one for now, so we keep ad revenue, and only switch to a compliant one if we are actually sanctioned to do so, or the company across the street is fined to oblivion for doing it?”
They probably don't actually realize it's not compliant. They just saw everyone else doing it and mindlessly followed the herd. Users probably also think they are compliant and blame the EU.
I can believe this argument for a handmade mom & pop site. I will absolutely not believe it for any large tech company such as Google or Facebook or "consent management platform" providers such as TrustArc - that scum knowingly breaches the regulation because their entire business relies on it.
Worse, it's actually just a request to disable tracking, just like the unsubscribe link is a "if you get around to it in 6-8 weeks put my address on a do-not-contact list"
Without the very real fear of meaningful fines for tracking after rejection, it's just lip service
It would be fascinating to see the design review document and resulting launch metrics for this. Somewhere deep in Google there is a written justification for the previous dark pattern.
Or they CC'd lawyers in all the discussions, ostensibly to receive legal review but actually so they could later try to hide those discussions from discovery under the pretext of it being privileged attorney-client communication.
This isn't about just the button. Until a few years, "you're not the customer; you're the product" was the norm for a typical user's interaction with the internet, and they were powerless to change that.
Then the GDPR came along, declaring that users have fundamental right to their data, and as such, they no longer can be forced to be "the product" without their consent. One of its most empowering rules, however, is in Article 7 (4):
"When assessing whether consent is freely given, utmost account shall be taken of whether, inter alia, the performance of a contract, including the provision of a service, is conditional on consent to the processing of personal data that is not necessary for the performance of that contract."
So consent must be (1) freely given, and (2) it's not free if you're blocking access to service A by requiring consent for service B, when B is unnecessary for performing A.
Hence, a search engine cannot force you to consent to tracking for advertising purposes, because technically, the search engine doesn't need it.
So how can the search engine make money? One popular way that has already been ruled as legal is to offer two plans: a paid plan with no ads and tracking, or a free plan with ads and tracking (in essence, it's a paid plan and you're paying for it with tracking).
> So how can the search engine make money? One popular way that has already been ruled as legal is to offer two plans: a paid plan with no ads and tracking, or a free plan with ads and tracking (in essence, it's a paid plan and you're paying for it with tracking).
Google search was likely profitable with keyword ads long before any tracking was involved.
> One popular way that has already been ruled as legal is to offer two plans: a paid plan with no ads and tracking, or a free plan with ads and tracking (in essence, it's a paid plan and you're paying for it with tracking).
Where has that been ruled legal, and do you have a link to the ruling? I've only seen German newspapers do this, so my assumption has been that it's just the German authorities turning a blind eye to it. If it's really a legit option, it seems like a miracle that nobody else is using this.
> Where has that been ruled legal, and do you have a link to the ruling?
This was communicated to me by a law firm specializing this area. This was with regards to the newspaper "derstandard.at".
Here's a source in German which includes the ruling as a PDF [1].
I see that in the meantime, this "pay or okay" model has been questioned again, but in any case, a ruling exists. And, as I was told, the national DPAs don't just rule as they see fit, but rather coordinate with other DPAs, in order to harmonize the enforcement across the EU.
> If it's really a legit option, it seems like a miracle that nobody else is using this.
None of the big sites are using it because it's frequently far more profitable to track people. Just look at various revenue-per-user stats.
And the users most willing to spend money for no-tracking tend to be the users who also spend money on other things, so their the users you'd want to advertise to the most.
> None of the big sites are using it because it's frequently far more profitable to track people.
That doesn't explain it. If the only options on accessing one of the "big sites" were to see ads and be tracked, or to pay €200 / year and not be, it is pretty obvious that fewer users would opt out of the tracking than now (when it is free).
If a consent dialogue that required one more click for refusing than accepting is not legal, how can this be?
One reason that most sites might choose to make their money selling ads over selling subscriptions is that the former is much less of a hassle when it comes to taxes.
If your site makes its money selling ads that is most likely only of interest only to the tax authorities where your company is located. It doesn't matter where your site users are located. You advertising income will just be another entry on normal business income taxes.
If your site makes its money selling subscriptions, that money is business income on your normal business income taxes but also likely will be subject to sales taxes in the places where your subscribers are located.
If you have customers in the EU, you'll need to collect and file quarterly VAT there (the EU has its act together so you only have to register in one country, and file a quarterly report breaking down sales by country, and that one you registered in will deal with distributing the money to the rest).
If you have customers in the UK you will have register separately with them, because the morons implementing Brexit failed to arrange to stay in the unified online VAT system of the EU. (Not that you can currently actually register in the UK. We applied for our VAT registration number something like two years ago and they keep saying that HMRC is very busy and will eventually get to it. In the meantime we are suppose to collect and hold UK VAT, but not supposed to call it VAT).
If you have customers in the US you might have to deal with sales tax in 45 states. Unlike the EU VAT the rates are not uniform within a given state. You have to deal with different rates in each county, possibly each city, and possibly even in different areas within a city--and tax boundaries do not always line up with zip code boundaries so to do it right you have to collect exact address information for your subscribers.
These companies already make money directly from consumers all across the EU and would have the processes in place. And again, the point isn't even particularly to make money from selling the subscriptions. It's that if it's legitimate to demand either payment or tracking, then it's trivial for companies to force consent to tracking.
> If a consent dialogue that required one more click for refusing than accepting is not legal, how can this be?
In the derstandard.at model, the site is inaccessible (and also does not set any cookies, or perform any other tracking options) until you choose either A or B, so the amount of clicks are the same.
> That doesn't explain it. If the only options on accessing one of the "big sites" were to see ads and be tracked, or to pay €200 / year and not be, it is pretty obvious that fewer users would opt out of the tracking than now (when it is free).
It's too soon to say as the option only appeared recently, but taken to the extreme, if every single user of Google services opted out of tracking, Google would probably end up with a revenue problem and re-assess the situation.
> In the derstandard.at model, the site is inaccessible (and also does not set any cookies, or perform any other tracking options) until you choose either A or B, so the amount of clicks are the same.
The amount of clicks is definitely not the same; choosing to subscribe will certainly require filling in a dozen fields of personal information, provide credit card details, etc. The friction difference between 1 vs. 2 clicks is trivial compared to the friction added by a full subscription flow.
I don't speak legalese but your link does not seem to be about any legally binding ruling but simply an interpretation by the data protection agency. It also mentions that users have alternate ways to read the newspaper (offline) so it is not just a choice between paying and tracking.
The UK's ICO took a different stance[0] when the Washington Post tried to do this a few years back. For companies that want to do business in the UK, it probably makes sense to follow that more conservative decision.
The decision that you link also seems very much at-odds with the text of the GDPR (in both the German and English versions):
> (42) Consent should not be regarded as freely given if the data subject has no genuine or free choice or is unable to refuse or withdraw consent without detriment.
Interestingly, the decision that you linked prefers to rely on the case law of the data protection authority when interpreting the question of consent, in particular referring to rulings that predate the GDPR, despite its refinement of the concept of consent. It also focuses upon 'wesentlicher Nachteil' (significant detriment) where the original text of the GDPR prohibits just 'Nachteil' detriment. I find these choices rather suspicious, and wouldn't be comfortable with relying on them holding if challenged in other EU states.
Indeed, I overlooked that the ICO took a different stance. I have to admit that I did not research this myself, I just took the word of the (top-tier) law firm back then.
> The decision that you link also seems very much at-odds with the text of the GDPR (in both the German and English versions):
>> (42) Consent should not be regarded as freely given if the data subject has no genuine or free choice or is unable to refuse or withdraw consent without detriment.*
I don't see why they should be at odds with the decision? Quoting:
"Gibt eine betroffene Person keine Einwilligung ab, so besteht die erste Konsequenz darin, dass diese ein O*-Abo abschließen kann. Dieses O*-Abo ist – wie festgestellt – frei von Werbung, frei von Daten-Tracking und frei von der Setzung von Fremdcookies. Das O*-Abo ist mit einem Preis von 6 Euro monatlich ab dem zweiten Monat auch keine unverhältnismäßig teure Alternative."
It's obviously without question that one can charge something for a service rendered -- the newspaper does not have to offer this for free.
And in the above text, the authority then opined that charging 6 Euro for a monthly subscription to read a newspaper is not unreasonable, hence the decision to opt into tracking is a free one, because the alternative is simply to pay 6 Euro for a month's access.
Indeed, that is what they decided. However, I still find the interpretation very surprising.
The GDPR reads as if its authors had a different understanding. For example, the 'data minimisation' principle indicates that you should not be collecting personal data unless doing so would prevent the desired activity; serving an article, or even serving an article with adverts, can both be achieved without this data. There's also a question over whether the services are the same: a subscription appears to be materially different to one-shot access to an article, a question which is entirely overlooked in their analysis.
Beyond the weakening of 'detriment' to 'significant detriment' by reference to their pre-GDPR decisions, I also find the result unusual. The practical interpretation is that, if a user wishes to read 100 articles from 100 service providers all using this scheme, the cost of privacy is a significant detriment at €600. I am not confident that other data protection authorities will be as eager to jump to this same weakened interpretation.
There is probably a safer middle ground: giving the user the option to pay for either the article or the subscription, and allowing for either direct payment or payment by proxy through an advertising broker. In the latter context, the user's relationship and data-flow is controlled by the advertising broker, and the service provider needs no data relationship with the user. Of course, this shifts the controller liability into the advertisers -- something that they would probably prefer to avoid -- and I'm not aware of any services offering this or concrete decision on it yet.
> The GDPR reads as if its authors had a different understanding. For example, the 'data minimisation' principle indicates that you should not be collecting personal data unless doing so would prevent the desired activity
Indeed. It could reasonably be assumed that this is the main reason why the paid version does not use any tracking.
> There's also a question over whether the services are the same: a subscription appears to be materially different to one-shot access to an article, a question which is entirely overlooked in their analysis.
Well, the service is being offered on a monthly basis alone. The customer may only desire a one-shot access, but that offer is simply not on the table.
I may only be interested one-shot access to [some-Netflix-movie], but the smallest access unit Netflix is willing to sell me is a month. Same goes for certain gym memberships. etc.
> The practical interpretation is that, if a user wishes to read 100 articles from 100 service providers all using this scheme, the cost of privacy is a significant detriment at €600.
Accessing a 100 difference service providers is on the customer, though?
Same example as above. Say the customer wants to watch just one movie on Netflix, Disney+, and 98 other providers, all charging $10/month. $1000 per month sounds a lot but that's entirely on the customer; they could also just spend only $10 and watch 100 movies on Netflix.
> they no longer can be forced to be "the product" without their consent.
Don't use the service.
> when B is unnecessary for performing A.
Technically unnecessary, the same way that the existence of goods in a shop is not dependent on whether or not I pay for them. But it sure has hell is necessary for the shop to keep functioning.
The overwhelming majority of people are happy to sell their data for free services. How many Facebook movies need to be made before people are convinced that 1) the average person knows they are being tracked and 2) they would still rather be tracked than pay.
I mean ffs, Netflix is about to roll out an ad supported plan. The average person does not value their privacy nearly as much as the outraged HN'er would believe.
The whole idea of the law (at least as I see it) is to state that companies shouldn’t be allowed to trade in people’s information unless both parties of the transaction know exactly what the transaction is. And that means even people who don’t care.
That basically means the business model “give people a service in exchange for personal info they think they don’t mind sharing” isn’t ok.
It also, by extension, means that you are telling consenting people that some agreements aren’t ok for them to enter into. Which is also fine.
> The average person does not value their privacy nearly as much as the outraged HN'er would believe.
I think you have it backwards. Regulation is needed because people don’t care. Not the other way around.
Seat belt laws are there because people don’t care about personal safety enough.
This is not how Democracy works. If, as you admit, billions of people don’t really care…who are you to force your will upon them? You know what’s better for them than they do?
Wait isn’t this exactly what democracy is, i.e some legislative body writing laws that apply to everyone?
One might argue (perhaps especially in the case of the EU) that the legislative body doesn’t have public support some times. But that somehow laws are undemocratic because some fraction of the population doesn’t want them or doesn’t care? What’s the reasoning there? It’s even very common in a democracy to see lawmakers introduce laws that have a majority oppose them, because e.g it’s needed for society to function (tax increase would be the typical example that is rarely popular with 50%).
I don’t care about half the laws in the book most likely. But I elect representatives I trust to make things I might not like too. Democracy.
Also “billions”? These laws are m ass for half a billion Europeans and similar laws exist e.g in California. What’s going on with the rest of the billions it’s not really the concern of EU or California voters because … Democracy.
1) The true extent of the tracking was disclosed. Before the GDPR that wasn't the case.
2) In practice, this only works for small services with lots of competition. For services which have a monopoly and/or oligopoly (and all of them are equally bad), this isn't a solution.
Maybe the GDPR can be relaxed in a few years/decades after other regulations against monopolies take effect or the monopoly/oligopoly problem self-resolves, but in the meantime I think the current state of the GDPR is valuable.
Now can Google please also respect the DNT header and automatically reject all cookies if present? That would demonstrate real leadership and putting the user experience first.
Where can I buy a subscription that gives me all of Google's services with no ads & stalking? As far as I know you can pay for some stuff such as Workspace or YT Premium but some services such as Search or Maps are still ad-supported and no doubt stalk you anyway (since there's at least one ad-supported service, the incentive to stalk the user remains).
We've already encountered that with "Do Not Track"—as soon as you have anything that doesn't require user intervention, websites start arguing that it doesn't reflect the users' intention, and so they have to protect us from the nasty browsers by tracking us.
To be fair, the DNT launch was botched from the beginning, starting more as hack than an industry-wide consensus [1]. While it eventually got implemented by browsers, it lacked adoption, and had risks with fingerprinting [2]. The nail in the coffin was when Internet Explorer 10 decided to enable it by default [3], completely disregarding user intent.
Certainly not at the near 100% level that the default setting suggests. Microsoft poisoned the well with DNT and worsened privacy on the web for everyone.
I can believe that there are some people who don't care if they're tracked, but do you believe that there's anyone who wants to be tracked?
Maybe someone out there somewhere does, but surely such people, who actively want to be tracked, are in the distinctly small minority. In that case, why should the onus be on everyone else to communicate their intent, rather than on the few users affected to communicate their intent?
>why should the onus be on everyone else to communicate their intent, rather than on the few users affected to communicate their intent?
Because this effectively bans any kind of tracking cookies which, while most are kind of awful, there are legitimate reasons for their existence. Shifting the conversation from a user choice to an effective ban is a completely different conversation with pros and cons that must be considered separately.
> Because this effectively bans any kind of tracking cookies which, while most are kind of awful, there are legitimate reasons for their existence. Shifting the conversation from a user choice to an effective ban is a completely different conversation with pros and cons that must be considered separately.
It doesn't at all ban them—it just makes them only effective for users who explicitly opt in. And if that's too much of a burden to impose on those very few users, then why is it reasonable to impose the burden on the vast majority of users who don't want to be tracked?
I still don't understand how this has anything to do with "user intent". What makes you think that the default user intent is to allow tracking? Would it have been better if the browser asked the user to choose? Do you think user intent would have been respected if it was presented as an opt-in setting? (ie. 99% of user would just click ok without opting in)
The reason why this flag doesn't work has nothing to do with user intent. We wouldn't see all these GDPR banners that make it difficult to opt out if anyone actually cared about user intent.
”We make money by selling a snippet of code to websites that integrates with Super Agent.
Essentially, websites can have a JS snippet unique to them so that when a user with Super Agent visits, cookie preferences are applied automatically without having to ask anything.”
https://www.super-agent.com/faq
Cookies are just completely broken. The EU should never have got involved in the way that it did. No matter how positive the intentions, the web is a worse experience as a result, with marginal privacy gains.
The focus on cookies was always a bit off and more a result of too much technical detail resulting in laws missing their intent. The legislative moves slowly, over time, this will be fixed. However the legislative regulating how webservices have to handle data privacy was very necessary (and the people of the USA should really consider amending their constitution by also demanding a basic human right to data privacy). The key elements are "informed choice" and "consent to data gathering/processing" which have little to do with cookies. Let's say you buy a smartphone from china and it comes with a keyboard app that sends all your inputs to a chinese company so they can make predictions and offer autocompletion. You kind of want that app to display a banner asking you if that is okay. And you kind of want a privacy policy attached that explains they will create user specific profiles and sell them to advertisers and share them with the chinese ministry of state security. I think you want that banner. Now google analytics isn't much different. It tracks you all over the web, creates profiles of your browsing habits, sells those to advertisers and shares them with the american national security agency. Sure it also shows statistics to the website owners, the same way that keyboard app has an autocomplete function, but you kind of want to be informed about those other functions and have the option to say no, don't you? That is why 'consent management' is so important for data privacy.
I'm really hoping Do Not Track becomes legally binding. (Also, how is it not already treated like a piece of a contract negotiation? It is machine readable and sent on every request. Hidden website EULA's are already treated like contracts.)
I would rather use Lynx than any more creepy JavaScript.
When I want “experience” —- a concept I loathe because it is a euphemism in all senses, and somehow arrogant and naive at the same time. —- that is the role of a desktop program. And it better ask me and inform me whenever it wants to perform a network request.
I saw this the other day on youtube, it appeared, saw the middle Reject All button, clicked it, closed and reset the browser, tried again and it didnt appear!
Were they testing bots?!?
Regarding the Reject All button, its about bloody time, these tech giants have plenty of other surveillance methods at their disposal if they want to go down that creepy criminal online stalking route, exploiting people's lack of knowledge to make sure shareholders and employees are getting their top dollar.
No there is more surveillance than meets the eye and more hacking in plain site than most people realise, after all when is a company not organised crime?
Dude chill. As the parent comment said, it was probably just an A/B test. Google does that a lot, even with their old consent dialog, sometimes they used buttons and sometimes they used radio checkboxes.
Does this "reject all" accept the "legitimate uses" which very clearly break GDPR?
Most GDPR dialog windows I've encountered have made it harder to opt out of what they incorrectly claim to be "legitimate interests". According to GDPR, a "legitimate interest" has very clear requirements, which are by no means met. This "legitimate interests" very often includes things like "creating a personalized add profile and tying it to external data lakes and devices", which is by no means necessary in order to provide the service.
I'm waiting for what has become the de-facto abuse of GDPR to have a serious reckoning.
The legitimate interest ticks are just another way for the scum of the web to break the law. I hope the makers of these popups will at some point get fined to hell because of their sneaky attempts to smuggle tracking into the browsers of people who click the "fuck you and fuck your cookies" button.
I assume if France was breathing down Google's neck in the design of this feature (or as Google puts it "Providing specific direction"), I assume this does not have legitimate interest bullshit.
I've noticed when I go to configure my cookies on European sites, most of them default to all the tracking stuff turned off. Is that typical? Is it required by the law? They still do everything they can in the UI to encourage you to "accept all", but there's generally a single button to click to "reject all" and it'd be more work to pick and choose.
Yeah, it's a requirement by law, tracking must be disabled unless explicitly allowed. Accepting all must be as easy as denying. However a lot of sites offer an easy one-click accept all and the deny all is behind a two step "configure" + "confirm selection", sometimes even with a fake save timer.
Thanks! I looked for info on this and failed; is there some reference I can share with people about how this part of the law works? Maybe it's not completely obvious, apparently Google had failed to comply with it.
More privacy if you assume they actually respect the GDPR internally (but this occurrence is just yet another example of bad faith that suggests that the entire thing might be a charade and they stalk you anyway regardless of your settings).
update: they added it and with EU vpn you can press the reject button!
A. will other websites follow? (its so stupid that we had accept all or "more information" button, but never reject and its crazy)
B. does it do anything really? i never save cookies in the first place.
In my opinion the whole 'cookie banner industry' sucks. From a customer perspective I'm completely annoyed with different types of banners wasting my time with searching the right button, waiting because of artificial delays, clicking through layers of fake settings to find the reject option and other dark patterns.
However, from a provider perspective things a not better. Unnecessary waste of time and money to look for plugins and services to deal with cookie walls to avoid GDPR problems.
Best thing would be a EU enforcing the use of a standard browser API to ask for tracking. Just as simple as asking me if I want to share my location or webcam with an option to remember for this domain.
I really want the web browser industry to come together and form a new kind of P3P standard, complete with some example libraries for people to use on the backend.
Sadly, the browser market is dominated by Google, who has a direct interest in tracking people, Apple, who operates in proprietary protocols unless they absolutely have to, Microsoft, whose stalking exceeds even Google's at this point, and then a tiny slither of well-meaning but overall badly-managed open source projects.
Relying on the DNT header is difficult as "tracking" can be interpreted in a number of ways, especially by the data vampires of the advertising industry where they have developed many nice words to make their business so sound harmless. We need a better protocol, implemented across the board, for this to automate away these ridiculous popups. If a sufficiently flexible protocol exists, I'm sure it'll be taken up by either Europe's DPAs or even new legislation, though existing legislation should already be sufficient.
The EU should not, and generally doesn't want to, specify which technologies get used because technologies develop faster than bureaucracy. The hastily thrown together Brexit accords mention Netscape Navigator and ancient, insecure, outdated cryptography because they decided to include that in legislation many years ago and the accord was just a combination of existing EU and UK laws thrown together. We don't want that to happen again, especially on a larger scale.
Though the current state of the web is depressing, the rise of piracy and TOS-breaking VPNs for the general public made it quite easy and inconspicuous for people to use VPN services.
You can effectively get the protection of any EU based citizen by setting up a VPN with an endpoint in Europe. There are dirt cheap options thanks to the man youtube sponsor discounts, and there are privacy-first VPN providers like Mullvad who are pricier but better if you care about your traffic.
someone needs to step up to Google on this side of the Atlantic. They are constantly in the news regarding their tracking. tracking should be off by default. all cookies should be ban except session cookies that expire when browser closes. we already have saved passwords and logins for browsers
At least in the past, I remember to see the same youtube recommendations even after deleting all cookies etc from the browser. So google was fingerprinting you in one way or the other. I doubt this changed. So what is the point of not using cookies? They are tracking you anyway.
Any company that didn't have this set up already can reside at the bottom of the ocean with the rest of the bottom feeders (no disrespect to the mussels)
That's the intent of the law/regulation, but indeed I very much doubt they will just comply and call it a day.
Most likely in a couple years it will emerge that there's one more way of stalking that conveniently wasn't being disabled by this "disable all" button.
It's sad that even several instances of bad faith along with a business model based on breaching the GDPR is still not enough for the record-breaking "4% of global turnover" fines to actually appear.
GDPR applies even if you use pen and paper, you still need to ask for permissions. But in this case it was a dark pattern, Google had no choice then to ask for permission but made it hard to deny them.
The [...] you omitted is "and to the processing other than by automated means of personal data which form part of a filing system or are intended to form part of a filing system." - if your company takes notes on your customers with pen and paper and puts these notes in a drawer for further use in your business processes, GDPR definitely does apply.
A random real example is that I used to work in a building which had a paper logbook where people sign the time and name when taking/returning keys for the meeting rooms. That logbook falls under GDPR as it has personally identifiable information - there's the legitimate need use case justifying it; but if the company suddenly wanted to use the stored data for some other purpose, that might be restricted.
GDPR is applied to personal data in general. It is "General Data Protection Regulation".
And it states in (15):
--- start quote ---
In order to prevent creating a serious risk of circumvention, the protection of natural persons should be technologically neutral and should not depend on the techniques used. The protection of natural persons should apply to the processing of personal data by automated means, as well as to manual processing, if the personal data are contained or are intended to be contained in a filing system
--- end quote ---
And in Article 2, emphasis mine. It also lists what it doesn't apply to.
--- start quote ---
1. This Regulation applies to the processing of personal data wholly or partly by automated means and to the processing other than by automated means of personal data which form part of a filing system or are intended to form part of a filing system.
2. This Regulation does not apply to the processing of personal data:
(a) in the course of an activity which falls outside the scope of Union law;
(b) by the Member States when carrying out activities which fall within the scope of Chapter 2 of Title V of the TEU;
(c) by a natural person in the course of a purely personal or household activity;
(d) by competent authorities for the purposes of the prevention, investigation, detection or prosecution of criminal offences or the execution of criminal penalties, including the safeguarding against and the prevention of threats to public security.
--- end quote ---
And in Article 4. Definitions
--- start quote ---
(2) ‘processing’ means any operation or set of operations which is performed on personal data or on sets of personal data, whether or not by automated means, such as collection, recording, organisation, structuring, storage, adaptation or alteration, retrieval, consultation, use, disclosure by transmission, dissemination or otherwise making available, alignment or combination, restriction, erasure or destruction;
It is available on Android. I don't think they can legally track you until you provide consent. They might still do it, but I'll take that chance to make the web less hostile.
Rejecting all cookies would be disastrous. You wouldn't be able to log in to any website (unless they use JavaScript storage for logging in, which is less secure that cookies)
And btw, you can already disable all cookies for a particular website, or all websites in both FF and Chromium. Just click the lock https icon > more info and choose the cookie setting.
Making the browser not support cookies at all exists, is trivial, and has nothing to do with not opting into consent walls. Virtually nobody uses such extensions because
> You wouldn't be able to log in to any website
Rather, it's about interacting with these consent walls in an automatic manner to block the tracking cookies that aren't necessary for the website to function. You know, the part that needs consent.
But so long as the general public, heck, even techies continue to believe that antiquated lawmakers had no idea what the heck they were talking about when they made all cookies require opt-in (spoiler: that's not what it says), I guess we'll continue to accept consent walls because there is no critical mass to oppose sites that employ them.
I mean the EU can force websites to do whatever. That includes implementing a feature. As we saw with cookie consent law and GDPR. And as we're going to see with interop requirements for chat apps.
Cookie banners only exist because websites want to collect data and track users. The banner is a symptom of the real issue, which is what the regulation aims to fix.
These consent forms are such a sabotage on the original idea.
The idea being, as far as i remember, you set your preferences once.
In a container on your machines under your control.
Then the site and the container negotiate. Either the side is willing to accept your preferences - or it denies showing to you - or presents you a "negotiated" down version.
No clicking. No visible banners. No large forms and lawyer legalese.
Just a privacy level setting for the web via standardized API. Europe at least tried but dropped the ball.
I'm wondering why it took slightly but not significantly more than the 3 months. On one hand, adding a button doesn't have to take 3 months despite all the necessary reviews etc., not if it's actually considered a priority. On the other hand, if the benefits were considered worth the fine/fee, you'd expect a bigger delay.