Hacker News new | past | comments | ask | show | jobs | submit login

I'm no Facebook fan, but the chronology seems remarkable here. Facebook were irritating Rupert Murdoch, and traditional media in general, by refusing to pay fees for articles that appeared in people's feeds. About a month before everybody started running hit-pieces on Facebook, the negotiations broke down, with Facebook telling the news guys to go take a hike.

Suddenly, everybody is absolutely horrified by the fact that Facebook is selling people's data. Despite the fact that it's always been their explicit business model.

Facebook is absolutely a horrible idea, but the news are hardly being impartial. It's incumbent publishers trying to knock out a new rival, using their reputation and power to get the newcomer dragged through the mud by politicians. Nobody should be surprised that the most news-sensitive, not the most privacy-sensitive governments were the ones to attack Facebook most directly. The UK, for instance, is perhaps the most invasive surveillance state west of China - and yet, they've been leading the charge against Facebook. Despite the fact that the same government has floated ideas about monetizing government data about its own citizens!




I mean that's just revisionist history. Yes everyone knew Facebook was selling your data. That's always been clear, but just since 2016 I can think of numerous legitimate stories:

1. Russian interference in the election and the fact that FB knew about it and did nothing.

2. FB running psychological experiments in the newsfeed not only to see if they can make you spend more time there (questionably ok) but also to see if they could affect your mood (definitely not ok).

3. Facebook censoring news in the newsfeed based on political leanings.

4. FB tracking you even after you log out.

5. Zuck lying on capital hill about what he knew and when.

6. FB hiring a pr firm to right negative articles about an investor.

Also to be clear, even if Murdoch was the catalyst for the media turning a critical eye on FB, which is at best unsubstantiated conjecture, it is undeniable that very valid, very concerning stories have been revealed as result of the media's renewed scrutiny.


>1. Russian interference in the election and the fact that FB knew about it and did nothing.

"the fact that FB knew about it and did nothing" is unsubstantiated. The person I trust the most on this is Alex Stamos, who actually left Facebook due to tensions over this, and he's gone on record saying there was no obstruction.

> 2. FB running psychological experiments in the newsfeed not only to see if they can make you spend more time there (questionably ok) but also to see if they could affect your mood (definitely not ok).

From 2014. I'd also disagree with running A/B tests to increase time spent being "questionably ok" though.

https://www.theguardian.com/technology/2014/jul/02/facebook-...

3. Facebook censoring news in the newsfeed based on political leanings.

I can't find anything online about this? Unless you're talking about filter bubbles, in which case I think that's a reasonable criticism although I think describing it as "censorship" is a huge stretch.

4. FB tracking you even after you log out.

Again, from 2013.

https://www.dailydot.com/news/facebook-shadow-profiles-priva...

5. Zuck lying on capital hill about what he knew and when.

Can't find any news articles about this.

6. FB hiring a pr firm to right negative articles about an investor.

Fair criticism.


I also have to add, most if not all of these apply to other big internet companies. Hell, Google took pride on all the information it could figure out about you even if you were logged out to target your search, didn't even show up when called to testify, and Youtube is as much a highly-optimized entertainment and news source as Facebook is, with similar faults.

The fact the media are sharks doesn't absolve Facebook's sins. The media ARE sharks though.


Facebook is in a negative news cycle. That’s means two things:

a) Journalists ignore your PR team.

b) Journalists just parrot and re-write was other journalists report.

c) Some journalists are doing real investigative reporting to reinforce these views.

How does it end? The news cycle changes. People forget things. Worse things happen somewhere else.

I’m a lot more concerned about what is being ignored. Facebook has apps which people can, despite what is claimed, fairly easily stop using. Google owns the full stack, operating system, browser, ad platform. It’s a lot more difficult for users to fully exit that environment in the event of abuse.


It could end when new problems stop being revealed in the press, and when facebook starts to make real progress towards addressing these problems. Or facebook can just pretend it's all unfair treatment from the mean news media. I got news for you facebook - there are real problems that need to be addressed.


> Google owns the full stack, operating system, browser, ad platform.

So does Apple, so does Microsoft. Why bring up Google specifically?


Neither of those are ad-tech companies that make their money from spying on users?


Targeted ads are a significant revenue source for all of these companies. That also doesn't explain why Google's stack is harder to leave than the rest.


> Targeted ads are a significant revenue source for all of these companies

I'd like to see some documentation for that claim. Company financial reports? What percentage of Apple or Microsoft revenue comes from selling ads or user data? (Not revenue raised indirectly by them spending on their own advertising)

> that also doesn't explain why Google's stack is harder to leave than the rest.

Parent comment wasn't comparing the difficulty of leaving Google with the difficulty of leaving Apple/Microsoft. The comment was about leaving Google in comparison to leaving Facebook, in the context of a discussion about personal privacy.

I think bringing up Apple & Microsoft is a non sequitur in this context.


I don't believe they release the numbers but they have app search ads and news app ads. They shutdown iAd but it looks like they want to take another crack at it. https://www.wsj.com/articles/apple-looks-to-expand-advertisi...

Microsoft owns LinkedIn and Bing which very large ad revenue services.

>I think bringing up Apple & Microsoft is a non sequitur in this context.

Bringing up Google alone was the non sequitur. You could argue its nearly impossible to leave the big three all together but switching between them isn't a significant burden.


How does Apple make money from targeted ads?


Same way everyone else does. Apple has search ads, news ads, they had iAd for a while but that was shut down and it seems they're pushing a new replacement https://www.wsj.com/articles/apple-looks-to-expand-advertisi...


> Also to be clear, even if Murdoch was the catalyst for the media turning a critical eye on FB, which is at best unsubstantiated conjecture

Murdoch’s threats are well documented. At a 2016 meeting, “Murdoch conveyed in stark terms, Zuckerberg could expect News Corp executives to become much more public in their denunciations and much more open in their lobbying.” [0]

[0] http://uk.businessinsider.com/rupert-murdoch-reportedly-thre...


> I mean that's just revisionist history. Yes everyone knew Facebook was selling your data.

The fact that they sell data is probably the only thing that hasn't been a purported scandal. Rather, what the media started claiming was a scandal in 2016 was that Facebook had an API in 2007. Which somehow became an issue as soon as FB kicked media companies out of the newsfeed.


When we do it: A/B Testing

When FB does it: Psychological Experiments


I think most people would categorize this as Psychological Experiments vs A/B testing. Most companies don't A/B test whether they can intentionally manipulate people into generally feeling negative.

> To test that, Facebook data scientists tweaked the newsfeed algorithms of roughly 0.04 percent of Facebook users, or 698,003 people, for one week in January 2012. During the experiment, half of those subjects saw fewer positive posts than usual, while half of them saw fewer negative ones. To evaluate how that change affected mood, researchers also tracked the number of positive and negative words in subjects’ status updates during that week-long period. Put it all together, and you have a pretty clear causal relationship. The results were published last month in the Proceedings of the National Academy Sciences, which, for what it’s worth, is a pretty prestigious journal.

https://www.washingtonpost.com/news/the-intersect/wp/2014/07...


OK so they published the results of an A/B test. My point is that those two phrases mean the same thing, not that it wasn't a psychological experiment.


One describes the how and the other describes the what and the why. They're not the same thing at all.


Yup, but "do people click on a blue button than a green button?" could also be accurately described as a psychological experiment. Every single A/B test that involves user interactions could be described as a psychological experiment. Yet we don't hear the mass media outrage directed at literally every other website in existence.


> Yet we don't hear the mass media outrage directed at literally every other website in existence.

Sorry, let's rephrase. "Harmful psychological testing"

Sure, A/B tests have a psychological component, although the tests don't specifically test for that or they'd be after the reason WHY they clicked on blue vs green. They don't care why, they only care about which got more clicks in order to choose which one to go with. They usually also tend to actually test something, like a feature. What feature was FB A/B testing? They weren't. It was a pure psychological test, with harmful results. You honestly don't see a distinction?


> Yup, but "do people click on a blue button than a green button?" could also be accurately described as a psychological experiment

No it wouldn't. Only to a person who knows nothing about psychology or about experiments would that phrase accurately describe a psychological experiment.


Which is which? Facebook’s study was “Does emotional contagion occur?” Right? So that’s “what”, not “why” or “how”.


The "how" is the psychological experiment. The hypothesis is the "what", as well as the "why" we are doing this.

There were a lot of ethical failures on the part of Facebook, such as the failure to get informed consent, the failure to review the experiment with an independent review board until after it had occurred, misleading said review board to say that the dataset was previously approved, etc., and all for results that the authors think were not very valuable: https://www.theatlantic.com/technology/archive/2014/06/every...


> also to see if they could affect your mood (definitely not ok).

Why is it ok for a bakery to spray the smell of fresh baked goods to affect your mood, but not ok for Facebook to attempt the same?


Scope and direction. The bakery can do everything in its power and still only effect a small subset of people in a certain location. Facebook using everything in its power can arguably effect a significant portion of living humans at the same time.

Second direction of effect. A baker can spray scent to make you think about food, positive or at best neutral effects. Facebook can manipulate what your see to induce rage, fear, hate, xenophobia, anything it wants if it is effective.

Again, I think it really just comes down to scope. Regardless of what precedent there is for corporate interactions with the populace, we are at an unprecedented time with regards to the reach, immediacy, and ubiquity a corporate entity can exert on its users. And when you have a company whose product incentivises immense social pressure on others to join you can end up with huge amounts of influence in a very short time.

Again, looking at it from the precedents of the past you can say there's nothing wrong or different about it but we do not live in the past and these entities are having large and possibly damaging effects on our societies and the precedent with regards to that is to attempt to eliminate it, either through media shaming or governmental regulation.


Your first argument is implying that the ethics of an experiment is dependent on its sample size. That seems ridiculous.

Your second argument is just incorrect. Given that obesity is arguably the foremost health issue in the country, inducing appetite is not "at worst neutral."

The bakery example actually seems pretty comparable to FB. IMO they are both fine because the magnitude of the negative effect of being made hungry by the smell of a bakery and being made angry by seeing a FB post are both insignificantly small.


I believe the argument being made is more on the user's magnitude of engagement. Users interact with facebook on a very close level - there's a level of emotional intimacy that users have with facebook the platform that they don't have with the corner bakery. To draw an analogy, imagine if a bank suddenly decided to run an experiment to tell people that they had lost all their money and that there was no recourse. Wouldn't that be a pretty fucked up thing for a bank to do?


The bank would be lying to you and possibly cause real financial damage e.g. mislead customer sells car to pay rent b/c thought they were now broke. An equivalent to FB would be experimenting with different interior designs to make people less likely to come into the branch vs use ATM's. FB is a content delivery platform and it should be free to experiment with different content delivery algorithms. FB has no duty to show specific things to specific users like a bank has to provide accurate account information. If FB had such a duty, it could never change it's algorithm. That would be ridiculous.


Just because FB doesn't publish the information itself doesn't mean it isn't liable for that information being on its platform.

I think your argument would make sense if Facebook was "dumb" content delivery platform and had no advertisements. But it isn't: its a smart one, collecting data on its users constantly, and making that information available to anyone who wants it to push possibly misleading content to its users.


I don't know, ethics? Why is it not ok for me to forcibly shoot you up with psychedelics anytime you come into my store? Probably because you didn't consent to it and it's potentially dangerous. There is a reason why you have to sign wavers before participating in most studies.


There's plenty to pick from when it comes to FB's behavior, but...:

> FB running psychological experiments in the newsfeed [...] to see if they could affect your mood (definitely not ok)

I don't understand how anyone could have a problem with this in principle. At Facebook's scale any change they make to their website will affect people's mood in one way or the other.

Would you prefer they didn't do experiments and just rolled out those changes without testing and hoped for the best? That would also be a sort of experiment, the experiment of closing their eyes to data they know is there.

The trolley problem doesn't go away just because you close your eyes as you blindly flip the switch.

Now, if we find that they ran such an experiment and found it was more likely to make people sad but also to spend more money we'd have a stand to criticize them.


It wasn't A/B testing, or testing a feature, it was psychological study (http://www.pnas.org/content/111/24/8788.full) whose goal was to manipulate emotions - not measure the effects of Facebook features on emotions. There was also no informed consent for the users.


How can the goal of a study be to manipulate emotions, not to measure the effect of feature X on emotions? I haven't read the paper, but the abstract barely even claims the latter. Honest question.


In principle no, but in execution, yeah sorry no. FB completely dropped the ball. When you have that many users you can't just go let me experiment and see if I can make people sad. The range of people on your service is too high for that. All it takes is one suicide that can be accurately tracked back to someone in that segmentation and FB is fucked, deservedly.

For something like that, the name of the game is informed consent. Let people opt into or out of the experiment. That's what people conducting an actual study would do. Not just randomly start experimenting on the public which is essentially what FB did


Well, sure, Facebook are a bunch of tech bros without any real interest in ethics. But I think the point is, they played that way right from the start - and most of those things are just what everybody does. It's not a story when an tech company tracks you when you're not even using their product. It's just ordinary operating procedure. Everything that's been aired about facebook is a bit like that - in real terms, obviously awful, but in the world we live in? It's hardly chopping up a journalist and putting him in a suitcase. It's milquetoast, everybody-in-SV-is-doing-it stuff.

So color me revisionist, but I smell a rat when it makes the headlines.


I mean if all of SV is doing what FB has been accused of, that is a story in and of itself. Maybe FB does typify or even exemplify SV tech bro culture but just because companies were once able to play hard and fast with the rules doesn't meant it's behavior that the rest of the world is okay with. Startups are mainstream now. And with all that new attention and money comes scrutiny because you aren't just operating out of someone's garage anymore. Move fast and break things doesn't fly anymore when the things you can break affect millions to billions of lives and dollars.


This is just incredibly revisionist. Do many people actually believe this crap here? In SV?

Stop jumping in front of the rotting corpse of FB. No Software firms hires right-wing PR firms to propagate anti-semitic conspiracy theories. They have repeatedly played fast and loose with the laws. With the amount of revenue and influence under their control, they're no longer a tech-startup that creates wealth by innovating/discovering a useful technology or service. They're a media company with tremendous influence, and need to be treated as such. Which is why the press is trying to hold them accountable. Open your eyes and stop imagining conspiracy theories where none exist.


This is a great example of defining deviancy down.


> Suddenly, everybody is absolutely horrified by the fact that Facebook is selling people's data

Facebook does not sell data. Facebook simply provides a platform for advertisement and to their credit because of the scale of facebook its easy for people to run ad campaigns.

Other than that I absolutely agree with the arguments here. Traditional media is hardly impartial or even a good source of news.


Dude, that's so disingenuous to say the media is reporting on Facebook because Facebook told the media to "take a hike". Facebook still has the users and could still pivot but this denial is hurting them at this point not helping. The media is a whole separate problem and still make lots of money from Facebook.


I'm no big Facebook fan, and as much as I like the idea of an impartial media, Every new facebook hit piece has seemed like a non-story run by a PR firm rather than an organic story.

I wonder if it's even some mix, an organized PR attack that makes the number and depth of stories seem a critical mass, and so the rest of the news gangs up on them cuz it's the trending topic that gets the eye balls, self fulfilling prophecy, etc..


"Selling people's data" is a very abstract concept. As technologists, we are able to imagine concrete implications. Everyone else brushes it off as they've "got nothing to hide" until hit with the actual repercussions.


Maybe someone hired Definers to do hit pieces on facebook. that would be beautiful irony




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: