Hacker News new | past | comments | ask | show | jobs | submit login

If FB are blocking things that they deem bad, then they are clearly setting themselves up as arbiters of morality, even if they say they aren't.



Facebook has opinions about the type of content they'd prefer not to facilitate on the Facebook platform, that doesn't make them "arbiters of morality", it just makes them curators of their product identity.


So they might say "we don't want $X type of content on our platform, as it would negatively affect our product identity".

But that's in fact a moral judgement! It's saying "some people think $X is bad, and we agree with them (or at least choose to pretend to so as not to soil our image)".


I would disagree that it is a judgement based on morality. It is a judgement made by the logic of corporations, which are machines made up of people that take on a life of their own.

While the actions of organizations like this have moral consequences, and the people that are part of it have their own morals, the machinery itself does not use morality to make decisions, except in hindsight as justification.

The moral consequences may be beneficial or harmful; it is all the same to a profit machine that acts in its own interests, which transcend the people who it is made of in the form of workers and users.


Saying that corporations should act to maximise their profits is in fact also a moral judgement.


I never put a value judgement like should on it. It just seems to me to be the dynamic that happens with organizations. They take on a life of their own, independent of the people they are composed of.


It's not one or the other, they can both care about their product identity, but by the same stroke arbitrate morality by doing so. (Product identity takes popular morality as an input).


By that definition, they're no more the arbitrator of morality than anyone else is by doing this


What do you mean? Not anyone has the power they do


The point is that if they're simply responding to consumer demand, then it's collectively us who are the censors.


If we were the censor, we would have the government do it (which represents us). Facebook's opinion of what they perceive to be morally dubious does not represent anyone but facebook themselves.


> If we were the censor, we would have the government do it (which represents us).

Well sort of. We can't legally, right? The constitution prevents that. If Facebook is simply censoring the things that consumers ask it to, it is absolutely serving itself (in a profit driven way), but it isn't asserting Facebook's morals. It's asserting something more like "American consumer morals".


Facebook clients are not individual, they're ad-centric. Facebook is serving what it believes to be advertisers' interest, which may not align with most people.

Plus, the reason we can't legally is because we agree it shouldn't be done.


> Facebook clients are not individual, they're ad-centric. Facebook is serving what it believes to be advertisers' interest, which may not align with most people.

But this still ultimately consumer morals, since it's about content that advertisers don't want to be associated with because that content will reflect badly on them in they eyes of the consumer.

> Plus, the reason we can't legally is because we agree it shouldn't be done.

We (generally) agree that the government shouldn't engage in censorship, yes. The claim that individuals and groups should not be able to themselves moderate the stuff that appears on their websites is a much more controversial claim.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: