Hacker News new | past | comments | ask | show | jobs | submit login

Yea the difference in my mind is that the phone company does not moderate content or exercise editorial control over what gets sent over its lines, but Facebook does. So if Facebook decided that this kind of user generated content was acceptable to publish, they should be liable for the resulting problems.



I'm not sure Facebook is acting like an editor here. Those approve every post. Facebook can't practically do they. I'm critical of this case because it's a large group and thus Facebook should have been aware, but it seems like it's a tall order to ask Facebook to migrate everything. Isn't this what section 230 is about? Before you either migrate all our not at all. 230 let's you at least try to moderate (because let's be real, you can't moderate a billion people)


Facebook does act like an editor by way of their algorithms. Facebook's (and other social media's) timelines are no longer limited to the people you explicitly follow.

Whether they have the ability to moderate or not is irrelevant. If you can't afford the obligations of a publisher, don't be a publisher.


But how do you codify that in law? I clearly want some moderation (taking down illegal stuff, border line illegal, and etc) but if your choice is "moderate all or not at all" (as my understanding of pre section 230 is) then no one is going to moderate anything at all. I'm not trying to defend Facebook here, but I feel completely ignoring any of the nuance to the situation is disingenuous. It's not like you can moderate a billion people, you could only do your best (I'm not saying Facebook is doing their best). Removing the nuance of the situation to make an easy argument is exactly the problem that got us here in the first place so let's not continue it.


I'm not talking about moderation here. I'm talking about the fact that Facebook promotes certain content in user's feeds (including content that these users have no direct relationship to - they're not friends with the author nor follow him) in order to generate "engagement". This should stop.

Regarding moderation, it's true that you can't moderate billions of people with 100% accuracy, but you can discourage them from posting undesirable content in the first place by associating real consequences such a a permanent ban (or a monetary loss, by charging an entry fee to create an account) or make them earn the privilege of posting content (for example, not being able to post links until your account has certain reputation of good behavior).

Discouraging people from posting bad content, and not amplifying the reach of bad content for engagement's sake should go a long way.


There's something to this. The things people hate about Facebook aren't caused by some user sitting down and typing a post, and in fact that kind of content gets buried in the feed anyway. The problems start with "sharing the story at the top of the feed" and "your friend liked X article which is really an ad". It's a more aggressive version of 90s email chain-letters.


Absolutely. I don't think most of these people woke up one day and suddenly decided "hey let's storm the Capitol".

Instead, these people were groomed over a period of months or years by way of recommending conspiratorial or outrageous content and it finally blew up.

So not only did Facebook create the problem in the first place, they also had plenty of early warnings about what's been going on, but it's hard to consider an increase in "engagement" (thus revenue) as a "warning" and even harder to act upon it.

This also raises another question regarding the efficiency of our intelligence services if large-scale domestic terrorism was organized all in public on a platform they had privileged access to.


You're looking at it wrong.

This is free expression, they are providing a platform as free as you going in the street and expressing yourself.

If the government wants to censor this type of expression, it must pass laws, and it must dictates what you can say or can't say. With such a move, social medias would become forbidden, or inefficient (as every post would need to be approved before being released).

If this is the world you want to live in, it's fine, but you need to understand the consequences and the ways to get there.


> This is free expression, they are providing a platform as free as you going in the street and expressing yourself.

The street does not automatically reshape itself to promote the most offensive graffiti to as many people as possible.

I wouldn't have a problem with Facebook being hands-off if it was limited to content you explicitly followed in reverse-chronological order (like it used to be), as all you'd need to filter out the bullshit is to not follow bullshit sources. Of course, Facebook's revenue would drop off a cliff if they made this change because bullshit is what drives Facebook's revenue and not the friends/family pictures users originally came there for.


Phones don't allow complete strangers to discover each other and work themselves into a frenzy over their shared angst.


Phones do in fact allow this, the difference is that it doesn't scale up in a 1:many relationship, and the phone users are not incentivized to use the phone more for this purpose over, say, calling their mom.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: