But there are healthy connections and unhealthy connections. Is it possible that there might be a way to encourage the healthy ones and discourage the unhealthy ones?
Just because social media works the way it does now doesn't mean it has to always work like that. These things are constructed by humans and we can change them if we want. There is nothing intrinsic in Facebook or YouTube that says they must promote "engagement" above all else.
Maybe they might make a few less dollars, wouldn't that be sad.
But there are healthy connections and unhealthy connections. Is it possible that there might be a way to encourage the healthy ones and discourage the unhealthy ones?
There has been talk of Facebook actually using algorithms to make mental health interventions. The problem is that I don't think that many users find some automatically generated warning credible.
Just because social media works the way it does now doesn't mean it has to always work like that.
Either you something like a single, state approved connection system or each social media network will serve up anything that users will buy. That's both because of the profit motive and because someone, somewhere thinks X sort of content is totally cool, both factor are in play. What's the plan to counter that?
They can make the interventions much more subtle and palatable than just a dry warning. Subtly changing the feed to nudge the person on another path could do this. It is manipulation, but maybe some people need it.
Just because social media works the way it does now doesn't mean it has to always work like that. These things are constructed by humans and we can change them if we want. There is nothing intrinsic in Facebook or YouTube that says they must promote "engagement" above all else.
Maybe they might make a few less dollars, wouldn't that be sad.