Well, neither "caused" those events per see, they were just places where people gathered and talked about doing them and riled each other up.
That's one of my frustrations with this: these issues should be dealt with by police and governments and society as a whole. The total failure of those leaves us looking to Big Tech firms to do things they were never designed to do (change human nature, police speech AND make a profit). Things they really can't do without collaboration and intrusive behaviour. But enough of my hair splitting...
Re the percentages, Any platform over a certain size will have racists (and pedophiles and other illegal or undesirable users). The question isn't whether someone on Facebook is being racist. The question is whether Facebook encourage it and whether they effectively moderate it when it is reported/discovered.
Facebook is huge and has much less racism etc than Parler did exactly because they ban it and mod it. They're not perfect, but they're making an effort. That is all you can expect from a platform that allows user content.
Parler encouraged and refused to moderate the same content. That's how it grew from zero users up. That's how it marketed itself. It could have moderated, it removed all sorts of legal content that didn't fit its brand. But it's taken little to no action on violent right wing groups. Quite the opposite, it's encouraged and courted them.
Percentage measures are a measure of the effectiveness of the mods and the actions of the people running the platform and that's what we can reasonably expect. Absolute measures are at least partly just down to size of platform.
The also percentage matters because parler is small right now. If we ignore it because it's "only a night of rioting" what happens when parler has 330mil users like twitter? Or 2.7bn users like FB? You'll get a lot more than 1 genocide out of a mega-parlar...
Density also matters. If I browse reddit, Facebook, twitter and parler randomly, how long before I see crazy conspiracy theories or calls to violence? That's what's concerning here as well: echo Chambers. All the others are much broader, but on parler you just got a very right wing, fact free and urgent set of inputs. That's much more likely to lead to someone being radicalised than the same amount of BS spread over a much bigger collection of cake recipes, kitten pics and funny gifs. That's the other reason percentages (a density measure) matter.
At least that's how I understand it.
Back to my soap box for a moment: I think we have a real problem with people who are not used to social media not having any BS filter. Thats the root cause of both the riots and the Burmese genocide (I assume that what you refer to, correct me otherwise).
We don't have a solution to that. Facebook et al tried to solve it by dilution and moderation. Parler saw it as an opportunity to profit off of and lean into. Those are 2 different business models morally even if its a difference of degree rather than ultimate outcome.
That's one of my frustrations with this: these issues should be dealt with by police and governments and society as a whole. The total failure of those leaves us looking to Big Tech firms to do things they were never designed to do (change human nature, police speech AND make a profit). Things they really can't do without collaboration and intrusive behaviour. But enough of my hair splitting...
Re the percentages, Any platform over a certain size will have racists (and pedophiles and other illegal or undesirable users). The question isn't whether someone on Facebook is being racist. The question is whether Facebook encourage it and whether they effectively moderate it when it is reported/discovered.
Facebook is huge and has much less racism etc than Parler did exactly because they ban it and mod it. They're not perfect, but they're making an effort. That is all you can expect from a platform that allows user content.
Parler encouraged and refused to moderate the same content. That's how it grew from zero users up. That's how it marketed itself. It could have moderated, it removed all sorts of legal content that didn't fit its brand. But it's taken little to no action on violent right wing groups. Quite the opposite, it's encouraged and courted them.
Percentage measures are a measure of the effectiveness of the mods and the actions of the people running the platform and that's what we can reasonably expect. Absolute measures are at least partly just down to size of platform.
The also percentage matters because parler is small right now. If we ignore it because it's "only a night of rioting" what happens when parler has 330mil users like twitter? Or 2.7bn users like FB? You'll get a lot more than 1 genocide out of a mega-parlar...
Density also matters. If I browse reddit, Facebook, twitter and parler randomly, how long before I see crazy conspiracy theories or calls to violence? That's what's concerning here as well: echo Chambers. All the others are much broader, but on parler you just got a very right wing, fact free and urgent set of inputs. That's much more likely to lead to someone being radicalised than the same amount of BS spread over a much bigger collection of cake recipes, kitten pics and funny gifs. That's the other reason percentages (a density measure) matter.
At least that's how I understand it.
Back to my soap box for a moment: I think we have a real problem with people who are not used to social media not having any BS filter. Thats the root cause of both the riots and the Burmese genocide (I assume that what you refer to, correct me otherwise).
We don't have a solution to that. Facebook et al tried to solve it by dilution and moderation. Parler saw it as an opportunity to profit off of and lean into. Those are 2 different business models morally even if its a difference of degree rather than ultimate outcome.