Judging from what's happened with many similar activists, I'm guessing a hard-line anti-censorship mindset isn't compatible with today's social/political landscape in which rampant misinformation on the internet has direct effects on meatspace.
Of course, this is purely conjecture. It could be completely unrelated.
i used to be into hardline freedom of speech... now i acknowledge that real life nuances are a lot more complicated.
censorship is messy and complicated and usually involves a dangerous concentration of power, sure...
but truly free and anonymous speech that can originate from places that are immune to it's effects or can be falsely attributed for the purposes of subversion can also result in a dangerous concentration of power.
what's the difference between a censored truth and a chorus of convincing lies originated anonymously that buries the truth?
Just kinda spit balling here so forgive the lack of empathy, but it seems to me like the overall System is perfectly capable of correcting itself when people succumb to "misinformation" to the point it harms them. Yeah we don't want anyone to get harmed, sure sure, yeah, of course, that would be simply... awful. Yet... we learn best from failure, correct?
In other words, at some point every concerned individual needs to let those insistent/destined to fail to do so, and let others learn from their mistakes.
Everyone's gotta stop trying to save everyone else.
I don't disagree with this, but there's a threshold in which the misinformation becomes the prevailing "truth" for a portion of the population, and is no longer able to self-correct.
If there is a force actively working towards this as a goal, do you not think that force should be actively opposed?
The problem here is each political leaning will say the same of the other regarding misinformation - which is always subjective.
The only difference is that one side has staked their existence on upholding free speech, to never silence the other, but not vice-versa. It’s not a fair fight.
I always think: “how do you get to Hitler’s Germany”. And it doesn’t come from the group that upholds free speech.
It comes from the group that says censorship has become an unfortunate necessity.
We should respect those that hold true to principles that do a disservice to themselves.
I agree with the concerns you bring up but isn’t framing this as a free speech/censorship” problem too broad when the main concern is propaganda/misinformation being essentially broadcast (and amplified by engagement algorithms) over quasi monopolistic tech platforms?
> (...) but it seems to me like the overall System is perfectly capable of correcting itself when people succumb to "misinformation" to the point it harms them.
For this hypothesis to be valid, you'd require a population which:
a) had decent critical thinking,
b) consumed reliable information from reliable sources,
c) wasn't targeted by bad actors who hijack information channels to saturate it with disinformation,
d) wasn't radicalized to the point where even basic health and safety precautions are attacked as being partisan politics.
What we have been seeing for the past year or so is that the system is unable to self-correct if attacked hard enough. Also, we also that the system indeed has some capacity to self-heal if the volume of disinformation is actively tuned down.
I would argue that you don't need any of these assumptions to be valid. All populations throughout history had partially excellent, partially catastrophically-flawed perceptions of reality and truth. And being "high info" or "low info" has zero link, since it's easy to find "pop science" factoid that are well-known and accepted, and have been highlighted in Ted Talks (not TedX) and popularized to millions, yet are false. The fact is, you can still reproduce, and still code a program, repair a car, do whatever your job is, even if you believe the earth is flat; you can't be an astrophisicist, but if you'd believe in flat-earth you weren't going to be one anyway. They just cause social problems if they're, e.g., colleagues of yours, and are pushy about their views; and I think that's what the "pro-censorship" crowd tries to address. Try to have a "deep enough" conversation with random strangers today, and you'll see it's not their "facts" that are the problem, it's that most people's thinking process just isn't rigorous. Internet censorship simply can't fix that; you censor certain views, you'll just find that people shift to adopting equally unrigorous views on the opposite side; and may be just as pushy if that's their temperament.
> you'll just find that people shift to adopting equally unrigorous views on the opposite side.
This is a feature, not a bug. The purpose of internet censorship as well as the entire "misinformation" discourse is to make sure the propaganda from your side wins.
That seems too provocative to me. It seems simply that the "elite", social media activists, and FB/YouTube employees are guilty of just the same kind of non-rigorous thinking. They think misinformation posted online causes a phase-shift of rational people into irrational/"mentally ill" people, and feel a responsibility to "limit the damage"; but human crowds have never been rational, period. Another comment in this thread says: "Sloganeering is actually critical for mass movements for political change", which is completely true, and I think proves my point; practically all discussions online are filled with complete misinformation, yet often aggregate around reasonable conclusions (and occasionally, unreasonable), whether it's on healthcare reform, privacy, medicine, whatever.
It's not just "sloganeering", it's an attempt to enforce a narrative and crush the opposing narrative. It's coercive.
This is particularly important when you have a grievance culture -- it very much depends who is getting the sympathy. The lens of concern needs to be focused with laserlike narrowness on the approved victims, and not on other victims, and outrage must be focused on approved perpetrators. This is a key part of Chomsky and Herman's propaganda model in Manufacturing Consent
Take the very different coverage of Antifa, BLM and Capitol riots. All three attacked government buildings and two of these were nationwide and resulted in multiple deaths. One of them caused billions of dollars in property damage, mass arson, etc.
But the coverage was very different. No one called the Capital riot "mostly peaceful". If they did, they would no doubt be accused of spreading "misinformation". So to merely call this "sloganeering" is to hide the coercive nature of the misinformation discourse.
i'm arguing that the situation is more complicated. we live in a world that is still somewhat e-communication naive, while the internet has completely rewritten the rules of the game bringing e-communication into focus.
when the us constitution was written, it wasn't written with the idea in mind that anyone in the world could anonymously participate in the local political process. that would have been crazy talk!
so i think maybe there may be some weirdness in terms of keeping the peace in the short term as more naive generations die off and more saavy generations come up. i also think that maybe some ideas we thought were principles were actually implementation strategies built for a very different world and that perhaps we'll need to look at what the underlying principles were and how they might be upheld in a world without information borders.
perhaps freedom of speech, which was written with the idea of preventing government from getting too powerful and controlling people, would need to be reduced to the idea of any entity amassing undue power by consolidating information capabilities to control people. from that, maybe you build up a freedom of speech paired with a required assertion of identity...
but honestly, i don't know. it sure does seem that the old principles were written for a different game though.
> when the us constitution was written, it wasn't written with the idea in mind that anyone in the world could anonymously participate in the local political process. that would have been crazy talk!
I thought the Federalist Papers were published anonymously.
I understand that you said "anyone in the world". I'm certainly not a scholar of American history, but surely there were European influences being exerted (and likely anonymously, too) on the Colonies around the time of the Federalist Papers.
Far-reaching anonymous political speech isn't a new thing. The speed and ease of disseminating speech is new, for sure.
> I thought the Federalist Papers were published anonymously.
They were not. They were published by three well-known political figures under a pseudonym that was widely known among their peers. Everyone knew it was one of three, and in most cases everyone who mattered in the discussion knew that Hamilton was the most likely author of most of them.
> Far-reaching anonymous political speech isn't a new thing. The speed and ease of disseminating speech is new, for sure.
yeah i suppose you're right, foreign intelligence operations designed to influence local politics have existed long before the internet. i think maybe the difference is, it can now be done at scale on a grassroots level at substantially reduced cost.
mix that with massive populations that are naive to common internet discussion traps and well, here we are.
Somebody bears ultimate responsibility for filtering fact from fiction.
It could be the individual consuming the news (which is hard work, and requires consciously counteracting confirmation bias as well as overt attempts at information manipulation).
Or it can be some centralized authority (social media companies, the government, etc.) and you have to ensure that the interests of that authority are aligned with the public interest.
The last several years have illustrated the downsides of the former approach but I'm not at all convinced that the latter is less brittle.
Ethically? Not unless they make is very clear what they are doing. Well I mean, how clear it has to be I guess would vary with the level of restriction?
The child looks at the forest and sees the forest. The adult looks at the forest and sees all the trees and plants and wildlife and features of the land. The old person looks at the forest and sees the forest.
It's possible to lose sight of what really matters when overwhelmed with nuance.
It's still an open question of whether we're actually seeing the result of too much free speech, or whether we're seeing the result of overly centralized powers popularizing extreme viewpoints to drive engagement. Faceboot et al have essentially installed themselves as middlemen into everyone's interpersonal relationships, and have thus hijacked our sense of social proof.
It all boils down to ad revenue and clicks. It’s no coincidence that all this started happening the same time old media was threatened by the digital age.
Any argument that appeals to fear and relies on boogeymen is trying to short-circuit the logical, rational parts of your brain. They're trying to make you part of the mob. You would do well to take such arguments and their prophets and burn them with hot coals. Literally tar and feather them and run them out of town.
Funny enough, if I recall correctly, that quote was specifically in the context of USENET.
And the story of USENET since then might be educational... USENET itself, and the infrastructure running under it, are hard to censor, but many of the individual service providers that ran USENET endpoints and provided them for their customers went "This is more trouble than it's worth" and stopped providing that service. It's harder to get on USENET now than it was in the days AOL offered it.
The Net interprets censorship as damage, but a critical mass of service providers concluding something isn't worth the resources can have the same effect as censorship.
Market dynamics support corporations which can generate the largest numbers of ad clicks. You can't run an organization which doesn't ultimately conform to market dynamics. That leads to polarization, hate, and misinformation. Do we want to go down that path?
Are corporations people? Should corporations have freedom-of-speech? Should corporations be free to lie? Should government employees? Should corporations be free to engage in speech which is known to actively harm people?
I really don't know.
I do feel like some forms of intentional lying should be illegal. If a government employee says something, or an academic does, I should be able to trust they're not being intentionally untruthful. Where does that line lie? I don't know.
I also feel like individuals should have real freedom of speech. Not just freedom from government prosecution after speech, but freedom from economic prosecution, and to some extent, social ostracization.
I feel like we need a serious discussion here, though.
All are in some ways restricting certain freedoms, while enabling others.
And what freedoms are you afraid of being taken that have not already been taken in some form?
I'm absolutely not disagreeing with you. I support full and absolute internet freedom. I have the ability to put absolutely anything I want on the internet with zero restrictions, but depending on what it is and how I put it there, there are many ways it may then cause me to lose freedom in some way.
A commenter on The Reg speculated that it's because (according to this commenter) Gilmore has lately veered into arguing publicly for legalising marijuana, and the EFF doesn't want to be associated with that.
If he has, that feels more plausible a cause than him being "an old-school internet freedom activist" per se.
It looks like there is subtext that there was a contentious issue they couldn't agree upon, I wonder what it is.