Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> An authoritarian oppressive government that has the support of the majority of the population is still an authoritarian and oppressive government.

This is indeed the neoconservative view. An analogous statement by a sectarian neoconservative might be "A heathen government that has the support of the majority of people is still a heathen government".

> Either it's okay to aid and abet political repression, or it's not.

Binary, over-simplification.

What is political repression? Does America's Gitmo count? Our imprisonment of nonviolent drug offenders? Our criminalization of sex work? How is it that the source of all the righteous indignation happens to lie on the other side of the world?

> ...they're not just hapless victims of their government, then they're welcome to put pressure on said government

So the Chinese people are being tested now to see if they have the mettle to demand a free society in the image of the USA? Your remark actually supports my point that the neoconservative view entails judgment (and the process of dehumanization) of the population that is first framed as victims, then shamed for not prioritizing one political cause above all else.

Note, the idea that China should adopt policies in the image of the USA is where the neocon view merges with the white supremacist view. The idea that the precise nuances of Western democracy are innately superior due to the unique cultural circumstances that gave rise to the US, making them less likely to occur elsewhere.



The notion that the Chinese people actually have agency in choosing their government and that they want it this way is so far beyond ludicrous that I hardly know how to respond. How can you say that about a government that imprisons political dissidents and suppresses any information that could be a threat to its authoritarian rule? China is a one-party state with a supreme leader for life. So what are you talking about?!


I think you are buying into two major fallacies and succumbing to one major bias. First, the fallacies:

- The fallacy that leaders, even dictators, have anything resembling absolute power or are able to govern without some degree of consensus from other powerful interests within the country.

- The fallacy that a system with two very similar parties (such as in the US) is significantly different from a nation with a single party.

The bias is the idea that there is something more sinister about the reported information suppression activity being done by China than is done in the US. States must do this to maintain/launder their reputation/legitimacy in the eyes of the majority.

Look how the US has treated Julian Assange. If anything, China simply has more people with the level of courage necessary to take the kind of risks that Assange took and must in some ways apply authoritarian power to stop it.


> What is political repression? Does America's Gitmo count? Our imprisonment of nonviolent drug offenders? Our criminalization of sex work?

That is for everybody to answer on their own, in a way that is consistent with their ethical principles on other things.

> How is it that the source of all the righteous indignation happens to lie on the other side of the world?

Because the people who are doing these things for China live on this side of the world, and are mostly American citizens. So we judge them for what they do in China by American standards.

> So the Chinese people are being tested now to see if they have the mettle to demand a free society in the image of the USA?

It's not a test. The Chinese people (or their government) can demand whatever society they want. What they don't get to demand is American engineers helping them build a society that is build on foundations that are unethical by American standards. And what we get do demand is that American engineers are not complicit in this.

By your standards, it sounds like castigating IBM for helping German govt with Holocause was also "neoconservative" - after all, how dare we judge their society for what they do to some unfortunate people with broad popular support?


> society that is build on foundations that are unethical by American standards.

I acknowledge the truth of this point, but I would argue that Google engineers are helping the US Government build a society that is unethical by American standards. I will include some examples below (though this is not the main point of my reply).

> By your standards, it sounds like castigating IBM for helping German govt with Holocause was also "neoconservative" - after all, how dare we judge their society for what they do to some unfortunate people with broad popular support?

I see how the example of IBM looms large as a lesson from history on what tech companies should strive to avoid doing. It is my understanding that the logistics of the Nazi genocide would not have been possible without IBM technology, and that top officials at IBM had reason to know how the technology was being used.

But I'd argue that the main reason IBM sold the technology to the German government was not corporate greed causing a moral blind spot, it was simply that the actions of the German government were not viewed at the time as out of bounds. Nations had been rounding up and isolating "troublesome" minority populations throughout history. It is/was in effect an aspect of running a state.

My point here is that choices that are uncontroversial and broadly morally acceptable turn into things that are later viewed as having been abhorrent. It used to be the case that one form of compensation to soldiers after they fought a battle was that they were allowed to rape, etc. It used to be morally acceptable for a husband to dole out corporal punishment toward his wife and children. People who did these things did them with a clean conscience, and were respected and upstanding members of society.

Some kinds of technology are ripe for abuse by governments. I'd argue that for the most part, any product that is useful for advertising is ripe for abuse. Google and Facebook are thus both creating tech that offers the ideal suite of services for governments to abuse.

How might governments abuse the services? China demanding censorship is one way, but the US Government has many points of access and influence into Google and Facebook, to the point where I'd argue the scope and reach of US government abuses goes well beyond search result censorship and much more closely resembles the kind of social credit system China has also been derided for aspiring to build.

Just to give a few examples, neither Google or Facebook offers any sort of warrant canary for user account data, and both have created custom interfaces for government officials with warrants to use to undertake unfettered data mining of public and private information about users, as well as the extensive metadata that both firms collect for ad targeting purposes.

Recently, Facebook has been blamed for allowing so-called election meddling to occur and "not responding soon enough". Since the election meddling was simply spending money on ads that supported various fringe political groups, we can infer that what the US Government is asking for is the ability to suppress such content if it gets too popular for any reason, not just that the ad impressions were paid for in Russian currency or the transaction originated somewhere geographically close to Russia.

If there has ever been a scenario ripe for abuse this is it.

Surely similar conversations are going on at Google, but thanks to the cozy relationship with government established by Eric Schmidt, the mechanics of the content suppression mechanism were uncontroversial and have been available to government for a while.

If we put on our sci-fi hats for a moment and imagine a sci-fi description of an algorithmic "feed" such as the FB news feed or a Google search results page. Inevitably we realize that of course the government will dictate a lot about how the algorithm works.

We get a glimpse into the liability angle by looking at Tesla's "auto pilot" algorithms. Just as Zuck admitted that FB should "probably be regulated", Tesla has the same ultimate desire with respect to auto-pilot algorithms. If someone is killed by a self-driving car, it's extremely useful to eliminate any negligence that might be found in the software design or QA process, since liability would rest there and the firm(s) responsible would have to pay whenever someone was killed.

But by allowing the algorithm to be regulated, liability can rest somewhere else and not with the most risky and experimental aspect of the car's safety.

Notably, regulators are irate about Facebook's handling of Russian ad spend, as well as about any self-driving vehicle that crashes. Why? Because it is the threat of liability that most effectively nudges the firms toward allowing the government more and more control over the algorithmic details.

So I think the biggest threat is the US Government's surveillance programs and cozy relationship with big silicon valley firms. There will always be well funded groups trying to find any way to criticize or embarrass China. Among them is Trump who started a trade war and has scapegoated China as the reason the US rust belt economy is weak.

So while there is plenty of reason to be concerned about China's treatment of its citizens, the rhetoric used to express that concern is of great benefit to the neocons who are fanning the flames of conflict.

Thus it is preferable to find a way to achieve the moral objective you cite without further empowering US warmongers/neocons.

One suggestion I have is to simply offer an uncensored version of China Google hosted in the US outside the Great Firewall. Then, if someone were to search for 天安門廣場 on both systems and a few results on page 1 are missing, it will be completely obvious what the Chinese government wants censored.

I actually created this the last time Google had a presence in China and was excited to create mechanisms to easily crawl and distribute the censored data, but then Google pulled back from China (for competitive, not ethical reasons at the time) and so there was no use for it anymore.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: