Hacker News new | past | comments | ask | show | jobs | submit login

I don't think that's the case. In theory it can be used to "find deviants", but has that actually ever been done and to what degree? Improving advertisements isn't horrible.

It has a lot of applications in science, could vastly improve medicine, it's used to predict energy usage for power plants, filter spam, recommend movies or products, improve harvests for farming, etc. But I guess those things don't sell papers.

And lastly calling a technology "bad" isn't actually helpful. We can't stop bad people from using the technology or "uninvent" it. You'll just damage the people using it for good. See the Halo Effect where people told about the risks of nuclear power perceive the benefits as less (http://lesswrong.com/lw/lj/the_halo_effect/).




By "finding deviants", I mean exactly what the NSA is doing, and what every Keystone police department will be doing in ten years. And yes, improving advertisements is horrible - this is precisely one of the anti-human facets I am talking about.

> It has a lot of applications in science, could vastly improve medicine, it's used to predict energy usage for power plants, filter spam, recommend movies or products, improve harvests for farming, etc

You're right. My initial comment was way too harsh and broad for indicting the whole field rather than just the in-our-face web scene applications.

> lastly calling a technology "bad" isn't actually helpful. We can't stop bad people from using the technology

Of course you can't uninvent something, but you can slow down or speed up it's progress. Path dependence is a thing. Currently, most people do not feel empowered with respect to computers. The more we put them at the mercy of seemingly unpredictable computer systems that are actually trying to outwit them, the harder it is going to be for them to get over those feelings of disempowerment so that we can get to a place where people expect their computers to work for them.


I've seen no evidence that the NSA does that, and really it makes no sense. Doing a search for "abnormal" behavior would mostly get you people that visit odd websites, or are awake at atypical hours, or spend too much time online, etc. Nothing remotely useful. There is a lot of hype around data-mining but there are diminishing returns on what can be done with it.

I agree that ads are bad, but they aren't that bad. Targeted advertisement is mostly just finding things that are relevant to the website or viewer. And it's so incredibly easy to block ads on the internet I don't really have much sympathy for those that don't and complain about them. Ads were way worse in the age of television where 25% of all content was commercials.

>Of course you can't uninvent something, but you can slow down or speed up it's progress. Path dependence is a thing.

Perhaps, but for the most part the technology already exists. At it's core it's just basic statistics or sometimes even simpler techniques. Diminishing returns have already been hit long ago - I can't think of much that would improve the "bad" applications more than the "good".

>The more we put them at the mercy of seemingly unpredictable computer systems that are actually trying to outwit them, the harder it is going to be for them to get over those feelings of disempowerment so that we can get to a place where people expect their computers to work for them.

I.. what? On the rare occasions where machine learning is used in apps, it's to improve user experience, or do something that would otherwise be too complex or inefficient to code by hand. I'm sure someone, somewhere, abused the technology in their app - just like every programming technology ever. But in general I don't see how this is bad or even that it makes computers more "unpredictable".




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: