Hacker News new | past | comments | ask | show | jobs | submit login
Automatically “block” people in images using a pretrained neural network (github.com/minimaxir)
110 points by thefox on March 30, 2018 | hide | past | favorite | 58 comments



The road to hell is paved with good intentions. And I think software engineers are this century's key perpetrators of "good intentions." Here's a hint: if you saw it in a dystopic thriller you probably don't need to build it, even out of morbid curiosity.


I have not watched the dystopic thriller you mention, so I don't know what kind of drawbacks you are expecting, but I think this technology might have some useful applications.

E.g. an image-upload site could answer requests for deletion on privacy grounds not by taking the whole image down, but by obscuring only parts of the image.

Or imagine you're taking a selfie in a crowded place and don't want all those other people diverting attention away from you.

Basically, this could be used whenever someone might go for manually painting over parts of an image to remove content. Something similar for screenshots of conversations might be useful as well.


>I have not watched the dystopic thriller you mention

https://en.wikipedia.org/wiki/White_Christmas_(Black_Mirror)

Netflix link: https://www.netflix.com/watch/80073158

Worth a watch, and the series is not chronological so no worries about prior knowledge.


tl;dw: the concept has been toyed with in a number of episodes:

Imagine the death penalty was replaced by a simple "ban" from everybody else's perception. Instead of being put to death, you just appear as an outline filled with static and your voice appears as garbled noise.

Alternatively, imagine "parental controls" for a child's perception that automatically filters out "unsafe" content like sex, violence, hobos, hookers and hate speech.

Or imagine altering a soldier's perception to make enemy combatants appear less human so empathy doesn't get in the way of a clean kill.


SPOILER warning!


Not really. I'm not giving away which of these happens in which episode and a lot of the technology is recurring throughout the different stories. In most episodes the general idea of how the tech works is given away fairly early on and the real plot points are more about how it affects the characters.

In fact the parental controls idea is literally this: http://j.aufbix.org/plif/archive/wc161.gif and I'm pretty sure the soldier one was taken from an episode of Outer Limits or Twilight Zone.


Every invention/technology eventually will be used also to harm people, even those which were never meant to be used as such, from electricity to airplanes. Unfortunately seeing anything in a dystopic thriller also means someone else, not just the writer, has already thought about using it. Making more people aware can do more good than harm IMO.

This technology is in its infancy; I fear the day a policeman wearing an encoded transmitter will be able in real time to force all security cameras in the area to exclude his image and cover it with normal background when he knows he abused his power or is going to.


I disagree with your use of guilt by association with Black Mirror.

There are both horrifying and also amazing technologies shown in that show. In terms of masking people in images, that would be the White Christmas episode.

The ability for excellent image segmentation was not the scary technology there, it was the ability for a government to control your vision and hearing at the neural level.


> Here's a hint: if you saw it in a dystopic thriller you probably don't need to build it, even out of morbid curiosity.

But, for example, you could make a chrome extension to remove Trump from all pictures on the internet! I'm sure that type of thing would make a some folks happy.


I’d like to see an opera extension to remove all the politic and pathos from the internet. If someone able reads this, thanks in advance for supporting opera too!


Pretty sure this is satire. Pretty sure there are no "good intentions" here.


Speak for yourself, I’d love to build a ton of shit I’ve seen in Black Mirror.


You can't stop progress, even if it's negative. Especially something that is relatively easy to make. If nukes were easy to make, we'd all be dead.


You can't. But you can choose not to be part of it. I refused politely an interview with google and facebook for the same reason in the past.

The world is what you make of it

People keep saying you can't change the world. That you can't do anything about it.

Well yes you can. Don't be part of it.

Oh yes, there is a price to pay.


You can absolutely work to redefine what “progress” means, who technology will serve, and how it’s bounty and harms are distributed.

If you live in a Western democracy, you’re reaping the benefit of a series of such struggles from the 17th century forward. The Enlightenment, various democratic revolutions, labor organization, etc. The fatalism, the “unchallengeable trajectory” solipsism that categorizes so much of the thinking around tech is the thing that’s new.


Reminds me of Boston Dynamics. Ten years ago, I was in awe and so entertained when watching their videos. Now, I am terrified and disgusted.


I was in awe while also being completely terrified. It didn't take Black Mirror or a viral ad to convince me that autonomous weapon systems are bad. I grew up watching Terminator and RoboCop. My first thought when seeing BigDog was "this is awesome but also we're all going to die".

When people didn't get it, I just showed them Petman: https://www.youtube.com/watch?v=tFrjrgBV8K0


Unless you have a nearly unique skill set, refusing to build something does not even slightly impede it from getting built.


The age old justification of "if I don't do it, someone else will". You can justify many things with that.

I happen to think this particular technology is cool and it's silly to be afraid of it, but no need to employ that justification.


But I think some things are justifiable by that argument. A great example of that is the public disclosure of software vulnerabilities. If you’re not malicious, and you find a bad bug, chances are a malicious actor will eventually find the bug too.


Doing X to mitigate the harm of someone else with bad intentions doing the same is not in the same category as doing X because "it's cool" or "money".


The equivalent example would be: "I might as well hack this account and take all the money because if I don't, someone else will".

I have no problem with: "I should inform the developers of the vulnerability (perhaps publicly to pressure them) so that it will be fixed because someone will hack it otherwise".


We were literally talking about building technologies that might be used in harmful ways. Discovering and developing software exploits is a direct example of this.


Further, if enough people refuse to accept that justification, you _can_ make a meaningful difference. Maybe can't stop it, but as the number of willing engineers goes down, the number of things that can be built in any given times goes down. This gives society and the law the time to take up with and process the implications of the technology.


If you saw it in a dystopic thriller and it's possible to build it then it's guaranteed that it will be built because the cat is out of the bag.


It only gets built if some over-zealous nerds build it. That's the point.


Or someone that's "just doing their job".


Just doing your job is not a justification or defense.

Everyone has a choice to make as to what role they want to play in society, some people have chosen to build the tools of our destruction in exchange for pieces of silver, but that was their choice, "just doing their job" does not absolve them of that choice


I agree, your sentiment is precisely what I was implying with the quotes.

These things don't always communicate perfectly through text or punctuation.


This would combine well with an in-painting algorithm-- or the video equivalents: https://research.adobe.com/project/content-aware-fill/ https://www.youtube.com/watch?v=j3uCV0JYMJ4


Will someone kindly smash this repo https://github.com/fivemok/image-inpainting together with the submission?



I'd imagine it'd be relatively easy, they both seem to depend on a mask outlining the part of the image to replace.


Why are we building the technology of a dystopia? What happened to shows like Star Trek which gave us more helpful goals to work toward?


The reason I made this was that I noticed Mask R-CNN’s masking and object identification capabilities were pretty good and I needed a refresher on Python-based image manipulation. https://twitter.com/minimaxir/status/976277067070300160?s=21

And then I remembered Black Mirror and it gave me an idea.

The step toward dystopia was not the primary motivation for this project.


This is cool I remember trying to build a chrome extension for something like this a while ago for detecting and blocking images of animals people had phobia's for (e.g snake, spider and e.t.c) and I tried using pythons dlib object detector and shape landmarks predictor and it was a very stressful exercise in futility. Now I will have a look at mark r-cnn as it might help solve my problem and my few minutes of total brain fuckery after unlucky seeing a picture or a video with any kind of snakes in it, the big one's are of course worse.


The Star Trek whose utopia needed a world war against the ethnic cleansing of genetically engineered Nazi supermen and an alien invasion to "civilize" humanity in the post-apocalyptic aftermath? After which which humankind just somehow evolved beyond the petty desires and violent impulses that had hitherto been baked into its primate genome for millions of years?

I love Star Trek but one person's utopia is another person's dystopia, it's just a matter of perspective.

And as far as technology goes, the original series predicted a couple of things, and we got a Space Shuttle named after the Enterprise, but most of it is magic and therefore useless to the real world. No matter how hard we work or how much we believe in ourselves we're never getting matter replication or transporters or holodecks that will create sentient life on command.

We could look at the ideals of Star Trek, but remember that what we see is quasi-militaristic, autocratic and there is no real sign anywhere of privacy or freedom from the surveillance infrastructure of the computer. Even the transporters keep copies of people's entire genetic code.


Depends. In TNG the outlook tends to be far more rosy than in DS9 or DIS. There were a few episodes which hinted at darker tendencies in Starfleet but those usually turned out to be corrupt individuals or some kind of alien influence.

Having grown up with TNG my impression of the Federation was much like what the US was pretending to be prior to 9/11 (i.e. prior to "world police"), a beacon of liberty, equality and hope for mankind, but with all the ills of real-world USA eliminated by scientific progress and a more idealistic and altruistic society.

The show had a few militaristic undertones initially but they mostly went away in season 2 and later, except for the ranks. The ranks and offensive capabilities are lampshaded a number of times but Picard and other characters frequently prove that those are just born out of necessity and for self-defense, never to attack. The Borg represent such a threat exactly because they are not interested in diplomacy and Starfleet has to fight an all-out war to survive them -- unlike the Romulans or Klingons who even at the worst of times mostly didn't outright attack them.


“I thought what I'd do was, I'd pretend I was one of those deaf-mutes.”


Also, that Black Mirror episode.

Edit: Turns out it's exactly what the author intended. Damn.


Not exactly “intended”, but rather, for want of a better quote, the person you’ve replied to, chose Ghost In The Shell reference, based on a J.D. Salinger quote.

The “author”, meanwhile, wrote the article we’re discussing in this HN thread.


I thought of that too, but I didn’t know a quote I could use.


This doesn't seem to work for me. I ran "python person_blocker.py -i images/img1.jpg -l" after installing dependencies and I only get:

/home/d33tah/virtualenv-py3/lib/python3.6/site-packages/h5py/__init__.py:36: FutureWarning: Conversion of the second argument of issubdtype from `float` to `np.floating` is deprecated. In future, it will be treated as `np.float64 == np.dtype(float).type`.

  from ._conv import register_converters as _register_converters
Using TensorFlow backend.

2018-03-31 12:53:18.777911: I tensorflow/core/platform/cpu_feature_guard.cc:140] Your CPU supports instructions that this TensorFlow binary was not compiled to use: AVX2 FMA /usr/lib/python3/dist-packages/scipy/misc/pilutil.py:480: FutureWarning: Conversion of the second argument of issubdtype from `int` to `np.signedinteger` is deprecated. In future, it will be treated as `np.int64 == np.dtype(int).type`.

  if issubdtype(ts, int):

/usr/lib/python3/dist-packages/scipy/misc/pilutil.py:483: FutureWarning: Conversion of the second argument of issubdtype from `float` to `np.floating` is deprecated. In future, it will be treated as `np.float64 == np.dtype(float).type`.

  elif issubdtype(type(size), float):


Author here: All of these are warnings, they shouldn't block the script. Check the current directory for person_blocked.jpg and person_blocked.gif.


Thanks! That helped.


What's next, cameras that refuse to take pictures of specific things? I find this unsettlingly creepy.


Scanners have been refusing to scan certain things for ages.


It's always a "fun" party trick at the office to show people that you can't scan money.


In a stallman-esque world, this is just a problem in installing custom firmware in the scanner, right?


DJI build drones that refuse to take off in certain places - so it's not entirely unprecedented...


This reminds me of the character Laughing Man in the anime Ghost in the Shell:

https://en.m.wikipedia.org/wiki/Ghost_in_the_Shell:_Stand_Al...


In some cases the shape is too small and left a few pixels around the person, or it has corners that are too sharp. Is it possible to add an option to enlarge the shape a few pixels?

[Bonus points for another option to make soft shadows near the border of the shape, so it is not so sharp.]


We are now a few steps away from Peril Sensitive Sunglasses. [1]

1: http://www.hhgproject.org/entries/perilsensitivesunglasses.h...


What purpose does this serve? I'm genuinely curious what are the applications of this?


Stuffing people into Memory Holes.


Imagine wearing AR glasses and blocking out people you find annoying or disagree with.


Or animals that frighten you.


Prepare your bicycle!!!




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: