Something really interesting here if you read the title and comments first is that this is an optical thing. It's not software running on the camera, it's physical.
This seems to be a new PR spin on the same technology that was posted a while ago. 3D printed optical neural networks. I'm surprised I haven't seen more interest considering the energy efficiency and speed of computation.
I don't see any technical reason you couldn't implement a relu in optics. There is a whole area, nonlinear optics (I think you may have intended that when you you said 'simple media?' Well, let me see. https://arxiv.org/abs/2201.03787
Thanks, the papers you quoted seem to describe something with low losses (though i can't find explicit statements to that effect when skimming). I was under the impression that nonlinear optics will have huge losses, because it requires multiphoton interactions (which should usually have low cross sections), so will have to figure out which part of my intuition doesn't hold.
Because you brought it up in the context of "I'm surprised I haven't seen more interest in..." I do assume that it is something neat with an interesting application and cool engineering behind it.
However,
"3D printed optical neural network" sounds like a term someone came up with by selecting randomly from a bag of PR buzzwords and trying to come up with the most incoherent phrase possible, to the point where if I saw a headline about it I'd assume it was a scam or something.
Also blurs the line (hrumf) between what is hardware and what is software? I mean software designed the hardware to behave with a certain ... algorithm?
While that idea might seem somewhat out-there, it's fairly straightforward once you think about it. We know the transfer function for light through matter, and can calculate its derivative. Therefore, we can use ML to design matter shapes that have desired properties.
All computers are effectively physical systems that control their noise levels to achieve logical operations. In this case, it's an analog system with no moving parts, but I imagine that given the existence of spatial light modulators and mems mirrors, you could probably reprogram the system in realtime to erase what you wanted on the fly.
Interesting article. Though I don't think the cryptography angle will pan out. I wonder if it was added because crypto's been a buzz word of late, and the researchers just really wanted to build this camera.
Maybe HN should make the comment button only available after first clicking the link. I see more and more of this "omg title" behavior on HN these days
Not trolling, just pointing out how you could've just flagged the comment if it was really an issue for you. Instead you ended up breaking the rule yourself just to beat your drum about it
"There, Cugel finds two bizarre villages, one occupied by wearers of the magic violet lenses, the other by peasants who work on behalf of the lens-wearers, in hopes of being promoted to their ranks. The lenses cause their wearers to see, not their squalid surroundings, but the Overworld, a vastly superior version of reality where a hut is a palace, gruel is a magnificent feast, and peasant women are princesses — "seeing the world through rose-colored glasses" on a grand scale."
Also similar to Wizard of Oz where everyone has to wear green spectacles to protect their eyes, when in fact, it was to make the Emerald City seem greener and more spectacular than it actually was.
"Joo Janta 200 Super-Chromatic Peril Sensitive Sunglasses have been specially designed to help people develop a relaxed attitude to danger. At the first hint of trouble, they turn totally black and thus prevent you from seeing anything that might alarm you."
--Douglas Adams
Maybe this tech is a continuum, but we've skipped past Adams straight to 2.0, and Overworld is 3.0.
Similar thing with the MASS system in a Black Mirror episode (Men Against Fire) where visual reality could be substituted by your implants (or rather using your implants, and by the people in control of them).
And again in the Christmas special, which was more similar to this device in that it would block out certain things, or everything (though it was in software, and again under external control). Which sounds horrifying enough but was far from the worst thing in the episode.
In the same vein, a short sci-fi film "The Nostalgist" [0]. This film really opened my eyes regarding why we may not want devices that alter our perception of reality.
"Privacy protection is a growing concern in the digital era, with machine vision techniques widely used throughout public and private settings. Existing methods address this growing problem by, e.g., encrypting camera images or obscuring/blurring the imaged information through digital algorithms. Here, we demonstrate a camera design that performs class-specific imaging of target objects with instantaneous all-optical erasure of other classes of objects. This diffractive camera consists of transmissive surfaces structured using deep learning to perform selective imaging of target classes of objects positioned at its input field-of-view. After their fabrication, the thin diffractive layers collectively perform optical mode filtering to accurately form images of the objects that belong to a target data class or group of classes, while instantaneously erasing objects of the other data classes at the output field-of-view."
This only works for the privacy-minded, naive among us. If you want to exclude something from a picture or video. Do NOT record it, at all, EVER! If it can record anything it can record the wrong thing.
What happens when a camera can't be used in a location that needs added "security" (which is really surveillance and not security) but it cannot be used due to expected privacy reasons (bathrooms, locker rooms, fitting rooms). The claim is "proven" that it cannot "see" your private parts because it is programmed not too. I guarantee the AI will fail at some point or is vulnerable to some attack.
Or what if it is connected to a radar/laser speed enforcement camera and takes your cars photo because it detects the car behind you speeding but it cannot "take" that part of the photo because it mis-detected you as the speeder.
This technology is fraught with problems when it comes to evidence in a court of law. What is not there is just as important as what is there and if you are erasing what is there you are also erasing what is not there?
Hmm. Maybe a shield could be designed to sit in front of a user's face or something. If it recognizes a friend, allow light which should be on a good trajectory to hit the friend's eyeballs go through. Otherwise, block everything.
So this is really cool and useful, but it's important to keep in mind that since this is a diffractive structure, it probably only works with coherent light (what you get from a laser.) Most normal light sources produce incoherent light, and that tends to not work so well with complex diffractive structures.
And, with a bit of poisoning in the image training data, all of the security cameras at $Critical_Facility will be utterly blind to anyone who wears a North Korean Military Intelligence full-dress uniform...
I'm not sure the authors appreciate the impact of their own invention. This isn't a camera that censors things: it's a passive image segmentation model that runs in real time and consumes zero power. This would have huge implications for robotics applications.
> Since the characteristic information of undesired classes of objects is all-optically erased at the camera output through light diffraction, this AI-designed camera never records their direct images. Therefore, the protection of privacy is maximized since an adversarial attack that has access to the recorded images of this camera cannot bring the information back. This feature can also reduce cameras’ data storage and transmission load since the images of undesired objects are not recorded.
That seems overstated. In the third example image pair, I can easily see a shadow of the input 5 in the output. I'm pretty sure the 9 is also there in the fourth pair, but the shadow is not as clear.
Get the digital equivalent of fnord in optical algorithms, feel free to rob/~murder~ assassinate with no evidence. Bonus points for when implants become widespread, then people won't be able to see you either!
What guarantees do you have that information doesn't still bleed through -- e.g. that compressed sensing techniques could still recreate meaningful parts of the obscured image?
This was not designed by an "AI", it was designed through gradient descent optimization. It is an interesting application but it has nothing to do with AI.
The real problem is that humans have always deluded themselves that some technology was a 'truth technology'. It's been done with everything from typewriters to cameras.
However, the camera has always lied, always rendered a counterfactual to a hypothesized truth state, denied access to fundamental reality.
That a camera may now lie in a slightly different fashion does not alter that.
The camera tells truth better than our eyes do. It can't represent color or especially dynamic range that well, and framing has a way of bending truth, but our eyes are far worse.
Everything we think we see is biased by the viewer, subject to hundreds of unconcious filters and processing steps, and colored by our also imperfect memory. And then we're also biased to ignore all of that and think our vision and memories are objective and faultless truth.
The proofs of this are everywhere, but the most obvious are optical illusions, youtube videos on visual attention, studies on unreliability of witness statements, etc.
About 6 years ago I sat on a jury. I was told by the defense that the plaintiff, who was suing for being critically injured in a workplace scenario, was overstating his injuries. They showed evidence that saw the plaintiff washing his car. The defense pointed out that there were timestamps on those car washing videos and each took place over 4 hours because he had to rest due to the pain from his sustained injuries. Aside from the facts of the case which were clearly in favor of the plaintiff, this attempt at deception pushed the jury to award more money than it likely would have otherwise.
Now that storytime is out of the way, this particular AI reminds me of a photo taken at a lynching.
If you obscure the top half of this photo from view, how does that change your perspective regarding what is going on at the time? IMO, recordings need to record what is, not what we believe we want to see.
On the flip side, it was possible to make a lynching look like a harmless social gathering with the technology of the era: point the camera selectively or take a pair of scissors to the image. Loss of timestamps is easily achieved by using any recorded media without timestamps, or any third-rate video editing software. Beria was airbrushed out of reproductions of photos that had circulated for years before he fell out of favour, but the revised images look convincing enough in the absence of that context. Cameras have never given a complete picture of everything going on (and people start worrying about panopticons with proposals that fall well short of that).
Anybody that wants to show only what they want to see can choose whether or not to record or edit the recording after the fact. The actual use cases for tech that can decide whether something is worth recording in real time are likely to be comparatively mundane...
Reminds me of this really cool video about using Fourier Optics for optical pattern recognition.[1] The video happens to have one of the best explanations of Fourier transforms I've yet encountered.
> AI-based camera design was also used to build encryption cameras, providing an additional layer of security and privacy protection. Such an encryption camera, designed using AI-optimized diffractive layers, optically performs a selected linear transformation,
Differentiable schemes do not generally make for secure cryptography.
Given the rise of UFO / UAP sightings I always wondered why there wasn't just an army of cameras pointed at strategic regions around the globe 24/7 (that aren't government owned). A camera like this would be great for catching only what's really interesting.
Wouldn’t the underlying data that the camera parsed to determine what to record still be recorded or be otherwise retrievable somewhere? Meaning everything is still recorded in some way?
Isn't "erased" a bit misleading from the paper? I understand that the camera does not see anything but objects of interest in which it was trained and manufactured.
This is not going to turn out well. Do we really want to edit reality in this way? This is like the printer that automatically watermarks your prints - for your security and protection! Coming to a child protection law near you real soon.
Want to take a picture of that Ferrari? That’ll be an extra $5.
No, you really can’t take photos in airports.
Thats a police officer(s). (Literally) Nothing to see here, move along.
Vampires/Ghosts. A class of people who’s faces are in a master redact database. You know, like some real CIA Jason Borne stuff.
Military installation? What military installation? Replace with slave labor camp or, a more economically favorable rendition - “sweatshop.”
I was just told by Amazon that I can't sell a used book there because the publisher, Morgan Kaufmann, is currently not accepting applications to sell and denied my request to sell.
I was pretty stupefied. Amazon and the publisher have colluded, for whatever reason, to police how used books are sold.
Who knows? Although, I actually bought this particular book off of Amazon.
In this case, after some research, I actually think this is related to the book sometimes being used as a textbook. I wouldn't really call it a textbook, as in the normal 30 editions type of textbooks, as it's just a really good book on its topic (and only the second edition, which is the latest). Apparently publishers want to funnel the used sale of such books through certain approved sellers? I imagine it's to keep the price artificially high and for the publisher to recover some of the money back themselves. Seems ridiculous, and it would be surprising if it's legal. It's basically a racket.
You have to to sell in their platform. When I tried to add it to my Amazon Seller inventory, it required approval. Selling on Abebooks (owned by Amazon) requires a subscription now.
> This is like the printer that automatically watermarks your prints - for your security and protection! Coming to a child protection law near you real soon.
You mean like most (or maybe all) color laser printers sold in the last couple decades?
It's already happening with AR; there was a demo on Twitter showing how using Apple's new AR SDK you can just plaster over things you don't want to see. This for me puts AR right up there with AI as a huge risk to society, for precisely the reasons you point out. "Pay $9.99 a month not to see homeless people" "Pay $2.99 a month to see enhanced street signs so only you can find your way quickly" etc
Better hope that's context-sensitive. Street meth addicts commit enough random assaults and smash enough car windows now, just imagine what they would do if they were literally invisible.
You're naive if you think this way. You're not forced to use AR, yet. That's different to never will be forced to.
Smartphones are already all but mandatory for certain locations/demographies. Not owning one carries an immense penalty when it comes to access to government services, banking, and general daily functioning.
Such technology could also represent a huge improvement in quality of life. Imagine augmented reality glasses with built-in ad blockers.
It all depends on who's in control of the device. It'll be great if we're in control and the technology is used to empower us. It'll suck if the corporations and governments are in control and the technology is used to manipulate public perception and extract revenue.
We have enough of people acting like their own experiences are indicative of the norm or are evidence of something happening or not happening, and using that to spread that message, that making this even easier seems like it would be a real problem.
Are news bubbles a problem? Imagine if people actually block out reality on an even more direct level and what that means to their perception of the world. What if people can opt into trusty AR programs to "show" them the stuff in the world they're missing (the conservative conspiracy or liberal agenda), and those also selectively omit some other things?
I'm not sure who you're arguing against. Where did I say we shouldn't allow AR or this technology? I stated why it might be a problem (which is what you asked for), and why we should possibly pay attention to the negative consequences that might follow.
I mean, maybe some people are homeless by choice, but some are due to misfortune and poverty. Using technology to turn a blind eye to poverty in our communities seems bad. Also you may trip over a homeless person.
It would defeat the entire point of having them: As an example why to obey. It is not like you cant scale a tax with housing requirements. Could give them jobs too. It would take a bit of getting used to but if the only thing a person wants is drugs their potential productivity could be 10x that of ours combined.
You may trip over a homeless person if you can't see them.
Also papering over social problems means those social problems will bite you in the ass eventually. Never underestimate the power of a bunch of desperate people.
If you don't want a TV that doesn't track your viewing habits, force automatic software updates, show ads, and do other objectionable things while claiming it's "smart", don't buy one.
Well yeah, those TVs are cheap because they use crappy panels and get subsidized by ads. If you just want a cheap panel and driver, you can message a wholesaler on panelook and buy them for like $200 plus some change for a sound bar.
A fast 4K monitor costs ~$700 because it’s a much better product.
The example of yellow tracking dots in printers has already been mentioned. Our governments had zero problems mandating the use of that specific technology. Same with kill switches in cars so that police can remotely disable your vehicle.
Maybe we could have laws that protect us from these kind of cameras instead of enforcing them. I'm against saying "No you can't do this" but I'm all for "You must show us how you do this" or "This thing must be optional".
It's moving it away from your control. Right now we have the option to edit the images and videos we capture. This kind of technology allows those same choices to be made by someone else without any regard for your wishes. Your options can be limited to editing only what they allow to be captured in the first place.
Would a government even want that? Not even from the moral stance, just the strategic point of view. We know from the social media experiment how bad things can go if you hook an entire population to some "product".
Sure, you can make (even though incentives) everyone wear AR glasses (because they're so cool), and they'll censor out undesired things. That's as much a form of control as it is a form of being controlled. Hooking your entire population's vision to the internet means it could possibly be maliciously used by bad actors.
It’s not that sophisticated and is a physical artifact that had to be forged for some purpose. Effectively it’s like having a zero power neural network. You could make something like a motion sensor that only spots human faces, but very low power.