Hacker News new | past | comments | ask | show | jobs | submit login

Snaps speed? How about you actually use the camera on android phones?? snapchat doesn't even USE the camera on Android phones. Instead, it takes a screenshot of the screen, resulting ina terrible quality. It's really embarrassing that they're still doing this for a company this big...



Former Snap employee here (albeit 3 years ago). They care a lot about their camera, particularly speed. They do, in fact, use the "proper" camera APIs on the devices where it's fast enough, and they'll hide the latency by using a screenshot first and then subbing in the higher quality image later. But at the time this was only done for whitelisted devices that could do this operation fast enough. A lot of devices have very slow cameras.

You might disagree with how they weigh speed vs quality, but I assure you they take it very seriously.


Thanks for the explanation. My Huawei had the best camera of the market when it was released, but it is obviously not whitelisted. I guess it's difficult to test a Huawei in USA.


What does “best camera” mean though? Sounds like the poster is talking more about latency and speed than quality, which may be more about the CPU/OS optimization than the camera image quality.


What do you mean with slow camera's? The people I hear the most, are people with high-end phones complaining about this.

So that means, that high-end phones aren't that fast after all? (Because what is high-end actually? Price, performance, both?)

Maybe latency, opening the camera? Or getting image itself?


Just out of curiosity is the whitelist public?


The reason is fairly simple. Most phones have a slight to extreme (on the order of seconds) delay taking photos. Snaps are about moments. Instagram takes an actual photo and it often results in much worse photos (the picture quality is better, the moment often missed).

I'm not sure about this, but I think I remember Snapchat even having implemented actual photo taking and being frustrated about it. Of course a reasonable compromise would be letting users choose, but it seems like that's just not trendy enough.


A lot of android phones have a manufacturer provided camera app which has special abilities - for example, it uses private API's to adjust focus, do zero shutter delay capturing, select which camera to use on 'quad camera' phones, do 'auto expose on faces in the scene', do various HDR in signal processing hardware that regular apps don't have access to, etc.

Any app which just uses the android camera API doesn't have access to this stuff - and usually ends up with very poor quality images as a result. On my phone, apps that use the android API can't adjust focus at all, which makes even things like a barcode scanning app useless. Things that say "please take a photo of your ID card in this box" are unusable!


Which device is this? Focus control has been a pretty basic API element since v1. Unless the device is reporting that it has a fixed lens erroneously, that is super weird.

Having said that, the Camera subsystem is most certainly subpar on Android. It is crash prone, only recovering after a full system reset, which makes developing anything novel a giant pain in the ass. It also is full of random bullshit even if you're using Google's anointed, in-house devices. I remember an update suddenly randomly yielding frame rate capability values multiplied by 1000, chaos monkey style. Code ends up being littered with "If this bullshit device on this bullshit version, then...". It's a gigantic waste of time.

And then as Google started trying to differentiate their own devices with exclusive imaging features, things really started going off the rails.


So in such cases, why not just use a camera activity or at least expose a configurable choice for which experience the android user prefers? It is unfortunate that the camera API clearly needs more work though.


On the one hand, this seems objectively bonkers, but on the other hand, I can kind of see how it'd make sense. If everybody using Snap uses a phone, and all the phones have more or less similar resolutions, and Snap doesn't allow any sort of zooming, then really the pixels on the original screen are exactly what you need, so why bother taking a new, different picture?

I'm pretty sure I can answer that question (because using a camera as a camera gets you a better picture, because some other phones that will view your photo will have higher resolution, etc), but I do appreciate a clever hack. I'd love to know the actual reasoning behind this. Maybe there was some weird permissions issue between displaying what the camera sees and "taking photos" that affected some users at some point?


> I'd love to know the actual reasoning behind this. Maybe there was some weird permissions issue between displaying what the camera sees and "taking photos" that affected some users at some point?

My bet is on that it's simply faster to take a screenshot of the viewfinder of the camera than using the camera to take a photo. Open up the camera already has some delay, taking photos introduces more delay before you get the actual bytes to do the processing. Taking a screenshot tends to be instant though, so in order to take photos the fastest, you can take a screenshot of the viewport.

Although this wouldn't work for videos, which I'm sure they are using the actual camera to do, instead of taking ~25/30 screenshots per second.


And my bet will be that they wanted to keep user's expectation of what he/she is sharing vs what actually is getting shared as close as possible.

A simple user believes he is sharing a screen resolution image and there's no hidden details in it.


>videos, which I'm sure they are using the actual camera

This is why when you start recording a video on Snapchat, there's a delay before the recording starts, and usually also a short delay before audio recording starts (especially noticeable if you have audio playing from your phone at the same time)


Surely snaps are not about quality but being able to quickly capture an image, cross device?

Plus what with people not being able to zoom in, what's the problem?


"Snap.. a camera company". Hilarious.


That was my first impression also.


Wait, can you explain this to me?


Snapchat "takes a picture" by just grabbing a frame from the viewfinder instead of, you know, using the proper API.

This could have made sense back when Android imaging situation was more of a mess, but the current API obviates the need for this.


It definitely made sense because Snapchat is only used for ephemeral photos, and taking a "proper" photo is extremely slow on many Android phones (or at least it was until recently).

That said they should have just implemented both solutions and let the user pick.


It's likely just a case of "it it ain't broken, there's no budget"


It is pretty broken though, since the image looks terrible because it skips a lot of the image processing steps.





Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: