Snaps speed? How about you actually use the camera on android phones?? snapchat doesn't even USE the camera on Android phones. Instead, it takes a screenshot of the screen, resulting ina terrible quality. It's really embarrassing that they're still doing this for a company this big...
Former Snap employee here (albeit 3 years ago). They care a lot about their camera, particularly speed. They do, in fact, use the "proper" camera APIs on the devices where it's fast enough, and they'll hide the latency by using a screenshot first and then subbing in the higher quality image later. But at the time this was only done for whitelisted devices that could do this operation fast enough. A lot of devices have very slow cameras.
You might disagree with how they weigh speed vs quality, but I assure you they take it very seriously.
Thanks for the explanation. My Huawei had the best camera of the market when it was released, but it is obviously not whitelisted. I guess it's difficult to test a Huawei in USA.
What does “best camera” mean though? Sounds like the poster is talking more about latency and speed than quality, which may be more about the CPU/OS optimization than the camera image quality.
The reason is fairly simple. Most phones have a slight to extreme (on the order of seconds) delay taking photos. Snaps are about moments. Instagram takes an actual photo and it often results in much worse photos (the picture quality is better, the moment often missed).
I'm not sure about this, but I think I remember Snapchat even having implemented actual photo taking and being frustrated about it. Of course a reasonable compromise would be letting users choose, but it seems like that's just not trendy enough.
A lot of android phones have a manufacturer provided camera app which has special abilities - for example, it uses private API's to adjust focus, do zero shutter delay capturing, select which camera to use on 'quad camera' phones, do 'auto expose on faces in the scene', do various HDR in signal processing hardware that regular apps don't have access to, etc.
Any app which just uses the android camera API doesn't have access to this stuff - and usually ends up with very poor quality images as a result. On my phone, apps that use the android API can't adjust focus at all, which makes even things like a barcode scanning app useless. Things that say "please take a photo of your ID card in this box" are unusable!
Which device is this? Focus control has been a pretty basic API element since v1. Unless the device is reporting that it has a fixed lens erroneously, that is super weird.
Having said that, the Camera subsystem is most certainly subpar on Android. It is crash prone, only recovering after a full system reset, which makes developing anything novel a giant pain in the ass. It also is full of random bullshit even if you're using Google's anointed, in-house devices. I remember an update suddenly randomly yielding frame rate capability values multiplied by 1000, chaos monkey style. Code ends up being littered with "If this bullshit device on this bullshit version, then...". It's a gigantic waste of time.
And then as Google started trying to differentiate their own devices with exclusive imaging features, things really started going off the rails.
So in such cases, why not just use a camera activity or at least expose a configurable choice for which experience the android user prefers? It is unfortunate that the camera API clearly needs more work though.
On the one hand, this seems objectively bonkers, but on the other hand, I can kind of see how it'd make sense. If everybody using Snap uses a phone, and all the phones have more or less similar resolutions, and Snap doesn't allow any sort of zooming, then really the pixels on the original screen are exactly what you need, so why bother taking a new, different picture?
I'm pretty sure I can answer that question (because using a camera as a camera gets you a better picture, because some other phones that will view your photo will have higher resolution, etc), but I do appreciate a clever hack. I'd love to know the actual reasoning behind this. Maybe there was some weird permissions issue between displaying what the camera sees and "taking photos" that affected some users at some point?
> I'd love to know the actual reasoning behind this. Maybe there was some weird permissions issue between displaying what the camera sees and "taking photos" that affected some users at some point?
My bet is on that it's simply faster to take a screenshot of the viewfinder of the camera than using the camera to take a photo. Open up the camera already has some delay, taking photos introduces more delay before you get the actual bytes to do the processing. Taking a screenshot tends to be instant though, so in order to take photos the fastest, you can take a screenshot of the viewport.
Although this wouldn't work for videos, which I'm sure they are using the actual camera to do, instead of taking ~25/30 screenshots per second.
>videos, which I'm sure they are using the actual camera
This is why when you start recording a video on Snapchat, there's a delay before the recording starts, and usually also a short delay before audio recording starts (especially noticeable if you have audio playing from your phone at the same time)
It definitely made sense because Snapchat is only used for ephemeral photos, and taking a "proper" photo is extremely slow on many Android phones (or at least it was until recently).
That said they should have just implemented both solutions and let the user pick.