> It's important to note that recent iPhones contain the same M1 chip as last year's MacBooks. They are not limited by processing power.
Correct me if I'm wrong but there's still a huge issue with silicon processing capabilities/power when it comes to image sensors (which is why camera sensors buffer images between shots). I unfortunately don't remember the context but it said something that it was (very far) from possible to actually get all the sensor data into the processor and as a result you only take a small fraction of the actual incident light/photos practically.
This isn’t true, really: there are massive amounts of data being pushed about in a modern camera, through custom image pipelines that will do many gigabytes per second of throughout, but none of those are really bottlenecks for the sensor data per se.
Most of it ends up in a RAM-like write buffer, because SD cards, or even faster formats like CF-express, can’t keep up with the write speed of tens of 50 megapixel shots coming through every second.
There are sensor readout speed limits, which is why you don’t see cameras exceed 30 frames per second of 8k recording, but there’s no reason why you couldn’t read out the entire full-well capacity of the sensor each of those frames.
Correct me if I'm wrong but there's still a huge issue with silicon processing capabilities/power when it comes to image sensors (which is why camera sensors buffer images between shots). I unfortunately don't remember the context but it said something that it was (very far) from possible to actually get all the sensor data into the processor and as a result you only take a small fraction of the actual incident light/photos practically.