Oh interesting! This seems like it would be a good fit ;)
Especially for really old setups that had RGB color wheels and multiple exposures, exactly like a multispectral astro image might. Phase one also has a multispectral capture system for cultural heritage, which just shoots individual IIQs to my knowledge… It would work great too for multiple pixel shift shots.
Possibly, the engineers just didn’t know about it when they were asked to write the firmware? It’s funny, I think most RAW formats are just weird TIFFs to some degree, so why not use this instead.
TIFF is almost an multidimensional array serialization format. Obviously is centered on images but it can have many layers. Usualy RGBA but they can be have other interpretations. It supports some level of streamed writting and random access over HTTP or other ranged protocols.
> Possibly, the engineers just didn’t know about it when they were asked to write the firmware?
Considering how often I witnessed engineers trying to build something to solve a problem instead of sitting down and researching if someone else did that already, and likely better, I really wouldn’t be surprised if that is the answer to most questions in this thread.
Especially for really old setups that had RGB color wheels and multiple exposures, exactly like a multispectral astro image might. Phase one also has a multispectral capture system for cultural heritage, which just shoots individual IIQs to my knowledge… It would work great too for multiple pixel shift shots.
Possibly, the engineers just didn’t know about it when they were asked to write the firmware? It’s funny, I think most RAW formats are just weird TIFFs to some degree, so why not use this instead.