Hi developers,
I am new to the iOS and image processing and have some questions.
I saw iOS 10 is going to support the RAW capture, and I wonder if this could make the real long exposure possible?
It looks like the long exposure camera apps in the app store (i.e. slow shutter cam) are doing post-processing based several frames that are not RAW data. Therefore, the image is going to be effect by the post-processing inherit from Apple's algortihm and the compression of JPEG format, which is the part we can't control before iOS 10.
As far as I know, RAW format is basicly everything that CMOS sensor captures (which I assumes to be RGB energy intensities). If we have the several frames of RAWs, I wonder if it is closer to have a real long exposure by summing the intensitiies as what we do in phyiscally long exposure?
Thanks.