When Apple introduced the new iPhone 11 generation, the new camera system was advertised with a function called “Deep Fusion.” Now the feature is expected to be released soon.
The Magazine The Verge claims to have learned that Apple has a previously unpublished feature for the iPhone 11 and iPhone 11 Pro in the next iOS 13 beta. It's called "Deep Fusion". The function is supposed to ensure even better shots in low or poor light by having the A13 Bionic chip further process the shot. The feature works in the background and significantly increases the image quality.
The Verge explains how Deep Fusion works as follows (quote):
- When you press the shutter button, the camera has already taken three images using a fast shutter speed to freeze the motion in the shot. When you press the shutter button, it takes three additional shots and then a longer exposure to capture details.
- These three regular shots and the long exposure are merged into what Apple calls a “synthetic long” – a big difference from Smart HDR.
- Deep Fusion selects the short exposure image down to the smallest detail and blends it with the synthetic long exposure - unlike Smart HDR, Deep Fusion only blends these two images - nothing more. These two images are also processed for noise differently than Smart HDR, which is better for Deep Fusion.
- The images go through four stages of detail processing, pixel by pixel, each one tailored to increasing levels of detail - skies and walls are at the lowest level, while skin, hair, fabrics and so on are at the highest level. This creates a series of weights for the blend of the two images - detail from one and tone, tone and brightness from the other.
- The final image is created.
Exactly which beta this is remains unclear for now - "Deep Fusion" will probably appear with the iOS 13.2 beta. The new developer preview should be released soon. (Photo by Vershinin89 / Bigstockphoto)