How iPhone 11 Pro Max Deep Fusion works in iOS 13.2

Imran Hussain
Deep Fusion iPhone 11 Pro Max

Deep Fusion is Apple's computational photography mode, exclusive for iPhone 11 Pro and its Max variant. It uses artificial intelligence for pixel-by-pixel processing of photos in real-time, identifying subjects and scenery to improve details. This is possible thanks to the powerful new A13 Bionic chip and its Neural Engine. Here are details on how it works and how it is different than Smart HDR.

Related StoryUzair Ghani
Download iOS 13.7 Final for iPhone and iPad with COVID-19 Exposure Notifications Without App

Deep Fusion on iPhone 11 Pro Max

Apple had announced Deep Fusion during their iPhone 11 Pro launch event. Phil Schiller spoke about the feature on stage, calling it 'computational photography made science' and demonstrated the output by showing a photo of a man wearing a sweater. Zooming on the sweater showed an incredible amount of detail, which is not something you expect from a smartphone to capture.

If you have heard of Smart HDR, iPhone 11 Pro Max' Deep Fusion will sound very familiar. Like Smart HDR, Deep Fusion also takes multiple photos, with a varying degree of exposures. Here is how Deep Fusion uses these photos.

  • Before a photo is taken, 4 short exposure frames and 4 secondary frames are captured in the buffer.
  • When the shutter button is pressed, 1 long exposure image is captured.
  • Using machine learning, the Neural Engine combines the low exposure frames with the most details, with the long exposure image. It goes through the resulting image by analyzing each individual pixel from the 24 million pixels captured. Different types of processing are applied to subjects and backgrounds like skin, hair, sky etc. The image is sharpened and more detail is exposed by utilizing the best parts of the different captured frames. Tone, color and other details like highlights and shadows are adjusted accordingly. The final image is optimized for low noise and detail. All this is done by using 4 neural networks.

Even though all this happens in real-time, it takes a second to process the final photo. Photos captured using this technique also have a larger file size than a photo captured normally.

On the other hand, Smart HDR combines all the frames that have been shot before and after the shutter is pressed instead of just two frames. Noise processing is also applied differently on Smart HDR photos since it works in sufficiently bright conditions.

Which iPhone 11 Pro lens work with Deep Fusion?

To use Deep Fusion, you must capture a photo in mid to low light conditions. If the conditions are too bright enough, it will not work and Smart HDR will be used instead. If the light is too low, Night Mode will automatically kick in. From the three lens that iPhone Pro 11 and its larger Max variant have, this is how Deep Fusion, Smart HDR and Night Mode work across them:

  • Standard wide lens
    • Supports Deep Fusion for low to medium light conditions
    • Supports Smart HDR for bright conditions
    • Supports Night Mode for very low light conditions (below 10 lux)
  • Telephoto lens
    • Supports Deep Fusion for low to medium light conditions
    • Supports Smart HDR for bright conditions
    • No Night Mode support. The 2x zoom in Night Mode doesn't use the telephoto lens. The zoom is digital not optical.)
  • Ultra-wide lens
    • Does not support Deep Fusion
    • Supports Smart HDR for bright conditions
    • No Night Mode support

To use Deep Fusion, make sure that 'Photos Capture Outside the Frame' option is disabled in Settings > Camera.

Here are some of the best Deep Fusion photo comparisons from around the web, which showcase how effective Apple's new computational photography feature is.

Deep Fusion will be released for iPhone 11 Pro Max through iOS 13.2 software update later this year.

Share this story