Click to Skip Ad
Closing in...

Apple just released iOS 13.2 beta 1 with support for Deep Fusion camera tech

Published Oct 2nd, 2019 1:04PM EDT
iOS 13.2 Beta 1 Release
Image: Apple

If you buy through a BGR link, we may earn an affiliate commission, helping support our expert product labs.

Without question, the biggest selling point of Apple’s new iPhone 11 lineup revolves around the device’s new and advanced cameras. More specifically, the main attraction of Apple’s new iPhone 11 cameras is the Night mode feature that allows users to take incredibly stunning and crisp photos even in low-light environments.

While the early reviews for camera performance on the iPhone 11 were overwhelmingly positive, one area where reviewers found room for improvement centered on photos taken in medium-light environments. Not to worry: Apple has a new image processing framework dubbed Deep Fusion designed to fix that. Even better, Apple just released a new iOS beta with Deep Fusion. So if you want to take advantage of even more impressive camera performance, and the usual assortment of security and bug fixes, you’ll want to download the new update right away.

Notably, Apple’s Deep Fusion technology isn’t a feature that can be toggled on and off. On the contrary, the iPhone will actively detect the lighting conditions in a given environment and will put Deep Fusion to work automatically, if need be.

Deep Fusion, Apple notes, “is a new image processing system enabled by the Neural Engine of A13 Bionic. Deep Fusion uses advanced machine learning to do pixel-by-pixel processing of photos, optimizing for texture, details and noise in every part of the photo.”

As to how it all works, The Verge explains:

  1. By the time you press the shutter button, the camera has already grabbed three frames at a fast shutter speed to freeze motion in the shot. When you press the shutter, it takes three additional shots and then one longer-exposure shot to capture detail.
  2. Those three regular shots and long-exposure shot are merged into what Apple calls a “synthetic long.” This is a major difference from Smart HDR.
  3. Deep Fusion picks the short-exposure image with the most detail and merges it with the synthetic long exposure. Unlike Smart HDR, Deep Fusion only merges these two frames, not more. These two images are also processed for noise differently than Smart HDR, in a way that’s better for Deep Fusion.
  4. The images are run through four detail processing steps, pixel by pixel, each tailored to increasing amounts of detail — the sky and walls are in the lowest band, while skin, hair, fabrics, and so on are the highest level. This generates a series of weightings for how to blend the two images — taking detail from one and tone, color, and luminance from the other.
  5. The final image is generated.

To download the latest beta, you can hop over to Settings > General > Software Update on your iOS device and then hit “Download and Install” located towards the bottom of the page. Per usual, you’ll want to make sure that your device is fully backed up before you install a new iOS update of any kind.

Yoni Heisler Contributing Writer

Yoni Heisler has been writing about Apple and the tech industry at large with over 15 years of experience. A life long expert Mac user and Apple expert, his writing has appeared in Edible Apple, Network World, MacLife, Macworld UK, and TUAW.

When not analyzing the latest happenings with Apple, Yoni enjoys catching Improv shows in Chicago, playing soccer, and cultivating new TV show addictions.