The computational photography feature Deep Fusion did not ship working on the new iPhone 11 phones but is now available in the iOS 13.2 public beta (via The Verge). The feature is the application computational photography, where software helps to improve image quality.
When Apple announced the iPhone 11 and iPhone 11 Pro on stage last month, it showed off the cameras and rightfully so. With two lenses on the 11 and three on the 11 Pro, iPhone owners are finally seeing what some of the Android world has had for a couple of years with an ultra-wide option.
Night mode on the iPhone 11 is also drastically improved. But one feature that Phil Schiller got really excited about is what Apple calls Deep Fusion, possible one of the most Apple names for something, like Retina, True Tone and AirPort.
It’s something Google did very well on the Pixel phones, but Apple is catching up. Interestingly, Deep Fusion is not user accessible but is simply added to the camera app. It will know when to apply its magic – very Apple.
It’ll use AI and software to product the best possible shots where light is lower but is separate from Night Mode. We can’t wait to try it out – our best camera feature will be due an update.
When you purchase through links in our articles, we may earn a small commission. This doesn't affect our editorial independence.