Although preorders for Apple’s new iPhone 11 and iPhone 11 Pro models have just begun, and the devices aren’t yet available to us mere mortals, this hasn’t stopped at least one person from getting her hands on one and sharing some initial examples of how well the new Night Mode holds up in the real world.
Apple of course provided its own example of Night Mode photography when it unveiled the iPhone 11 earlier this week, explaining how it will allow pictures to be taken in extremely low light conditions without the need to use a flash, and providing its own sample photos that are impressive, but have also presumably been taken under ideal conditions.
Welcome to the Real World
However, 9to5Mac has now discovered what may be the very first real-world image shot on the iPhone 11 using Night Mode, and it looks like it really will be everything that Apple is promising.
While not offering any explanation for how she got her hands on the new iPhone 11, Canadian model and agency founder Coco Rocha shared a comparison of photos shot from an iPhone 11 and an iPhone X on Wednesday night of a woman standing outdoors on an urban street corner. The images reveal that the iPhone’s ability to make extremely dark photos look normal is actually quite impressive, both for what it does and what it doesn’t do.https://twitter.com/cocorocha/status/1171957593998352384
Many of the more recent machine learning implementations of Night Mode photography have sometimes had a tendency to blow out the exposure on the photos a bit too much. In short, some of these night modes end up looking like a weird sort of faux daylight from a 1960’s movie set, rather than preserving the appropriate levels of highlight and shadow that make it clear you’re still shooting in a darker setting.
It seems like Apple’s implementation of Night Mode, on the other hand, will look significantly more natural, leaving you with photos that simply look like night-time photos with better lighting. In fact, without seeing the before-and-after comparisons, it shouldn’t be evident that any special computational photography was ever used. It’s also especially useful that Apple’s Night Mode promises to come on automatically whenever it’s needed, although of course by all reports you’ll still be able to override this behaviour.
How It Works
According to Apple, Night Mode uses advanced machine learning techniques to fuse together the best image possible from a series of multiple photos taken in rapid succession at different exposure levels. Optical image stabilization is used to steady the lens during the shooting process, and iOS 13 and the A13 Bionic’s Neural Engine analyzes the images, realigning them to correct for movement, throwing away blurry sections, and piecing together the sharpest portions.
The Night Mode processing then adjusts contrast to balance the image out, fine-tunes the colors for a more natural look, and applies noise reduction and enhances highlights and shadows. All of this happens in a fraction of a second, thanks to the power of the A13 chip.
While Apple’s new iPhone 11 Pro models will offer a triple-lens camera system, adding a 2X telephoto lens for better optical zoom, it appears that the new Night Mode and other computational photography features like Deep Fusion will be available on all iPhone 11 models, all of which are expected to begin arriving in stores and in customers’ hands next Friday.