Multi-Camera Support in iOS 13 Means Third-Party Apps Can Get More Creative

Iphone Xi Iphone 11 Concept Render Credit: Input
Text Size
- +

Toggle Dark Mode

Although Apple showed us a huge list of upcoming iOS 13 features during Monday’s WWDC Keynote — so many in fact that it could barely talk about them all on stage — we often hear about even more new stuff once developers get into the actual sessions with Apple engineers.

This is especially true for aspects of iOS that provide new under-the-hood features for developers to take advantage of. This stuff is rarely splashy enough to show off on stage, but it can make a huge difference in terms of what can be offered by third-party apps, which are basically at the mercy of what Apple lets them do with its operating system.

One such interesting feature that came to light is that iOS 13 will now support multi-camera capture, allowing apps to pull in photos and videos from multiple cameras on the iPhone XS, iPhone XS Max, iPhone XR, and the latest iPad Pro. 9to5Mac, which first reported on the session, offers some insights into how this will work and the practical ways in which developers may be able to use it.

The most obvious application is picture-in-picture support, which was briefly shown during Apple’s presentation, allowing a video recording app to simultaneously record the user’s face from the front TrueDepth camera while recording a full video from the main camera on the back. What was even more interesting was that the video capture framework was shown recording and storing video data from both streams in such a way as to make it possible to switch between the two cameras during playback in th iOS Photos app.

The new system also gives developers a great deal of control over the cameras, and even allows them to capture separate streams from both of the rear cameras — something that will likely be even cooler when Apple rolls out the expected triple-lens camera system on its new iPhones later this year, which could theoretically allow developers to capture up to four distinct video streams. What they choose to do with these is another matter, of course.

The ability to simultaneously capture from the front and back cameras has obvious user-facing benefits, as already shown, but capturing multiple streams from the back cameras will likely be used to empower more advanced photography and videography apps that can use their own algorithms to analyze and consolidate these images for more creative results that go beyond what’s baked into Apple’s hardware, or simply offer more granular photographic control of things like aperture and exposure.

The new API will also allow for the simultaneous capture of metadata from each camera as well as using multiple microphones for depth. That said, the new feature isn’t without its limitations — apps won’t be recording 4K 60fps video from multiple cameras simultaneously, and it won’t be possible to do multiple sessions with multiple cameras or multiple cameras in multiple apps at the same time. However, as Apple’s iPhone and iPad devices increase in power, it’s safe to assume that these limitations will also decrease; as it stands the feature is limited to the 2018 iPhone and iPad Pro lineup likely because Apple’s A12 Bionic chip is the first CPU that can even begin to handle this.

Again, most of this will likely not mean much in terms of specific features the end user will see, but it opens up huge creative potential for developers, and it’s almost certain that we’re going to see a new wave of intelligent photography apps coming out on the heels of iOS 13’s release later this year.

[The information provided in this article has NOT been confirmed by Apple and may be speculation. Provided details may not be factual. Take all rumors, tech or otherwise, with a grain of salt.]

Sponsored
Social Sharing