Surprise iOS 13.1 Feature May Allow Exciting Video Invisibility Effect

Iphone 11 Credit: Moby Geek
Text Size
- +

Toggle Dark Mode

Apple’s first iOS 13.1 and iPadOS 13.1 betas, which were surprisingly released earlier this week, may have revealed upcoming advanced photo and video editing features.

Hidden away within the beta’s release notes is a new addition focused around video encoding. Specifically, the iOS and iPadOS betas introduce HEVC video encoding with alpha channel support.

That may not sound that exciting on its own. But the possibilities of that small addition are incredibly interesting.

Real-Time Video Editing

An alpha channel, in the most basic terms possible, is a color component of an image that represents how transparent or opaque it is. Using an alpha later allows for parts of an image or video to be made invisible, transparent or simply removed.

In the context of smartphones, adding encoding that supports alpha channels could allow future iPhones to edit video or images in a variety of ways on-the-fly.

That could mean that a user will be able to take the background of an image and replace it with something entirely different. You could swap out the wall of your bedroom with a tropical rainforest, for example.

Apple could obviously combine this technology with its current depth-mapping Portrait mode platform. A user wouldn’t need to specifically mark parts of an image as the background; the iPhone could seamlessly do that work for them.

More than that, all of this would simply be speculation if it weren’t for a quiet Apple acquisition made in 2018.

Last fall, Apple reportedly acquired a Danish augmented reality startup called Spektral. The firm was working on technology that uses machine learning and computer vision to allow real-time capture and editing of images — including swapping out the background of an image while leaving the subject in it.

At the time, Spektral’s acquisition was thought to be related to Apple’s AR efforts. And that’s likely to still be the case. But combined with background features like alpha channel support, it isn’t hard to see how Apple could implement Spektral’s technology in its iPhones and iPads using ARKit.

Apple’s Camera-Focused iPhones

Taken together with rumors about Apple’s upcoming iPhones, it’s clear that the Cupertino tech giant is continuing to invest heavily in both computational photography and augmented reality.

Upcoming iPhone 11 devices are expected to sport a triple-lens camera setup, along with a slew of computational photography features. While we haven’t heard much about Spektral-related technology, Apple could feasibly introduce features related to that acquisition in the next iPhones.

Future iPhones may even pack a time-of-flight laser camera that would be able to provide accurate 3D depth maps of an environment. That platform will likely significantly enhance both augmented reality and real-time video and photo editing features.

Sponsored
Social Sharing