Apple Is Prepping a New ‘World-Facing’ 3D Camera for This Year’s iPhone

iPhone 12 Concept Image Credit: PhoneArena
Text Size
- +

Toggle Dark Mode

Ever since Apple launched its TrueDepth camera system on the iPhone X back in 2017, we’ve been expecting the company to also up its game with the rear camera by adding similar depth-sensing technologies that would allow it to take even more cool and creative photos.

For example, the infrared sensors on the TrueDepth camera have allowed Apple to enable Portrait Mode for selfies on all iPhone X and iPhone 11 models — a feature that requires at least a dual-lens camera system on the rear to offer the same capabilities (although the iPhone XR can take Portrait Mode photos with its single lens, it uses machine learning facial recognition and therefore doesn’t work with pets or other objects).

While the current infrared sensors work just fine for the shorter ranges of a front selfie camera, the general consensus is that Apple would have to use a much better 3D laser camera system to enable the same capabilities over longer distances, but we first heard that Apple was working on exactly this back in late 2017, right after the debut of the original iPhone X.

In fact, at that point, many analysts expected that it would come by the end of 2019, but while Apple boosted the iPhone 11 Pro Max to a triple-lens camera system, the rumoured 3D Time-of-Flight (ToF) sensor was nowhere to be found. Still, even before the debut of the iPhone 11 lineup, the writing was on the wall that Apple was working on bringing it to the 2020 iPhone, asking its supply chain to provide the necessary sensors as far back as last summer.

Now Fast Company has weighed in with even more evidence that the new camera system could be arriving this fall, in what it refers to as a “world facing” 3D camera. According to the report, Apple has already pegged San Jose-based Lumentum to provide the VCSEL laser technology components, which makes sense since Lumentum has already been providing the sensors for the TrueDepth camera for over three years now.

What Apple Could Do With It

As the report notes, Apple isn’t going to be the first smartphone maker out of the gate with this feature, as Samsung added it to last year’s Galaxy Note 10+ and its now a standard feature on the recently unveiled Galaxy S20+ and Galaxy S20 Ultra, but as Fast Company suggests, it’s very likely that Apple will take the technology up to the next level thanks to the more advanced machine learning capabilities offered by the Neural Engine in its A-series chips.

Apple may find some new and novel ways to leverage the technology for new user experiences. And it will likely be a bit more showy in the way it brands and markets those experiences, if history is a guide.

After all, Apple wasn’t the first smartphone maker to offer Night Mode in its camera system either, but the software-based computational photography feature has soundly beat out its rivals by offering better results overall when it comes to things like balancing out the exposure and colour of photos taken in extremely low light conditions.

On the surface, a 3D camera system could definitely be used to enhance the iPhone’s Portrait Mode, offering better-looking bokeh effects, but this seems like a rather pedestrian use for something that the iPhone 11 is already really good at, thanks to its multi-lens camera system. Plus, it would be seriously underutilizing the power that’s offered by the 3D mapping technology.

For example, one scenario that Fast Company points out from a strictly photographic perspective would be the ability to disconnect foreground and background images when shooting videos.

Imagine a video where a skateboarder in the middle of a jump is disconnected from the background, rendered in full 3D and slow motion.

However, the real power behind a 3D ToF camera system will likely be in enhancing the kind of augmented reality experiences that Apple can offer through the iPhone camera. Placing virtual objects in real-world spaces is one of the biggest applications of AR, but thus far all of Apple’s ARKit frameworks are making the best guesses they can from using machine learning to analyze what’s coming into the camera lenses. An actual laser depth sensor would allow pinpoint accuracy in determining how far away real-world objects are like walls, floors, light fixtures, and furniture.

This would not only empower practical apps like Ikea to give you a better idea of how furniture will look in your living room, but could be used for a wide variety of fun and whimsical AR experiences, from placing holograms and of course AR gaming applications, where the new level of precision should make the results look considerably more realistic than they have thus far.

[The information provided in this article has NOT been confirmed by Apple and may be speculation. Provided details may not be factual. Take all rumors, tech or otherwise, with a grain of salt.]

Sponsored
Social Sharing