New iPad Pro Cameras May Lag Behind, But the LiDAR Scanner Is a Game Changer

iPad Pro LiDAR closeup Credit: Apple
Text Size
- +

Toggle Dark Mode

While the new iPad Pro with its LiDAR Scanner and dual-lens camera system is an impressive upgrade to the older 2018 iPad Pro in many ways, we’ve already observed that the cameras still lag behind the iPhone, and now a new in-depth analysis by the developers of the Halide Camera app has given us some more insight into exactly where the new iPad Pro’s camera stands in comparison to what Apple has packed into the iPhone 11.

The team behind Halide Camera are naturally photography experts in their own right, and in a new blog post they’ve taken a deep dive into exactly what the LiDAR Scanner and new cameras on the iPad Pro are actually capable of.

The Main (Wide Angle) Lens

Right off the bat, Halide’s Sebastiaan de With confirms what we already suspect from the spec sheets about the standard wide angle camera, stating that the sensor and lens is “either identical or almost identical to the last generation iPad Pro.” In fact, as de With notes, the 28mm wide-angle camera is similar to every camera ever used on an iPad, going right back to the iPad 2, describing it as “table stakes.”

de With did find that the 2020 iPad Pro offers some minor changes in processing, such as wider sensitivity to light, but suggests that these are likely purely software-based changes, especially since the iPad Pro’s chipset is essentially identical to the prior model.

Most notably, however, the wide angle lens is nowhere close to what the iPhone 11, or even the iPhone XS packed in, despite the similar 12MP resolution. Back in 2018, Apple raised the bar slightly by moving the iPhone XS and iPhone XR both to a slightly wider 26mm lens, which of course carried over to last year’s iPhone 11 and iPhone 11 Pro models.

If you need something to compare it to, it’s the iPhone 8 camera. Don’t expect parity with Apple’s latest iPhone 11 Pro shooters, but it’s still a great set of cameras.

Sebastian de With, Halide Designer

Basically, the iPad Pro still retains the same main camera that was found on the iPhone 8, although that’s certainly not a bad thing — this year’s iPhone SE uses the same camera, which it’s able to beef up thanks to the newer A13 chip. Similarly, the iPad Pro A12Z/A12X chips also offer photographic improvements over what the iPhone 8’s A11 was capable of.

The Ultra Wide

You could probably be forgiven for thinking that the addition of an ultra-wide camera lens would put the new iPad Pro on par with the dual-camera iPhone 11, but the spec sheets made it clear right from the outset that this really wasn’t the case. For one, the iPhone 11 (and iPhone 11 Pro) sport a 12MP shooter for the ultra-wide, while the new iPad Pro’s version only comes in at 10MP.

There’s also a 1mm focal length difference between the iPhone 11 and 2020 iPad Pro versions, although as de With points out, it’s not noticeable in daily usage. However, while all of Apple’s non-Pro iPads still only use 8MP sensors, this is simply a matter of the camera having not been upgraded in several years; the 10MP resolution here actually represents the lowest resolution on a “new” rear camera since the 2014 iPhone 6.

In short, the iPhone 11 and 11 Pro pack a significantly larger (and better) sensor with its wide-angle camera, compared to iPad. The ultra-wide sensor on iPhone is comparable to the ultra-wide on iPad in quality, but the iPad is lower resolution.

Sebastian de With, Halide Designer

Most significantly, however, de With notes that it seems the hardware “just isn’t there to support night mode, Deep Fusion, and even portrait mode” — even beyond the fact that the new iPad Pro doesn’t include the newer A13 chip, adding that the only way the iPhone XR was able to pull off single-camera Portrait Mode was due to its larger sensor, and “tapping into focus pixels.” (This may in turn suggest that, despite the spec sheets being the same, the new iPhone SE camera may also have a similarly larger sensor compared to the iPhone 8).

The LiDAR Scanner

Even though the camera’s may not be anything to write home about compared to Apple’s recent iPhones, there’s one huge addition to the 2020 iPad Pro that more than makes up for it: The new LiDAR Scanner.

We’ve already seen some impressive examples of what the new scanner will be able to do for Augmented Reality, but naturally de With is more concerned with answering the question of whether it can be used to power Portrait Mode and other photographic features.

Although Apple hasn’t included Portrait Mode in the iPad Pro, many have speculated that this is simply a software limitation, and if anybody can find a way to implement the feature in third-party software, it would be Halide, which famously expanded Portrait Mode on the iPhone XR to be able to handle non-human subjects.

Unfortunately, de With notes that it’s unlikely they’ll be able to work the same magic with the 2020 iPad Pro, revealing that Apple probably left out Portrait Mode for bigger reasons than simply lacking the engineering resources to finish it in time.

The problem, de With notes, is that the depth data that’s captured by the LiDAR scanner simply isn’t high resolution enough. Referring to iFixit’s teardown as an illustration, the Halide designer explains that the dot pattern projected by the LiDAR Scanner, which is attempting to measure room-sized objects, doesn’t have the resolution to accurately measure a face the way that the True Depth camera does on the front of the iPhone and iPad.

That said, de With stops short of saying it’s impossible, noting that “machine learning is amazing” and pointing to the fact that the single-lens iPhone XR was able to produce Portrait Mode photos with very rough depth data. However, de With suggests that Apple is more likely to simply use the dual cameras and ignore the LiDAR Scanner if putting Portrait Mode on the iPad Pro was actually a priority. Which it’s probably not.

What Else Can LiDAR Do?

Halide’s developers are still exploring other ways to take advantage of the LiDAR Scanner for photography, but de With notes that right now there are no APIs that let them directly access the underlying depth data. The only access to the LiDAR Scanner at this point is through the ARKit 3.5 framework, which is of course designed to deliver processed 3D models for augmented reality, not expose the details of what’s being captured by the LiDAR Scanner directly.

However, by re-thinking photographic capture, the Halide team built a proof-of-concept app that they’re calling Esper which experiments with realtime 3D Capture by combining data from the camera and LiDAR sensor on a larger scale. It’s a solution that’s designed to capture a space like a room, rather than taking photos of individual people or pets. The smallest object they were able to make the LiDAR sensor work with in this context was a chair.

The end result is essentially an extension of augmented reality, but in reverse. Instead of placing an object within a 3D space, Esper captures a visual model of a 3D space that real-world objects can be placed in. For example, you can capture a room, complete with depth data, and then allow real objects and people to move around within it.

As de With notes, it’s not “traditional” photography, but it’s a good example of how the LiDAR sensor will open the door to even more powerful and creative applications and allow to reconsider how we perceive and capture the world around us.

Halide’s full blog post is worth a read if you want to see some more examples, and get into some deeper technical details, and the team also goes on to answer some questions from readers about how it all fits together.

Sponsored
Social Sharing