Of all of the new features Apple announced yesterday for the new iPad Pro, the new LiDAR scanner is probably the most interesting — we’ve been hearing about the possibility that technology like this would come to the iPad since at least last summer, and there have been rumours for even longer about it coming to the iPhone this year.
However, there’s still understandably a bit of confusion about what the LiDAR camera will do on the iPad Pro, or even how the technology works, while we can imagine a whole bunch of different applications going forward, it seems that right now Apple is keeping the focus pretty narrow.
What Is LiDAR?
LiDAR stands for Light Detection and Ranging, and is actually a technology that has been around since the 1960s. Early LiDAR systems used high-powered lasers to calculate distances by measuring the time it takes for a signal to return by bouncing off other objects, and while many high-end systems used by surveyors and topographers still do, more modern consumer implementations use lower-powered infrared laser and flash pulse technology, and have evolved from simply measuring point-to-point distances to actually generating 3D models of the objects around them through the use of multiple beams.
One of the most common consumer applications for LiDAR you’ve probably heard about already is in self-driving car systems, or even the steering and safety features that are already available in modern cars.
Technically speaking, a LiDAR scanner isn’t the same thing as the sort of Time-of-flight (ToF) 3D camera that we’ve been hearing rumours about — it’s actually much better, in fact. The ToF camera sensors being developed by Sony are less expensive, but provide lower accuracy than LiDAR since they use a wider single beam.
It’s Not For Photography (Yet)
When we first heard reports that future iPhones would be gaining ToF or LiDAR sensors, one of the most obvious things that sprung to mind was improvements to photographic effects like Portrait Mode and Portrait Light.
While it’s true that LiDAR could provide an unprecedented level of accuracy to support these features, which currently rely on a combination of multiple lenses and machine learning analysis to create simulated depth-of-field and lighting effects, Apple seems to have no interest in even bringing basic Portrait Mode photography to the iPad — something it could have easily done even with the single-lens 2018 iPad Pro, since the iPhone XR had introduced the feature only a month earlier.
The 2020 iPad Pro makes no mention of Portrait Mode features anywhere, so we’re left to assume that even with the dual-lens system and LiDAR scanner, it’s not going to offer these kinds of photographic effects. In fact, the only thing that Apple has said about the new LiDAR scanner in regard to photography is that it will “open up more pro workflows and support pro photo and video apps,” but it’s not entirely clear what it means by this.
New Augmented Reality Experiences
What Apple is focusing on instead with the new LiDAR scanner is creating “new breakthrough AR experiences,” and it’s already demonstrated this by showing how the Apple Arcade hit, Hot Lava will include a new AR mode that will “transform your living room into a lava-filled obstacle course.”
In an online briefing to journalists, shared by The Verge, Apple shared a demonstration of Hot Lava in action, showing how it could place pools of lava and other objects around a living room with pinpoint accuracy.
However, it’s obviously not just for games — although that’s certainly a cool way to show off what the new LiDAR scanner can do. Apple also demonstrated a CAD app, Shapr3D, that can make accurate 3D models of rooms that can then be edited to drop in objects and make other renovations and additions, so you could figure out where to place a closet and see what it would look like.
The original ARKit introduced a number of apps to laying out floor plans and measuring rooms, but again the accuracy offered by LiDAR will turn these from fun pastimes into actual professional tools.
Apple also showed how the LiDAR sensor can be used in the field of medicine with Complete Anatomy, which uses the LiDAR scanner to accurately measure the range of motion of an arm in real-time, which could be a huge benefit to doctors and physiotherapists alike.
All of these demonstrations were from real-world developers of apps that are already on the App Store, although updates with the LiDAR-enabled AR features aren’t expected to arrive until “later this year.”
The IKEA Place app, which was one of the early poster children of Apple’s ARKit, will also be upgraded to leverage the new LiDAR scanner to not only place individual pieces of furniture in a room more accurately, but actually let users start furnishing entire rooms intelligently via a new Studio Mode that will automatically drop whole virtual Room Sets into different areas of their home.
Apple is also using the LiDAR scanner to enhance the Measure app that’s been included on the iPhone and iPad since iOS 12. While the app is surprisingly good as measuring objects and distances considering that it’s only using the camera system, the introduction of a LiDAR scanner into the equation should result in measurements that are both faster and more accurate. For example, while the Measure app was previously great with straight lines and regularly shaped objects like boxes, Apple notes that it will now be much easier to measure a person’s height and get more granular measurements thanks to the new Ruler View that Apple is adding.