Over the past few iPhone releases it’s become clear that Apple puts a great deal of effort each year into improving the photography and imaging capabilities of its smartphones.
Although Apple has never competed on raw spec sheets — it was far from the first smartphone maker to move to a triple-lens camera system, and its megapixel counts still pale in comparison to those of many of its rivals — it puts a lot of emphasis on things like computational photography and other features that leverage the power of its A-series chips and their Neural Engines, and there’s no doubt that it allows iPhone users to capture some pretty great photos.
However, as important as it is to take great pictures with your iPhone, there’s actually a lot more to Apple’s focus on imaging technology. As we saw with the addition of the LiDAR Scanner to this year’s iPad Pro models, Apple’s also been pushing the technology to facilitate a whole bunch of new augmented reality experiences, which are even more important as it prepares to release its AR glasses and headsets.
Now a new patent reveals what could be Apple’s next frontier in the future of its work on imaging technologies.
As shared by AppleInsider, Apple has applied for a patent that covers the areas of infrared and thermal imaging. While the title —Method and System for Determining at least one Property Related to at least Part of a Real Environment — is certainly a mouthful, the 22,000-word patent goes into extensive detail on how computer vision algorithms can be aided by thermal imaging to make it much easier to detect and track moving objects in busy and poorly-lit environments.
While this isn’t necessarily a problem that photographers need solved, it’s a crucial factor in creating effective real-world augmented reality technologies — especially those that are going to be worn in front of your eyeballs. As you can probably imagine, a system that tries to identify important objects as you walk around a busy urban neighbourhood would quickly get disorienting and frustrating if it couldn’t accurately track their movements.
As the patent application notes, there have already been some attempts made to solve this problem, but they’re fairly complex and prone to failure. Most of the algorithms developed thus far tend to assume a relatively static environment where only the camera itself is moving — movement which can easily be tracked through positioning technologies like GPS, accelerometers, and gyroscopes.
While this generally works fine in scenarios like navigation systems, where the person or vehicle is the primary moving object and anything else is generally irrelevant, they quickly fall down when it comes to tracking an unrelated object moving across the camera’s field of view.
A few more advanced solutions have been developed to address this, but they suffer from what Apple calls “increased complexity and computational cost,” since they involve very complex techniques that try and analyze the optical data and motion being captured by the cameras.
Instead, Apple proposes that a thermal imaging or infrared camera could be used to augment the visual imaging by allowing the system to more effectively identify objects based on their thermal properties. For example, a vehicle would offer a considerably hotter profile than a pedestrian, and an animal would offer a different thermal profile from something like a basketball.
Although the patent application itself focuses on augmented reality scenarios, it’s also not hard to see how a thermal camera might also help both AR and even normal photography in lower-light situations as well.
As with all Apple patents, the usual caveats apply here of course. Many of Apple’s patents never turn into actual products, and this one in particularly is still at the application stage. However, it wouldn’t be surprising if the LiDAR scanner that’s coming to this year’s iPhone 12 Pro models isn’t also joined someday by a thermal imaging scanner.