Toggle Dark Mode
A new report reveals that Apple plans to break even more new ground with its much-rumoured mixed reality headset by eliminating the need for traditional hardware controllers in favour of completely virtual gesture recognition technology.
According to reputable analyst Ming-Chi Kuo, Apple is planning to pack in four sets of 3D sensors that use technology similar to the iPhone’s TrueDepth camera.
In much the same way that your iPhone can animate your facial expressions onto an Animoji avatar, these sensors will allow the AR/VR headset to track hand movements with even more extreme precision.
It’s not clear whether this would completely obviate the need for a physical controller to go with the headset, especially for gaming applications, but it could certainly eliminate the need for such controllers for everyday applications. We’ve heard sporadic rumours that Apple was considering a Vive-like controller, but that could be merely a fallback plan.
Certainly, the use of 3D sensors is a much better way to go, and Apple has already proven this technology with iPhone features like Face ID and through-device AR using the LiDAR scanners on recent iPhone and iPad models. So, it’s not a big stretch to see this being used for gesture recognition.
In fact, the more challenging aspect of implementing this kind of gesture recognition isn’t merely the 3D sensors to determine hand positions and movements, but rather interpreting those movements into something meaningful. This is where Apple’s Neural Engine silicon and machine learning come into the picture.
The goal, according to Kuo, is to make the user experience “more intuitive and vivid,” which makes a lot of sense. Kuo gives the example of a virtual balloon in your hand flying away as you unclench your fingers as the sort of gesture that’s much harder to replicate with a physical hardware controller.
Capturing the details of hand movement can provide a more intuitive and vivid human-machine UI.Ming-Chi Kuo
To be clear, Apple’s headset wouldn’t be the first to tackle hand tracking, but in true Apple fashion, it would take it an order of magnitude beyond what anybody else is doing. For instance, Meta (née Facebook) has Quest headsets that can handle hand tracking, but it’s a secondary feature that’s not especially precise. For one, it uses conventional monochrome cameras and rudimentary AI, which is about as different from Apple’s 3D structured light sensors as primitive 2011-era face unlocking solutions in Android 4.0 were from Apple’s Face ID.
Packed with Sensors
As impressive as this sounds, these structured light sensors are just one relatively small part of a much more ambitious project that Kuo predicts will include sensors for just about everything you can think of to create a truly immersive user experience.
For instance, the 3D sensors will also be used for object detection, presumably allowing for everything from placing augmented reality objects on surfaces, like the iPhone and iPad can do now, to actually identifying physical objects within your surroundings — not unlike the Visual Look Up feature in iOS 15. It’s actually pretty easy to see how Apple’s iPhone and iPad technologies have already been building toward this for years.
Kuo also adds that the headset is expected to feature eye tracking, iris recognition, voice recognition and control, and even skin detection, expression detection, and spatial detection. It’s hard to even imagine what all of these could be used for, although some reports have suggested that iris detection could serve as a form of user authentication.
According to Kuo, the key to the success of the mixed reality headset will be creating a tight human-machine user interface. Apple’s ultimate goal is to create a wearable that works so seamlessly and intuitively that it feels like an extension of the user’s own body.
Lightweight and Powerful
To this end, Apple is also working on making the headset as thin and light as possible, since the bulkier it is, the less natural it’s going to feel. While Kuo had previously reported that Apple wants to get the weight under 150 grams, it looks like it may not be able to pull this off for the first-generation model, which is more likely to come in at around 300–400 grams.
Even that will still be lighter than most of the alternatives, however. For example, the Quest 2 weighs in at 503 grams, and the original Quest was even heavier at 571 grams, or 1.25 pounds, HTC’s Vive and Microsoft’s HoloLens are in the same weight class.
Of course, Apple doesn’t plan to settle for a 300-gram headset, and Kuo says it’s already working on a “significantly lighter” second-generation model that’s slated to arrive sometime in 2024. That model would also include a new battery system and an even faster processor, naturally.
When Is Apple’s Headset Supposed to Be Released?
The first-generation headset is expected to be announced late next year, although it may not go on sale until early 2023. It’s expected to pack in multiple processors comparable to Apple’s M1 silicon, along with high-resolution 4K Micro OLED displays. The asking price will be well upwards of $1,000, but Kuo still estimates Apple could sell up to 3.5 million units, which would be much more impressive than the 180,000 units that Bloomberg’s Mark Gurman predicted earlier this year.
More significantly, however, Apple isn’t planning on releasing this as just another gadget. While the first-generation headset will likely be more of a proof of concept, in the same way the first iPhone and Apple Watch were, Apple’s end game is to change the face of mobile computing, predicting that by 2030 we may all be wearing Apple Glasses instead of toting around iPhones.
[The information provided in this article has NOT been confirmed by Apple and may be speculation. Provided details may not be factual. Take all rumors, tech or otherwise, with a grain of salt.]