Waving at the Future: Will Apple’s Smart Glasses Borrow the Vision Pro’s Best Trick?
AI-generated concept of Apple’s rumored smart glasses and hand-gesture controls [iDrop News / AI]
Toggle Dark Mode
While Apple is believed to have shelved work on a lighter “Vision Air” to focus on smart glasses, a new rumor suggests the new glasses could borrow at least one feature from Apple’s full-sized headset.
Following an unusual report that Apple was shutting down the Vision Pro, the folks at MacRumors have also reportedly learned from an inside source that Apple’s AI glasses will feature hand-gesture-based input similar to how users interact with the Vision Pro.
It’s unclear if this “inside source” is the same one that told MacRumors the Vision Pro team has been “redistributed to other teams within Apple,” but it may still be worth taking this one with a grain of salt. As I said yesterday, it’s highly unlikely that Apple is abandoning its spatial computing platform, and while I don’t doubt the veracity of what MacRumors has been told, there seems to be a lot of important context missing here — including where those members of the Vision Pro team are going.
After all, it’s not exactly news that Apple has shelved its more ambitious “Apple Glass” project. The long-rumored set of augmented reality glasses were always a bit of a moonshot with today’s technology, but there’s also every reason to believe Apple hasn’t given up, but merely paused the project until the state of the art catches up with its vision.
The same is undoubtedly true of the Vision Pro. It’s an incredible piece of technology, but it’s also bulky and expensive. Apple has reportedly struggled to overcome both of those obstacles — and I have no doubt they will eventually do so — but there’s a point at which it’s time to stop fishing and cut bait.
Smart Glasses: A Much Easier Win — But Not a Vision Pro

Apple’s vision for spatial computing isn’t dead. It’s just evolving more slowly than some expected. The Vision Pro was chapter one, but for now Apple may have gone as far down that road as it wants to.
That’s especially true as its competitors pivot toward more affordable smart glasses. Meta’s Ray-Bans have become all the rage right now, and Apple is reportedly eager to come up with its own answer to those — and it’s uniquely positioned to do this in a far less creepy way than what Meta has done so far.
If the Vision Pro team is moving to the Smart Glasses side — and it seems likely that at least some of them are — then it stands to reason that they’ll be bringing some of their expertise along to influence some of the features of the smart glasses.
It’s not hard to believe the smart glasses team is at least considering gesture controls, but it’s important to remember that a set of smart glasses will be a fundamentally more limited piece of hardware compared to the Vision Pro.
While Apple’s smart glasses are widely believed to rely on a paired iPhone to do the heavy lifting as no one is expecting Apple to be able to fit an A-series or M-series chip into them, there’s still the challenge of round-tripping these requests over Bluetooth. It’s unclear if Apple intends to use a Wi-Fi connection as the higher power requirements might be too much for the necessarily smaller battery used in a pair of smart glasses. The company reportedly toyed with 802.11ay a few years ago, but there’s no indication that ever went anywhere — and while Wi-Fi 7 on the latest iPhone 17 Pro offers better latency, it still hasn’t reached the power-sipping “WiGig” dreams of old.
While Apple could overcome some of this with clever on-device processing — it might be possible for a variation on the Vision Pro R1 chip to fit into the smart glasses — the cameras would still present another challenge. At best, Apple’s smart glasses are expected to include two cameras: a high-resolution one to capture photos and videos and a lower-resolution wide-angle camera for reading hand gestures and providing additional visual input for Siri. MacRumors suggests the same low-res cameras will also be on a future set of AirPods Pro to bring hand gesture support to them as well.
This sounds feasible at first glance, until you realize that the Vision Pro has eight external cameras (not counting the four inside that track your eyeballs). Two of those are used to stream the external world to your eyes, while there are six dedicated to tracking hand movements and mapping the surrounding environment. This gives visionOS multiple angles from which to recognize and interpret hand gestures.
That’s not to say a single camera can’t get the job done, but it would likely be far less accurate and reliable. Apple may be pursuing some limited hand gestures, similar to what it’s done with the Apple Watch. It’s unlikely these will be as extensive as those on the Vision Pro, but it’s also fair to say they don’t need to be.
Unlike the mythic Apple Glass, these smart glasses won’t include any display technology, so you won’t be using your hands to select menus and move around virtual objects in your field of view. Simple gestures should suffice for tasks such as capturing photos, answering calls, or summoning Siri, and Apple could combine these with head gestures, similar to what it’s done on the AirPods Pro.
Still, there’s plenty of reason to be skeptical here. Bloomberg’s Mark Gurman echoes the notion that this can’t be done reliably with a single camera today, while adding that he’s also heard nothing about it from his sources.
The technology to do this reliably with a single camera, no neural band and no eye-scanning doesn’t exist today as far as I know. I’ve also heard nothing to suggest the first version has any sophisticated form of gestures as this describes. I am extremely skeptical. https://t.co/27RGM2YhgA— Mark Gurman (@markgurman) April 30, 2026
Of course, “sources” are funny things when it comes to the rumor mill, as they can all be plugged in at different levels within Apple, which is also a notoriously secret and segregated company that isn’t above running “sting operations” to identify leaks. Gurman’s collection of sources has proven more reliable than not over the years, although he’s still gotten a few things wrong. This insider speaking with MacRumors appears to be someone with new information, but as with yesterday’s Vision Pro report, it paints an incomplete picture that leaves too much room for interpretation. While we have little doubt that Apple is working on gestures — evidence of that goes back nearly two decades — the limitations of technology and physics result in plenty of things that Apple wants to do being shelved or scaled down until it can figure out how to do them right.
[The information provided in this article has NOT been confirmed by Apple and may be speculation. Provided details may not be factual. Take all rumors, tech or otherwise, with a grain of salt.]

