PSA | AirTag ‘Lost Mode’ Could Be Used to Send Good Samaritans to a Malicious Website

Apple AirTag in Keychain Case Credit: Dreii / Shutterstock
Text Size
- +

Toggle Dark Mode

A security flaw found in the “Lost Mode” feature of Apple’s AirTags could potentially lead to trouble for those who might genuinely want to reunite an AirTag with its owner, according to one security researcher.

As reported by KrebsOnSecurity, cybersecurity consultant Bobby Rauch found a vulnerability in how Apple’s Lost Mode works that leaves the door open for malicious actors to “weaponize” AirTags by using them as “effective physical trojan horses.”

The problem appears to be not so much with the actual AirTag, but rather the software surrounding it, which appears to have a serious design flaw.

As you may already be aware, scanning an AirTag with any NFC-capable device — even an old Windows phone — will prompt your browser to open a web page at found.apple.com to provide more details about the AirTag in question.

Normally, this only shows the serial number of the AirTag, which can just as easily be found inside the battery compartment. However, when someone places their AirTag in “Lost Mode” it may also show contact information such as a phone number — or an email address if they’re running iOS 15.

As Rauch discovered, however, Apple isn’t actually doing anything to sanitize or clean up these phone numbers or email addresses — it just takes whatever the user enters when enabling Lost Mode and displays it on the found.apple.com webpage for that specific AirTag, exactly as-is.

As a result, a malicious AirTag user could inject arbitrary computer code into the phone number field — code that could redirect the finder’s browser away from the proper found.apple.com page to pretty much anywhere else.

I can’t remember another instance where these sort of small consumer-grade tracking devices at a low cost like this could be weaponized.

Bobby Rauch, Cybersecurity Consultant

This could be a fake Apple iCloud login page as a basic phishing attempt, or even a page that attempts to deliver malicious software to the target device.

While iOS devices would hopefully be less vulnerable to such attacks, it’s worth remembering that any NFC-capable device can scan an AirTag, and many Android devices aren’t nearly as protected, especially considering how many could be running much older versions of Android.

Should You Be Worried?

It’s the role of security consultants to sound these kinds of alarm bells because of potential problems, and as a result, sometimes these can seem more serious than they are.

While the scenario painted by Rauch is absolutely plausible, there’s room for debate as to how practical it is. Certainly, AirTags are low-cost devices, but they still cost enough that it seems unlikely many hackers will just be scattering dozens of them around randomly in hopes of catching a few unsuspecting good samaritans.

More importantly, however, this problem should be fairly simple for Apple to fix, since it exists entirely on the back-end. This is not a vulnerability in the AirTag’s firmware, but rather a flaw in the found.apple.com webpage, which is entirely under Apple’s control.

Unlike hacking an actual AirTag, which is a considerably more complex process, the attack outlined by Rauch doesn’t change the initial destination that the AirTag will send someone to. They’ll still end up on found.apple.com before anything else happens, it’s just that malicious data in the phone number or email address field could in turn take the user somewhere else.

Based on that, the fix should be as simple as Apple making sure that the found.apple.com sanitizes the data sent to it, stripping out any arbitrary code and ensuring that what gets displayed is, in fact, nothing more than a phone number or email address.

In fact, it’s rather astonishing that Apple didn’t think of this in the first place, as proper data sanitization is kind of “Programming 101.” Presumably, however, Apple was content to handle this on the iOS side, where the Find My app only accepts properly formatted email addresses or phone numbers.

Unfortunately, as Rauch discovered, it’s possible to intercept the Lost Mode request on its way to Apple’s servers, and inject a malicious HTML XSS script into the phone number.

An attacker can carry out Stored XSS on this https://found.apple.com page, by injecting a malicious payload into the Airtag “Lost Mode” phone number field. A victim will believe they are being asked to sign into iCloud so they can get in contact with the owner of the Airtag, when in fact, the attacker has redirected them to a credential hijacking page.

Bobby Rauch, Cybersecurity Consultant

It’s important to keep in mind that this is still more complicated than simply pasting some code into the phone number field. Specialized tools are required, which puts it in the realm of serious hackers. It’s not something people are likely going to be doing to prank their friends.

Again, though, it’s questionable whether it’s going to be worth the cost and effort for hackers to attempt to do such a thing with AirTags. Not only is there a tangible cost to purchasing AirTags, but they’d also have to associate them with an Apple ID to trigger Lost Mode in the first place. This would require a throwaway iCloud account on a “burner” iPhone, since anything else could easily be traced back to them.

Then they have to weigh the likelihood of somebody actually finding the AirTag, knowing that they can scan it with their smartphone, and falling prey to whatever attack they have in mind.

For example, in the case of a phishing attack, the user would have to not only believe that they’re required to supply their login credentials to help reunite a lost AirTag with its owner, but also be willing to actually do this. Many may just give up, thinking it’s too much hassle.

An attacker can create weaponized Airtags, and leave them around, victimizing innocent people who are simply trying to help a person find their lost Airtag.

Bobby Rauch, Cybersecurity Consultant

Similarly, an attack that attempts to exploit a device by injecting malicious code would have to be taking advantage of a known vulnerability for a specific platform.

Although KrebsOnSecurity cites the example of malware distributed via USB sticks, these are naturally more targeted attacks, often with higher payoffs. Everyone knows what a USB flash drive is for, and an average person can be more easily enticed to plug it into a computer out of curiosity, especially if it’s found in a company parking lot with a label like “Employee Salaries” on it. It’s also possible to do a lot more over a direct USB connection into a Windows PC than it is via a website on an iPhone or Android smartphone.

By contrast, most people wouldn’t know what to do with an AirTag, and many likely still don’t even know what an AirTag is in the first place.

There’s also a higher chance that anybody who does know enough to scan an AirTag with their smartphone won’t be as easily duped by phishing attempts.

Why Hasn’t It Been Fixed?

The real mystery here, however, is why it’s taken Apple so long to address this issue. According to Rauch, he noticed Apple about this vulnerability on June 20, but for three months he was told only that it was still being investigated.

Rauch also adds that Apple never even acknowledged basic questions about the bug, such as whether they had a timeline for fixing it, or whether he would get a “bug bounty” for the discovery, or at least be credited by name in the accompanying security advisory.

He also told Apple that he planned to go public with his findings within 90 days — a fairly standard timeframe for security researchers to give companies time to address any issues — however he claims the response he got to that was “We’d appreciate it if you didn’t leak this.”

Rauch acknowledged to KrebsOnSecurity that this is likely a pretty low-priority issue for Apple, but he’s also at a loss for why it hasn’t been fixed already, since it should be fairly simple for Apple to just restrict or clean up the data that gets submitted as phone numbers or email addresses when enabling Lost Mode.

For now, you may want to be a bit more wary about scanning AirTags you find randomly lying around, especially if you find one that’s not attached to anything of value. While it seems unlikely that hackers would spend the money needed to scatter dozens of AirTags around at all, it’s even more of a stretch to believe that anybody would invest the resources to pair an AirTag with an expensive item just to capture someone’s iCloud password.


Sponsored
Social Sharing