iOS 13 and macOS Catalina Now Feature Built-in Dog and Cat Detectors

Pet Detector Ios 13 Credit: Pooch Selfie
Text Size
- +

Toggle Dark Mode

Apple has included new machine learning algorithms in iOS 13 that will allow your iOS and iPadOS apps to detect your four-legged buddies.

Of course, Apple’s native Photos app has long been able to use computer vision to detect dogs and cats in images. You can try that out for yourself — just type “dog” or “cat” into the search bar in the Photos app.

The change in iOS 13 is the ability for third-party developers to use Apple’s Vision framework in their own apps. In other words, a third-party developer will be able to make their camera app as good at detecting your furry friends as Apple’s native apps.

As far as the benefit to consumers, the change could herald a new wave of App Store apps that are just as smart — and just as good at detecting dogs and cats — as Apple’s own native platforms.

Animal detection isn’t the only capability baked into the Vision developer framework — the framework can detect objects; faces; text and barcodes; horizons; and even areas of an image where humans are most likely to look at.

But the fact that developer APIs like VNAnimalDetectorCat and VNAnimalDetectorDog exist in Apple’s developer documentation is too pure to pass up — as first noted by developer Frank Krueger. The APIs are even labeled as “a cat detector” and “a dog detector,” respectively.

While it may be the cutest developer-specific change in the latest batch of Apple software updates, it isn’t the only one. In fact, the Vision framework is just one of several formally internal machine learning features that Apple has opened to third-party developers as part of CoreML.

Others include Natural Language and Speech frameworks that work similarly to Vision. And they won’t just be included for iOS and iPadOS — Apple has baked the new frameworks into tvOS 13 and macOS Catalina, as well.

What this means for developers is that it’s now a lot easier for them to build these intelligent machine learning algorithms into their Apple OS apps. The animal detector, for example, can be built into an app in as little as four lines of code, as this developer presentation given at WWDC ’19 points out.

Sponsored
Social Sharing