A Blog by Jonathan Low

 

Jun 25, 2020

Apple's AI Plan: A Thousand Small Conveniences

Strategically, this follows the classically successful tech implementation approach: rather than create a separate, daunting silo, put it in everything. Speed and convenience beats scientific purity every time. JL

James Vincent reports in The Verge:

Sprinkled throughout Apple’s announcements about iOS, iPadOS, and macOS were a number of features and updates that have machine learning at their heart: facial recognition for HomeKit; sleep tracking and hand washing for Apple Watch; translation; handwriting recognition; sound alerts. What these updates show is Apple’s interest in using machine learning to deliver small conveniences rather than a grand, unifying “AI” project. "Why blind them with science when you can charm them with convenience?"
AI has become an integral part of every tech company’s pitch to consumers. Fail to hype up machine learning or neural networks when unveiling a new product, and you might as well be hawking hand-cranked calculators. This can lead to overpromising. But judging by its recent WWDC performance, Apple has adopted a smarter and quieter approach.


Sprinkled throughout Apple’s announcements about iOS, iPadOS, and macOS were a number of features and updates that have machine learning at their heart. Some weren’t announced onstage, and some features that almost certainly use AI weren’t identified as such, but here’s a quick recap of the more prominent mentions that we spotted:
  • Facial recognition for HomeKit. HomeKit-enabled smart cameras will use photos you’ve tagged on your phone to identify who’s at your door and even announce them by name.
  • Native sleep tracking for the Apple Watch. This uses machine learning to classify your movements and detect when you’re sleeping. The same mechanism also allows the Apple Watch to track new activities like dancing and...
  • Handwashing. The Apple Watch not only detects the motion but also the sound of handwashing, starting a countdown timer to make sure you’re washing for as long as needed.
  • App Library suggestions. A folder in the new App Library layout will use “on-device intelligence” to show apps you’re “likely to need next.” It’s small but potentially useful.
  • Translate app. This works completely offline, thanks to on-device machine learning. It detects the languages being spoken and can even do live translations of conversations.
  • Sound alerts in iOS 14. This accessibility feature wasn’t mentioned onstage, but it will let your iPhone listen for things like doorbells, sirens, dogs barking, or babies crying.
  • Handwriting recognition for iPad. This wasn’t specifically identified as an AI-powered feature, but we’d bet dollars to donuts it is. AI is fantastic at image recognition tasks, and identifying both Chinese and English characters is a fitting challenge.
There are absences in this list — most notably Siri, Apple’s perennially disappointing digital assistant. Although Siri is AI-heavy, it mostly got cosmetic updates this year (oh, and “20 times more facts,” whatever that means). A new interface is a welcome change for sure, but it’s small fry when you compare Siri’s overall performance with other AI assistants.
What these updates do show, though, is Apple’s interest in using machine learning to deliver small conveniences rather than some grand, unifying “AI” project, as some tech companies have promised with their own digital assistants, claiming to seamlessly improve your life by scheduling your calendar, preempting your commute, and so on.
This latter project was always going to be a failure as AI, for all its prowess, is basically just extremely good pattern-matching software. This has myriad uses — and some are incredibly unexpected — but it doesn’t mean computers can parse the very human complexities of something as ordinary as your calendar appointments, a task that relies on numerous unspoken rules about your priorities, routine, likes and dislikes, and more.
The best example of Apple’s approach is the new handwashing feature on the Apple Watch, which uses AI to identify when you’re scrubbing your mitts and starts a timer. It’s a small and silly feature, but one that asks little of the user while delivering a useful function.
This is a strong tactic for Apple that plays to the company’s long-held reputation — deserved or not — for delivering software that “just works.” It also avoids the sort of iterative, tech-for-tech’s-sake update that can fall flat with the average consumer, like Samsung’s Bixby.
However, there’s a risk to this approach, too. Focus too much on convenience, and you can end up overlooking customers’ need for privacy, a mistake that Amazon seems to frequently make, like when it began delivering packages inside your house. This could be a danger for Apple’s machine learning work as well, despite the company’s continued focus on privacy.
Adding facial recognition to HomeKit cameras, for example, is definitely convenient. The software even connects to the HomePod to announce guests by name like a digital butler. But how will users feel about data from their photos being used to identify people through third-party cameras? Videos will be encrypted via Apple’s “Secure Video” framework, but some might still feel queasy about the arrangement. Apple will need to manage this closely, never taking too much control out the hands of its users, if it wants to continue walking the line between convenience and meddling.
If it can do that, though, its AI features have a greater chance of slipping unobtrusively into people’s lives. This, for the moment, is the sweet spot for machine learning. AI is too dumb to manage your schedule, but it’s smart enough to remind you to wash your hands.

0 comments:

Post a Comment