A Blog by Jonathan Low

 

May 2, 2018

How Alexa Keeps Beating Siri As Most Useful Virtual Assistant

The difference underscores fundamentally opposing strategic belief systems: Apple's insistence on strict internal 'not-invented-here' control, versus Amazon's openness to partnering with outsiders to achieve results faster. JL 


Christina Bonnington reports in Slate:

While Apple took the early lead, its cautious approach—particularly when it comes to integrating information and functionality from third-party apps—has left it lagging. Amazon has leapt ahead thanks to its broad library of third-party developer skills. Siri’s abilities are limited to apps you downloaded also supported by SiriKit, and you need to specify which app you want to use. Where Siri seems stuck on the phone, Alexa has expanded inside and outside the home thanks to partnerships with third-party devices.
Tech companies are investing major resources into developing virtual assistants, making calculated bets that these digital entities will be the future of mobile computing. Apple first made the idea mainstream when it launched Siri with the iPhone 4S in 2011. Google followed swiftly with the launch of its A.I.-fueled Google Now (which has since evolved into Google Assistant) in 2012. Apple had a three-year head start on other major competitors such as Amazon’s Alexa and Microsoft’s Cortana. But while Apple took the early lead in the space, its cautious approach—particularly when it comes to integrating information and functionality from third-party apps—has left it lagging. Of all these digital assistants, Amazon has leapt ahead thanks to its broad library of third-party developer skills, and its latest updates broaden that lead even further—especially compared to Siri.
On its Alexa developer blog this week, Amazon outlined several new capabilities rolling out to Alexa users. Soon, Alexa will be able to remember information you tell her (“Alexa, remember that John’s birthday is May 25”) as well as context awareness in follow-up questions (“Alexa, how is the weather in New York City?” and then “What about this weekend?”). Most importantly, however, is a capability Amazon calls Skills Arbitration. Over the next few weeks, the company is rolling out the ability for Alexa users to automatically find, enable, and launch skills using natural phrases and requests. Alexa will use machine learning to carry out those tasks. The example the company gives is of a person asking Alexa how to remove an oil stain from a shirt. The query has normally been met with a response like, “Sorry, I don’t know that one.” With Skills Arbitration, however, Alexa instead discovered the Tide Stain Remover skill and recited instructions on how to remove the stain.
For users, this kind of experience is straightforward and frictionless. You ask a question and (as Apple used to say of its services) it just works. Until now, virtual assistants hadn’t been ready for this experience. Users needed to hunt through Alexa’s Skills section, download a particular skill, and then remember to summon or open that skill before using it. This technique was arguably less elegant than Apple’s approach. While Siri may have fewer capabilities at her disposal—currently there are nine types of third-party apps that can utilize Siri integration—once such an app is downloaded onto your iOS device, there’s no need for any additional download in order to take advantage of Siri’s voice-based command center. You can simply say “Hey Siri, send Maria $5 on Venmo” or “Start a run with Runtastic.” But again, Siri’s abilities there are limited to apps you have downloaded that are also supported by SiriKit—and you need to specify which app you want to use when you make a query. With Amazon’s Skills Arbitration update, while some functions will still require Alexa users to note which app they want to use (“Alexa, order a Lyft”), that won’t always be necessary. Now, Alexa will process your query, decide if the answer lies within a particular skill, and then automatically download that skill for you if you don’t already have it. Amazon’s progress has been faster and broader than Apple’s progress with Siri. Apple kept tight control of its digital assistant following its launch, typically integrating new areas of knowledge and capabilities—things such as sports stats or the ability to find movie tickets—once a year, with new versions of iOS each fall. Save a few exceptions, third-party developers were left out of the picture until Apple introduced SiriKit with iOS 10 in 2016. With the debut of Amazon’s third-party skills, Alexa gained abilities like pizza-ordering and Uber-hailing months before Siri was able to accomplish similar feats. (Siri did beat Alexa to a handful of functions, such as the ability to send calls and messages—an easier feat for Apple since Siri is integrated into iOS, as are iMessage and phone calling.) But where Siri’s progress seems to have stalled for months at a time, Alexa has constantly gained new notable apps and functionalities—aided by the variety of APIs, SDKs, and Skills Kits that Amazon has made available to developers. And where Siri seems largely stuck on the phone, Alexa has expanded into additional areas both inside and outside the home thanks to partnerships with third-party hardware devices.
Alexa also recently expanded in one other very important way: personalization. With Blueprints, Alexa users can craft a limited set of their own, customized skills—without any programming knowledge whatsoever. Blueprints is the first time a digital assistant maker has given users full control over what questions their assistant can answer, and what those answers are. Paired with Skills Arbitration, Amazon may have made Alexa the most useful, all-encompassing virtual assistant currently available. Unless Apple makes sweeping updates to Siri in iOS 12, the gap between these two virtual assistants will likely continue to grow.

0 comments:

Post a Comment