Apple has a problem with Siri. The voice-activated virtual assistant that has been built into Apple products since the iPhone 4 is losing ground to the competition.
Amazon’s relentless development and marketing of its Echo and Echo Dot smart speakers have pushed its virtual digital assistant Alexa far ahead of other similar products. Now Amazon and Microsoft have agreed to link Alexa and Cortana, which resides in Windows 10.
Meanwhile, the Google Assistant, which resides on the Google Home speaker, is starting to grab market share and it’s getting more capabilities.
While Siri is available on most iPhones and iPads, its usefulness is limited compared to the competing assistants. Worse, while Siri’s features have been improved, its integration with iOS and its use of artificial intelligence haven’t been keeping up.
But Apple is taking action to change this situation. Apple has quietly assigned the development team of Craig Federighi, Apple senior vice president for software engineering to oversee Siri development. While Apple didn’t issue a formal announcement, it was Federighi who announced changes to Siri at Apple’s World Wide Developer’s conference in June. He also runs the teams that develop iOS and MacOS.
Having Siri as part of the same group that’s developing Apple’s operating systems means that the virtual assistant can be much more deeply integrated into the software and hardware that Apple sells. Deeper integration will mean that Siri can control much of what iOS and MacOS can do and it means that Siri will be able to work with third-party apps that already work with the operating systems.
Federighi has said that Apple is adding machine learning to make Siri smarter. At WWDC he said that Siri would learn to know individual voices and preferences, making it possible for Siri to figure out what you want more quickly.
Apple’s Siri team has explained in Machine Learning Journal how such deep learning can make Siri more useful by, among other things, improving Siri’s voice. At the end of the paper you can find examples of how Siri’s voice has changed with each iteration of iOS, including the very natural sounding voice in iOS 11.
But there’s a lot more to catching up with competing digital assistants than having a nice voice. Part of the process is making Siri smart enough to quickly understand your spoken requests on the first try.
Right now, Siri can do some things very well, including updating users on the latest sports scores as well as finding and playing users’ favorite music.
But there are many areas in which Siri isn’t really much use. For example, I asked all three digital assistants, Siri, Cortana and Alexa the same question: “How has machine learning changed Siri?”
Siri presented three findings that included machine learning as a general topic, but had nothing about Siri. Alexa provided a series of machine learning facts from its skill set, but nothing any more useful.
Originally published on eWeek
Page: 1 2
Targetting AWS, Microsoft? British competition regulator soon to announce “behavioural” remedies for cloud sector
Move to Elon Musk rival. Former senior executive at X joins Sam Altman's venture formerly…
Bitcoin price rises towards $100,000, amid investor optimism of friendlier US regulatory landscape under Donald…
Judge Kaplan praises former FTX CTO Gary Wang for his co-operation against Sam Bankman-Fried during…
Explore the future of work with the Silicon In Focus Podcast. Discover how AI is…
Executive hits out at the DoJ's “staggering proposal” to force Google to sell off its…