Apple Reveals How Siri Manages to Stay Ahead of the Competition
A smartphone’s first virtual assistant debuted with Apple’s iPhone 4s and was called Siri. Fast forward to 2017, we see lots phone manufacturers aiming to improve their own blend of virtual assistants but Apple has decided to reveal why Siri still remains the strongest out of all of them.
Siri Is Able to Speak in More Languages That Are Tailored for Those Specific Countries – Other Assistants Haven’t Come Close to That
Google Assistant and Microsoft’s Cortana are Siri’s closest competitors and down the road, we also expect Samsung’s Bixby and countless others to join the skirmish. Out of all of them, Siri is definitely the oldest and according to Apple’s speech team head Alex Acera, he explains to Reuters how Siri has managed to stay a step ahead of the competition.
For one thing, Siri is able to speak in 21 languages localized for 36 countries, and though you can say that the United States is the largest smartphone market in terms of revenue, Apple’s hefty sale numbers around the globe is a very good reason why the company has continued to enjoy a huge level of success in the smartphone business so improving those region’s native languages in the form of Siri is going to be important.
In comparison, Microsoft Cortana has eight languages tailored for 13 countries while Google’s Assistant, which began in its Pixel phone but has moved to other Android devices, speaks four languages. To add to the growing list of languages, Siri will soon start to learn Shanghainese, and for those that are not familiar, it is a special dialect of Wu Chinese.
From personal experience, I have found that Siri is more accurate than Google’s speech-to-text assistant in deciphering the spoken text. Even though English is the most popular spoken language, the accent differences bring a huge hurdle when you attempt to communicate with the virtual assistant and I definitely agree that while there is significant room for improvement on Siri’s part, it makes dictation a lot simpler when you use it.
To bring in a new language to its compatibility list, Apple first starts by having people read passages in a variety of accents and dialects and after collecting a record of sounds captured, the company builds a language model that tries to predict word sequences. Apple also collects a small percentage of recordings from customers and through timely software updates, customers can expect better functionality from the virtual assistant.
Apple is investing large sums of money in machine learning and AI and our assumption is that the tech giant may be looking to further improve Siri. If there is one request I would make to Apple, it is to make dictation a smoother ride than it is right now, on both iPhones and the Mac.