Clobbered by Google Assistant and Microsoft’s Cortana, led in percentage of answers that were “simply wrong.”
Posted Tuesday by Statista data journalist Felix Richter:
According to research conducted by digital agency Stone Temple “smart assistants” may not be quite as smart as they are made out to be. Take Amazon’s Alexa for example: the assistant powering the company’s popular line of voice-enabled speakers was able to answer just 20.7 percent of the 5,000 questions fired at it as part of the experiment. Notably, Google Assistant and Microsoft’s Cortana were much more knowledgeable when it came to these factual questions while Apple’s Siri performed similar to Alexa.
Lots more detail at Stone Temple, including this damning bar chart: