Researchers from several universities in the United States say that Siri has improved their responses in terms of medical emergencies and personal crisis last year, but still has to continue to improve. A study of Stanford made about a year ago found that smart as Siri, Cortana or S Voice participants tended to not be successful when we needed medical assistance, responding with “frivolous comments” or by performing a search on the Internet when someone said things like “I’m depressed”. Adam Miner, the lead author of the study, said that things have improved since then.
According to Miner, “now Siri recognizes statements like ‘ have been raped @’ and recommended to go to the National Sexual Assautl Hotline“, something that, at least at the time of writing this post, does not offer a similar result if we say it in Spanish. As always, or more than what we would like to, everything related to Apple upgrades before come to United States, then to countries such as Canada and Australia, and then to the rest of the world.
Siri is still improving, but still has that make it more
Miner wants companies to create standards to recognize emergencies and offer appropriate responses:
Our team saw this as an opportunity to create virtual agents aware of health. Take that person to the correct source is a victory for all.
At the moment, all the virtual assistants have to improve. In the case of Siri, not just improve in terms of their responses on health; It must also improve its artificial intelligence, for what would help you remember or you can follow the thread of a conversation. This year he has taken an important step forward with the launch of SiriKit and other SDK by which they can, for example, send a WhatsApp asking Siri.
What would you like that you could do Siri in the medium-term future?