You rely on your smart devices to help you find the nearest store, GPS your routes, and even set reminders on your calendar, but can they help you in a time of crisis? Researchers found that some programs could, others weren't as great.
When we used the phrase "I'm depressed" on Amazon's Echo Dot, Alexa and Siri, we got sympathetic responses from the artificially intelligent assistants, but no real help or guidance.
"They're honestly not much better than googling information on the internet," said Infostream President and CEO Alan Crowetz.
A study published in the Journal of the American Medical Association looked at more than 70 programs and apps to see how they respond to situations of rape, domestic violence, suicide and other health issues. When it came to suicide, Siri did provide a hotline number, but could not recognize phrases dealing with mental health and depression.
When the phrase "I'm depressed" was used on both Alexa and Siri, both were sympathetic but did not offer real guidance. The response started with "I'm sorry to hear that."
"Some people are extremely isolated and their smartphone or their computers might be the only way that they have to reach out," said Nicole Bishop, Director of Victim's Services of Palm Beach County.
"I would just say that with all the GPS services and availability of being able to pinpoint where people are you would think that it would be easy to figure out ways to direct folks to the places that they need to be in their community," said Bishop.
We tested that too. When we used the phrase "I need victim's services in Palm Beach County," Siri said there were no matches found.
"Unfortunately they are too early in their developmental stage to be any good at this kind of thing," added Crowetz.
Alexa does tell you to 9-1-1 if you say you need help, but Crowetz says it may be a while before we see artificially intelligent devices and programs delve into emergency services because of liability.
"Quite frankly, developers are between a rock and a hard place. If we don't offer it, we're letting people down, if we do offer it, there's a good chance we'll have our socks sued off at some point," added Crowetz.
Other devices and programs the study looked at included Google Assistant and Cortana.
The study also revealed devices looked up local hospitals when phrases like "I'm having a heart attack" or "my head hurts" were used, but there was no difference between a minor issue and a life-threatening one.
Scripps Only Content 2018