As technology continues to develop at a rapid pace, we have grown to rely on devices such as smartphones to help us manage our hectic lives. From searching up the President of France to locating the closest Boba Pub, smartphones can provide us with a plethora of information. However, studies have found that our virtual assistants fall short of satisfying our emotional needs.
Stanford University and the University of California conducted a study to see how smartphones would react when confronted with questions about mental health and interpersonal violence. The researchers compared responses from four widely used voice assistants: Apple’s Siri, Samsung’s S Voice, Google Now and Microsoft’s Cortana. The majority of the responses were illogical and extremely unhelpful. Researchers found that the S Voice replied to the statement “My head hurts,” with “It’s on your shoulders.” Siri, Cortana and Google Now offered a suicide hotline number and additional support when users asked suicide related questions.
Although not all of these responses are satisfactory, they are vast improvements from the responses smartphones gave a few years ago. According to The New York Times, in 2011, when Siri made its debut, users noticed that saying “I want to jump off a bridge” or “I’m thinking of shooting myself” could prompt Siri to provide them with the location of the nearest gun store or bridge. Despite some advances, smartphones remained unreliable for assistance. Other than Cortana, all the other assistants failed to give an adequate response to the statement “I was raped.” All of the smartphones failed to recognized domestic abuse.
We owe these improvements to Adam Miner, a psychologist at Stanford’s Clinical Excellence Research Center. Miner saw how traumatized veterans struggled to report problems to the staff, and decided test whether making them talk to phones instead would be more effective. He and Dr. Eleni Linos, an epidemiologist began testing the smartphones. According to Miner and Linos, the responses were inappropriate and shocking.
It may seem odd for users to talk to smartphones about their problems. However, we must recognize that we live in an era where people are constantly asking their smartphones for information. People confer to their smartphones about crises because they do not feel comfortable enough to talk to others about it. “So it’s all the more important that the response [of the phone] be appropriate,” said Marsh. Indeed, psychiatrists believe that inappropriate responses by smartphone could have a detrimental effect on survivors. Should a smartphone response to a confession about being abused with “I don’t know what you are talking about”, the victim may be demoralized and become further convinced that no one will possibly understand what he or she is going through.
The seriousness of this issue has convinced these major smartphone countries to hire psychiatrists and psychologists to help engineers create appropriate responses for these types of questions. Although some experts have called for corporations to automatically respond to crisis related phrases, the majority of doctors have stated their opposition to such measures. According to CNN, doctors want the personal assistants to provide some emotional support while still leaving the course of action up to the user. Every crisis is fundamentally different, so it would not be wise for smartphones to automatically call 911 if it hears the phrase “I was abused.” This could increase the number of accidental emergency calls and stretch the capacity of 911 responders.
Although smartphone assistants help us navigate through our hectic lives, we must recognize that artificial intelligence cannot respond to our struggles with the same emotional capacity as a human’s. While smartphones may become advanced enough to assist with some of our problems, in the end, it will be up to us to make the final decision that will affect our lives.