Prompted by growing concerns, researchers analysed if the way we bark out orders to digital assistants like Siri, Alexa, and Google Assistant, is making us less polite!
To find out the same, researchers didn't ask Siri or Alexa. Instead, they asked 274 people, and after conducting the survey and observing those participants, they got to know that artificially-intelligent digital assistants are not making adult humans ruder to other humans.
However, "worried parents and news outlets alike have fretted about how the personification of digital assistants affects our politeness, yet we have found little reason to worry about adults becoming ruder as a result of ordering around Siri or Alexa. In other words, there is no need for adults to say "please" and "thank you" when using a digital assistant," said James Gaskin, associate professor of information systems at BYU.
Gaskin and lead author Nathan Burton actually expected to find the opposite, that the way people treat AIs would make a difference in their lives and interpersonal interactions. According to their assessment, digital assistants in their current form are not personified enough by adult users to affect human-to-human interactions.
But that may not be the case with children. Parental concerns have already prompted both Google and Amazon to make adjustments to their digital assistants, with both now offering features that thank and compliment children when they make requests politely.
According to the study presented at Americas Conference on Information Systems, Gaskin and Burton did not study children, but assessed young adults, who generally have already formed their behavioral habits.
The researchers believe that if they repeated the study with kids, they would find different results.
More From This Section
They also say that as artificial intelligence becomes more anthropomorphic in form, such as the new Vector Robot, which has expressive eyes, a moving head and arm-like parts the effects on human interactions will increase because people will be more likely to perceive the robots as having and understanding emotion.
"The Vector Robot appears to do a good job of embodying a digital assistant in a way that is easily personifiable if we did the same type of study using a Vector Robot, I believe we would have found a much stronger effect on human interactions," Burton said.