Clinicians and doctors take note. When it comes to honesty in disclosure, patients prefer a computer-simulated human to the real thing. According to a first-of-its-kind study released by the University of Southern California (USC) and published in the journal Computers in Human Behavior, virtual humans (VHs) elicited more honest answers from patients about their symptoms, regardless of how potentially embarrassing the information was.
|Using SimSensei, USC researchers render a virtual human to interact with patients. [image via USC Institute for Creative Technologies]
The researchers, led by Gale Lucas, a social psychologist at USC's Institute for Creative Technologies (ICT), recruited 239 adults, ages 18-65, to interact with a virtual human as if they were being admitted into a clinic or hospital. Subjects interacted through SimSensei
, an application developed at the ICT that renders a computer-generated person and uses realtime sensing and behavior recognition to detect signs of depression and other mental illness.
The participants were randomly told that the VH was either fully-automated or being remotely controlled by a human whom they would never see or meet. After analyzing video of the human-VH interactions the researchers found that the belief that they were speaking with a computer program made participants more honest and open in their responses, even to personal and potentially embarrassing questions like, ‘‘Tell me about the hardest decision you’ve ever had to make.’’ and “‘Tell me about an event, or something that you wish you could erase from your memory.’’
Participants who thought they were being watched were less forthcoming, with one commenting, “I wish you hadn’t told me that other people were in the other room listening in. It was weird, like, I don’t even know these people. I would have said a lot more stuff if they weren’t there.”
“Because some participants were led to believe that a researcher in the other room was watching their responses, and others were told that the very same virtual human was completely automated, we can identify that the effect in this study was driven by the belief they were interacting with just a computer,” Lucas says via email.
Lucas and her team hope is that VHs will work alongside human doctors and clinicians to provide more comfort to patients in clinical settings and perhaps even help cut down on provider costs.The study suggests, for example, that VHs can be used to reach patients in remote or sparsely populated locations where it would be costly to provide traditional health screening services.
“Virtual humans could be useful in eliciting more honest responses from, for example, cancer patients who are taking part in in clinical trials or studies,” Lucas says. “However, research suggests that cancer patients can also have especially heightened fears of disclosing information to healthcare providers. Cancer patients in clinical trials might, for example, be afraid to disclose information about side effects because they worry the cancer might progress if doctors, in turn, decide to reduce or discontinue that treatment.”
But is this really a case of human's bonding with a computer-generated face or is it just that we're more honest when we think we're being anonymous? “In our research, the effect of being unobserved on disclosure held even though participants knew that their responses would be viewed by researchers in the future. They knew that they were being videotaped and that the tape of their session would be viewed by researchers later,” Lucas says.