observer
07-15-2008, 06:11 AM
Edna sits on an examining table ready and alert -- she wants answers about the lump in her breast.
For each of the 21 medical students who enter the room, Edna's fears are still to be discovered.
They each see the same 55-year-old woman, each meet with the same brown eyes.
They all hear the same Southern twang in her voice and the same tremor of fear when she asks if she could have cancer.
The only difference is that 12 of the students see a dark-skinned version of Edna, and the other nine students see a light-skinned version.
Edna is a computer-animated image projected life-size on the side of a white wall, and she is used in a study monitoring the interactions of medical students with virtual patients. The study, which has three of five authors from UF, found white medical students were less empathetic toward black virtual patients in one-on-one interviews.
"Bias in the real world is translating to the virtual world," said Benjamin Lok, an author of the study.
The existence of racial partiality in the medical field is a problem acknowledged since the 2002 government study "Unequal Treatment."
"We're not claiming that we have found any new bias," said Lok, an assistant professor in UF's Department of Computer and Information Science and Engineering.
The fact that bias is still present in mock doctor-patient scenarios shows that students treat virtual patients in a realistic manner, he said.
"Our goal is to try to mimic as close as possible a doctor talking to a patient," Lok said.
For the medical students who participated in the study, interaction with Edna was made as real-life as possible. Infrared lights and sensors allowed Edna to detect the position of students and look at them as they moved.
A hat embedded with a microphone enabled students to talk with her. Edna was programmed to offer about 200 different responses to as many as 500 questions.
After the interviews, medical and nonmedical people rated students' empathy on a scale of one to seven. The raters could not see Edna's skin tone.
Empathy scores were compared to results of a post-interview subconscious bias-detecting test created by a University of Washington psychologist.
Brent Rossen, a UF CISE graduate student and an author of the study, hopes the virtual patient program will be used as a tool to alleviate bias.
"The way to get rid of racial biases is to train them out," Rossen said. "It's not about somebody's skin tone; it's about what you're not used to."
Full Article
http://www.cnn.com/2008/HEALTH/07/09/virtual.patient.bias/index.html
For each of the 21 medical students who enter the room, Edna's fears are still to be discovered.
They each see the same 55-year-old woman, each meet with the same brown eyes.
They all hear the same Southern twang in her voice and the same tremor of fear when she asks if she could have cancer.
The only difference is that 12 of the students see a dark-skinned version of Edna, and the other nine students see a light-skinned version.
Edna is a computer-animated image projected life-size on the side of a white wall, and she is used in a study monitoring the interactions of medical students with virtual patients. The study, which has three of five authors from UF, found white medical students were less empathetic toward black virtual patients in one-on-one interviews.
"Bias in the real world is translating to the virtual world," said Benjamin Lok, an author of the study.
The existence of racial partiality in the medical field is a problem acknowledged since the 2002 government study "Unequal Treatment."
"We're not claiming that we have found any new bias," said Lok, an assistant professor in UF's Department of Computer and Information Science and Engineering.
The fact that bias is still present in mock doctor-patient scenarios shows that students treat virtual patients in a realistic manner, he said.
"Our goal is to try to mimic as close as possible a doctor talking to a patient," Lok said.
For the medical students who participated in the study, interaction with Edna was made as real-life as possible. Infrared lights and sensors allowed Edna to detect the position of students and look at them as they moved.
A hat embedded with a microphone enabled students to talk with her. Edna was programmed to offer about 200 different responses to as many as 500 questions.
After the interviews, medical and nonmedical people rated students' empathy on a scale of one to seven. The raters could not see Edna's skin tone.
Empathy scores were compared to results of a post-interview subconscious bias-detecting test created by a University of Washington psychologist.
Brent Rossen, a UF CISE graduate student and an author of the study, hopes the virtual patient program will be used as a tool to alleviate bias.
"The way to get rid of racial biases is to train them out," Rossen said. "It's not about somebody's skin tone; it's about what you're not used to."
Full Article
http://www.cnn.com/2008/HEALTH/07/09/virtual.patient.bias/index.html