Robots With Realistic Pain Expressions Can Cut Error, Bias by Doctors: Study

London, March 14: UK researchers have developed a way to generate more realistic and accurate expressions of pain on the face of medical training robots during physical examination of painful areas. The new approach by the Imperial College London team could help to reduce error and bias by doctors during physical examination.

The findings, published in the journal Scientific Reports, suggest this could also help teach trainee doctors to use clues hidden in patient facial expressions to minimise the force necessary for physical examinations, and may also help to detect and correct early signs of bias in medical students by exposing them to a wider variety of patient identities. Sophia, the Robot’s Digital Artwork Is Up For Auction, Watch Videos of First AI Paintings to be Sold Online.

“Improving the accuracy of facial expressions of pain on these robots is a key step in improving the quality of physical examination training for medical students,” said Sibylle Rerolle, from Imperial’s Dyson School of Design Engineering.

In the study, undergraduate students were asked to perform a physical examination on the abdomen of a robotic patient. Data about the force applied to the abdomen was used to trigger changes in six different regions of the robotic face – known as MorphFace – to replicate pain-related facial expressions.

This method revealed the order in which different regions of a robotic face, known as facial activation units (AUs), must trigger to produce the most accurate expression of pain. The study also determined the most appropriate speed and magnitude of AU activation.

The researchers found that the most realistic facial expressions happened when the upper face AUs (around the eyes) were activated first, followed by the lower face AUs (around the mouth). In particular, a longer delay in activation of the Jaw Drop AU produced the most natural results.

When doctors conduct physical examination of painful areas, the feedback of patient facial expressions is important. However, many current medical training simulators cannot display real-time facial expressions relating to pain and include a limited number of patient identities in terms of ethnicity and gender.

The researchers say these limitations could cause medical students to develop biased practices, with studies already highlighting racial bias in the ability to recognise facial expressions of pain.

“Underlying biases could lead doctors to misinterpret the discomfort of patients – increasing the risk of mistreatment, negatively impacting doctor-patient trust, and even causing mortality,a said co-author Thilina Lalitharatne, from the Dyson School of Design Engineering.

“In the future, a robot-assisted approach could be used to train medical students to normalise their perceptions of pain expressed by patients of different ethnicity and gender.”

(The above story first appeared on Today News 24 on Mar 14, 2022 05:37 PM IST. For more news and updates on politics, world, sports, entertainment and lifestyle, log on to our website todaynews24.top).

Comments (0)
Add Comment