A closer look at the role of health care algorithms in racial and ethnic disparities

A Penn Medicine study points to ways to reduce potential for racial bias and inequity when using algorithms to inform clinical care.

For years, it was harder for Black patients to secure a coveted spot on the national kidney transplant waitlist because a clinical algorithm was making Black patients appear healthier than they were. After a Penn Medicine researcher exposed the problem in 2019—and showed how it exacerbated racial disparities in kidney disease—a national taskforce recommended removing race from the algorithm’s scoring, a move that has quickly been adopted throughout the country in an effort to reduce racial inequity.

Electronic medical records.
Image: Adobe Stock/metamorworks

But that wasn’t the only impact, according to a comprehensive new study by Penn researchers that digs deeper into the complicated issue of race and ethnicity in health care algorithms. Removing race from the kidney function algorithm also appeared to reduce chemotherapy access, reduce eligibility for Black patients in clinical trials and affect medication dosing.

The new paper, published in the Annals of Internal Medicine, paints a nuanced picture of algorithms in health care—a ubiquitous, but often unseen, force in clinical decision making—and how their use can impact racial and ethnic disparities. The research team, led by Shazia Mehmood Siddique, an assistant professor of gastroenterology in the Perelman School of Medicine, finds that algorithms can mitigate, perpetuate, and exacerbate racial and ethnic disparities, regardless of whether they explicitly use race or ethnicity as an input.

“Intentionality matters,” says Siddique, who also serves as director for research for Penn Medicine’s Center for Evidence-Based Practice (CEP) and also the Penn Center for Healthcare Improvement and Patient Safety (CHIPS). “Racial and ethnic disparities cannot be an afterthought.”

Race is often used in algorithms as a proxy for another variable, such as ancestry, a specific gene, social determinants of health or even the effects of systemic racism, Siddique said. The problem, she added, is that it is often unclear why race is being used in an algorithm. “We need algorithm developers to be clear about what race is being used as a proxy for, because clinicians may have no idea,” Siddique says. “If there is no transparency about it, then it can perpetuate the false assumption that race is biologic.”

A better option: replace race with a more precise variable. Siddique is now studying, for instance, whether replacing race with country of origin in a liver cancer screening guideline would reduce disparities. (While similar to algorithms, guidelines are non-mathematical, evidence-based recommendations typically developed by medical associations to help guide best clinical practices.)

Read more at Penn Medicine News.