From predicting who will be a repeat offender to who’s the best candidate for a job, computer algorithms are now making complex decisions in lieu of humans. But increasingly, many of these algorithms are being found to replicate the same racial, socioeconomic or gender-based biases they were built to overcome. This racial bias extends to software widely used in the healthcare industry, potentially affecting access to care for millions of Americans, according to a new study by researchers at the University of California, Berkeley, the University of Chicago Booth School of Business and Partners HealthCare in Boston.
“The algorithms encode racial bias by using health care costs to determine patient ‘risk,’ or who was mostly likely to benefit from care management programs,” saidDr. Ziad Obermeyer, acting associate professor of health policy and management at UC Berkeley and lead author of the paper. “Because of the structural inequalities in our healthcare system, blacks at a given level of health end up generating lower costs than whites. As a result, black patients were much sicker at a given level of the algorithm’s predicted risk.”
The new study, published Oct. 25 in the journal Science, found that a type of software program that determines who gets access to high-risk health care management programs routinely lets healthier whites into the programs ahead of blacks who are less healthy. Fixing this bias in the algorithm could more than double the number of black patients automatically admitted to these programs, the study revealed.Friday Letter Submission, Publish on November 01