By Tom Simonite
In October, a bombshell academic study questioned whether widely used software could cause racial bias in US health care. It found that an algorithm some providers use to prioritize access to extra help with conditions such as diabetes systematically favors white patientsâ€™ needs over those of black patients. Democratic presidential candidate and senator Cory Booker (D-New Jersey) and Senate colleague Ron Wyden (D-Oregon) are now demanding answers.
On Tuesday, Booker and Wyden released letters to the Federal Trade Commission and Centers for Medicare and Medicaid Services asking the agencies how they look for and prevent bias in health care algorithms. They asked the FTC to investigate whether decision-making algorithms discriminate against marginalized communities. The lawmakers also wrote to five of the largest health care companies asking about their internal safeguards against bias in their technology.
â€œIn using algorithms, organizations often attempt to remove human flaws and biases,â€� Booker and Wyden wrote. â€œUnfortunately, both the people who design these complex systems, and the massive sets of data that are used, have many historical and human biases built in.â€� The letters were sent to health companies Blue Cross Blue Shield, Cigna Corporation, Humana, Aetna, and UnitedHealth Group.
The study that prompted Booker and Wydenâ€™s letters found racial bias in the output of patient management software from UnitedHealth subsidiary Optum. It is used to predict the health care needs of 70 million patients across the US, but data from a major hospital showed that it understated the severity of black patientsâ€™ disease, assigning them lower scores than white patients with the same medical conditions.
That skew could have serious, even fatal, consequences, because some health systems use the scores to determine who gets access to special programs for people with complex chronic conditions such as diabetes or kidney disease. In the large academic hospital where the study was conducted, the authors calculated that the algorithmâ€™s bias effectively reduced the proportion of black patients receiving extra help by more than half, from almost 50 percent to less than 20 percent.
Booker and Wyden are not the first to suggest those results should stir government action. Last month, New York stateâ€™s health and financial services regulators wrote a joint letter to UnitedHealth warning that â€œthese discriminatory results, whether intentional or not, are unacceptable and are unlawful in New York.â€� The agencies asked the company not to use any algorithms or data analysis unless it could show they were free from racially disparate impacts. UnitedHealth declined to answer questions about how it was responding to the study that showed bias in its technology’s output, or subsequent pressure from regulators.
In April, Booker and Wyden introduced a Senate bill called the Algorithmic Accountability Act that would require organizations using automation in decision-making to evaluate their technology for discrimination. US representative Yvette Clarke (D-New York) introduced a version in the House.